Roxanne Persaud (@commutiny) has shared an important paper in looking at mistakes and failure. It is "How Complex Systems Fail" (link - PDF) by Dr. Richard Cook, director of the Cognitive Technologies Laboratory at the University of Chicago. The 1998 paper is brief - only 4 pages long - and is in bullet form. In it, Dr. Cook summarizes his thoughts about complex human-developed systems - examples include power generation/distribution systems, health care, transportation.
Some of the headlines represent things I'm thinking about a lot these days: "Complex systems are intrinsically hazardous systems" - that is, failures will occur and not all can be avoided.
"Catastrophe requires multiple failures" - it occurs when "small, systemic failures join to create opportunity for an accident." This means that near misses are important and cannot be ignored - the right combination of near misses means disaster.
"Change introduces new forms of failure." Complex systems evolve, and what worked in the past may not work in the future due to the context and actors changing.
It's a terrific paper. One headline, "Human practitioners are the adaptable element of complex systems," makes me realize how important it is for workers in complex environments (that means all knowledge workers, managers, salespeople, customer service reps, nurses, etc.) to be aware they work in a complex environment. How many of those millions of workers don't even realize they are part of a system that breaks often, and that they can help repair by being cognizant of small mistakes and changes, and by sharing those lessons?