They call it the "near miss." It is the kind of mistake or set of mistakes that occur but don't lead to a total catastrophe (the near miss's close relative). Tinsley et al argue that managers overlook near misses and thereby fail to correct the mistakes that lead, on occasion, to terrible, preventable catastrophes. The Deepwater Horizon spill is merely one of the latest of these examples.
The typical reaction of a manager to a near miss is (and I paraphrase), "Whew! That was close. Let's move on." That last statement is endemic of almost every company I've worked with - the desire to press ahead even (or especially) if the result wasn't perfect. But in ignoring the lessons of the near miss, they set the table for a subsequent, more terrible, occurrence.
Because, as the authors point out, the difference between the near miss and the catastrophe is one of circumstance and luck, not good planning or execution. A flawed process or sloppy execution repeated under less favorable conditions will eventually blow up.
Here's an important paragraph:
For the past seven years, we have studied near misses in dozens of companies across industries from telecommunications to automobiles, at NASA, and in lab simulations. Our research reveals a pattern: Multiple near misses preceded (and foreshadowed) every disaster and business crisis we studied, and most of the misses were ignored or misread. Our work also shows that cognitive biases conspire to blind managers to the near misses. Two in particular cloud our judgment. The first is “normalization of deviance,” the tendency over time to accept anomalies—particularly risky ones—as normal. Think of the growing comfort a worker might feel with using a ladder with a broken rung; the more times he climbs the dangerous ladder without incident, the safer he feels it is. For an organization, such normalization can be catastrophic. Columbia University sociologist Diane Vaughan coined the phrase in her book
This work reminds me of the research of Amy Edmondson, who studied the work of nurses and found that psychologically-safe working environments enabled sharing mistakes, which then allowed them to be correct, while unsafe environments covered up mistakes, with the result we can all well imagine.
No comments:
Post a Comment