I've been thinking about how to teach "mistake appreciation," for lack of a better term, to knowledge professionals. As such, I am very attuned to discussions of errors/mistakes/failures, including definitions and prescriptions. So I was very interested when @whatsthepont shared the Duke "Anatomy of an Error
" online course (discussed earlier this week
). Here's a graphic from my recent materials (adapted from Amy Edmondson's work):
Health care is a business that sits squarely in the complex domain - the center of the spectrum. And as you can see, failures are unavoidable. How important, therefore, is it to teach people involved in this process how to understand, prevent and learn from errors? Very important, of course. So the Duke course is admirable.
However, my concern about the course is that it seems to hold a point of view that if workers acted carefully enough, and process designers were thorough enough, that errors in health care could be eliminated completely. Here's an example of what I'm talking about from the course:
Because humans are fallible, we must rely upon systems and back-ups to prevent or detect errors before they can cause harm. Unfortunately, our systems are not always designed well to achieve this. System and design factors that can lead to bad outcomes include:
- Too many steps
- Too many people (communication issues)
- Too heavy or too light (performance is best when workload is moderate)
- Too much reliance on human vigilance/monitoring
- Focus on functionality, while ignoring the real-life user
This presumes that dealing with all the bulleted items will avoid all possible bad outcomes. This thinking is just wrong. Inherent in a complex process is changing interactions between different people and the environment. Things evolve. Health care is no different. As a result, unexpected results can occur even if
workers work with the utmost care and the processes have been designed to the utmost quality. Because things change around them.
This is more reason to take a more open view of error than that simply created by tired workers or inadequate processes. And to thrive with this open view requires a culture that does not work by "stored patterns of pre-programmed instructions
," but that is mindful, aware, and welcomes observations and insights from anyone in the organization (not just doctors).
In addition to "Anatomy of an Error," Duke Medical Center would be well advised to teach their workers to be very aware of surprises and disappointments - to use a spy novel term, "expect the unexpected." If well-trained, well-rested staff perform a well-designed process and something still goes awry, that is not a defect, it is vital information.
It means something has not been taken into account, that the environment has changed, and a cool, clearheaded review is in order. It could be the appearance of a new strain of resistant bacteria, a new designer drug, or simply a unique patient situation. The worst possible step is to hide the error or to engage in "quiet fixing
This is the next level of training that groups like Duke need to embark on. They should be commended for training their staff on this weird topic of errors and mistakes. But they can't stop there; there's much left to do.