Peter Pronovost's New Book on Patient Safety
Peter Pronovost is a physician and patient safety leader at the Johns Hopkins School of Medicine. He’s a major thinker in the patient safety movement and has a great deal of influence on how people think about patient safety in the US and around the globe. Pronovost has just published a book entitled Safe Patients, Smart Hospitals: How One Doctor’s Checklist Can Help Us Change Health Care From Inside Out. Pronovost’s work was also discussed at length in Atul Gawande’s Checklist Manifesto.
Safe Patients, Smart Hospitals merits a close reading and a careful analysis, which is what I intend to do on this website over the next week or so.
I’d like to begin with a discussion of Pronovost’s views on the safety movement in aviation known as Crew Resource Management (CRM) — which has now evolved into what is called Threat and Error Management (TEM). I am, in fact, working with an airline pilot who teaches CRM, Patrick Mendenhall, and medical educator, Bonnie O’Connor on a new book entitled Come Fly With Me. This book takes an in-depth look at the culture transformation that occurred in aviation and explores its lessons for health care.
We are writing this book because we are convinced that the aviation safety movement is poorly understood and that a better understanding of what happened in aviation could significantly advance patient safety. Pronovost’s book is a good example of this phenomenon. While it contains many brilliant suggestions for change, the author gets some fundamental things wrong about the aviation safety movement and its applicability to hospital and health care.
Consider, for example, what he has to say about the use of checklists in aviation. After explaining that he looked closely at the use of checklists in aviation, the author writes that, “There were also significant differences between medicine and aviation. In aviation the general acceptance that humans are fallible was fundamental to the checklist’s success. Once this truth had been universally accepted, the industry was able to design systems that could prevent or catch inevitable errors before they caused harm, or minimize harm from errors that were not identified.”
There are several problems with this assertion. First of all, in aviation, checklists are part of a much larger system that is intended to prevent, catch or trap errors. While checklists are central, they are certainly not all there is to it. Indeed, CRM/TEM is a very sophisticated system of training that includes how pilots and captains are selected, orientation and recurrent training, instruction in communication, negotiation, conflict resolution skills, instruction of other crew members, serious assessment of skill and competence and technical proficiency and much, much more.
Secondly, the universal recognition and acceptance of human fallibility did not proceed the universal use of checklists. It was the other way round – checklists were imposed on pilots who failed utterly to recognize such fallibility. Once it became clear to researchers and other aviation experts and company leaders that accidents were the result of people willing to take unacceptable risks because they refused to admit to and thus learn from errors, CRM training began. Pilots who dismissed it as “charm school,” or a “Communist plot” to erode their authority were forced to attend these trainings because it was required of them, first by their companies and then by the government. Now, thirty years later, fallibility is universally accepted. Pronovost is right there. But when aviation was, thirty years ago, where medicine is today, checklists started to become the de facto law of the land because the leaders of the industry (i.e company executives and government regulators) refused to coddle pilots who thought they had the power of Zeus when, in fact, they had the wings and hubris of Icarus.
In his concluding comments on the aviation model, Provonost also asserts that “Medicine is infinitely more complex than aviation. The amount of information that a doctor must retain to practice medicine is mind-boggling. To put this in perspective, what a pilot needs to know to fly a specific aircraft, say a Boeing 747, is the equivalent of what a doctor must remember to perform one single procedure.” With this, and another paragraph Provonost goes on to talk about the aviation model and his own use of checklists.
This comment is worth considering because it is a very common medical response to the aviation safety model. I have heard countless physicians explain why the aviation safety movement can’t really be all that helpful because medicine is so much more complicated. While Provonost may not intend that as his message, I think it’s important to consider this line of thinking. What it does is set up a kind of competition between medicine and aviation that takes us down a path which those interested in patient safety should avoid.
The issue is not which industry is more complex, more stressful and more challenging. To enter into a competition between pilots and physicians is to hold patient safety hostage to an exercise that is as foolish as it is futile. I am sure pilots could explain for hours how complex, stressful and challenging is their work. I can hear it now.
Pilot to surgeon – “You try doing what you do at 36,000 feet, in the middle of turbulence, with one engine out and 300 passengers and six flght attendants in the cabin.”
Suregon to pilot – “That’s nothing. You try operating on 12 people in a day, when you’ve been up for hours…”
You get the point. Who wins? Definitely not the patient.
The thing to remember here is that stressful, challenging work is not relative. I can assure you that in the 3 minutes and 42 seconds that transpired while Captain Chesley Sullenberger and his first officer were landing US Airways flight 1549 in the Hudson, the one thought that did not pass through their minds was ,”Oh my this could be so much worse, I could be a neurosurgeon.”
Similarly, no surgeon dealing with a patient going South in the OR is thinking, “This could be so much worse, I could be trying to land a plane with 400 people in the back and no landing gear.”
What is relevant from the patient safety point of view is the following. Safety and teamwork have become the norm in an industry in which solo aviators were socialized to think of themselves as “captain/kings,” and refused to listen to the input of crew members. Crew members were so cowed by the authority of the autocratic captain that they did not share critical information with sufficient urgency. As a result of this dynamic people died – all too often. The industry was not only characterized by a universal commitment to toxic hierarchy, it was also characterized by the interface of human beings and advanced technology, by high levels of stress, and intense amounts of unpredictability and variation. Passengers were introduced into this brew and depended on the judgments of the “experts” for their very lives. The question is not is performing a single procedure more complex than flying a 747. The question we all need to ask, is how did aviation do it and what lessons can health care learn from this complex example of cultural transformation?