I have told the story about the supposedly clean IV pump in the dirty utility room — see previous post — to a lot of people over the course of this last week. Kind of like a focus group exercise. Can you guess what most people immediately say? The usual. Blame the nurse. What kind of stupid person would think a pump that is supposedly clean is really clean if it’s in a dirty utility room? In this instance, as in so many others, the game is blame the individual, in this case the nurse. This in spite of more than two decades of research telling us that creating ambiguous situations like the one I just described is a recipe for smart and well-intentioned people to make catastrophic errors. This is particularly true when people work in high stress environments where they are over-worked and fatigued(latent pathogens if there ever were ones) as most nurses, doctors, and others who work in health care are today. Read more >>
One of the greatest accomplishments of the Aviation Safety Model (ASM), Crew Resource Management (CRM) has been the creation of what Schein and Bennis and Edmondson have called “psychological safety” in the airline industry. Moving from a culture that was characterized by the authoritarian (rather than authoritative) exercise of power, to one in which it is safe to tell someone — even someone higher up — that they have made or are about to make a mistake, was central to the creation of airline safety. As my co-authors airline pilot Patrick Mendenhall and medical educator Bonnie Blair O’Connor and I have written in our book Beyond the Checklist: What Else Health Care Can Learn from Aviation Teamwork and Safety, creating a psychologically safe environment takes a lot of work — both initially and over time. Because of this work, aviation culture has moved, as former Vice Chairman of the National Transportation Safety Board Robert T. Francis has described it from a culture in which the captain would convey, implicitly or explicitly that “I’m captain, I’m king. Don’t do anything! Don’t say anything! Don’t touch anything! Shut up!” to a culture where the message is “I’m captain, I’m king, please tell me if you see me making a mistake!” Read more >>
News that hospital patients in Minnesota are no safer today than a decade ago comes as no surprise to those trying to reduce medical errors and injuries. The drive to make hospitalization less risky began in 1999, when a study by the prestigious Institute of Medicine estimated that more than 100,000 patients suffer from avoidable harm each year.
Since then, hundreds of millions of dollars has been spent on new patient safety initiatives of all kinds. Unfortunately, as The Star Tribune confirmed last month, hospitals are still the scene of too many botched procedures and preventable falls, infections, and misdiagnoses. What explains the persistence of failure that adds billions to our nation’s health care bill, costs many patients their lives, and leaves others with new medical problems?
Many well-intentioned safety efforts have fallen short because doctors, administrators, and other staff resist necessary changes in the hierarchical and often dysfunctional culture of hospital work. Patients get hurt or become even sicker when doctors, nurses, and other hospital staff fail to share information, don’t work effectively in teams, or ignore mechanisms – like hand-washing to prevent infection –designed to minimize human error. In too many hospital chains, fragmentation of care is an institutional shortcoming both endemic and unacceptable.
In our view, the quality of patient care in America won’t get better until the healthcare industry embraces the flight-tested model of teamwork training and workplace cooperation that has drastically reduced the dangers of commercial air travel.
In response to a series of horrific crashes four decades ago, the airlines didn’t just develop better safety checklists or new equipment to detect impending mechanical failures. The entire industry was compelled to adopt new safety protocols, known as Crew Resource Management. (CRM). As a result, flying today is safer than ever before—for airline crews and their passengers
CRM teaches those working in the cockpit, cabin, on the ground and in air traffic control how to communicate and cooperate. It requires everyone to solicit and listen to relevant safety information from other employees, regardless of their rank or job responsibility as well as to use standardized procedures that assure safety. Annual CRM training is now mandated for all flight personnel by the Federal Aviation Administration. Airline unions, like the Airline Pilots Association and the Association of Flight Attendants, have been heavily involved in the implementation and success of the program.
Whatever other job-related complaints they may have, any longtime pilot, flight attendant, or air traffic controller will vouch for difference that CRM has made in their industry. One of CRM’s leading advocates is Captain Chesley “Sully” Sullenberger, who arguably became the most famous pilot in America after his landing of a crippled US Airways jet in the Hudson River four years ago. Since retired, he is now helping hospitals incorporate CRM methods into their patient safety programs.
“Not long ago,” Sullenberger says, “there were captains in out cockpits who acted liked gods with a little ‘g’ and cowboys with a capital ‘C.’ You questioned the captain’s authority at your own peril, even if you were a fellow pilot.”
When CRM first emerged, some captains worried it would undermine their authority. According to Sullenberger, “others, like some doctors today, felt they didn’t need to learn the ‘soft skills’ of better communication and respectful interaction with co-workers.” Yet the consensus today is “that our cockpit management decisions are far better when we have regular input and information from all members of our aviation team, in the air and on the ground.”
Even where hospitals have realized the need for more “inter-professional training,” few have created a similar environment in which staff members feel free to challenge a physician or complain to an administrator. In medical schools and on the job, most doctors are not encouraged, much less required, to consult with skilled and experienced members of the same health care team who may have life-saving in-put.
Many tragedies could be averted and patient care improved if health care fully applied the lessons of aviation teamwork and safety. In our hospitals, it’s time to go beyond the checklist and actually reduce the terrible financial and human toll of avoidable medical errors.
I am delighted to let readers of this blog know about our new book, Beyond the Checklist: What Else Health Care Can Learn from Aviation Safety and Teamwork, which was just published in the series on The Culture and Politics of Health Care Work that I co-edit for Cornell University Press. I wrote the book with international commercial airlines pilot Patrick Mendenhall and medical educator and ethnographer Bonnie Blair O’Connor.
As the title suggests, the book is a very detailed look at the aviation safety model (ASM) known as Crew Resource Management( CRM) or, in its contemporary iteration, Threat and Error Management(TEM). We were delighted that Captain Chesley “Sully” Sullenberger agreed to write a foreword for the book. The book is designed to help people in critical industries like healthcare understand how aviation became safer and how they can adapt the aviation safety model (ASM) to their work — particularly when it is in healthcare, which should be a high reliability industry but which has a ways to go to attain that status.
Anyone interested in transparency in healthcare should check out the Sugar Bowl website under chairlift safety program, under inspiration. So right on the website of this famous ski area in California is the story of an accident that killed seven-year-old John Marco Henderson.
As the article explains, “Last December our community suffered a great loss. John Marco Henderson, age 7, died after falling from the Mt. Lincoln chairlift while skiing with the Sugar Bowl Ski Team. Over the past 10 months, Sugar Bowl Corporation and Sugar Bowl Ski Team have worked with John’s parents to investigate how this accident occurred and determine what steps could be taken to prevent a similar tragedy. At Sugar Bowl, safety remains our highest priority.”
The article then goes on to detail what could be pieced together about what happened to the boy and, in great detail, discusses what the ski area is doing in response. The website acknowledges that there were problems with the lifts and the ski area does not in any way try to shirk its responsibility — both in the accident and in fixing the problems that may have caused it. “Ultimately our safety practices were insufficient to prevent this tragedy,” the article reads. Although the article acknowledges problems, it does so without apportioning blame or scapegoating anyone.
“As a result of this tragedy,” the resort explains, “we reviewed existing policies, procedures and best practices relating to the loading, riding and unloading of the chairlifts at Sugar Bowl. We have implemented a comprehensive Safety Program that includes the following changes: restraining bars will be lowered for all minors under 51 inches in height (including equipment); increased adult supervision of children in the ski school and on ski team; and installation of cameras to continuously monitor and improve chairlift loading practices. The Safety Program will be distributed widely and is available on the Sugar Bowl website.”
Imagine seeing something like this on a hospital website. How remarkable would that be?
Reading this people might think, Suzanne has lost it. Does she really imagine that any hospital or health care professional would actually publicly post an acknowledgment of safety problems and how they are being remedied? But no, I have not lost my mind. This kind of behavior is common in high reliability organizations (HROs). In our book Beyond the Checklist: What Else Health Care Can Learn from Aviation Safety and Teamwork, we describe the kinds of reporting programs and information sharing that have made flying safer than it has ever been. These kinds of programs can be adapted to healthcare. Patients know healthcare isn’t safe. They know hospitals are dangerous places. What they cannot do is protect themselves if they don’t know what to do and don’t have guidance and assistance from professionals as they try to protect themselves.
Right now, too much of patient safety is being outsourced on to patients themselves. We are asked to check our meds, to make sure no one gives us the wrong dose, a more — rather than less — invasive operation. We are asked to ask professionals if they have washed their hands. And everyone expects us to do this — even when we’re unconscious — and some actually blame us if experience a medical error or injury because, according to the outsourcing logic, we have not been vigilant enough. It was our own fault. It’s kind of like asking an airplane passenger not only to listen in to the air traffic control channel but to make sure the captain has, in fact, descended to 25,000 feet.
Patients — like child skiers — can only do so much to protect themselves. Yes, we should do what we can. We should be vigilant. But when we are most vulnerable, we also the most unable to act to protect ourselves, and if we’re lucky enough to have family and friends nearby, they too may be unable to effectively advocate for us. One has only to read the last two Narrative Matters selections in the journal Health Affairs by Beth Swan and Jonathan R. Welch to discover how difficult it is for even seasoned professionals to protect their loved-ones.
Although I will write more on these two stories later, the take home message is, without the kind of high level committment exhibited by those who lead (the kind of high-level commitment exhibited by Sugar Bowl) institutions as well as efforts to involve staff at every level, patients will never be safe.