I have told the story about the supposedly clean IV pump in the dirty utility room — see previous post — to a lot of people over the course of this last week. Kind of like a focus group exercise. Can you guess what most people immediately say? The usual. Blame the nurse. What kind of stupid person would think a pump that is supposedly clean is really clean if it’s in a dirty utility room? In this instance, as in so many others, the game is blame the individual, in this case the nurse. This in spite of more than two decades of research telling us that creating ambiguous situations like the one I just described is a recipe for smart and well-intentioned people to make catastrophic errors. This is particularly true when people work in high stress environments where they are over-worked and fatigued(latent pathogens if there ever were ones) as most nurses, doctors, and others who work in health care are today. Read more >>
In their book Managing the Unexpected: Assuring High Performance in an Age of Complexity, Karl E. Weick and Kathleen M. Sutcliffe outline the characteristics of a High Reliability Organization (HRO). HROs, they explain, are preoccupied with failure, reluctant to embrace simple interpretations of problems, sensitive to operations (i.e. the frontline where work takes place), committed to resilience, and always defer to expertise (even when that expertise comes from people low down on the organizational totem-pole.
HROs also “worry a lot about the temptation to normalize unexpected events” and thus respect “feelings of surprise.“ People who work in HROs are, as they describe it “mindful.” “By mindfulness we mean the combination of ongoing scrutiny of existing expectations, continuous refinement and differentiation of expectations based on newer experiences, willingness and capability to invent new expectations that make sense of unprecedented events, a more nuanced appreciation of context and ways to deal with it, and identification of new dimensions of context that improve foresight and current functioning.” HROs always pay attention to human factors, which as Kim Vicente explains in his book The Human Factor, are the “problems arising out of the relationship between people and technology, not just at the level of the individual but also at the organizational and even political level.”
As we have argued in our book Beyond the Checklist: What Else Health Care Can Learn from Aviation Safety and Teamwork, aviation shares all the characteristics of an HRO outlined above. Because of three decades of mindfulness about safety, aviation, once a very high risk industry, is now a high reliability one. The other day, my co-author Patrick Mendenhall and I were talking to two former aircraft accident investigators Douglas Dotan and Ron Schleede . Both were at work several decades ago as the aviation safety model (ASM) of Crew Resource Management (CRM) was, no pun intended, just getting off the ground. They had first hand experience of investigating some terrible airline crashes. Read more >>
One of the greatest accomplishments of the Aviation Safety Model (ASM), Crew Resource Management (CRM) has been the creation of what Schein and Bennis and Edmondson have called “psychological safety” in the airline industry. Moving from a culture that was characterized by the authoritarian (rather than authoritative) exercise of power, to one in which it is safe to tell someone — even someone higher up — that they have made or are about to make a mistake, was central to the creation of airline safety. As my co-authors airline pilot Patrick Mendenhall and medical educator Bonnie Blair O’Connor and I have written in our book Beyond the Checklist: What Else Health Care Can Learn from Aviation Teamwork and Safety, creating a psychologically safe environment takes a lot of work — both initially and over time. Because of this work, aviation culture has moved, as former Vice Chairman of the National Transportation Safety Board Robert T. Francis has described it from a culture in which the captain would convey, implicitly or explicitly that “I’m captain, I’m king. Don’t do anything! Don’t say anything! Don’t touch anything! Shut up!” to a culture where the message is “I’m captain, I’m king, please tell me if you see me making a mistake!” Read more >>
News that hospital patients in Minnesota are no safer today than a decade ago comes as no surprise to those trying to reduce medical errors and injuries. The drive to make hospitalization less risky began in 1999, when a study by the prestigious Institute of Medicine estimated that more than 100,000 patients suffer from avoidable harm each year.
Since then, hundreds of millions of dollars has been spent on new patient safety initiatives of all kinds. Unfortunately, as The Star Tribune confirmed last month, hospitals are still the scene of too many botched procedures and preventable falls, infections, and misdiagnoses. What explains the persistence of failure that adds billions to our nation’s health care bill, costs many patients their lives, and leaves others with new medical problems?
Many well-intentioned safety efforts have fallen short because doctors, administrators, and other staff resist necessary changes in the hierarchical and often dysfunctional culture of hospital work. Patients get hurt or become even sicker when doctors, nurses, and other hospital staff fail to share information, don’t work effectively in teams, or ignore mechanisms – like hand-washing to prevent infection –designed to minimize human error. In too many hospital chains, fragmentation of care is an institutional shortcoming both endemic and unacceptable.
In our view, the quality of patient care in America won’t get better until the healthcare industry embraces the flight-tested model of teamwork training and workplace cooperation that has drastically reduced the dangers of commercial air travel.
In response to a series of horrific crashes four decades ago, the airlines didn’t just develop better safety checklists or new equipment to detect impending mechanical failures. The entire industry was compelled to adopt new safety protocols, known as Crew Resource Management. (CRM). As a result, flying today is safer than ever before—for airline crews and their passengers
CRM teaches those working in the cockpit, cabin, on the ground and in air traffic control how to communicate and cooperate. It requires everyone to solicit and listen to relevant safety information from other employees, regardless of their rank or job responsibility as well as to use standardized procedures that assure safety. Annual CRM training is now mandated for all flight personnel by the Federal Aviation Administration. Airline unions, like the Airline Pilots Association and the Association of Flight Attendants, have been heavily involved in the implementation and success of the program.
Whatever other job-related complaints they may have, any longtime pilot, flight attendant, or air traffic controller will vouch for difference that CRM has made in their industry. One of CRM’s leading advocates is Captain Chesley “Sully” Sullenberger, who arguably became the most famous pilot in America after his landing of a crippled US Airways jet in the Hudson River four years ago. Since retired, he is now helping hospitals incorporate CRM methods into their patient safety programs.
“Not long ago,” Sullenberger says, “there were captains in out cockpits who acted liked gods with a little ‘g’ and cowboys with a capital ‘C.’ You questioned the captain’s authority at your own peril, even if you were a fellow pilot.”
When CRM first emerged, some captains worried it would undermine their authority. According to Sullenberger, “others, like some doctors today, felt they didn’t need to learn the ‘soft skills’ of better communication and respectful interaction with co-workers.” Yet the consensus today is “that our cockpit management decisions are far better when we have regular input and information from all members of our aviation team, in the air and on the ground.”
Even where hospitals have realized the need for more “inter-professional training,” few have created a similar environment in which staff members feel free to challenge a physician or complain to an administrator. In medical schools and on the job, most doctors are not encouraged, much less required, to consult with skilled and experienced members of the same health care team who may have life-saving in-put.
Many tragedies could be averted and patient care improved if health care fully applied the lessons of aviation teamwork and safety. In our hospitals, it’s time to go beyond the checklist and actually reduce the terrible financial and human toll of avoidable medical errors.
People have made some very interesting and important comments on my latest post. I want to add something here and also reiterate that I am not talking about patients who are truly abusive, or in other ways very difficult to care for. That said, I want to add something about the conditions that may lead Health Care Professionals (HCPs) to view patients with legitimate needs as difficult.
One barrier is, of course, workload and fatigue. Whether it is in the hospital or any other workplace, like primary care, people who are too tired, who have not eaten, and who have too much work tend to be irritable and have trouble attending to others. In healthcare settings, professionals and other staff are constantly expected to act in a professional manner when, as Lucian Leape and his colleagues argue, their institutions do not treat them respectfully and ask them to work too hard and too long. Read more >>