On the Asiana Crash
My co-author of Beyond the Checklist, Patrick Mendenhall, just posted this on our blog www.beyondthechecklist.com. There are obviously many parellels to health care — notably over reliance on technology. In his excellent book The Lost Art of Healing, Noble Prize winning cardiologist Bernard Lown, for example, warned of physician’s increasing over-reliance on technology and the problems that can produce. In our book Beyond the Checklist: What Else Health Care Can Learn from Aviation Teamwork and Safety, we also write about what happened with Quantas Airways Flight 32 when an engine failure led to catastrophic damage to the airplane and its information systems. Pilots quickly had to use their own brains — and not rely on the misfiring technology– to safely land the plane. These are lessons that are relevant for all high tech, high reliability industries.
Can we Fly and Talk at the Same Time? – PLEASE?
By Patrick Mendenhall
In the July 15, 2013 edition of Aviation Week and Space Technology, John Croft comments that “…over-reliance on automation systems appears to have trumped basic flying skills and crew resource management [CRM] in the crash of Asiana 214…” Like a child learning to walk, the aviation industry is still finding its way regarding how to manage the phenomenal levels of automation that are now available to us.
Properly managed, automation decreases workload and allows better situation awareness than ever before. It is a wonderful thing and can make the most complex and critical tasks appear nearly effortless. Improperly managed, it can lead us into the tragic equivalent of flying into a box canyon, from which there is no escape.
Now consider the human factor: rule number one in any study of human factors is that people make mistakes; rule number two: machines are designed and operated by people – which leads us back to rule number one – and opportunities for failure abound! We know this all too well, yet complacency and “auto-dependency” too often override just plain common sense. We must not forget that these brilliant engineers that gave us this amazing technology also gave us, with one little click of a button, the option to still actually fly the aircraft. Pilots must continue to train to and maintain those basic skills because odds are, they will be called upon to use them at some time when least expected; they had better be ready!
Airline cockpits are designed with a crew – a team – in mind. Automation has led us to a concept that has taken prominence in the pilots’ lexicon: the “monitoring” function. Not only is the pilot flying (PF) required to constantly monitor that the aircraft is actually doing what s/he has told it to do – known as the “fact vs. fantasy” notion – but all other pilots on the flight deck are expected to fulfill the pilot monitoring (PM)function. Unlike the days long ago passed, the PM’s involvement in this process is every bit as crucial as the PF. The PM must function almost as if s/he is the PF.
Essential to this process is communication, a fundamental performance indicator of crew resource management: crews must be willing and able to fly AND talk. If any member of the crew – PF or PM – sees something that is outside of their expectations they need to speak up – to talk – and to feel encouraged and empowered to do so. Taking this one step further, if aircraft performance is outside of established parameters – such as, for example, stabilized approach criteriabelow 1,000 feet – any member of the crew should be expected to command the appropriate action up to and including, “GO AROUND!”
We have yet to get the whole story on Asiana 214 and should definitely withhold judgment and speculation on the myriad of facts that we do not yetknow. What we can surmise thus far is that there was an over-reliance on automation, automation was mismanaged and the crew apparently failed to actually fly the aircraft, monitor performance and when the situation had clearly deteriorated, failed to verbalize their observations with the proper level of urgency.
The lessons from this tragic event extend far beyond the crew, the airline, the manufacturer, ATC, etc… Any high reliability organization (HRO) can and should benefit from the lessons from Asiana 214:
· Automation is designed by humans, therefor subject to human failings
o operators must trust, but verify any automation mode
o operators must continue to train to basic skills
· If a situation is outside of expected (or established) parameters, act
· If something seems wrong, it likely is
· Communicate: see something? Say something – Please!