The Technology Paradox: CRM and Flight 214


Asiana Flight 214 cartwheeled into the San Francisco miraculously killing ‘only’ two passengers directly.  Preliminary assessments of the crash highlight one paradox of modern technology: technology can help correct for human performance failures, but we sometimes rely on technology so much that we abandon our own skills and judgment—even when those skills and judgment tell us not to.  We can become deskilled and lulled into deadly complacency.

The paradox unfolds:  A key tenet of Crew Resource management (CRM) is that team members must maintain ‘situational awareness’ (attention to surroundings, environmental changes and potential problems that might arise).  But the crew of 214 lost awareness and failed to restore it. Reports following the crash emphasize the failure of the captain-in-training to recognize a too-rapid descent and the low air speed of his craft as it approached the runway.  Although speed and rate of descent are two of the most critical elements of an approach, the pilot appears to have assumed that they were within acceptable limits, but why?   He reportedly thought that the plane’s automatic throttle was set to manage airspeed correctly.  It was not, or it did not hold the setting assumed by the pilot, and the plane was headed for landing well under the required minimum speed (119 mph vs. nearly 158 mph).  Note that maintaining or restoring situational awareness is a crew responsibility.

“GIGO” revisited:  The loss of situational awareness also included a failure to recognize that the technology was not performing as expected—at odds with the air speed indicator.  Years ago, a colleague of mine coined the phrase “garbage in, gospel out”—referring to the temptation to assume that what computers produce must be correct, even in the face of contrary evidence.  This problem is intensified in fast moving conditions and with advanced technology.   At times operator uncertainty as to what automation is doing creates dangerous situations, a state called “Mode confusion”.  (Think of an automobile speedometer set to MPH when the driver assumes KPH—oops, speeding ticket.  Correct info, but incorrect mode.)  In aviation, mode confusion applies to losing track of cockpit settings—a recipe for inaccurate status assessments and disabled decision making.

Training, stress, and diminished skills:  Reports of the trainee pilot’s testimony indicate that he was under great stress as he made an unfamiliar approach using some unfamiliar controls (he had been “warned” that the thrust control did not always maintain the expected setting).  Increased stress reduces flexibility and narrows attention.  Did stress overwhelm good piloting and focus?  Was there a predisposition to let the auto systems do too much of the flying?  One reported hypothesis of the investigation is that pilots have become overly reliant on automation, and that basic flying skills have become eroded.  Although the flight crew had trained in simulators they claimed to have had inadequate time training to land a 777 at an airport like San Francisco.

Culture, communication, and teamwork:  Although several crew members were in the cockpit, with the pilot they did not warn him of the airspeed and descent issues in time.  As the investigation reported, no one said that the plane was too low until the last 30 seconds of the flight. Three seconds before impact, the pilot made an attempt to throttle up and abort the landing.  The crew found it difficult to countermand the actions of the pilot in charge of the craft.  One even noted that he did not wear his sun glasses during the approach (in spite of initially reported glare, an effect later retracted) because he thought it impolite. All of this in spite of CRM training that the crew members had received, training that focuses on collective responsibility for maintaining and regaining situational awareness, for communicating problems, and for raising safety issues regardless of rank or seniority.

The Antidote:  Technology, as Edward Tenner notes, is fraught with unintended consequences; it bites back.  The antidote is fourfold, including:  (1) an understanding that technology and human systems are not discontinuous, (2) effective” human technology” to address technological advances, (3) effective management of the interface between human and technical systems, and (4) practice, practice, practice.

Comments are closed.