26
Ben Berman NASA Ames Research Center/San Jose State University Key Dismukes NASA Ames Research Center Loukia Loukopoulos NASA Ames Research Center/San Jose State University Human Factors in Aviation Conference San Antonio, Texas March 6, 2007 Cognitive Limitations and Vulnerabilities on the Flight Deck: What Can We Do?

Ben Berman NASA Ames Research Center/San Jose State University Key Dismukes NASA Ames Research Center Loukia Loukopoulos NASA Ames Research Center/San

Embed Size (px)

Citation preview

Ben BermanNASA Ames Research Center/San Jose State

University

Key Dismukes NASA Ames Research Center

Loukia LoukopoulosNASA Ames Research Center/San Jose State

University

Human Factors in Aviation ConferenceSan Antonio, Texas

March 6, 2007

Cognitive Limitations and Vulnerabilities on the Flight Deck:

What Can We Do?

“Most Airline Accidents Attributed to Crew Error”

What does this mean?

– Why do highly skilled pilots make fatal errors?

– How should we think about the role of errors in accidents?

Draw upon cognitive science research on skilled performance of human operators

Broken Bolt vs.

Unflipped Switch

• Physical causes leave characteristic traces• No such thing for human performance• “Why” drops out, for good reason• Can never know with certainty why accident crew

made specific errors--but can determine why the population of pilots is vulnerable

• Put “why” back into the analysis with informed perspective

Approach

• Reviewed NTSB reports of the 19 U.S. airline accidents between 1991-2001 attributed primarily to crew error

• Followed the approach of NTSB (1994)

• Asked: Why might any airline crew in situation of accident crew – knowing only what they knew – be vulnerable?

Hindsight Bias

• Knowing the outcome of an accident flight reveals what the crew should have done differently

• But: accident crew does not know the outcome– They respond to situation as they perceive it at the

moment

• Presumption has to be that more underlies an error than lack of skill or motivation

Immediate demands of situation & tasks

being performed

Training, experience, & personal goals Social/Organizational

Influences- Formal procedures & policies- Explicit goals & rewards- Implicit goals & rewards- Actual “norms” for line operations

Human cognition

characteristics & limitations

Crew responses to situation

American 903May 12, 1997

West Palm Beach, Florida• Airbus 300/600 in its initial descent for Miami

• Airplane leveled, slowed, and began a holding pattern at 16,000 feet

• Uncontrolled roll and pitch oscillations; steep descent; recovered with one serious injury/damage to the tail that was discovered much later

• Probable cause: “The flight crew’s failure to maintain adequate airspeed during level-off which led to an inadvertent stall, and their subsequent failure to use proper stall recovery techniques”

• Contributing factor: “The flight crew’s failure to properly use the autothrottle”

American 903:Sequence of Events

• Autothrottle became disconnected during descent from cruise--not clear why, and not noticed or remembered by the crew

• Autopilot captured 16,000 feet and pitched up to maintain altitude, so airspeed decreased

• Vibration and buffeting began that the crew attributed to turbulence

• Airplane stalled and departed controlled flight

• Crew recovered below 13,000 feet, proceeded to safe landing in Miami

American 903:Dim Cues, Missing Cues,

and Hints

• Autothrottle disconnection did not have salient annunciation

– Dim cue (compared to autopilot, for example)

– Crews often do not notice unexpected mode changes

• Autothrottles had tripped off on other flights

– Hint about accidents from everyday operations

• Crew might have disconnected autothrottle without thinking about it or remembering

– Automaticity

American 903:Out of the Loop, Missing Cues,

and the Wrong Procedure

• Neither pilot noticed the airspeed loss until well below maneuvering speed

– Vulnerability from being out of the active control loop

– Crew expected aircraft to slow, just not that much

• Stall warning did not activate prior to stall

– Dependence on missing cue

• Crew performed wind shear recovery rather than stall recovery

– Primed to think about thunderstorms

– Training for wind shear and min-altitude approach to stall recovery

– Again, absence of timely stall warning

American 903:Lessons

• Tendency toward degraded monitoring of normally reliable automation is rooted in basic human cognitive vulnerabilities

– Simply cautioning pilots to monitor more carefully will not greatly reduce vulnerability, by itself.

• Snowball effect of one error producing overload that undermines error-trapping and results in more errors

• Missing cues can be extremely hazardous

• Concept of reliability on the flight deck

– Variability of performance

– Two pilots, redundant procedures, several warning systems, and an airplane: goal is to keep errors from sneaking through all those layers of cheese

Each Accident Has Unique Surface Features and

Combinations of Factors

• Countermeasures to surface features of past accidents will not prevent future accidents

• Must examine deep structure of accidents to find common factors

Cross-Cutting Factors Contributing to Crew Errors

• Situations requiring rapid response • Challenges of managing concurrent tasks

• Equipment failure and design flaws

• Misleading or missing cues normally present

• Stress

• Shortcomings in training and/or guidance

• Social/organizational issues

• Plan continuation bias

Situations requiring rapid response

• Nearly 2/3 of 19 accidents

• Examples: upset attitudes, false stick shaker activation after rotation, anomalous airspeed indications at rotation, autopilot-induced oscillation at Decision Height, pilot-induced oscillation during flare

• Very rare occurrences, but high risk

• Surprise is a factor

• Inadequate time to think through situation– Automatic response required from pilot

Cross-Cutting Factors

Challenges of managing concurrent tasks

• Workload high in some accidents (e.g., Little Rock, 1999)

– Overloaded crews failed to recognize situation getting out of hand (“snowball effect”): losing the ability to recognize that they were overloaded

– Monitoring and cross-checking suffered

– Crews became reactive instead of proactive/strategic

• But: adequate time available for all tasks in many accidents– Inherent cognitive limitations in switching attention:

preoccupation with one task of many; forgetting to resume interrupted or deferred tasks

Cross-Cutting Factors

Stress

• Stress is normal physiological/behavioral response to threat

• Acute stress hampers performance– Narrows attention (“tunneling”)– Reduces working memory capacity

• Combination of surprise, stress, time pressure, and concurrent task demands can be lethal setup

Cross-Cutting Factors

Social/Organizational Issues

• Actual norms may deviate from Flight Operations Manual

• Accident crew judgment & decision-making may not differ from non-accident crews in similar situations:– Lincoln Lab study: Penetration of storm cells on approach not uncommon– Other flights may have landed or taken off without difficulty a minute or two

before accident flight

– Little data available on extent to which accident crews’ actions are typical/atypical

• Competing pressures not often acknowledged– Implicit messages from company may conflict with formal guidance

• e.g. on-time performance vs. conservative response to ambiguous situations

– Pilots may not be consciously aware of influence of internalized competing goal

Cross-Cutting Factors

Plan continuation bias (e.g., Burbank, 2000)

• Unconscious cognitive bias to continue original plan in spite of changing conditions

• Appears stronger as one nears completion of activity (e.g., approach to landing)– Why are crews reluctant to go-around?

• Bias may prevent noticing subtle cues indicating original conditions have changed

• Reactive responding is easier than proactive thinking

• Default plan always worked before

Cross-Cutting Factors

Implications and

Countermeasures:The Individual

• Vulnerabilities are inherent in our human cognition

– Limitations, biases

• Effects extend to error trapping mechanisms

– Monitoring subject to task shedding under workload

– Checklists automatized, subject to slips/omissions

• Recognize realistic rather than theoretical performance of humans in generating errors as they work, and in catching errors

Implications and

Countermeasures:The Aviation System

• Accept the variability of human performance

– Because someone makes an error does not make them deficient or complacent

– Errors are inevitable

– Recognize that errors are probabilistic

• Society’s demand for reliability is nearly unlimited

– Focus on net system reliability, made up of:

• Reliability of each error trapping mechanism

• Independence of error trapping mechanisms to prevent the holes in the swiss cheese from lining up: (e.g., landing checklist initiation cued by callout to extend the gear)

Implications and

Countermeasures:The Aviation System (cont’d)

• Causal attribution to individual can distract attention from underlying cause and prevention

• Remediation lies with the system

– Guard against “don’t do that anymore” prescriptions

– No evidence of Bad Apple theory

– All other remediation requires more difficult changes in the system

Net System Reliability: “Okay, what can we do?”

• Treat monitoring and crosschecking as being as important as anything else we do

• Suggestions for training & checking:– Cut down on negative training (e.g., continuing

unstabilized approaches in sim makes plan continuation bias stronger on the line

– Teach pilots to monitor, give them the opportunity to practice in the sim

– Build at least one realistic challenge and decision into every sim session

– Teach how to manage workload, practice in the sim

“What can we do?”(continued)

• Procedures design review– Match human characteristics and limitations– Independence

• Equipment design review– Match human characteristics and limitations

• Safety Change:– Heed the hints--the next accident is trying to warn you

about itself right now• AA903, AA1340, CAL1943

– How to separate the wheat from the chaff?

Dismukes, R. K., Berman, B.A., & Loukopoulos, L. L. The Limits of Expertise: Rethinking Pilot Error and the

Causes of Airline Accidents.

www.ashgate.com

More information on NASA Human Factors Research:

http://human-factors.arc.nasa.gov/his/flightcognition/

This research was partially funded by NASA’s Aviation Safety Program and by the FAA (Eleana Edens, Program Manager).

What else can we do?