Upload
vuhuong
View
220
Download
3
Embed Size (px)
Citation preview
! Risk management is the core of safety management ! A lot of SMS effort – both operators and regulators is focused on process ! But the challenge lies in the content
! Hazard identification – Can’t manage unidentified hazard ! Risk Assessment – wrong risk level miss directs attention
! And we humans are not naturally good at either of these ! Some things we can do to improve risk management content
! Understand limitations ! Structure the approach – Strategic, Tactical and Mission ! Engineer the task ! Let the data talk ! Taste the cake
2
! Hazard Identification ! Risk Assessment ! Risk Control
3
Polic
y
Ris
k
Ass
uran
ce
Prom
otio
n
Polic
y
Ass
uran
ce
Prom
otio
n
! Critical Hazard Missed ! Risk Underestimated
! Six Illusions ! Attention – the invisible gorilla – we don’t perceive the unexpected ! Memory – we don’t replay we reconstruct, reliance on stories to fill gaps ! Confidence – we think confidence signals being right – but we’re wrong! ! Knowledge – we confuse familiarity with understanding and understand less than we
think, especially about complex systems. ! Cause – we see patterns and infer causes – one implication is we totally misunderstand
random events – Stars and Apple ! Potential – we think we have massive unused brain power that can be brought to bear
on problems – we don’t!
Source - Christopher Chabris and Daniel Simons “The Invisible Gorilla”
4
! Public view of risk is grossly distorted and often irrational ! Cognitive systems combined with structural effects of media drive distortion ! System 1(rapid unconscious) and System 2(slow conscious)
! System 1 uses rules of thumb which work most of the time ! System 1 can overwhelm System 2 when we don’t stop it
! System 1 rules of thumb ! Anchoring – guessing based on anchor – may have no reasonable connection ! Typical – common patterns, stereotypes overwhelm rational ! Availability – recent examples tend to excessively influence (example repeating safety
events) ! Safety and operational managers use the same systems!
Source – Dan Gardner “Risk”
5
! Daily review of incoming safety reports by panel of SMEs ! Minor event types – eg minor loading errors ! Look at long term data and the rate is static ! Most of the time these events are judged by the panel as intrinsically low
risk (although need to address system issues) ! Randomly can get two or more events close together in time ! Panel always wants to increase risk rating ! Regulator challenged approach when risk was not escalated ! Leads to focus on circumstances of these events (stories) rather than long
term data
6
! Expert predictions are hardly better than chance (Philip Tetlock) ! World is a very complex place – way more complex than we think ! Brain suffers from cognitive wiring that leads to systematic errors ! The most confident experts are likely to be wrong ! Effective prediction (to the extent possible) requires mind open to the evidence
! We believe experts ! We crave certainty ! We see patterns where none exist – we see random as meaningful ! We’re seduced by stories ! We believe that more confident equals more right – we’re wrong
Source - Dan Garner “Future Babble”
7
! Proactive risk process – expert panel ! Hazard identification ! Proposed hazard - undesired/unexpected behaviour ! Protest – not a hazard because already covered by procedure – “It tells them
not to do that!” ! Better – evaluate strength of procedural barrier – does it reduce risk to
negligible – checking data (eg LOSA) often shows less than perfect ! Need to keep hazard identification process open – don’t close early
! Need for trained facilitators...
8
! We can fail to identify critical hazards: ! Lack imagination – typical scenario is much stronger than “odd” scenario ! Don’t expect unexpected ! Wrong mental model – simple linear process narrative not complex noisy parallel chaos ! Over reliance on procedure ! Group think – social effects dominate rational judgement
! We can miss-estimate risk: ! Poor calculators ! Overconfidence bias ! Understanding of complex systems is actually poor ! Intuitive sense of random is wrong! ! Small data window – one-off may be underreporting
9
! Understand limitations ! Awareness of cognitive “wiring” effects
! Structure the approach ! Strategic, Tactical and Mission risk
! Engineer the task ! Structure process to encourage open mind and careful consideration ! Challenge expert and group thinking ! Facilitation
! Let the data talk ! Focus analysis on data which measures risk not data as it is collected ! Share hazards not data ! Test apparent “one-offs” - get more data
! Taste the cake ! Test the outcomes
10
11
Strategic Risk Management • Broad issues (eg Runway safety) • Timeframe years • Global data excellent • Assume you’re the same unless clear you’re not
Tac1cal Risk Management • Hazard iden@fica@on • Timeframe months • Need risk “model” eg bow@e to understand link to strategic • Focus on performance monitoring – Data!
Mission Risk Management • Underused • Timeframe single opera@on (flight, turn around) • Link to short term data trends • Encourage mission debriefs
Problem Solution Wrong mental models of operation
LOSA-like observations of actual operation Challenge through facilitation Include “others”
Lack of imagination (availability bias)
Include non domain people Facilitated search pattern to ransack the space of the possible
Poor calculators Data, Data, Data! Structured risk models (eg bowtie) Weighted risk performance measures
Group Think Include “Others” Facilitation
One-off events Survey to test extent Share hazards
12
! Making a cake ! You can have the ingredients (Input) ! The recipe (Process) ! You can actually make the cake (Output) ! But – what does it taste like! (Outcome)
! Risk Management ! You have hazard identification elements (Input) ! You have a documented RM process (Process) ! You document risk management processes – risk control actions (Output) ! But have you managed the risk (Outcome)
! Both regulators and operators need to focus on content ! Measure risk management performance ! Compare outputs across operators and domains
13
! If risk management processes fail to produce desired outcomes (hazards identified and assessed) SMS will fail
! The human and social factors that make risk management hard are not that different from those that we investigate in safety events. We should manage them the way we want human factors in the operation managed.
! The focus of practitioners should be on structuring and engineering the risk management approaches to directly address the known failure modes.
! The focus of regulators and operators should be on testing outcomes both within and across organisational boundaries
14