Upload
nhsscotlandevent
View
268
Download
1
Tags:
Embed Size (px)
Citation preview
Blame Culture, No-Blame Culture and Just Culture
Keith Grint & Clare Holt
Smith & Hogan, 1975. Criminal Law
According to World Health Organization (WHO)…..
You have 1 in 10 million chance of dying in a plane crash
You have 1 in 300 chance of dying from a healthcare error in hospital
“The operator of an aircraft, the surgeon performing an operation, must all foresee that their acts might cause death; but we should not describe them as reckless unless the risk taken was unjustifiable.”
(The Times , 22/7/11)
3
What happens when it all goes pear-shaped?
UNSAFE ACT
UNINTENDEDACTION
INTENDEDACTION
SLIP
LAPSE
MISTAKE
VIOLATION
Attentional failuresIntrusionOmissionMistiming
Etc.
Memory failuresForgettingOmission
Place-losing
Rule-based Misapplication of good rule
Application of bad ruleKnowledge-based
Many variablesUntested Process
Routine violationsExceptional violationsActs of sabotage
BASIC ERROR TYPES
Taken from ‘Human Error’, James Reason (1990, 2009), p207
CLASSIFICATION OF ERRORS
UNSAFE ACT
UNINTENDEDACTION
INTENDEDACTION
SLIP
LAPSE
MISTAKE
VIOLATION
Attentional failuresIntrusionOmissionMistiming
Etc.
Memory failuresForgettingOmission
Place-losing
Rule-based Misapplication of good rule
Application of bad ruleKnowledge-based
Many variablesUntested Process
Routine violationsExceptional violationsActs of sabotage
BASIC ERROR TYPES
Taken from ‘Human Error’, James Reason (1990, 2009), p207
CLASSIFICATION OF ERRORS
BLAME CULTURE (1/2): The Sweep it under the carpet school of management
You’ve made a mistake
Will it show? YES Can you hide it? YES Conceal it before somebody else finds out
NO
Bury it
NO
Can you blame someone else, special circumstances or a difficult client?
YES Get in first with your version of events
Could an admission damage your career prospects?
NONO
Sit tight and hope the problem goes away
Problem avoided
YES
BLAME CULTURE (2/2): The Sweep it under the carpet school of management
You’ve made a mistake
Will it show? YES Can you hide it? YES Conceal it before somebody else finds out
NO
Bury it
NO
Can you blame someone else, special circumstances or a difficult client?
YES Get in first with your version of events
Could an admission damage your career prospects?
NONO
Sit tight and hope the problem goes away
Personal Responsibility Avoided; OrganizationContinues toFail; no-oneSeems to know why….
YES
No-BLAME CULTURE (1):
You’ve made a mistake
Will it show? YES Don’t need to hide itIt wasn’t your faultIt was probably the fault of the system
Admit it
NO
Ignore it
Personal Responsibility Avoided; OrganizationContinues toFail; no-oneSeems to know why….
No-BLAME CULTURE (2):
You’ve made another mistake
Will it show? YES Don’t need to hide itIt wasn’t your faultIt was probably the fault of the system
Admit it
NO
Ignore it
Personal Responsibility Avoided; OrganizationContinues toFail; no-oneSeems to know why….
No Learning!
JUST CULTURE:
You’ve made a mistake
Will it show? YES
Don’t need to hide itCould be partly your fault but it’slikely that other factors are also involvedYou have a responsibility to prevent it happening again
NO
Admit it
Personal Responsibility Taken.OrganizationContinues toImprove –everyone knows why….
Organizational learning occursInformation fed back to individual as well as the organization
Admit it
Report it through the appropriate channels
Investigated
Just Culture: A Brief Theoretical Overview from ‘Accident’ Theory
In the beginning....
1. Human Error (First Story Accounts) 2. Sequence of Events3 . Systems (Second Story Accounts; tight/loose coupling
icebergs & hard shell/soft shell + process/SOPs)a. Latent Failure/Swiss Cheese modelb. Normal Accidentc. Just Culture
11
1. HUMAN ERROR (First Story Accounts – the initial assumption)Biggest personnel problem for US military: getting the right people in the right jobs
– problem of selection – fixed through competency framework1943 P-47s & B-17s keep crashing – wheels are retracted on landing instead of flapsCannot be the planes – look at how robust they are – must be the people.
B-17 AfterB-17G-80BO 43-38172 8th AF 398th BG 601st BS
damaged on a bombing mission over Cologne, Germany,
Must be HUMAN ERROR – So , what’s wrong with our pilots?
B-17 Before
12
Alphonse Chapanis
How come the P-47 pilots make same error but C-47s’ don’t?
13
P-47 Thunderbolt
Flaps & Wheels
14
1. HUMAN ERROR
C-47/DC-3
C-47s don’t have side by side wheel and flap controls with identical levers & coloured toggle switches
15
1. HUMAN ERROR
Alphonse Chapanis
Mark wheel lever with a wheel & flap lever with a triangle – significant reduction in landing ‘accidents’
HUMAN ERROR is just one possible explanation: 1st story accountIt’s likely that such mistakes will recur because of the connection
between the human and the system –
16
1. HUMAN ERROR
Folk Myth: systems are 100% reliable – as long as they are protected from human error
Reification: a system is an objective, stable & predictable ‘thing’ - not a moving mass of stuff.
Response: eliminate human error, especially in high risk organizations
Consequence: system becomes more calcified/brittle – allows less, not more, learning
17
1. HUMAN ERROR
Hard Shell (Exogenous) V Soft Shell (Endogenous) organizationHard Shell – externally strong, process-driven but brittle
system designed to prevent errorSoft Shell – externally weak but flexible system:
built in resilience via capacity to learn & rectify error
18
HARD SHELL - SOFT SHELL
Is the safety system hard or soft – prevention or recovery?
2. Sequence of events model (Heinrich, 1931) – domino run model
Events preceding accident occur in linear fixed order with theaccident being the last in sequence
Solution: a sequence of barriers to reduce hazard, absorb energy & prevent accident
Space Shuttle Columbia 2003:Piece of foam strikes wing on launch breaching thermal protectionOn re-entry superheated air melts wing which breaks offSolution:
19
2. Sequence of events model (Heinrich, 1931) – domino run model
Events preceding accident occur in linear fixed order with theaccident being the last in sequence
Solution: a sequence of barriers to reduce hazard, absorb energy & prevent accident
Space Shuttle Columbia 2003:Piece of foam strikes wing on launch breaching thermal protectionOn re-entry superheated air melts wing which breaks offSolution: reinforce wing
20
Safe Present
Critical Future
The hindsight problem:View from investigator
“A map that shows only those forks in the road that we decided to take”Lubar, 1993: 1168 History from Things (Smithsonian Institute)
21
Sequence of Events Model: the hindsight problem
Present
Future
View from the decision-maker
Future
FutureFuture
Future
Future
Future
Future
22
2. Sequence of events model (Heinrich, 1931) – domino run model
The hole in the wing was produced by not simply by debris but by holes in organizational decision-making
24
25
3. Systems Approaches/ Second Stories
Second Story accounts
• human ‘errors’ are the products –symptoms of system complexity (1st story accounts) – there are usually multiple causes (2nd story accounts)
• safety or success is less the consequence of perfect process and more the consequence of people's operational practice
26
Second Story accounts
• sharp end
– practitioners directly interact with a hazardous process
• blunt end – regulators, administrators & managers provide resources & constraints that practitioners have to integrate
• success & failure is a result of how sharp end practitioners cope with complexity & how their actions are shaped by resources & constraints of those at the blunt end
27
Bricoleurs: (Levi Strauss) people who achieve success by stitching together whatever is at hand, whatever needs stitching together to ensure practical success. Bricoleurs & the possibility of rescue: First-Responders to the flooding in New Orleans Kroll-Smith et al, (2007) Journal of Public Management & Social Policy (Fall)
The CPR (Cardiopulmonary resuscitation) paradox: 5 trainee + 1 experienced paramedics filmed using CPRFilm shown to three groups: who is the experienced one?1. Experienced paramedics get it right 90%2. Students right 50%3. Instructors right 30%
Why?
Bricoleurs & the possibility of rescue: First-Responders to the flooding in New OrleansKroll-Smith et al, (2007) Journal of Public Management & Social Policy (Fall)
The CPR (Cardiopulmonary resuscitation) paradox: 5 trainee + 1 experienced paramedics filmed using CPRFilm shown to three groups: who is the experienced one?1. Experienced paramedics get it right 90%2. Students right 50%3. Instructors right 30%
Why? Instructors follow training protocols; experiencedparamedics know that the protocols don’t always work
Training V Education?Bricoleurs can be undermined by over relying on
protocols?First responders in New Orleans were left to their own
devices
St Claude Bridge
People sheltered on the bridge but the water rose rapidlyPolice officer went to National Guard base near the bridge and
asked a colonel for the buses to rescue the peopleColonel refused but said he would ask his general –
but wasn’t sure where he was ... No buses left the depot
One ambulance driver carried 42 people in one go
Police officer commandeered (stole) a refrigerator truck siphoned (stole) diesel from abandoned vehicles to keep it running to feed 100 people for days
Second Story accounts – cont
• All complex systems contain weaknesses but these are usually transcended – stopped – by the safety seeking actions of individuals
• Multiple weaknesses exist in all complex systems but failure tends to occur when all of the weaknesses occur simultaneously
• The search for a single cause inhibits our understanding
• To understand failure you must first understand success – how people at sharp end learn & adapt to create safety or success in world fraught with hazards, trade-offs & multiple goals
32
Iceberg model:
1 accident
10 incidents
30 near misses
600 unsafe acts
33
Reduce the unsafe acts to reduce the accidents
But US air data suggests the airlines with the most incidents & near misses have the lowest # accidents
30,000 near-miss/trivial reports per annum in US aviationAlmost no catastrophic crashes reported.
The ability to learn is critical to safety because you cannot build a completely safe system
passengers have to fly 19,000 years to die in place crash
34
3a. Latent Failure Model – Swiss Cheese Model (Reason, 1990)
Some of the factors that contribute to disaster are latent – present before the disaster - ‘Hidden Pathogens’ (Reason)
Active Failures: unsafe acts – people at sharp end – errors quickly apparentLatent Failures: features that lay dormant & only become evident when they
combine and are triggered – people at the blunt end
‘People at the sharp end – operators – are not usually the cause of the accident but the inheritors of system defects created by poor design, incorrect installation, faulty maintenance & bad management decision’
(Reason, 1990: 173 Human Error)
Safety critical systems have a series of barriers to prevent/limit/absorb danger
But each barrier has holes in it – imperfections – when all the holes line up and are penetrated – disaster occurs
35
36
37
Build an error-tolerant system with long recovery interval
If the elimination of error is impossible must build system that enhances error recovery
How good is the system at recognizing & responding to disturbances?
3b. Normal accident theory (Perrow)
Multiple safety systems add complexity and increase opacity –
When things start to go wrong it’s difficult to see or act appropriately
Systems involved - not a single or set of component failures, but the unanticipated interaction of a multitude of events in a complex system
Accidents are not unusual events but normal events given the complexity & tight coupling of the system
38
3c What is meant by ‘Just’?Dekker (2007)
Balancing safety with accountabilityJUST CULTURE
Satisfies the demands for accountabilityContributes to learning and improvement
Not punishing for ‘unintended’ errors which are part of the professional role of the individual,
BUT intentional violations and destruction, are not tolerated
39
What is Just? What isn’t Just?Where do you draw the line?Who draws the line?What is the line?
‘It’s not obvious, but it needs to a be a judgement by the organisation looking at ‘politics, power and populism.’ (Dekker, 2011)
i.e. individuals should be included in the decision!There needs to be some trust to encourage honesty, but what
is acceptable/unacceptable needs to be clearSome organisations are subject to regulatory bodies, some set
up their own safety boards and ethics committees40
Intentional
UnintentionalViolation
Slip, lapse
Safety Culture = Just + OpenViolations can be linked to culture
A ‘No Blame’ culture is neither – Feasible,– Desirable, nor – Accountability -free
You need to look ahead to improve (accountability), and not blame the past.To encourage a ‘safety culture’ and hold individuals accountable, they must be given an appropriate level of discretion – ‘a culture of balance’
41
I’ll get away with it! Everyone does it,
they’ll just turn a blind eye.
OPEN-reporting
Openness in reporting is providing the environment for individuals to report the trivial near-misses
These are what can cascade into latent system failures that fester and can have catastrophic consequences!
Front-line professionals are best used to help with future prevention – this can be hindered if they are treated like a criminal!
42
OPEN-reporting
Open reporting requires- Honest disclosure and transparency- Easily submitted, with some immunity- Needs to be seen to be actioned - confidence- Lessons learned (training, change in SOP, etc)- Actions and lessons disseminated (if possible
across an industry!)
43
Things to bear in mind…………Health care, aviation, petrochemical, nuclear
professionals, etc. all have a
Criminalization of an unintended error can hamper a safe & just culture – this only encourages people to ‘hide’ their mistake(s)
‘‘Dispensing mistakes [in healthcare] happen. And
even with the introduction of robots and SOPs, the Utopian ideal of a world without errors is closer to fantasy than reality.’
44
STRONG SAFETY ETHIC
Chapman, 2009 ‘A criminal mistake?’
Conclusions:
Stop looking for psychological error mechanisms – 1st story accountsstop blaming HUMAN ERROR
Systematic features of the environment can trigger predictable actions that lead to ‘error’
Safety is less a feature of the system and better understood as being created by people in complex systems
Are systems safe & therefore need protecting from unreliable humans? Or does the elimination of human ‘unreliability’ make the system more
brittle so that the sources of resilience are eliminated?
‘the enemy of safety is not the human: it is complexity’ Woods et al, (2010:1) Behind Human Error (Ashgate)
45
A Safety, Just and Learning culture can be strived towards but rarely attained
It is the process that is important!
46