Upload
vanthu
View
217
Download
4
Embed Size (px)
Citation preview
Risk in Process Control:Do not forget human Imagination
Prof. Patrick Millot
[email protected]@fit.edu
1P. Millot RTLCS March 5, 2014
Risk Taking in Life -Critical Systems
The present workshop Risk Taking in Life Critical Systemsresults of a collaboration between HCDi at FIT (FL, USA) andLAMIH at the University of Valenciennes (F) and founded byPartner University Found (PUF Face).
It follows a first issue organized at the University of Valenciennes(F) on July 1-5, 2013 by Patrick Millot, Guy A. Boy & FrédéricVanderhaegen and related to Risk Management in Life CriticalSystems as shown bellow.
A book is under press:Millot P. (Ed) (to appear 2014) . Risk management in Life CriticalSystems , ISTE Ltd., London.
P. Millot RTLCS March 5, 2014 2
Organized by:
Patrick Millot, Guy A Boy,
Frédéric Vanderhaegen
Secretary:
Philippe Polet, Marie Claude Rossillol
Véronique Landrain, Corine Aureggi
P. Millot RTLCS March 5, 2014 3
Life Critical Systems belong to domains such as :
- transportation (trains, cars, aircraft flying, air traffic control),
- space exploration, energy (nuclear, chemical engineering),
- health and medical care,
- telecommunication networks,
- cooperative robot fleet, manufacturing,
- and services leading to complex and time-critical issues regarding safety,
efficiency and comfort.
Life Critical Systems are potentially highly automated .
We are focusing on “human(s) in the loop” systems and simulations, taking
advantage of human ability to cope with unexpected dangerous events on
one hand, and attempting to recover from human errors and system failures
on the other hand.
Our approach deals with Human-Computer Interaction and Human-
Machine System.
P. Millot RTLCS March 5, 2014 4
Risk management deals with prevention, decision-making, action
taking, crisis management and recovery, taking into account
consequences of unexpected events.
The approach consists in three complementary steps:
- prevention, where any unexpected event could be blocked or
managed before its propagation;
- recovery, before the event results in an accident, making
protective measures mandatory to avoid the accident or to
minimize the damages;
- and possibly after the accident occurs, management of
consequences to minimize or remove the most severe ones.
Global crisis management methods and organizations are
considered.
P. Millot RTLCS March 5, 2014 5
6
Prevention Recovery Consequences management
Unexpected
Event
Fails Fails
Accident
Predicting & anticipating:
Short term corrective Action: Consequence Anticipation:
STRATEGY FOR RISK MANAGEMENT
P. Millot RTLCS March 5, 2014
Through Enhancing Resilience:
- Adapted Methods & practices- Dedicated & adaptive tools - Flexible Organization- Human intervention + imagination
P. Millot RTLCS March 5, 2014 7
An interesting definition of Resilience (Amalberti 2014 this workshop )
9
Prevention Recovery Consequences management
Unexpected
Event
Fails Fails
Accident
Predicting & anticipating:
- Maintenance- Supervision (monitoring, fault detection)- Barriers (Norms, Procedures)- DSS: HM Cooperation- Situation Awareness
Short term corrective Action: - Procedures- DSS for event detection, compensation … HM cooperation-Support System for
action correction (Brakes, Emergency stop…- Situation Awareness
Consequence Anticipation:
- System Robustness- Emergency Care- Containment devices…
STRATEGY FOR RISK MANAGEMENT
P. Millot RTLCS March 5, 2014
Human Centered Design Approach for Risk management
PARAMETERS RELATED TO HUMAN MACHINE SYSTEMS:
- Technical aspects (Complexity, Reliability, Criticality, Safety, Security…)- Human aspects (physical, cognitive, social)- Human-machine Interaction (Task, Organization, Situations)
PARAMETERS INFLUENCING H-M INTERACTION
- Understanding the system complexity- Understanding the Human complexity- Levels of Automation, Autonomy, Authority
HUMAN OPERATOR ROLES
- Negative role = human error- Positive role = human capability to detect and recover technical as well as human errors
10P. Millot RTLCS March 5, 2014
11
• 1. The computer offers no assistance; human must do it all.
• 2. The computer offers a complete set of action alternatives, and
• 3. narrows the selection down to a few, or
• 4. suggests one, and
• 5. executes that suggestion if the human approves, or
…… Human may not always be maintained as the final Authority…...
• 6. allows the human a restricted time to veto before automatic execution, or
• 6.5 executes automatically after telling the human what it is going to do, or
• 7. executes automatically, then necessarily informs humans, or
• 8. informs him after execution only if he asks, or
• 9. informs him after execution if it, the computer, decides to.
• 10. The computer decides everything and acts autonomously, ignoring the human.
Levels of automation (LOA) (Sheridan, 1992)
and AUTHORITY
P. Millot RTLCS March 5, 2014
12
• 1. The computer offers no assistance; human must do it all.
• 2. The computer offers a complete set of action alternatives, and
• 3. narrows the selection down to a few, or
• 4. suggests one, and
• 5. executes that suggestion if the human approves, or
… Human may not always be maintained as the final Authority…...
• 6. allows the human a restricted time to veto before automatic
• execution, or
• 6.5 executes automatically after telling the human what it is going to do, or
• 7. executes automatically, then necessarily informs humans, or
• 8. informs him after execution only if he asks, or
• 9. informs him after execution if it, the computer, decides to.
• 10. The computer decides everything and acts autonomously, ignoring the human.
Levels of automation (LOA) and AUTHORITY (revisited by Moray Inagaki Itoh 00)
Authority
to the
Human
P. Millot RTLCS March 5, 2014
13
• 1. The computer offers no assistance; human must do it all.
• 2. The computer offers a complete set of action alternatives, and
• 3. narrows the selection down to a few, or
• 4. suggests one, and
• 5. executes that suggestion if the human approves, or
… Human may not always be maintained as the final Authority…...
• 6. allows the human a restricted time to veto before automatic
• execution, or
• 6.5 executes automatically after telling the human what it is going to do, or
• 7. executes automatically, then necessarily informs humans, or
• 8. informs him after execution only if he asks, or
• 9. informs him after execution if it, the computer, decides to.
• 10. The computer decides everything and acts autonomously, ignoring the human.
Levels of automation (LOA) and AUTHORITY (revisited by Moray Inagaki Itoh 00)
Authority
to the
Human
Authority
to the
Machine
P. Millot RTLCS March 5, 2014
UMR 8201
Example of Car Driving and road safety management:
ABV project: Automated driving at low speed
(LT 50 km/h)
Human Centered Design Approach for Risk management
PARAMETERS RELATED TO HUMAN MACHINE SYSTEMS:
- Technical aspects (Complexity, Reliability, Criticality, Safety, Security…)- Human aspects (physical, cognitive, social)- Human-machine Interaction (Task, Organization, Situations)
PARAMETERS INFLUENCING H-M INTERACTION
- Understanding the system complexity- Understanding the Human complexity- Levels of Automation, Autonomy, Authority
HUMAN OPERATOR ROLES
- Negative role = human error- Positive role = human capability to detect and recover technical as well as human errors
17P. Millot RTLCS March 5, 2014
HUMAN OPERATOR ROLES
- Negative role = human error
- Positive role = can detect and recover
human errors as well as technical errors
18
Automation dilemma
P. Millot RTLCS March 5, 2014
adapted from G. Boy
HUMAN OPERATOR ROLES
- Negative role = human error
- Positive role = can detect and recover
human errors as well as technical errors
19
Automation dilemma:
- Balanced Level Of Automation- Decision Support System (DSS)- Human-DSS cooperation- Situation Awareness
P. Millot RTLCS March 5, 2014
Decision Action
ExecutionState of the
environnement
Goals & objectives
Expectations
Information processingmechanisms
Long termmemory stores Automaticity
Abilities, Experience, Training
Individual Factors
Task and system factors
System complexityInterface design
Stress & WorkloadAutomation
Perception
of the
elements
SA1
Comprehension
of Situation
SA2
Projection
of future
states
SA3
SITUATION AWARENESS
SA model adapted from Endsley (95)
P. Millot RTLCS March 5, 201420
Data
SA
Development
Perception
Comprehension
Projection
Individual SA
Team
member A
Team
member B
Team
member N
Team Processes
Communication
Collaboration
Co-operation
Shared mental models
Team SA
Individual SA
SA of other
Team
members
SA of entire
Team
Common/
Shared
Picture
Taskwork Teamwork
Team – Situation Awareness (Salmon et al 2008)
P. Millot RTLCS March 5, 2014 21
HUMAN OPERATOR ROLES
- Negative role = human error
- Positive role = can detect and recover
human errors as well as technical errors
22
Automation dilemma:
- Balanced Level Of Automation- Decision Support System (DSS)- Human-DSS cooperation- Situation Awareness
& Human imagination
P. Millot RTLCS March 5, 2014
P. Millot RTLCS March 5, 2014 23
Besides Technical Supports
& H M Cooperation…
… Enhance Human Situation
Awareness
& Human Imagination …
An approach to DSS for Nuclear
Power Plant simulator … revisited …
Alliance Project: the ProcessNPP simulator: 700 instrumented variables
25
Reactor simulator
Sodium loop
Cooler
Steam generator
Water-Steam loop
P. Millot RTLCS March 5, 2014
Supervision team in the control room
P. Millot RTLCS March 5, 2014 26
- 1 chief Engineer head of the team (KBB: deep knowledge in nuclear physics and control engineerin g)
- 2 technicians (RBB: good knowledge on procedures)
- 1 operator make rounds in the plant (rondier)
- Work schedule:3 periods of 8 hours / day, 7days / week
Two parts of the Human operators supervision tasks
(what they are supposed to do! )
P. Millot RTLCS March 5, 2014 27
1) Monitoring (surveillance) in order to detect a defect2) Trouble shooting:
- Perception of relevant information- Diagnosis (Understanding what happens)- Prognosis (Projection of future state)- Decision making (chose, formulate an action or
a sequence of actions, … or invent it)- Application of the action- … and monitor the action effect (return to the
top)
Two parts of the Human operators supervision tasks
(what they are supposed to do! )
P. Millot RTLCS March 5, 2014 28
1) Monitoring (surveillance) in order to detect a defect2) Trouble shooting:
- Perception of relevant information- Diagnosis (Understanding what happens)- Prognosis (Projection of future state)- Decision making (chose, formulate an action or
a sequence of actions, … or invent it)- Application of the action- … and monitor the action effect (return to the
top)
SA (1)
SA (2)
SA (3)
29
HEST High Emergency Stop Threshold
HAT High Alarm Threshold
t (min)
LAT Low Alarm Threshold
LEST Low Emergency Stop Threshold
Set Value P0A
P04 (t)
Variable temporal evolution and Alarm Thresholdsfor monitoring and fault detection
P. Millot RTLCS March 5, 2014
30
HEST High Emergency Stop Threshold
HAT High Alarm Threshold
t (min)
LAT Low Alarm Threshold
LEST Low Emergency Stop Threshold
Var(t) Set Value
Var(t )
Effect of the malfunction on variable var(t)
A malfunctionoccurs HAT risks to be
crossed … Alarm
P. Millot RTLCS March 5, 2014
Alliance Project: Decision Support System Functionalities (DSS)
31
. Prediction capabilities through a process model simulation
. Provides a diagnosis and preventive advices based on Qualitative Physics principle (Gentil, Montmain 04)
P. Millot RTLCS March 5, 2014
DSS for trouble shooting
32Process
Supervision and Control Computer
Process Data Base
DSS
Fact Base
HMI Animation Program
Human supervision
team
Animated images Advices
Navigation Commands
Requests
Control
P. Millot RTLCS March 5, 2014
33
Human-DSS Cooperation: a vertical structure
Human Operator:decisions
DecisionSupportSystem
AutomatedProcess
Know-How
Know-How
Objectives
Performance
Production
. Orders
. Assistancerequirements Advices
+_
P. Millot RTLCS March 5, 2014
34
Var(t) Set Value
Var(t )
Scenario of unknown malfunction on Var(t)
A malfunctionoccurs HAT is
crossed … Alarm
HEST
HAT
t (min)
LAT
LEST
P. Millot RTLCS March 5, 2014
Monitoring View of the Process Operation and Fault Detection (1)
35
Regular Star-View of the 7 significant variables
P02
Normal operation
mode
TD03
Q01
T03Tna 15
Dna 4
Tna 134
P. Millot RTLCS March 5, 2014
Monitoring View of the Process Operation and Fault Detection (2)
36
Irregular Star-View of the 7 significant variables
P02
Abnormaloperation
mode
TD03
Q01
T03Tna 15
Tna 134
Dna 4
P. Millot RTLCS March 5, 2014
39
H-M Cooperation structure: a vertical one
Human Operatordecisions
DecisionSupportSystem
AutomatedProcess
Know-How
Know-How
Objectives
Performance
Production
Possible conflict?
. Orders
. Assistancerequirements Advices
+_
P. Millot RTLCS March 5, 2014
40
H-M Cooperation structure: a vertical one
Human Operatordecisions
DecisionSupportSystem
AutomatedProcess
Know-How
Know-How
Objectives
Performance
Production
Know-How toCooperate
. Orders
. Assistancerequirements Advices
+_
P. Millot RTLCS March 5, 2014
41
Alarm or detection
of abnormal state
Observation of
information
Identifying the system
state
Predictions Evaluation
of alternatives
Definition of task
Selection of modifications
Definition of a
procedure
Execution of the
procedure
Warning
Set of observa-
tion
System state
General strategy
Task
Procedure
Justification of the system
Conflits ?
Human-DSS Interface
Alarm detection of
events
Observation of
information
Diagnosis
Selection of the appropriate
procedure
Rul
e ba
sed
Beh
avio
ur
yes
no
Steps of the operator decisional pathway
Conflict solving by the human, supported by the DSS
Reasonning process
of the DSS
Looking for consensus-points
Skill
-bas
ed B
ehav
iour
Process Supervision
Set of relevant variables
Diagnosis
Synthesis of the decisional
conflict pathwaybetween
the HO and the DSS
K H C ij
K H i
K H j
P. Millot RTLCS March 5, 2014
1rst justification level: Propagation View
Defect propagation Network on a Simplified Process Synopsis
42P. Millot RTLCS March 5, 2014
2nd justification level: significant variables
. Historic
. Prediction
43
t (min)
P04
P. Millot RTLCS March 5, 2014
17
Alarm or detection
of abnormal state
Observation of
information
Identifying the system
state
Predictions Evaluation
of alternatives
Definition of task
Selection of modifications
Definition of a
procedure
Execution of the
procedure
Warning
Set of observa-
tion
System state
General strategy
Task
Procedure
Justification of the system
Conflits ?
Human-DSS Interface
Alarm detection of
events
Observation of
information
Diagnosis
Selection of the appropriate
procedure
Rul
e ba
sed
Beh
avio
ur
yes
no
Steps of the operator decisional pathway
Conflict solving by the human, supported by the DSS
Reasonning process
of the DSS
Looking for consensus-points
Skill
-bas
ed B
ehav
iour
Process Supervision
Set of relevant variables
Diagnosis
Disturbance Appearance
DSS imagery synthesis
P. Millot RTLCS March 5, 2014
But, what happens when neither DSS
nor Human have a solution ? …
… the Human must deal with the problem !
… s/he must develop a strategy
for Risk taking & invent a solution
P. Millot RTLCS March 5, 2014 45
46
Var(t) Set Value
Var(t )
Scenario of a malfunction not solved by DSS (1)
A malfunctionoccurs HAT is
crossed
HEST
HAT
t (min)
LAT
LEST
P. Millot RTLCS March 5, 2014
47
Var(t) Set Value
Var(t )
Scenario of a malfunction not solved by DSS (2)
A malfunctionoccurs
HEST risks to be
crossed
HEST
HAT
t (min)
LAT
LEST
Increasing risk for system
Increasing risk for human operators
P. Millot RTLCS March 5, 2014
48
Var(t) Set Value
Var(t )
Scenario of a malfunction not solved by DSS (3):HO has no solution but adopt a strategy to save tim e
A malfunctionoccurs HEST risks
to be crossed
HEST
HAT
t (min)
LAT
LEST
HO command to stabilize
Var(t) & save time
P. Millot RTLCS March 5, 2014
49
Var(t) Set Value
Var(t )
Scenario of a malfunction not solved by DSS (4):HO applies the strategy to save time
A malfunctionoccurs HEST risks
to be crossed
HEST
HAT
t (min)
LAT
LEST
Stabilize to save timeCommand
P. Millot RTLCS March 5, 2014
P. Millot RTLCS March 5, 2014
- DSS & a coherent cooperation with the humans increase performance: air misses (errors) were divided by 2 … but errors remain
- A cognitive analysis of the controllers activities showed : - The Controllers do not have any lack of expertise: they
always know the solution
- But, as they solve the problem in anticipation, the risk is they forget a problem, ie: to apply the solution at the right time
2nd example in Air Traffic Control
A cooperative function allocation between human controllers & a dedicated DSS showed:
P. Millot RTLCS March 5, 2014
Lessons & discussion
- Even with Support systems and experts in the supervision loop, systems have limits and risk does not disappear
- In Process control, - managing Risks needs humans in the
loop, - designers should take the operators real
Needs into account when designing- Designers should allow more liberty to
human Imagination
References (1)
52
Boy, G.A. & Grote, G. (2011). The Authority Issue in Organizational Automation. InG.Boy (Ed) Handbook for Human-Machine Interaction. Ashgate Publishing Ltd, WeyCourt East, Union Road, Farnham, Surrey, GU9 7PT, England ,2011
Debernard, S., Hoc, J-M., (2001). Designing Dynamic Human-Machine TaskAllocation in Air Traffic Control: Lessons Drawn From a MultidisciplinaryCollaboration. In M.J. Smith, G. Salvendy, D. Harris, R. Koubek (Ed.), Usabilityevaluation and Interface design: Cognitive Engineering, Intelligent Agents and VirtualReality, volume 1. London: Lawrence Erlbaum Associate Publishers, pp. 1440-1444.
Gentil S., Montmain J., (2004). Hierarchical representation of Complex systems forsupporting human decision-making, Advanced Engineering Informatics, Elsevier, vol.18, p. 143-159.
Inagaki T.(2006). Design of human–machine interactions in light of domain-dependence of human-centered automation. Cognition, Technology and Work.Volume 8, Issue 3, pp. 161 – 167, ISSN:1435-5558
Millot P. Mandiau R. (1995). Men-Machine Cooperative Organizations : Formal andPragmatic implementation methods.In Hoc, Cacciabue, Hollnagel (Ed), Expertise andTechnology : Cognition Computer Cooperation, Lawrence Erlbraum Associates, NewJersey , chap. 13, pp. 213-228
P. Millot RTLCS March 5, 2014
References (2)
53
Millot P., Hoc J.M. (1997). Human-Machine Cooperation: Metaphor or possible reality?European Conference on Cognitive Sciences, ECCS'97, Manchester UK, April.
Millot P., Lemoine, (1998) An attempt for generic concept toward human machinecooperation. IEEE SMC’98, San Diego, US.
Millot P., Debernard S. ,Vanderhaegen F. (2011). Authority and cooperation betweenhumans and machines. In G.Boy (Ed) Handbook for Human-Machine Interaction.Ashgate Publishing Ltd, Wey Court East, Union Road, Farnham, Surrey, GU9 7PT,England ,2011
Millot P., Pacaux-Lemoine MP. (2013). A Common Work Space for a mutualenrichment of Human-machine Cooperation and Team-Situation Awareness, 12thIFAC/IFIP/IFORS/IEA Symposium Analysis Design and Evaluation of Human MachineSystems, Las Vegas Nevada,USA, August 11-15.
Millot P. (Ed) (2013). Ergonomie des Systèmes Homme-machine: conception etcoopération, 387 p, Hermés-Lavoisier, Paris.
Millot P. (Ed) (to appear 2014) . Risk management in Life Critical Systems , ISTE Ltd.,London.
P. Millot RTLCS March 5, 2014
References (3)
54
Moray N., Inagaki T. , Itoh M. (2000), Situation adaptive automation, trust, and self-confidence in fault management of time-critical tasks, Journal of ExperimentalPsychology: Applied, Vol. 6, No. 1, pp. 44-58, 2000.
Pacaux-Lemoine, M.-P., Debernard, S., (2002). Common work space for human-machine cooperation in air traffic control. Control Engineering Practice. 10 (2002) 571-576.
Rajaonah B., Tricot N., Anceaux F., Millot P. (2008). Role of intervening variables indriver-ACC cooperation, International Journal of Human Computer Studies 66 .
Schmidt, K. (1991). Cooperative work: a conceptual framework. In J. Rasmussen, B.Brehmer, & J. Leplat (Eds.), Distributed decision-making: cognitive models forcooperative work. Chichester, UK : John Willey and Sons. pp. 75-110.
Sentouh C., Popieul JC.(2014 to appear), Human-Machine Interaction in AutomatedVehicle: The ABV Project. In P. Millot (Ed) Risk management in Life Critical Systems,ISTE Ltd., London.
P. Millot RTLCS March 5, 2014