Upload
vokhuong
View
227
Download
0
Embed Size (px)
Citation preview
Safety Integrity Safety Integrity AB
SOFTWARE SAFETYWe provide SERVICES, EDUCATION, SAFETY MANAGEMENT TOOLS,
DOCUMENTATION TEMPLATES, and CHECKLISTS
We are experts on the standards: IEC61508 and its derivatives e.g., EN50128, ISO26262
We provide SERVICES as− Independent SAFETY ASSESSORS − SAFETY PROCESS OPTIMIZERS− SAFETY MENTORS
We offer TRAINING in − SAFE and RELIABLE SOFTWARE DEVELOPMENT, as well as − Tailored courses for how to put together SAFETY CASES
complying with the STANDARDS IEC61508, EN50128 and ISO26262
Safety Integrity
Dr. Henrik Thane
− Founded Safety Integrity AB in 2009, and work as a software safety assessor/auditor.
− Member of national committees for IEC61508 and EN50128
− Product M Manager at ENEA, Responsible for all operating systems and tools
− CEO ZealCore, co-founded ZealCore 2001, acquired by ENEA 2008
− Associate Professor (Docent) at Mälardalen Real-Time Center until 2008
− Ph.D. from the Royal Institute of technology in Stockholm, 2000
− In addition to research I have during the last 14 years worked as an expert consultant for the industry
and given numerous industrial courses on design and test of software in safety-critical computer based
systems.
Biography
Safety Integrity Outline
What is safety?Functional Safety StandardsProcess requirements on VerificationProcess requirements on TestSummary
Safety Integrity Definition: Safety
Safety is the absence of unacceptable risk
That is…
A system is safe if the risk associated with the system is acceptable
Unacceptable/Eliminateand Unacceptable Control
Residual
Acceptable
Unidentified
Total Risk Residual Risk
Safety Integrity What computer software is dangerous?
• Software that controls or monitor safety-critical hardware is potentially dangerous “Hazardous”
• Hazards can cause accidents/damage to:
Human/animal - Train crashEquipment/vehicle – Space probe goes missingFacilities/environment – Nuclear power plant accident
• Software which do not control or monitor “real” processes in real-time are usually not hazardous
• Consequently, software is only hazardous in a context.
MachineMachine
SensorSensor ActuatorActuator
Control or monitoringsystem
Control or monitoringsystem
Safety Integrity When does a system fail?
Fault Error Failure
Fault Execution
Fault Infection Fault Propagation
Incubation Time
Safety Integrity Definition: Functional Safety
Absence of unacceptable risk due to hazards caused by » Malfunctional behavior of E/E systems
Failure Hazard Accident
…Or in other words the risk for:
…is acceptable:
Safety Integrity
Build a safe system
Development and Operation:1. What hazards exist?
Rank them
What can cause the hazards?
2. Identify requirements for safe design (Safety Goals)
Eliminate specific hazards
Control hazards that cannot be eliminated
3. Evaluate the appropriateness of the safety measures.
4. Supply evidence for quality assurance
5. Evaluated planned modifications
6. Examine accidents and incidents.
Prove it is safe - The Safety Case
1. How safe is the system? Arguments
Evidence
2. Evaluate the threat from non-eliminated hazards
Safe Software Objectives
11 22
Safety Integrity Functional Safety Standards
SS-EN-IEC 615082001 & 2010
EN 50126EN 50129EN 50128Railroad
1999/2001/2011
IEC 61513
Nuclear Power
2001
IEC 61511
Process Industry
2003
IEC 62061
Safety of Machinery
2003
IEC 26262
Automotive
2012
• Embedded Systems Safety– IEC 61508 (2001) and (2010 2nd ed.)
• Industry specific– Software for Machines
• ISO13489-1• ISO 62061
– Transportation• EN 50128 – railway software• ISO 26262 – Automotive/Trucks/Construction Equip.
• Industry specific– Aerospace and aviation
• DO-178B, Aviation, USA• NASA-STD-8719-13, NASA, USA• ESA PSS-05-0, Space, European
– Military• MIL-STD-882D, DoD, USA• 00-55/00-56, MoD, UK• MIL-STD-498, DoD, USA
Safety Integrity IEC61508 the best-practice standard
A PROBABILISTIC APPROACH
– IEC61508 categorizes systems based on the probability of dangerous failures occurring
– Applied in sector specific standards
– Works well for hardware, where failure probabilities are “easy”to obtain – as well as show independence of failures
– …But does not work as well for Software
• Only systematic faults
Safety integrity levels: target failure measures for a safety function operating in high demand mode of operation or continuous mode of operation
Safety Integrity
For Software – No quantification
… The process approach
IEC 61508 EN 50128Railroad
IEC 26262Automotive
Safety Integrity Process Requirements
Fulfillment of all Normative Requirements by providing Evidence of:
• Plans & Process
• Requirements
• Verification for each phase• Static (reviews)• Dynamic (tests)
• Independence– Between doer and verifier– Doer – Verifier – Validator -
Assessor
• Reports for each phase
• Change management
• A complete documentation trail
• Assessment
Figure. Software Development process for EN50128
Safety Integrity
Plans (16)• Project management plan• Safety plan• Safety assessment plan• Item integration and test plan• Validation plan
Artifacts (43)• Safety Case• Evidence of field monitoring• Item definition• Impact analysis (modified item)• Hazard analysis• Safety Goals• Functional safety concept• Technical safety req. spec.• Technical safety concept• System design spec.• HW/SW interface spec.• Production, operation, service,
decommissioning. Spec.• Integration test spec.• HW safety req. spec.• HW design spec• Analysis of Architecture wrt random
failures• Analysis of safety goal violation due
to random failures• Specification of dedicated measures
for HW,
Verification/reports (57)This includes records and summarizing reports
• Safety plan (confirmation)
• Safety Case (confirmation)
• Hazard analysis (confirmation)• Item integration and test plan (confirmation)• Validation Plan (confirmation)• Safety Goals
• Functional safety concept• Technical Safety req. spec• Technical safety concept• System design spec.• HW/SW interface spec.• Production, operation, service, decommission. Spec.• System level safety analysis (confirmation)• Integration test spec • Integration test records• Validation• Safety assessment/audti (confirmation)• Release for production• HW safety req. spec.• HW design spec• HW safety analysis report• Analysis of Architecture for random failures• Analysis of safety goal violation due to random
failures• HW integration test report
Necessary Lifecycle Artifacts ISO26262
Safety Integrity
Plans• SW verification plan• Design and coding guidelines
Artifacts• SW safety req. spec.• SW architectural design spec.• SW unit design spec• SW unit implementation• SW unit test spec• SW integration test spec• Embedded Software (integrated)• SW safety req. test spec.• SW configuration data spec.• SW calibration data spec.• Configuration data• Calibration data• SW configuration & configuration test
spec
Verification/reportsThis includes records and summarizing reports
• SW safety req. spec.• SW architectural design spec• SW Safety analysis report• SW Dependent failure analysis report• SW unit design spec• SW unit implementation• SW unit test spec• SW unit test records• SW unit test report• SW integration test spec• SW integration test records• SW integration test report• SW safety req. test spec.• SW safety req. test records• SW safety req. test report• SW configuration data spec.• SW calibration data spec.• SW configuration & configuration test spec• SW configuration & configuration test records• SW configuration & configuration test report
Necessary Lifecycle Artifacts ISO26262
Safety Integrity
Plans• Production plan• Production control plan• Maintenance plan• Configuration management plan• Change management plan• Change request plan• Documentation management plan• HW qualification plan• HW component test plan
Artifacts• Production requirements (system, HW &
SW)• Service/maintenance requirements
(system, HW & SW)• Repair instructions• Manual (safety related content)• Problem report instructions/process• Decommissioning instructions• Interfaces within distributed developments
(includes 6 artifacts)• Change request• Change Impact analysis• Documentation guideline requirements• Software component documentation• Proven in use candidate
documentation/evidence
Verification/reportsThis includes records and summarizing reports
• Production control measures report• Production requirements (system, HW & SW)• Assessment report for capability of the production process• Maintenance plan• Interfaces within distributed developments• Change management plan• Change request plan• Change report• Software tool evaluation report• Software tool qualification report• Software component qualification report• HW qualification report• Proven in use analysis report (confirmation)
Necessary Lifecycle Artifacts ISO26262
Safety Integrity Independence of Verifier, Validator and Assessor
According to the principle of double control
According to EN50128
Safety Integrity
Table A.8 – Software Analysis Techniques EN50128
Architecture Requirements Verification
Safety Integrity Software Integration Test EN50128
Table A. 13 – Functional/Black Box Test EN50128
Table A. 6 – Integration Test EN50128
Safety Integrity Validation EN50128
Table A.17 – Performance Testing EN50128
Table A.7 – Overall Software Testing EN50128
Safety Integrity Safety Lifecycle Process Fundamentals
We CANNOT guarantee safety of software using existing
• Fault avoiding measures (Design)• Fault detecting measures (Test, Verification)• Software fault tolerant approaches
(Architecture)
We cannot prove absence of faults• Especially the absence of specification and
design faults
Safety Integrity
The principles of developing High Integrity SystemsRely Heavily on process:
– Top-down design methods– Modularity– Verification of each phase of the development lifecycle– Verified components and component libraries– Clear documentation and traceability– Auditable documents– Validation– Independent Assessment– Configuration management and change control and– Appropriate consideration of organization and personnel
competency issues
Summary 1
Safety Integrity Summary 2
• The standards list normative requirements – For the development process and – what evidence need to be produced (work products)
• Hazard analysis– Identify hazards– Give each hazard a SIL or ASIL based on Severity, the Exposure and the
Controllability• Mitigations
– Safety Goals– Functional safety requirements aiming at reducing risk for hazards
occurring shall be specified• Verification
– For each Phase– Static (reviews) & Dynamic (tests)
• Safety Case– All lifecycle work products – Arguments
Safety Integrity THANK YOU
Safety Integrity ABHemdalsvägen 17723 35 VästeråsSweden
[email protected]: + 46 (0)21-18 36 00
Safety Integrity ABHemdalsvägen 17723 35 VästeråsSweden
[email protected]: + 46 (0)21-18 36 00
Safety Integrity ABHemdalsvägen 17723 35 VästeråsSweden
[email protected]: + 46 (0)21-18 36 00
Safety Integrity
‘The radar system of HMS Sheffield identified an incoming Exocet missile as non-Soviet and thus friendly. No alarm sounded. The Ship sunk with substantial losses of life.’[ACM Software Engineering Notes , Vol.8, No. 3. ]
All wrong despite V&V
The Inadequacy of Verification
Incorrect specifications cause over 50% of all failures[N. G. Leveson. Safeware System, Safety and Computers. Addison Wesley 1995.]
We need means to see what happens out there…
Safety IntegrityWhy can’t the software engineer
do things right as the “real” engineer?
Because
Software and computers have a discontinuous behavior
Software lacks physical restrictions regarding attributes like
Mass, energy, size, and the number of partsOr structural attributes like strength, density or form
The only physical attribute that is of significance for software is time.
Safety Integrity Software has no physical restrictions
Consequence:
• Complexity
– Design mistakes
• Never wears out
– Cannot be over engineered (safety margins)
– Only design faults
Safety Integrity Fundamental problem: Discontinuity
32767
-32768
Output
Intput
Fundamentally hard to test software in general
• Cannot interpolate or extrapolate.
• “Can only show the presence of errors not their absence” - Dijkstra
• 40 sequential if-then-else statements --> 240 paths takes 34 years 1 test/ms.