View
221
Download
1
Category
Preview:
Citation preview
Proposal of Certification and Benchmarking for the Internet of Things
Sara N. Matheu-Garcia, University of Murcia (UMU), Spain
Jose L. Hernández-Ramos, UMU
Antonio F. Skarmeta, UMU, CTO Odins
Introduction
• Looking for a certification and labeling scheme (ENISA, DIGITALEUROPE, ECSO…)
• ARMOUR European Project
Standarized
Dynamic Automatic
CompareLabel
Assess
Visual
2ETSI IoT Week 2017
Objectives
• Establishment of the ARMOUR benchmarkingmethodology
• Facilitate benchmarking database of secure & trusted solutions proper for the large-scale Internet-of-Things.
• Provide comparison for Large-Scale IoT Security & Trust solutions
• Extending the range of exploitation of the FIRE facilities• Datasets will be made available via the FIESTA testbed• Large-Scale execution in Fit IoT-Lab
• Automatic and dynamic
ETSI IoT Week 2017 3
Follow the ETSI approach, ISO 31000 and 29119
Monit
ori
ng a
nd R
evie
w
Understanding the business and regulatory environment
Analysing the environment
Security assessment
Security risk assessment Security testing
Risk Estimation
(CWSS)
Risk identification
(vulnerability
selection)
Test design &
implementation
(MBT and certifyIT)
Test environment set
up & maintenance
(adapters)
Test execution,
analysis & summary
(TITAN)
Risk Evaluation
(profiles)
Certification (labelling)
Identification of vulnerabilities
Identification of security properties
Requirements and process identification
Com
munic
ate
and C
onsu
lt
Treatment
Test planning
4ETSI IoT Week 2017
• Database of general security threats in IoT (not included in ISO 31000)
• Compact threats of OneM2M to simplify and adapt to IoT
Identification of vulnerabilities
• The endpoints should be legitimate.Lack of Authentication
• Intermediate entity can store a data packet and replay it at a later stage.Replay attack
• The cryptographic suite and key length must be enough to avoid certain type of attacks,
• such as dictionary attack or force brute.Insecure cryptography
• Several endpoints can access to the server at the same time in order to collapse it.DoS attacks
• Received data are not tampered with during transmission; if this does not happen, then any change can be detected.Lack of Integrity
• Transmitted data can be read only by the communication endpoints.Lack of Confidentiality
• Endpoint services should be accessible to endpoints who have the right to access them.Lack of Authorization
• Exceptions should be controlled to avoid faults that affects the endpoints.Lack Fault tolerance
ETSI IoT Week 2017 5
Analysing the environment
• Includes understanding the business, regulatory environment, analyse which security level is required in each of them and planning the testing (objective, scope).
• Determinate which level of security is needed in a specific domain by defining several profiles that indicates which level of security must be achieved by the TOE in the context considered to obtain each profile
It needs a low risk in
replay attack if It
wants to fulfils the A
profile
ETSI IoT Week 2017 7
Security testing
• From the vulnerabilities considered in the first phase, we produce security tests, that provide us a series of security metrics such as % of ciphered data, % of messages protected against replay attack, etc.
• In the ETSI proposal, the automation of this phase is not contemplated, but the automation of this process eases the update of the label to cope with changing conditions in which the device operates.
• In ARMOUR project, this process is intended to be automatized:• Model-based testing (MBT)
• Testing and Test Control Notation (TTCN) v.3, TITAN and CertifyIT
• Local or external large-scale testbed such as FIT IoT Lab
• Adapters to cope with the particularities of each IoT device
MBTTTCNv3
(TITAN)
FIT IoT Lab or
local device
CertifyIT Adapter
8ETSI IoT Week 2017
Security risk assessment
• Several approaches:• OWASP Top Ten
• Microsoft's STRIDE model
• OCTAVE
• DREAD scheme
• CVSS /CWSS• Standarized
• Clear metrics, simple to understand and perform
• Widely adopted (for example in National Vulnerability Database)
• Repeatable (different analysts will generate the same score for a vulnerability)
• CVSS more consistent and useful for non-experts, and CWSS more flexible and depth forsecurity experts
• CWSS recommended by the ITU-T in X.1525
• We propose adapting it to IoT and the usage of testing to complement it
ETSI IoT Week 2017
• Provide a risk mark associated to the vulnerabilities
9
Security risk assessment – Risk evaluation
• We associate CWSS score intervals with risk levels (low, medium, high and critical) to compare with the profiles.
• We always choose the highest profile fulfilled by each vulnerability.
CWSS Risk Risk
0-30 Low
31-62 Medium
63-84 High
85-100 Critical
ETSI IoT Week 2017 10
Certification - Labelling
• As an output of the general certification process, we obtain a label associated to the risk of the scenario tested.
• Three mains aspects are considered to be included in the label, following the Common Criteria approach
TOE
• Also includes the protocol tested and the context where it has been tested (industry, health, etc.)
Profiles (Level of protection)
• A
• B
• C
• D
Certification execution
Common Criteria
EALsTOEPPs
ETSI IoT Week 2017 11
Certification - Labelling
• Visual labelling following the recommendations of ENISA and ECSO. The result of the evaluation need to be communicated appropriately to the user.
• Multidimensional, like security.
• To perform a fast labelling update, we propose the usage of a QR.
coaps coap12ETSI IoT Week 2017
M3 (border router)
TITAN
Credential
Manager
Executable
Test Suite
Security
MBT models
Security tests
TTCN-3
tunslip
EC Serial Agent
Sniffer
AgentRecording
Agent
Benchmarking and
labelling entity
M3Smart Object
Exp1
Label.
Test.
Applying benchmarking methodology to EXP1
13ETSI IoT Week 2017
Applying benchmarking methodology to EXP1
SNIFFER TITAN
• Integration of certification approach with other ARMOUR componentsfor automating security tests
• Example for Confidentiality in EXP1
14ETSI IoT Week 2017
Recommended