26
Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK [email protected]

Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK [email protected]

Embed Size (px)

Citation preview

Page 1: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Utilising human factors in the

science of security

Adam BeautementDepartment of Computer Science

University College London, [email protected]

Page 2: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Overview

• Background• Limitations of common security

outlooks• Compliance as a decision making

process• Identifying drivers for non-

compliance• Positively influencing the compliance

decision

Page 3: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Background• Research associate at UCL– ACE-CSR– RISC

• Focused on optimising Information Security decision making– Individuals– Organisations

• Current research takes a utility-based view of systems fully incorporating human factors

Page 4: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Productive Security

• A project motivated by the view that:– Security exists to serve the primary

process, not as an end goal in its own right

– Taking a Productive Security approach can at least improve productivity without compromising security, and possibly improve both at the same time

– Security can act as a business enabler

Page 5: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

The science of security

• There is no current science of security

• Security decisions are made by individuals, based on their own personal store of knowledge and experience

• Data is in short supply– Organisations are reluctant to release

breach reports–What is security relevant?

Page 6: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

The System

Technology

Infrastructure

Secured by:- Technical Controls- Control of the

environment

Processes

End Users

A wider range of interventions and approaches needed

Page 7: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Uninformed assumptions

• Security managers assume that users:– Are an unlimited source of effort– Are motivated by security– Are lacking in education• And that educating them appropriately will

change their behaviour

• None of these are true!• Security systems based on these

assumptions will fail

Page 8: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Hypothesis~10%

~10%

~80%

Staff who think they know better, or don’t care

Staff who know what they should do, but feel they can’t

Staff who don’t know policy

Page 9: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Friction

• Security is a process that sits alongside others– Business– Infrastructure– Social

• Where security is designed without these in mind it creates friction

Page 10: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Perceived individual cost

Eff

ect

iven

ess

of

Secu

rity

poli

cy Compliance ThresholdHigher Spending RateLower Spending Rate

The Compliance Budget

Page 11: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Outcome of PositiveCompliance Decision

BENEFITS:Protection From Responsibility

Protection FromSanctions

COSTS:Physical Load

Cognitive LoadMissed Opportunity

EmbarrassmentReduced

Availability‘Hassle Factor’

Outcome of NegativeCompliance Decision

Page 12: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Productive Security Methodology

Assess the scale of the problem

Identify problem areas and drivers of behaviour

Prioritise interventions

Design (and deploy) interventions

Assess impacts and outcomes

1

5

2

3

4

Page 13: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

In practise…

Scenario-based survey, based on interview analysis, that assesses responses to conflict situations

Semi-structured interviews with vertical cross section of the target organisation

Work with organisation to determine strategy and capability

Select optimal intervention, targeting appropriate socio-technical factor(s)

Develop and utilise metrics to measure change in security behaviour and levels of compliance

1

5

2

3

4

Page 14: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Empirical data gathering• Focused on identifying ways of managing

non-compliance through:– Changing behaviour– Restructuring security systems/policy

• Working with commercial partners118 semi-structured interviews with staff on (non)compliance, to identify areas and reasons

• Online survey asking staff about security behaviour and attitudes– 1256 valid completed survey– 800+ free text responses

Page 15: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Interview Results

• High level of awareness of corporate policies• Every interviewee reported not complying

with at least one policy– Hotspots include bypassing access control, not

encrypting files, password sharing, tail-gaiting

• Main drivers for non-compliance come from time and performance pressures:– Compliance impossible or inconveniently delays

the primary task– Compliance perceived to be damaging to

individual/business performance

Page 16: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Behavior and attitude survey

• 10 scenarios describing situations in which an employee is faced with a conflict between the business and security processes

• Scenarios split between Behaviour and Attitude types

• Each participant presented with 4 scenarios – clear company policy, but “no easy answers” –

dilemma between business and security – range of non-compliant options to deal with dilemma– participants ranked the options in order of preference– rated severity of security issue created by non-

compliance in each scenario

Page 17: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Findings and recommendations

Interview/Scenario Finding Suggested course of action

Employees aware of risks but still not compliant

The problem is not one of knowledge – awareness training will not solve compliance issues so new approaches required

Statistically significant cultural variation detected between US and UK populations

Interventions need to be tailored to the target populations – more business focused in the US and more security focused in the UK

Passive disposition toward security – breaches and workarounds not challenged

Provide appropriate discrete channels for security feedback, whether complaints, problems or breach reports

Main security driver is common sense, not organisational communications/policy

Seek to increase the visibility of the organisational message, and engagement with employees

Page 18: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

What does ‘good’ look like?

• Showing what problems exist does not necessarily allow goals to be set

• Organisations are poor at describing what desirable security outcomes look like, especially with regards to security behaviour– Is it ever acceptable for employees to break

policy?

• We looked at existing models, particularly the CM process maturity model and adapted them

Page 19: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Security Behaviour Maturity Model

Page 20: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

The Maturity Model• Actually expresses a relationship

between the user and the policy– It is not just a checklist of desirable user

attributes

• Individuals with a strong internal security culture will exhibit different behaviours depending on the quality of the policy they are working under

• Identifying these individuals improves organisational efficiency as effort is not wasted in trying to retrain them

Page 21: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

The Knowing-Doing Gap

• Alfawaz et al. identify that information can be unintentionally leaked when a gap exists between policy and behaviour

• They describe a framework of behaviour–Not knowing, not doing (security

novice)–Not knowing, doing (security savant)– Knowing, not doing (rule breaker)– Knowing, doing (optimal)

Page 22: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Interaction with maturity model• Overlaying these framework allows a

behavioural diagnostic approach to be taken

• ‘Knowing, not doing’ can indicate:– A malicious insider– A worthwhile employee utilising

workarounds due to a poor policy implementation

• Elimination of the second category, through reducing policy friction, improves insider detection

Page 23: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Key principles for mature security

• Relationship of security to productive process

• Awareness of security-relevant events• Detection and reporting of vulnerabilities• Action to manage vulnerabilities/risk• Action in case of human error• Action in case of breach• Maintenance and improvement over time

Page 24: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Managing Non-Compliance• Compliance requires ability and

willingness Can’t complySecurity asks that are impossible to complete. Must remove as a matter of security hygieneCould comply but won’t complyTasks that can be completed in theory, but require high level of effort and/or reduces productivity.Re-design or SEAT

Can comply and does complySecurity tasks that are routinely completed.Provide initial baseline.

Page 25: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Improving decision making• The natural limitations of the user must be

recognised, as well as their goals– Security interventions must be tailored and

targeted – one sized fits none

• The primary process of the business must be understood, and served– This will be the major motivating force of the

user’s actions

• The organisation has as much responsibility to change as the user– Policies (e.g. health and safety, recycling,

security) must be unified not stove piped

Page 26: Utilising human factors in the science of security Adam Beautement Department of Computer Science University College London, UK a.beautement@cs.ucl.ac.uk

Questions?