31
Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

  • View
    217

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Security and Refinement

John A. ClarkSusan Stepney

Howard Chivers Dept of Computer Science, University of York

7 September 2004

Page 2: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

What is Security?????

Page 3: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

What is security Stopping people reading what they

shouldn’t. You could be forgiven for believing

this. It’s a start

Page 4: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Bell and La Padula

Top Secret

Secret

Confidential

Restricted

Unclassified

May read and write

May write

May read

Secret

Confidential

Restricted

Unclassified

MaximumClearance

CurrentClearance

Page 5: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Bell and La Padula Various foundational papers in the

early seventies and the ‘Secure Exposition and MULTICS Interpretation’ in 1976.

But it’s not without its problems....

Page 6: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Bell and La Padula In practice objects lie outside the

model; access is not policed. file locks Names

This allows these channels to act a information conduits for information transfer (actually this is their intended function)

Page 7: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Bell and La Padula In practice objects lie outside the

model; access is not policed. file locks Names

This allows these channels to act a information conduits for information transfer (actually this is their intended function)

Page 8: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Covert Channels Computer systems provide channels for

inter-subject communication, e.g. FILES. These are intended to be used for

communication and access to them can be policed by the system.

Generally possible to identify unusual means of communicating via elements of the system so that the intent of the mandatory policy is broken. These means are known as covert channels.

Page 9: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Covert Channels Covert channels arise because subjects

share resources. The shared resources allow one subject (the transmitter) to modulate some aspect of how another subject (the receiver) perceives the system.

Covert channels are generally grouped for working purposes into:

storage channels (where the values returned to the receiver by operations are affected).

timing channels (where the times at which events are perceived by the receiver are modulated).

Page 10: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

System Z – the trouble continues Start in a secure state “no one is able to

read a document above their clearance” Make only transitions that preserve that

property Security is maintained????? Not exactly.

A system could downgrade everything to the lowest classification and allow anyone access

Page 11: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Non-interference More modern 1984(1982)++ is non-

interference by Goguen and Meseguer. Actions by high level subjects (e.g. Top Secret)

should not be visible to lower level subjects Formulated in terms of trace validity

HighLow

Page 12: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Non-interference This is generally considered too

strong a policy to implement effectively.

In practice, ruling out every bit of information flow is not the aim. We may wish to allow communication

but only via specific channels (conditional non-interference)

Page 13: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Formal Methods Empowerment

Model based specifications: operations are “atomic” yet consume no power.

Mathematics: f(x) just is (cf. “Be man” from the 1960s)

Computation: You need to calculate f(x) on real

hardware. This consumes power.

Page 14: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Power is a drug – Just say no.

2*5=?? 25*25=?? 16554*54237=??

The amount and pattern of intellectual work you have to do varies according to the data you are multiplying.

So if you don’t want anyone to know (gain information on) what you are multiplying don’t freely accept offers of power (power is a drug, and it causes you to spill the beans)

Page 15: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Time to think, I think

216=?? 2317=??

The time a computer takes to carry out an operation such as exponentiation typically depends on the operands.

If the exponent is a private key (e.g. as in RSA)

Page 16: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

And is this what security is about? No. Confidentiality is one part of security. Integrity (if we know what this

means)??? Availability??? Anonymity (again unlikely to be an

all or nothing affair). Accountability/non-repudiation

Page 17: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Breaking the Model

Slides by Susan

Page 18: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

Relational model of refinement

initialisation operation finalisation

simulation

Page 19: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

what is finalisation? process of moving from (abstract or

concrete) world of programming variables, to “real” (but still modelled) world

how to interpret programming variables “finalisation glasses” both outputs and state components abstract model -- usually identity function

Page 20: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

example: set and sequence global “real world” model : a set abstract model : a set

s : P X ; FA(s) = s -- identity finalisation

concrete model : a sequence t : seq X ; FC(t) = ran t look at the sequence through finalisation glasses

that see only the underlying set successive refinements

one step’s concrete model is the next step’s abstract model

Page 21: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

example : a clock paradox

(A) clock that is 5 minutes slow -- never right

(B) stopped clock -- right twice a day

(A) a simple finalisation to get the right time -- add 5 minutes

(B) no possible finalisation glasses

Page 22: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

unwanted finalisations finalisations “throw away”

information example: order of the sequence

what if you don’t wear the glasses? instead, use the maximal identity

finalisation, to see “more” covert channels

ordering, timing, delays, …

Page 23: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

a solution? restrict to identity finalisation

essentially, don’t allow output refinement

but, real devices output concrete things, like bit-streams, or voltages, or …

“extra-logical” finalisations

C

G

G’

R

f

id

x

Page 24: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

so, not a solution forbidding the problem merely

moves it somewhere even more covert

instead, use the concept of finalisation to categorise and expose various covert channels, various security attacks

Page 25: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

vary the “glasses” intended -- use the glasses unintended -- remove (or de-mist) glasses

view page faults, interrupts, power, timing, …

direct -- observe a single output enhanced -- post-process

view sequences of outputs

single viewpoint multiple viewpoints (multiple different glasses)

passive -- just observe invasive -- break into the device

Page 26: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

vary the system standard -- observe the system working as intended perturbed -- observe a perturbed system

under load, irradiated, flexed, …

passive perturbation -- worn out device invasive perturbation -- attack with ion gun

single system multiple systems -- differential analyses

homogeneous multiples -- lots of the “same” system heterogeneous -- “sameness” depends on the

observation engineered, natural

Page 27: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

vary the environment standard -- observe within specification (can still

vary) perturbed -- observe outside specification

heated, cooled, …

single environmental attribute multiple attribute

passive perturbation -- naturally very cold smart cards in Sweden cold can lower noise, defeat blinding defences

invasive perturbation -- deliberately modulate power supply

Page 28: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

higher order glasses non-standard observations of the

analysis techniques themselves often attacks involve searches

view the trajectory of a search algorithm view the search on a slightly different

problem …

Page 29: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

illustrations intended finalisation, single system, perturbed fault injection, bit flip, breaking RSA

unintended finalisation, enhanced, single system

differential power analysis of smart cards

unintended finalisation, single system, perturbed environment

static RAM persists for minutes at –20°C

Page 30: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

prevention enforce use of the glasses

by use of a protective screen implement “magic eye” glasses

noise without them -- crypto detect unwanted finalisations

mainly invasive ones and then destroy the secrets!

Page 31: Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

unusual, wanted, finalisations

multiple finalisations for different roles user, administrator, audit, …

breaking atomicity usefully progressive gifs

finalise the user observe dynamics of hand signatures textual analyses for authorship

anonymity v. authentication tradeoffs destructive quantum finalisation

essential in various security schemes