58
Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification purposes only, and is not intended to convey or imply MITRE's concurrence with, or support for, the positions, opinions or viewpoints expressed by the author.

Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Proving Security Protocols Correct— Correctly

Jonathan Herzog

21 March 2006

The author's affiliation with The MITRE Corporation is provided for identification purposes only, and is not intended to convey or imply MITRE's concurrence with, or support for, the positions, opinions or viewpoints expressed by the author.

Page 2: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Introduction

This talk: soundness of symbolic proofs for security protocols• Think: Are proofs in an ‘ideal’ world meaningful in the

real world? Even when national secrets are on the line?

• Answer: mostly ‘yes,’ but sometimes ‘no’

But first: what are security protocols?

Scenario: A and B want to create shared secret key• Must communicate over unsecured network

Page 3: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Needham-Schroeder protocol

(Prev: A, B get other’s public encryption keys)

A BEKB(A || Na)

EKA(Na || Nb)

EKB(Nb)

B,K

Version 1: K = Na Version 2: K = Nb

A,K

Page 4: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Security goals

Authentication of A to B:• “If B outputs (A,K), then A outputs (B,K’)”

• Mutual authentication: both A to B and B to A

Key agreement: • If A outputs (X,K) and B outputs (Y,K’), then K=K’

Secrecy: surprisingly tricky to define• Intuition: only people who can know K should be A, B

Does Needham-Schroeder achieve any of these?Does Needham-Schroeder achieve any of these?

Page 5: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Needham-Schroeder: broken

A BEKM(A || Na)

EKM(Nb)

M,K A,K

MEKB(A || Na)

EKB(Nb)

EKA(Na || Nb)

A = Alice, B = Alice’s bank, M = on-line merchant• Alice buys goods from merchant• Merchant masquerades as Alice to her bank • (Lowe, 1995)

Page 6: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Needham-Schroeder-Lowe protocol

‘Fix’ by Lowe (1995)

A BEKB(A || Na)

EKA(Na || Nb || B)

EKB(Nb)

B,K A,K Added B’s name to 2nd message

• Is this secure? Is TLS? Kerberos? SSH?

More importantly: how to analyze?

Page 7: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The symbolic model

Analysis framework for security protocols• Originally proposed by Dolev & Yao (1983)

General philosophy: be as high-level as possible Three general intuitions:

• Axiomatize the messages

• Axiomatize the adversary

• Security is unreachability

Page 8: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Axiomatize the message space

Messages are parse trees Use symbols to represent atomic messages

• Countable symbols for keys (K, K’, KA, KB, KA-1, KB-1 …)

• Countable symbols for nonces (N, N’, Na, Nb, …)

• Countable symbols for names (A, B,…)Just symbols: no a priori relationships or structure

Helper functions: keyof(A) = KA, inv(KA)= KA-1

Encryption ( EK(M) ) pairing ( M || N ) are constructors

Protocols described (mostly) by messages sent/received

Page 9: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Axiomatize the adversaryDescribed by explicitly enumerated powers

• Interact with countable number of participants Each participant can play any role Adversary also legitimate participant

• Knowledge of all public values, non-secret keys

• Limited set of re-write rules:

Adversary can (non-deterministically) compose atomic abilities

M1, M2 M1 || M2

M1 || M2 M1, M2

M, K EK(M)

EK(M), K-1 M

Page 10: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Security is unreachability

Some state is unreachable via chain of adversary actions Secrecy (symbolic model):

“If A or B output (X,K), then no composition of adversary actions can result in K”

Authentication of A to B:“If B outputs (A,K), then no composition of adversary actions

can result in A outputting (X,K’) where X≠B”

Main advantage of symbolic model: security proofs are simple• Automatable, in fact!• Demo 1-- NSL provides both:

Mutual authentication Key agreement Secrecy for both Na, Nb

Page 11: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

A biased sample of previous work (symbolic model)

Analysis methods/mathematical frameworks• Many, many, many proposed• Two main survivors: spi calculus [AG] & strand spaces [THG]

Automation• Undecidable in general [EG, HT, DLMS] but:• Decidable with bounds [DLMS, RT]• Also, general case can be automatically verified in practice

Cryptographic Protocol Shape Analyzer [DHGT] Many others

Extensions• Diffie-Hellman [MS, H]• Trust-management / higher-level applications [GTCHRS]

Compilation • Cryptographic Protocol Programming Language (CPPL) [GHRS]

Page 12: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Central issue of this talk

So what? • Symbolic model has weak adversary, strong assumptions

• No a priori guarantees about stronger adversaries

1. Real adversaries can make up new “ciphertexts”

2. Real adversaries can try decrypting with wrong key

3. Real adversaries can exploit relationships between nonces/keysSymbolic proofs may not apply!

This talk: ways in which symbolic proofs are (and are not) meaningful in the computational model

Can we trust symbolic security proofs in the ‘real world’?Can we trust symbolic security proofs in the ‘real world’?

Page 13: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The computational model

Outgrowth of complexity theory

Symbolic model Computational model

Keys, names, etc. Symbols Bit-strings

Encryption Constructor Poly-time algorithm

Ciphertexts Compound parse-trees Bit-strings

Adversary Re-write rules Arbitrary poly-time algorithm

Proof method Reachability analysis Reduction to hard problem

Security Unreachability Particular asymptotic property

Page 14: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Example: semantic security [GM]

Described as game between ref and adversary:

1. Ref generates fresh key-pair

2. Ref gives public key to adversary

3. Adversary provides two messages: m0 and m1

4. Ref chooses one randomly, encrypts it

5. Adversary gets resulting ciphertext

6. Adversary guesses which was encrypted

Semantic security: no adversary can do better than chance

R

A

GK, K-1

K

m0, m1

U(0,1)b

EK, mb c

g

poly-time A: Pr[b=g] ≈ .5 poly-time A: Pr[b=g] ≈ .5

Page 15: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Example II: real-or-random secrecy (‘universally composable’ version)

Another game, between adversary and protocol participants

1. Participants engange in protocol

2. Adversary has control over network

3. When any participant finishes protocol, outputs either real key or random key

4. Other participants continue protocol, output same key

5. Adversary guesses ‘real’ or ‘random’

Real-or-random secrecy: no adversary can do better than chance

poly-time A: Pr[A is correct] ≈ .5

poly-time A: Pr[A is correct] ≈ .5

P1 P3

P2

A

K

K

K

real/random

Page 16: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Soundness

Computational properites are strong, but complex and hard to prove

Symbolic proofs are much easier, but unconvincing Soundness: symbolic proofs imply computational properties

Protocol

Protocol

Protocol

Protocol

Symbolic property

Computational property

Hard

EasyHard, but done once

Result: automated proof-methods yield strong properties!Result: automated proof-methods yield strong properties!

Page 17: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Previous work (soundness)

[AR]: soundness for indistinguishability• Passive adversary

[MW, BPW]: soundness for general trace properties• Includes mutual authentication; active adversary

Many, many others

Remainder of talk: 2 non-soundness results Key-cycles (joint work with Adao, Bana, Scedrov) Secrecy (joint work with Canetti)

Page 18: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Key cycles

When a key is used to encrypt itself

EK(K)

More generally: K1 encrypts K2, K2 encrypts K3 … until Kn encrypts K1

EK1(…K2…) EK2(…K3…) … EKn(…K1…)

Problem for soundness• Symbolic model: key-cycles are like any other encryption

• Computational model: standard security defs don’t apply

Page 19: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Semantic security, revisited

Adversary generates m0 and m1 based on public key only!

Doesn’t talk about messages based on private keys

Easy to devise semantically secure schemes that fail in presence of key-cycles

R

A

GK, K-1

K

m0, m1

U(0,1)b

EK, mb c

g

Page 20: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Counter-example

Let E be a semantically-secure encryption algorithm Let E’ be:

E’K(M) = EK(M), if M≠K

K, if M=K

Semantically secure, unless encounters a key-cycle Contrived example, but valid counterexample

• Symbolic encryption stronger than semantic security

Soundness requires new computational security definitionSoundness requires new computational security definition

Page 21: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Resolution: ‘KDM security’

‘Key-dependent message security’ Proposed by [BRS/AC] Implies soundness in presence of key cycles [ABHS]

Future work Devise a KDM-secure encryption algorithm Find a non-contrived non-KDM algorithm Define & implement KDM-secure hashing

• Note: hash-based key-cycles occur in TLS and SSH!

Page 22: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Soundness for secrecy

Does symbolic secrecy imply computational secrecy?• Implies weakened notion [CW], but…

• Unfortunately, not the UC definition

Counter-example:• Demo: NSL satisfies symbolic secrecy for Nb

• Cannot provide UC real-or-random secrecy

Symbolic model Computational model“If A or B output (X,K), then no composition of adversary

actions can result in K” (Key does not totally leak)

“No adversary can distinguish real key from

random key”

(No partial leaks)

Page 23: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The ‘Rackoff attack’ (on NSL)

A BEKB( A || Na)

EKA( Na || Nb || B )

EKB(Nb)

AdvK =? Nb

EKB(K)

K if K = Nb

O.W.

?

Page 24: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Achieving soundness

Every single symbolic secrecy proof has been wrong weak• Symbolic secrecy implies only weak computational properties

• ‘Real’ soundness requires new symbolic definition of secrecy

[BPW]: ‘traditional’ secrecy + ‘non-use’• Thm: new definition implies secrecy

• But: must analyze infinite concurrent sessions and all resulting protocols

Here: ‘traditional’ secrecy + symbolic real-or-random• Non-interference property; close to ‘strong secrecy’ [B]

• Thm: new definition equivalent to UC real-or-random

• Demonstrably automatable (Demo 2)

Page 25: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Decidability of secrecy

Traditional secrecy Symbolic

real-or-random

Unbounded sessions Undecidable[EG, HT, DLMS]

Undecidable[B]

Bounded sessions Decidable(NP-complete)

[DLMS, RT]

Decidable(NP-complete)

Side effect of proof method: • Computational crypto automagically prevents cross-

session interaction• Thus, suffices to analyze single session in isolation

Page 26: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

More future work

Soundness• Implement decision procedure for symbolic real-or-

random • Extend result past public-key encryption (e.g., hashing,

symmetric encrypion)• Apply analysis to real-world protocols (TLS, SSH, etc)• What is traditional symbolic secrecy good for?

Symbolic model• Apply methods to new problems (crypto APIs)• Unify compilation, analysis tools• Symbolic notions for new properties (e.g., anonymity)

Page 27: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Conclusion

Want to prove protocols secure• Easy to prove security in ‘ideal’ setting (symbolic model)

• Meaningful to prove security in ‘real’ setting (computational model)

Soundness: ‘ideal’ proof implies ‘real’ security Two aspects of symbolic model are not sound

• Key-cycles: must strengthen computational encryption

• Secrecy: must strengthen symbolic definition Important side-effect: soundness for new definition

implies decidability

Page 28: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Thanks!

Page 29: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

KDM-secure encryption (oversimplified)

Adversary provides two functions f0 and f1

Referee chooses one, applies to private key, encrypts result

KDM security: no adversary can do better than random• Strictly stronger than

semantic security

R

A

GK, K-1

K

f0, f1

U(0,1)b

EK, fb(K-1) c

g

Page 30: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Overview

This talk: symbolic analysis can guarantee universally composable (UC) key exchange • (Paper also includes mutual authentication)

Symbolic (Dolev-Yao) model: high-level framework• Messages treated symbolically; adversary extremely limited• Despite (general) undecidability, proofs can be automated

Result: symbolic proofs are computationally sound (UC) • For some protocols • For strengthened symbolic definition of secrecy

With UC theorems, suffices to analyze single session• Implies decidability!

Page 31: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Two approaches to analysis

Standard (computational) approach: reduce attacks to weakness of encryption

Alternate approach: apply methods of the symbolic model• Originally proposed by Dolev & Yao (1983)

• Cryptography without: probability, security parameter, etc.

• Messages are parse trees Countable symbols for keys (K, K’,…), names (A, B,…)

and nonces (N, N’, Na, Nb, …) Encryption ( EK(M) ) pairing ( M || N ) are constructors

• Participants send/receive messages Output some key-symbol

Page 32: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The symbolic adversary

Explicitly enumerated powers• Interact with countable number of participants• Knowledge of all public values, non-secret keys• Limited set of re-write rules:

M1, M2 M1 || M2

M1 || M2 M1, M2

M, K EK(M)

EK(M), K-1 M

Page 33: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

‘Traditional’ symbolic secrecy

Conventional goal for symbolic secrecy proofs:“If A or B output K, then no sequence of

interactions/rewrites can result in K” Undecidable in general [EG, HT, DLMS] but:

• Decidable with bounds [DLMS, RT]• Also, general case can be automatically verified in

practice Demo 1: analysis of both NSLv1, NSLv2

So what? • Symbolic model has weak adversary, strong assumptions• We want computational properties!• …But can we harness these automated tools?

Page 34: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Two challenges

1. Traditional secrecy is undecidable for:• Unbounded message sizes [EG, HT] or

• Unbounded number of concurrent sessions(Decidable when both are bounded) [DLMS]

2. Traditional secrecy is unsound• Cannot imply standard security definitions for

computational key exchange

• Example: NSLv2 (Demo)

Page 35: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Prior work: BPW

New symbolic definition

Implies UC key exchange

(Public-key & symmetric encryption, signatures)

Theory Practice

Page 36: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Our work

New symbolic definition:

‘real-or-random’

Equiv. to UC key exchange

(Public-key encryption [CH], signatures [P])

UC suffices to examine single protocol run

Automated verification!

+ Finite system

Decidability?

Theory Practice

Demo 3: UC security for NSLv1

Page 37: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Our work: solving the challenges

Soundness: requires new symbolic definition of secrecy• Ours: purely symbolic expression of ‘real-or-random’ security

• Result: new symbolic definition equivalent to UC key exchange

UC theorems: sufficient to examine single protocol in isolation

• Thus, bounded numbers of concurrent sessions

• Automated verification of our new definition is decidable!… Probably

Page 38: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Summary

Summary: • Symbolic key-exchange sound in UC model

• Computational crypto can now harness symbolic tools

• Now have the best of both worlds: security and automation!

Future work

Page 39: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Secure key-exchange: UC

?P P

AK K

Answer: yes, it matters• Negative result [CH]: traditional symbolic secrecy does

not imply universally composable key exchange

Page 40: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Secure key-exchange: UC

?P P

A

Adversary gets key when output by participants• Does this matter? (Demo 2)

K K

F

S?

Page 41: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Secure key-exchange [CW]

P P

A

Adversary interacts with participants• Afterward, receives real key, random key• Protocol secure if adversary unable to distinguish

NSLv1, NSLv2 satisfy symbolic def of secrecy• Therefore, NSLv1, NSLv2 meet this definition as well

K, K’

Page 42: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

KE

?P P

A

F

S

Adversary unable to distinguish real/ideal worlds• Effectively: real or random keys• Adversary gets candidate key at end of protocol• NSL1, NSL2 secure by this defn.

Page 43: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Analysis strategy

Concrete protocol

UC key-exchangefunctionality

Dolev-Yao protocol

Dolev-Yaokey-exchange

Would like

Natural translation forlarge class of protocols

Simple, automatedMain result of talk

(Need only be done once)

Page 44: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Proof overview (soundness)

Multi-session KE(CCA-2 crypto)

Symbolickey-exchange

Single session UC KE(ideal crypto)

Multi-session UC KE(ideal crypto)

UC w/ joint state

[CR](Info-theor.)

UC theorem

Construct simulator• Information-theoretic• Must strengthen notion of UC public-key encryption

Intermediate step: trace properties (as in [MW,BPW])• Every activity-trace of UC adversary could also be produced by symbolic adversary• Rephrase: UC adversary no more powerful than symbolic adversary

Page 45: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

“Simple” protocols

Concrete protocols that map naturally to Dolev-Yao framework Two cryptographic operations:

• Randomness generation

• Encryption/decryption (This talk: asymmetric encryption)

Example: Needham-Schroeder-Lowe

P1 P2

{P1, N1}K2

{P2, N1, N2}K1

{N2}K2

Page 46: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

UC Key-Exchange Functionality

FKE

(P1 P2)

k {0,1}n

Key P2

P1

(P1 P2)

Key k

P2

(P2 P1)

Key k

(P1 P2)

A

Key P1

(P2 P1)

Key P2

(P2 P1)

X

Page 47: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The Dolev-Yao model

Participants, adversary take turns Participant turn:

AP1 P2

M1

M2

L

Local output:Not seen by adversary

Page 48: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The Dolev-Yao adversary

Adversary turn:

P1 P2

A

Know

Application of deduction

Page 49: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Dolev-Yao adversary powers

Already in Know Can add to Know

M1, M2 Pair(M1, M2)

Pair(M1, M2) M1 and M2

M, K Enc(M,K)

Enc(M, K), K-1 M

Always in Know:Randomness generated by adversaryPrivate keys generated by adversaryAll public keys

Page 50: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

The Dolev-Yao adversary

AP1 P2

Know

M

Page 51: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Dolev-Yao key exchange

Assume that last step of (successful) protocol execution is local output of

(Finished Pi Pj K)

1. Key Agreement: If P1 outputs (Finished P1 P2 K) and P2 outputs (Finished P2 P1 K’) then K = K’.

2. Traditional Dolev-Yao secrecy: If Pi outputs (Finished Pi Pj K), then K can never be in adversary’s set Know

Not enough!

Page 52: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Goal of the environment

Recall that the environment Z sees outputs of participants Goal: distinguish real protocol from simulation In protocol execution, output of participants (session key)

related to protocol messages In ideal world, output independent of simulated protocol If there exists a detectable relationship between session key

and protocol messages, environment can distinguish• Example: last message of protocol is {“confirm”}K where K is

session key

• Can decrypt with participant output from real protocol

• Can’t in simulated protocol

Page 53: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Real-or-random (1/3)

Need: real-or-random property for session keys• Can think of traditional goal as “computational”• Need a stronger “decisional” goal• Expressed in Dolev-Yao framework

Let be a protocol Let r be , except that when participant outputs (Finished Pi Pj Kr), Kr added to Know

Let f be , except that when any participant outputs (Finished Pi Pj Kr), fresh key Kf added to adversary set Know

Want: adversary can’t distinguish two protocols

Page 54: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Real-or-random (2/3)

Attempt 1: Let Traces() be traces adversary can induce on . Then:

Traces(r) = Traces(f)

Problem: Kf not in any traces of r

Attempt 2:

Traces(r) = Rename(Traces(f), Kf Kr)

Problem: Two different traces may “look” the same• Example protocol: If participant receives session key, encrypts

“yes” under own (secret) key. Otherwise, encrypts “no” instead

• Traces different, but adversary can’t tell

Page 55: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Real-or-random (3/3) Observable part of trace: Abadi-Rogaway pattern

• Undecipherable encryptions replaced by “blob”

Example:

t = {N1, N2}K1, {N2}K2, K1-1

Pattern(t) = {N1, N2}K1, K2, K1-1

Final condition:

Pattern(Traces(r))

=

Pattern(Rename(Traces(f), Kf Kr)))

Page 56: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Main results

Let key-exchange in the Dolev-Yao model be:• Key agreement• Traditional Dolev-Yao secrecy of session key• Real-or-random

Let be a simple protocol that uses UC asymmetric encryption. Then:

DY() satisfies Dolev-Yao key exchangeiff

UC() securely realizes FKE

Page 57: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Future work

How to prove Dolev-Yao real-or-random?• Needed for UC security

• Not previously considered in the Dolev-Yao literature

• Can it be automated?

Weaker forms of DY real-or-random Similar results for symmetric encryption and

signatures

Page 58: Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification

Summary & future work

Result: symbolic proofs are computationally sound (UC) • For some protocols

• For strengthened symbolic definition of secrecy

With UC theorems, suffices to analyze single session• Implies decidability!

Additional primitives • Have public-key encryption, signatures [P]

• Would like symmetric encryption, MACs, PRFs…

Symbolic representation of other goals• Commitment schemes, ZK, MPC…