44
Cryptography Lecture 2 Stefan Dziembowski www.dziembowski.net [email protected]

Cryptography Lecture 2 Stefan Dziembowski [email protected]

Embed Size (px)

Citation preview

Page 1: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Cryptography

Lecture 2

Stefan Dziembowskiwww.dziembowski.net

[email protected]

Page 2: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Plan

1. Information-theoretic cryptography

2. Introduction to cryptography based on the computational assumptions

3. Provable security

4. Pseudorandom generators

Page 3: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

The scenario from the previous lecture

Eve

Alice Bob

Shannon’s theorem perfect secrecy is possible only if the key is as long as the plaintext

In real-life it is completely impractical

Page 4: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

What to do?

Idea: limit the power of the adversary.

How?

Classical (computationally-secure) cryptography: bound his computational power.

Alternative options exists(but are not very practical)

Page 5: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Quantum cryptographyStephen Wiesner (1970s), Charles H. Bennett and Gilles Brassard (1984)

quantum link

Eve

Alice Bob

Quantum indeterminacy: quantum states cannot be measured without disturbing the original state.

Hence Eve cannot read the bits in an unnoticeable way.

Page 6: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Quantum cryptographyAdvantage:

security is based on the laws of quantum physics

Disadvantage:needs a dedicated equipment.

Practicality?

Currently: successful transmissions for distances of length around 150 km.Commercial products are available.

Warning:Quantum cryptography should not be confused with quantum computing.

Page 7: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

A satellite scenario

Eve

Alice Bob

000110100111010010011010111001110111111010011101010101010010010100111100001001111111100010101001000101010010001010010100101011010101001010010101

A third party (a satellite) is broadcasting random bits.

Does it help?No...

(Shannon’s theorem of course also holds in this case.)

Page 8: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Ueli Maurer (1993): noisy channel.

1 0 1 0 1 0 0 1 1 0 1 0 0 1 0 1 0 1 0 1 0 0 1 1 0 1 0 0 1 0

1 0 1 0 1 0 0 1 1 0 1 0 0 1 0

1 0 1 0 1 0 0 1 1 0 1 0 0 1 0

0 0 1 0 0 0 0 1 1 0 0 0 0 1 1 1 0 1 1 1 0 0 1 1 0 1 0 0 0 1

1 0 1 1 1 0 0 1 1 0 1 0 0 0 0

Assumption: the data that the adversary receives is noisy.(The data that Alice and Bob receive may be even more noisy.)

some bits get flipped(because of the noise)

Page 9: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Bounded-Storage ModelAnother idea: bound the size of adversary’s memory

000110100111010010011010111001110111111010011101010101010010010100111100001001111111100010101001000101010010001010010100101011010101001010010101

too large to fit in Eve’s memory

Page 10: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Real (computationally-secure) cryptography starts here:

Eve is computationally-bounded

But what does it mean?

Ideas:

1. She has can use at most 1000 Intel Core 2 Extreme X6800 Dual Core Processors

for at most 100 years...

2. She can buy equipment worth 1 million euro and use it for 30 years...

it’s hard to reasonformally about it

Page 11: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

A better idea”The adversary has access to a Turing Machine that can make at most

1030 steps.”

More generally, we could have definitions of a type:

“a system X is (t,ε)-secure if every Turing Machine

that operates in time t

can break it with probability at most ε.”

This would be quite precise, but...

We would need to specify exactly what we mean by a “Turing Machine”:

• how many tapes it has?• how does it access these tapes (maybe a “random access memory”

is a more realistic model..)• ...

Moreover, this approach often leads to ugly formulas...

Page 12: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

What to do?“(t,ε)-security”

Idea:

• t steps of a Turing Machine = “efficient computation”

• ε – a value “very close to zero”.

How to formalize it?

Use the asymptotics!

Page 13: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Efficiently computable?“polynomial-time computable

on a Turing Machine”“efficiently computable” =

that is: running in timeO(nc) (for some c)

Here we assume that the Turing Machines are the right model for the real-life computation.

Not true if a quantum computer is built...

Page 14: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Very small?

“very small” =

“negligible”=

approaches 0 faster than the inverse of any polynomial

Formally:

Page 15: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Negligible or not?

yes

yes

yes

no

no

yes

Page 16: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Security parameter

The terms “negligible” and “polynomial” make sense only if X (and the adversary) take an additional input n called

a security parameter.

In other words: we consider an infinite sequence X(1),X(2),...

of schemes.

Typically, we will say that a scheme X is secure ifApolynomial-time

Turing Machine M

P (M breaks the scheme X) is negligible

Page 17: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

ExampleConsider the authentication scheme from the last week:

Page 18: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Nice properties of these notions• A sum of two polynomials is a polynomial:

poly + poly = poly

• A product of two polynomials is a polynomial:poly * poly = poly

• A sum of two negligible functions is a negligible function:negl + negl = negl

Moreover:

• A negligible function multiplied by a polynomial is negligiblenegl * poly = negl

Page 19: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

A new definition of an encryption scheme

Page 20: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Is this the right approach?

Advantages

1. All types of Turing Machines are “equivalent” up to a “polynomial reduction”.

Therefore we do need to specify the details of the model.

2. The formulas get much simpler.

Disadvantage

Asymptotic results don’t tell us anything about security of the concrete systems.

However

Usually one can prove formally an asymptotic result and then argue informally that “the constants are reasonable”

(and can be calculated if one really wants).

Page 21: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Provable security

We want to construct schemes that are provably secure.

But...

• why do we want to do it?• how to define it?• and is it possible to achieve it?

Page 22: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Provable security – the motivation

In many areas of computer science formal proofs are not essential.

For example, instead of proving that an algorithm is efficient, we can just simulate it on a

“typical input”.

In cryptography it’s not true, because

there cannot exist an experimental proof that a scheme is secure.

Why?

Because a notion of a“typical adversary”

does not make sense.

Page 23: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

How did we define the perfect secrecy?

Experiment (m – a message)1. the key k is chosen randomly2. message m is encrypted using k:

c := Enck(m)3. c is given to the adversary

Idea 1The adversary should not be able to compute k.

Idea 2The adversary should not be able to compute m.

Idea 3The adversary should not be able to compute any information about m.

Idea 4The adversary should not be able to compute any additional information about m.

makes more sense

Page 24: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

IdeaThe adversary should not be able to compute any additional information about m.

Page 25: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

A

m0,m1

P(C = c | M = m0) = P(C = c | M = m1)

A

c

P(Enc(K,M) = c | M = m0) = P(Enc(K,M) = c | M = m1)

Towards the definition of computational secrecy...

P(Enc(K,m0) = c | M = m0) = P(Enc(K,m1) = c | M = m1)

P(C = c) = P(C = c | M=m)

P(Enc(K,m0) = c) = P(Enc(K,m1) = c)A

m c

A

A

m0,m1

A

c

A

m0,m1

A

c

A

m0,m1

A

c

Page 26: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Indistinguishability

P(Enc(K,m0) = c) = P(Enc(K,m1) = c)

A

m0,m1

A

c

In other words: the distributions of Enc(K,m0) = Enc(K,m1) are identical

IDEAchange it to:

are indistinguishable by a polynomial time adversary

Page 27: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

A game

adversary(polynomial-time Turing machine) oracle

chooses m0,m1 such that|m0|=|m1|

m0,m1 1. selects k := G(1n)2. chooses a random b = 0,13. calculates

c := Enc(k,mb)

(Gen,Enc,Dec) – an encryption scheme

chas to guess b

Security definition:We say that (Gen,Enc,Dec) has indistinguishable encryptions if any polynomial time adversary guesses b correctly with probability at most 0.5 + ε(n), where ε is negligible.

security parameter1n

Alternative name: semantially-secure (sometimes we will say: “is computationally-secure”, if the context is clear)

Page 28: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Testing the definition

1. Suppose the adversary can compute k from some Enc(k,m). Can he win the game?

2. Suppose the adversary can compute some bit of m from Enc(k,m). Can he win the game?

YES!

YES!

Page 29: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Is it possible to prove security?(Gen,Enc,Dec) -- an encryption scheme.For simplicity suppose that:

1. for a security parameter n the key is of length n.2. Enc is deterministic

Consider the following language:

Q: What if L is polynomial-time decidable?

A: Then the scheme is broken (exercise)

On the other hand: L is in NP. (k is the NP-witness)

So, if P = NP, then any semantically-secure encryption is broken.

Is it really true?

Page 30: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

“If P=NP, then the semantically-secure encryption is broken”

Is it 100% true?

Not really...

This is because even if P=NP we do not know what are the constants.

Maybe P=NP in a very “inefficient way”...

Page 31: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

In any case, to prove security of a cryptographic scheme we would need to show

a lower bound on the computational complexity of some problem.

In the “asymptotic setting” that would mean that at least

we show that P ≠ NP.

Does the implication in the other direction hold?(that is: does P ≠ NP imply anything for cryptography?)

No! (at least as far as we know)

Intuitively: because NP is a notion from the “worst case complexity”, and cryptography concerns the “average case complexity”.

Therefore

proving that an encryption scheme is secure is probably much harder than proving that P ≠ NP.

Page 32: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

What can we prove?

We can prove conditional results.

That is, we can show theorems of a type:

Suppose that some scheme Y is secure

then scheme X is secure.

Suppose that some “computationalassumption A”

holds

then scheme X is secure.

Page 33: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Research program in cryptographyBase the security of cryptographic schemes on a small

number of well-specified “computational assumptions”.

then scheme X is secure.

Some “computationalassumption A”

holds

in this we

have to

“believe”

the rest is

provable

Examples of A:

“decisional Diffie-Hellman assumption”

“strong RSA assumption”

Page 34: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Example

We are now going to show an example of such reasoning:

then scheme X is secure.

Suppose that some “computationalassumption A”

holds

we G can construct a secure encryption

scheme

Suppose that G is a “cryptographic pseudorandom

generator”

Page 35: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Pseudorandom generators

s G(s)

Page 36: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

If we use a “normal PRG” – this idea doesn’t work (exercise).

It works only with the cryptographic PRGs.

Page 37: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

“Looks random”

What does it mean?

Non-cryptographic applications:

should pass some statistical tests.

Cryptography:

should pass all polynomial-time tests.

Page 38: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Cryptographic PRG

a polynomial-timedistinguisher D

a random string R

G(S) (where S random)

or

Should not be able to distinguish...

outputs:

0 if he thinks it’s R

1 if he thinks it’s G(S)

Page 39: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Constructions

There exists constructions of cryptographic pseudorandom-generators, that are conjectured to be secure.

Some of them are extremely efficient, and widely used in practice.

They are called the “stream ciphers” (we will discuss them later).

Page 40: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

TheoremIf G is a cryptographic PRG then the encryption

scheme constructed before is semantically-secure (i.e. it has indistinguishable encryptions).

cryptographic

PRGs

computationally-secure

encryption

Proof (sketch)

Suppose that it is not secure.

Therefore there exists an adversary that wins the “guessing game” with probability 0.5 + δ(n), where δ(n) is not negligible.

Page 41: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

X

chooses m0,m1 m0,m1 1. b = 0,1 random 2. c := x xor mb

chas to guess b

simulates

If the adversary guessed b correctly then output 1: “x is pseudorandom”.

Otherwise output 0: “x is random”.

Page 42: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

x is a random string R x = G(S)

the adversary guesses b correctly with probability 0.5

the adversary guesses b correctly with probability 0.5 + δ(n)

prob. 0.5 prob. 0.5prob.

0.5 + δ(n)prob.

0.5 - δ(n)

1 0 1 0outputs:

QED

Page 43: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Moral

To construct secure encryption it suffices to construct a secure PRG.

cryptographic

PRGs

semantically-secure

encryption

Page 44: Cryptography Lecture 2 Stefan Dziembowski  stefan@dziembowski.net

Outlook

Cryptography

• one time pad,• quantum cryptography,• “the satellite scenario”

often called:• “information-theoretic”,• “unconditional”

“computationally-secure”

based on 2 assumptions:

1. some problems are computationally difficult

2. our understanding of what “computational difficulty” means is correct.