Public-Key Encryption from Different Assumptions Benny Applebaum Boaz Barak Avi Wigderson

Preview:

Citation preview

Public-Key Encryption from Different Assumptions

Benny Applebaum Boaz Barak Avi Wigderson

Plan

• Background

• Our results

- assumptions & constructions

• Proof idea

• Conclusions and Open Problems

Private Key Cryptography (2000BC-1970’s)

Public Key Cryptography (1976-…)

k Secret key

Public Key Crypto

Talk Securely with no shared key

Private Key CryptoShare key and then talk securely

Beautiful Math

Few Candidates

• Discrete Logarithm [DiffieHellman76,Miller85,Koblitz87]

• Integer Factorization [RivestShamirAdleman77,Rabin79]

• Error Correcting Codes [McEliece78,Alekhnovich03,Regev05]

• Lattices [AjtaiDwork96, Regev04]

Many Candidates

• DES [Feistel+76]• RC4 [Rivest87]• Blowfish [Schneie93]• AES [RijmenDaemen98]• Serpent [AndersonBihamKnudsen98]• MARS [Coppersmith+98]

Unstructured

“Beautiful structure” may lead to unforeseen attacks!

The ugly side of beautyFactorization of n bit integers

Trial Division ~exp(n/2)

300BC 1974

Pollard’s Alg~exp(n/4)

1975

Continued Fraction~exp(n1/2)

19851977

RSAinvented

Quadratic Sieve~exp(n1/2)

1990

Number Field Sieve~exp(n1/3)

1994

Shor’s Alg~poly*(n)

The ugly side of beautyFactorization of n bit integers

Trial Division ~exp(n/2)

300BC 1974

Pollard’s Alg~exp(n/4)

1975

Continued Fraction~exp(n1/2)

19851977

RSAinvented

Quadratic Sieve~exp(n1/2)

1990

Number Field Sieve~exp(n1/3)

1994

Shor’s Alg~poly*(n)

Cryptanalysis of DES

1976

Trivial 256 attack

1993

Linear Attack [Matsui]

243 time+examples

DES invented

Differntial Attack [Biham Shamir]247 time+examples

1990

Are there “ugly” public key cryptosystems?

Complexity PerspectiveOur goals as complexity theorists are to prove that:

• NP P

• NP is hard on average

• one-way functions

• public key cryptography

(clique hard)

(clique hard on avg)

(planted clique hard)

(factoring hard)

• Goal: PKC based on more combinatorial problems– increase our confidence in PKC

– natural step on the road to Ultimate-Goal

– understand avg-hardness/algorithmic aspects of natural problems

• This work: Several constructions based on combinatorial problems

• Disclaimer: previous schemes are much better in many (most?) aspects– Efficiency

– Factoring: old and well-studied

– Lattice problems based on worst-case hardness (e.g., n1.5-GapSVP)

What should be done?Ultimate Goal: public-key cryptosystem from one-way function

Plan

• Background

• Our results

- assumptions & constructions

• Proof idea

• Conclusions and Open Problems

Assumption DUEDecisional-Unbalanced-Expansion:

Hard to distinguish G from H

• Can’t approximate vertex expansion in random unbalanced bipartite graphs

• Well studied problem though not exactly in this setting (densest-subgraph)

G random (m,n,d) graph H random (m,n,d) graph

+ planted shrinking set

S of size q

n

d

mn

d

m

T of size<q/3

Assumption DUEDecisional-Unbalanced-Expansion:

Hard to distinguish G from H

G random (m,n,d) graph H random (m,n,d) graph

+ planted shrinking set

S of size q

n

d

mn

d

m

We prove:• Thm. Can’t distinguish via cycle-counting / spectral techniques • Thm. Implied by variants of planted-clique in random graphs

T of size q/3

Decisional-Sparse-Function:

Let G be a random (m,n,d) graph.

• Hard to solve random sparse (non-linear) equations

• Conjectured to be one-way function when m=n [Goldreich00]

• Thm: Hard for: myopic-algorithms, linear tests, low-depth circuits (AC0)

(as long as P is “good” e.g., 3-majority of XORs)

Assumption DSF

mn

d

x1

xn

y1

yi

ym

=P(x2,x3,x6)

Then, y is pseudorandom.

random input

random string

P is (non-linear) predicate

SearchLIN : Let G be a random (m,n,d) graph.

• Hard to solve sparse “noisy” random linear equations

• Well studied hard problem, sparseness doesn’t seem to help.

• Thm: SLIN is Hard for: low-degree polynomials (via [Viola08]) low-depth circuits (via [MST03+Bravn-order Lasserre SDP’s [Schoen08]

Assumption SLIN

mn

x1

xn

y1

yi

ym

=x2+x3+x6

Given G and y, can’t recover x.

random input

Goal: find x.

- noisy bit

+err

Main ResultsPKC from:

Thm 1: DUE(m, q= log n, d)+DSF(m, d)- e.g., m=n1.1 and d= O(1)

- pro: “combinatorial/private-key” nature

- con: only n log n security

qq/3

n m

DUE: graph looks random

P(x2,x3,x6)d

x1

xn

DSF: output looks random

inputoutput

d

x1

xn

dLIN: can’t find x

inputoutput

x2+x3+x6+err

Main ResultsPKC from:

Thm 1: DUE(m, q= log n, d)+DSF(m, d)

Thm 2: SLIN(m=n1.4,=n-0.2,d=3)

qq/3

n m

DUE: graph looks random

P(x2,x3,x6)d

x1

xn

DSF: output looks random

inputoutput

d

x1

xn

dLIN: can’t find x

inputoutput

x2+x3+x6+err

Main ResultsPKC from:

Thm 1: DUE(m, q= log n, d)+DSF(m, d)

Thm 2: SLIN(m=n1.4,=n-0.2,d=3)

Thm 3: SLIN(m=n log n, ,d)

qq/3

n m

DUE: graph looks random

P(x2,x3,x6)d

x1

xn

DSF: output looks random

inputoutput

d

x1

xn

dLIN: can’t find x

inputoutput

x2+x3+x6+err

+DUE(m=10000n, q=1/, d)

3LIN vs. Related Schemes

d

x1

xn

dLIN: can’t find x

inputoutput

x2+x3+x6+err

Our scheme[Alekhnovich03][Regev05]

#equationsO(n1.4)O(n)O(n)

noise rate1/n0.21/n1/n

degree (locality)3n/2n/2

fieldbinarybinarylarge

evidenceresists SDPs,related to refute-3SAT

implied by n1.5-SVP [Regev05,Peikert09]

Our intuition: • 1/n noise was a real barrier for PKC construction

• 3LIN is more combinatorial (CSP)

• low-locality-noisy-parity is “universal” for low-locality

Plan

• Background

• Our results

- assumptions & constructions

• Proof idea

• Conclusions and Open Problems

S3LIN(m=n1.4,=n-0.2) PKE

Goal: find x

yM

x+ =e

err vector of rate

m

n

random 3-sparse matrix

1 1 1

nx1

xn

=x2+x3+x6+err

random input

- noisy bit

y1

yi

ym

z

Our Encryption Scheme

Public-key: Matrix M Private-key: S s.t Mm=iSMi

Decryption:

• w/p (1-)|S| >0.9 no noise in eS iS yi=0 iS zi=b

Encrypt(b): choose x,e and output z=(y1, y2,…, ym+b)

y

x

+ =e

M

Given ciphertext z output iS zi

y

b+

Params: m=10000n1.4 =n-0.2 |S|=0.1n0.2

S z=

z

Our Encryption Scheme

Public-key: Matrix M Private-key: S s.t Mm=iSMi

Thm. (security): If M is at most 0.99-far from uniform

S3LIN(m, ) hard Can’t distinguish E(0) from E(1)

Proof outline: Search Approximate Search Prediction Prediction over planted distribution security

Encrypt(b): choose x,e and output z=(y1, y2,…, ym+b)

y

x

+ =e

M

y

b+

Params: m=10000n1.4 =n-0.2 |S|=0.1n0.2

S z=

Search Approximate SearchS3LIN(m,): Given M,y find x whp

AS3LIN(m,): Given M,y find w 0.9 x whp

Lemma: Solver A for AS3LIN(m,) allows to solve S3LIN(m+10n lg n ,)

yM

x+ =e

err vector of rate

m

n

random 3-sparse matrix

random n-bit vector

1 1 1

search app-search prediction prediction over planted PKC

Search Approximate SearchS3LIN(m,): Given M,y find x whp

AS3LIN(m,): Given M,y find w 0.9 x whp

Lemma: Solver A for AS3LIN(m,) allows to solve S3LIN(m+10n lg n ,)

• Use A and first m equations to obtain w.

• Use w and remaining equations to recover x as follows.

• Recovering x1:

– for each equation x1+xi+xk=y compute a vote x1=xi+xk+y

– Take majority

Analysis:

• Assume wS = xS for set S of size 0.9n

• Vote is good w/p>>1/2 as Pr[iS], Pr[kS], Pr[yrow is not noisy]>1/2

• If x1 appears in 2log n distinct equations. Then, majority is correct w/p 1-1/n2

• Take union bound over all variables

=wi+wk+y

Approximate Search Prediction AS3LIN(m,): Given M,y find w 0.9 x w/p 0.8

P3LIN(m,): Given M,y, (i,j,k) find xi+xj+xk w/p 0.9

Lemma: Solver A for P3LIN(m,) allows to solve AS3LIN(m+1000 n ,)

yM

x+ =em

n

1 1 1 ?

search app-search prediction prediction over planted PKC

Approximate Search PredictionProof:

yM1000n

zTm

11

Approximate Search PredictionProof:

Do 100n times

Analysis:

• By Markov, whp T, z are good i.e., Prt,j,k[A(T,z,(t,j,k))=xt+xj+xk]>0.8

• Conditioned on this, each red prediction is good w/p>>1/2

• whp will see 0.99 of vars many times – each prediction is independent

yM1000n

zTm

1 1 1

11 1

111111 1

Invoke Predictor A

+ 2 noisy

0.2 noisy

xi

i

Prediction over Related DistributionP3LIN(m,): Given M,y, r=(i,j,k) find xi+xj+xk w/p 0.9

D = distribution over (M,r) which at most 0.99-far from uniform

Lemma: Solver A for P3LIND(m,) allows to solve P3LINU(O(m) ,)• Problem: A might be bad predictor over uniform distribution

• Sol: Test that (M,r) is good for A with respect to random x and random noise

Good prediction w/p 0.01

Otherwise, “I don’t know”

yM

x

+ =e

1 1 1 ?r

search app-search prediction prediction over planted PKC

UniformD

Prediction over Related DistributionLemma: Solver A for P3LIND(m,) allows to solve P3LINU(O(m) ,)

Sketch: Partition M,y to many pieces Mi,yi then invoke A(Mi,yi,r) and take majority

• Problem: All invocations use the same r and x

• Sol: Re-randmization !

M

x

+ =

1 1 1 ?r

e y

Distribution with Short Linear DependencyHq,n = Uniform over matrices with q-rows each with 3 ones

and n cols each with either 0 ones or 2 ones

Pm,nq = (m,n,3)-uniform conditioned on existence of sub-matrix H Hq

that touches the last row

Lemma : Let m=n1.4 and q=n0.2

Then, (m,n,3)-uniform and Pm,nq are at most 0.999-statistially far

Proof: follows from [FKO06].

1 1 11 11 1 1 1 1 1 1

1 1 11 11 1 1 1 1 1 1

stat

Plan

• Background

• Our results

- assumptions & constructions

• Proof idea

• Conclusions and Open Problems

Other Results

•Assumptions Oblivious-Transfer

General secure computation

• New construction of PRG with large stretch + low locality

• Assumptions Learning k-juntas requires time n(k)

Conclusions

• New Cryptosystems with arguably “less structured” assumptions

Future Directions:

• Improve assumptions

- use random 3SAT ?

• Better theoretical understanding of public-key encryption

-public-key cryptography can be broken in “NP co-NP” ?

Thank You !