16
Privacy-Preserving Trust Negotiations* Mikhail Atallah CERIAS and Department of Computer Sciences Purdue University * Joint work with Keith Frikken and Jiangtao Li

Privacy-Preserving Trust Negotiations* Mikhail Atallah CERIAS and Department of Computer Sciences Purdue University * Joint work with Keith Frikken and

Embed Size (px)

Citation preview

Privacy-Preserving Trust Negotiations*

Mikhail Atallah

CERIAS and Department of Computer Sciences Purdue University

* Joint work with Keith Frikken and Jiangtao Li

Motivation (1)Motivation (1)

• In an open environment, access control decisions are often based on requester attributes

– Access policy stated in terms of attributes

• Digital credentials, e.g.,– Citizenship

– Age

– Physical condition (disabilities)

– Employment (government, healthcare, FEMA, etc)

– Credit status

– Membership in groups (AAA, AARP, etc)

– Security clearance

Motivation (2)Motivation (2)

• Credentials and access policies can be sensitive (or are advantageously treated as such)

– Better individual privacy

– Hide business strategy (fewer unwelcome imitators)

– Incentives (less “gaming”)

– Better security

ModelModel

• M = message ; P = Policy ; C = credentials– Credentials C are issued off-line

• Alice gets M iff C satisfy Bob’s policy P

• Protocol cannot use a trusted third party

Bob (Server)

Alice(Client)

Alice: request for M

M, PC=C1,C2,…,Cm

Alice-Bob Protocol

M if Csatisfies P

Properties of our solutionProperties of our solution

• Bob does not learn whether Alice got access or not

• Bob does not learn anything about Alice’s credentials

• Alice learns neither Bob’s policy structure nor which credentials caused her to gain access

• Alice cannot probe off-line (by requesting an M once and then trying various subsets of her credentials)

• Policy need not be monotonic

Practical considerationsPractical considerations

• Non-monotonic policy may not make sense– Alice cannot be forced to use all her credentials, she

could use a subset (those she thinks will help)– Require a credential for the absence of an attribute ?

But it can be difficult to prove an absence to the CA

• Alice can carry out on-line probing attacks– She can request the same M multiple times, each

time trying a different subset of her credentials, and thereby gain information about the policy

– Note that Bob cannot do any probing by modifying the policy (because he does not learn whether Alice obtained access)

Related Work (1)Related Work (1)

• Trust negotiations, hidden credentials– [Holt-Bradshaw-Seamons-Orman, WPES 03]

– [Holt-Bradshaw-Seamons, CCS 04]

– [Seamons-Winslett-Yu, NDSS 01]*

– [Seamons-Winslett-Yu-Yu-Jarvis, WPET 02]

– [Winsborough-Li, WPES 02, Policy 02, S&P 04]

– [Yu-Winslett, S&P 03]* [Yu-Winslett-Seamons,CCS 01]

– [Bonatti-Samarati, CCS 00]*

• Minimize disclosure of credentials, policies*– Some disclosure does occur

Related Work (2)Related Work (2)

• Secure function evaluation– [Yao, FOCS 86]; [Naor-Nissim, STOC 01]

– General functions

• Secure private function evaluation– [Canetti et al, PODC 01]

– Special cases

• In our case, Alice’s private input is not her attributes – Rather, her private input is a third-party verification of

her attributes (by a CA who is not online when the protocol is run)

Hidden Credentials [HBSO’03]Hidden Credentials [HBSO’03]

• Credentials are generated by CA, using Identity Based Encryption– E.g., [Boneh-Franklin, Crypto 01]

• How CA issues Alice a student credential:– Use Identity Based Encryption with ID = Alice||student – Credential is the private key corresponding to above ID

• Simple example of hidden credential usage:– Bob sends Alice M encrypted with public key for that ID– Alice can decrypt only with a student credential– Bob does not learn whether Alice is a student or not

Policy DefinitionPolicy Definition

• A Boolean function p(x1, …, xn) – xi corresponds to presence or absence of attribute attri

• Alice’s credentials set C satisfies the policy iff– p(x1, …, xn) = 1 where xi is 1 iff there is a credential in

C for attribute attri

• Example

– Alice is a senior citizen and has low income– Policy = (disability senior-citizen) low-income

= (x1 x2) x3

= (0 1) 1 = 1

Protocol Has Two PhasesProtocol Has Two Phases

• Phase 1: Credential and Attribute Hiding– For each attri Bob generates 2 randoms ri[0], ri[1]

– Alice learns n values k1, k2, …, kn s.t. ki = ri[1] if she has a credential for attri , otherwise ki = ri[0]

• Phase 2: Blinded Policy Evaluation– Alice’s inputs are the above k1, k2, …, kn

– Bob’s input is M, p, and the n pairs ri[0], ri[1]

– Alice receives M if and only if p(x1, …, xn) = 1

Phase 1: HidingPhase 1: Hiding• Input: Alice has m hidden credentials C1,C2,…,Cm ; Bob has

n pairs of randoms ri[0], ri[1] for i=1, …, n

• Output: Alice gets one element from each of Bob’s n pairs: ri[1] if she has a Cj s.t. Cj.attr = attri, else ri[0]

• Steps:

1. Bob generates n pairs ki[0], ki[1] for i=1, …, n

2. Bob sends Alice n items: EIBE(ki[0],Alice||attri)3. Alice decrypts the items (using her respective Cj when she

has it, else using a random), gets B1 , … , Bn

4. Alice and Bob use a set intersection protocol on {ki[0]} and {Bi}; post-process its outcome s.t. if ki[0] is in Bi Alice obtains ki[1], else she gets garbage

5. Alice securely compares what she got with Bob’s {ki[1]}: If they’re equal she gets ri[1], else ri[0]

Phase 2: Policy EvaluationPhase 2: Policy Evaluation• Input: Alice has k1, …, kn where ki is in {ri[1],ri[0]} ;

Bob has policy p, message M, and n pairs of randoms ri[0],ri[1] for i=1, …, nn

• Output: Alice learns M if and only if p(x1, … , xn) = 1 where xi = (ki = ri[1]) for i = 1 , … , n

• Steps:1. Bob sends Alice EK(M)

2. Bob builds a scrambled circuit [Yao86] that computes p(x1, …, xn) where the 1 encoding of the output wire is the decryption key K

3. Bob sends Alice the scrambled circuit

4. Alice evaluates the circuit and decrypts EK(M) using the value from the output wire

5. If Alice gets the 1 encoding, she can obtain M

Other ProtocolsOther Protocols• Approximate pattern matching• Biological sequence comparisons• Contract negotiations (fairness)• Collaborative benchmarking, forecasting• Location-dependent query processing• Credit checking• Supply chain negotiations• Data mining (partitioned data)• Electronic surveillance• Intrusion detection• Vulnerability assessment• Biometric comparisons

Computational OutsourcingComputational Outsourcing

• Bob has all the data but little computing power• Alice has none of the data but has much

power, hence Alice must do all of the heavy-duty computational work

• Bob’s data and answer are sensitive, so Bob must learn the answer without Alice learning anything about his data or the answer

• E.g., Bob = sensor (or smartcard or PDA), Alice = super-computing center

AcknowledgementsAcknowledgements

• Ph.D. Students– Keith Frikken, Marina Bykova, Jiangtao Li

• Gov’t– NSF5, ONR, AFRL

• Industry– Motorola, HP + the corporate sponsors of CERIAS

• Foundation– Lilly Endowment

• Purdue– CERIAS, Discovery Park