36
CMSC 414 Computer and Network Security Lecture 22 Jonathan Katz

CMSC 414 Computer and Network Security Lecture 22

  • Upload
    eydie

  • View
    40

  • Download
    0

Embed Size (px)

DESCRIPTION

CMSC 414 Computer and Network Security Lecture 22. Jonathan Katz. Rest of the semester…. Today: database privacy Next 2-3 lectures: PL security, buffer overflows, malicious software Tuesday’s class will cover material relevant for HW4 - PowerPoint PPT Presentation

Citation preview

Page 1: CMSC 414 Computer and Network Security Lecture 22

CMSC 414Computer and Network Security

Lecture 22

Jonathan Katz

Page 2: CMSC 414 Computer and Network Security Lecture 22

Rest of the semester… Today: database privacy Next 2-3 lectures: PL security, buffer overflows,

malicious software– Tuesday’s class will cover material relevant for HW4– Thursday’s class will be a guest lecture – material will

still be covered on the final

Last 2-3 lectures: Topics in network security– Network security in practice

Page 3: CMSC 414 Computer and Network Security Lecture 22

Database security

Page 4: CMSC 414 Computer and Network Security Lecture 22

Overview Want to be able to discern statistical trends

without violating privacy Questions

– How to obtain the raw data in the first place?– How to maintain privacy of the data while still making

it useful for data mining?

Serious real-world problem– Medical databases are protected by Federal laws– Data mining on credit card transactions, web browsing– Netflix dataset

Page 5: CMSC 414 Computer and Network Security Lecture 22

Obtaining sensitive data How do you get people to give honest answers to

sensitive questions?– Shall we try it?

Page 6: CMSC 414 Computer and Network Security Lecture 22

Randomized response Respondent flips a coin/rolls a die… …and answers the question incorrectly with some

known probability q (The result of the die toss is not known to the

interviewer) Why does this preserve privacy?

– What is Pr[yes | answer yes] ?

Page 7: CMSC 414 Computer and Network Security Lecture 22

Analysis of randomized response Generating an estimate:

– Say the fraction of “yes” in the population is p– Pr[“yes”] = p(1-q) + (1-p)q– Solve for p given q and Pr[“yes”]– E.g., q=1/4 gives: p = 2Pr[“yes”] – 0.5

Shall we try it…?

Page 8: CMSC 414 Computer and Network Security Lecture 22

Randomized response Other variants that provide better estimators Extensions to provide estimators of other statistics

– E.g., correlations between responses

Page 9: CMSC 414 Computer and Network Security Lecture 22

“Privacy-preserving data mining”

Page 10: CMSC 414 Computer and Network Security Lecture 22

The setup… A user (or group of users) has authorized access to

certain data in a database, but not to all data– E.g., user is allowed to learn certain entries only– E.g., user is allowed to learn aggregate data but not

individual data (e.g., allowed to learn the average salary but not individual salaries)

– E.g., allowed to learn trends (i.e., data mining) but not individual data

Note: we are assuming that authentication/access control is taken care of already…

Page 11: CMSC 414 Computer and Network Security Lecture 22

Two models Non-interactive data disclosure

– User given access to “all data” (possibly after the data is anonymized in some way)

Interactive mechanisms– User given the ability to query the database – We will mostly focus on this model

Page 12: CMSC 414 Computer and Network Security Lecture 22

The problem A user may be able to learn unauthorized

information via inference– Combining multiple pieces of authorized data– Combining authorized data with “out-of-band”

knowledge• 87% of people identified by ZIP code + gender + date of birth• Given ZIP+DOB, may be able to infer gender from other

entries

This is a (potentially) serious real-world problem– See the article by Sweeney for many examples

Page 13: CMSC 414 Computer and Network Security Lecture 22

Example Say not allowed to learn any individual’s salary

Name UID Years of service SalaryAlice 001 12 $65,000Bob 010 1 $40,000

Charlie 011 20 $70,000Debbie 100 30 $80,000Evan 101 4 $50,000Frank 110 8 $58,000

Give me Alice’s salaryRequest denied!

Page 14: CMSC 414 Computer and Network Security Lecture 22

Example

Name UID Years of service SalaryAlice 001 12 $65,000Bob 010 1 $40,000

Charlie 011 20 $70,000Debbie 100 30 $80,000Evan 101 4 $50,000Frank 110 8 $58,000

Give me the list of all namesGive me the list of all salaries

AliceBob

CharlieDebbieEvanFrank

$65,000$40,000$70,000$80,000$50,000$58,000

Solution: return data in order that is independentof the table (e.g.: random, sorted)

$40,000$50,000$58,000$65,000$70,000$80,000

Page 15: CMSC 414 Computer and Network Security Lecture 22

ExampleName UID Years of service SalaryAlice 001 12 $65,000Bob 010 1 $40,000

Charlie 011 20 $70,000Debbie 100 30 $80,000Evan 101 4 $50,000Frank 110 8 $58,000

Give me all names and UIDsGive me all UIDs and salaries

(Alice, 001)(Bob, 010)

(Charlie, 011)(Debbie, 100)(Evan, 101)(Frank, 110)

(001, $65,000)(010, $40,000)(011, $70,000)(100, $80,000)(101, $50,000)(110, $58,000)

Page 16: CMSC 414 Computer and Network Security Lecture 22

Example

Name UID Years of service SalaryAlice 001 12 $65,000Bob 010 1 $40,000

Charlie 011 20 $70,000Debbie 100 30 $80,000Evan 101 4 $50,000Frank 110 8 $58,000

Give me all names with their years of service

(Alice, 12)(Bob, 1)

(Charlie, 20)(Debbie, 30)

(Evan, 4)(Frank, 8)

Give me the list of all salaries

$40,000$50,000$58,000$65,000$70,000$80,000

External knowledge:more years higher pay

Page 17: CMSC 414 Computer and Network Security Lecture 22

Some solutions In general, an unsolved (unsolvable?) problem Some techniques to mitigate the problem

– Inference during database design• E.g., recognize dependencies between columns• Split data across several databases (next slide)

– Inference detection at query time• Store the set of all queries asked by a particular user, and look

for disallowed inferences before answering any query• Note: will not prevent collusion among multiple users• Can also store the set of all queries asked by anyone, and look

for disallowed inference there

As always, tradeoff security and functionality

Page 18: CMSC 414 Computer and Network Security Lecture 22

Using several databases DB1 stores (name, address), accessible to all DB2 stores (UID, salary), accessible to all DB3 stores (name, UID), accessible to admin

What if I want to add data for “start-date” (and make it accessible to all)?– Adding to DB2 can be problematic (why?)– Adding to DB1 seems ok (can we prove this?)

Page 19: CMSC 414 Computer and Network Security Lecture 22

Statistical databases Database that only provides data of a statistical

nature (average, standard deviation, etc.)– Pure statistical database: only stores statistical data– Statistical access to ordinary database: stores all data

but only answers statistical queries

Focus on the second type– Aim is to prevent inference about any particular piece

of information– Two general approaches: query restriction and

data/output perturbation

Page 20: CMSC 414 Computer and Network Security Lecture 22

Query restriction Basic idea: only allow queries that involve more

than some threshold t of users Example: Say we want to hide individual salaries Only allow queries about the average salary of a

set S of people, where |S| > 20 (say)

Page 21: CMSC 414 Computer and Network Security Lecture 22

Query restriction Query restriction itself may reveal information Example: say averages released only if there are at

least 2 data points being averaged– Request the average salary of all employees whose

GPA is ≥X– No response means that there are fewer than 2

employees with GPA ≥X– If query(GPA ≥ X) answered but query(GPA ≥ X+) is

not, there is at least one employee whose GPA lies between X and X+

Page 22: CMSC 414 Computer and Network Security Lecture 22

Query restriction Query restriction alone doesn’t work when

multiple queries are allowed, or when the database changes

Determine a person’s salary using two queries Determine a person’s salary after a raise

Page 23: CMSC 414 Computer and Network Security Lecture 22

Query restriction Can use more complicated forms of query

restriction based on all prior history– E.g., if query for S was asked, do not allow query for a

set S’ if |S’S| is “small”

Drawbacks– Maintaining the entire query history is expensive– Difficult to specify what constitutes a privacy “breach”– NP-complete (in general) to determine whether a

breach has occurred...– Does not address adversary’s external information

Page 24: CMSC 414 Computer and Network Security Lecture 22

Query restriction Comparing queries pairwise may not be enough Example

– Say you want information about user i– Let S, T be arbitrary sets (that are very different), not

containing i– Ask for Avg(S), Avg(T), and Avg(S T {i})

Inference can be very difficult to detect and prevent…

Page 25: CMSC 414 Computer and Network Security Lecture 22

Query restriction Apply query restriction globally, or per-user?

– If the former, usability limited– If the latter, security can be compromised by colluding

users

Page 26: CMSC 414 Computer and Network Security Lecture 22

Query restriction Example: say we do not want an adversary to

learn any value exactly Consider the table with x = y = z = 1, where it is

known that x, y, z {0,1,2} User requests sum(x, y, z), gets response 3 User requests max(x, y, z)

– If user learns the answer, can deduce that x = y = z = 1– But if the request is denied, the user can still deduce

that x = y = z = 1 (!!)

Page 27: CMSC 414 Computer and Network Security Lecture 22

Query restriction We can try to “look ahead”, and not respond to any query

for which there is a subsequent query that will reveal information regardless of whether we respond or not

sum(x, y, z)

max(x, y, z)deny?respond?

respond?

deny

Page 28: CMSC 414 Computer and Network Security Lecture 22

Query restriction with “look-aheads”

Problems– May need to look more than 1 level deep– Computationally infeasible, even if only looking 1 level

deep– Does it even work?

• Denying the request for sum(x, y, z) reveals that x = y = z

Even if answers don’t uniquely reveal a value, they may leak lots of partial information

What can we prove about it?

Page 29: CMSC 414 Computer and Network Security Lecture 22

Query restriction A different approach is to use “simulatable auditing” Deny query if there is some database for which that query

would leak information This fixes the previous problem:

– Learning sum(x, y, z) = 3 and then seeing that max(x, y, z) is denied no longer proves that x = y = z = 1

Even more computationally expensive Restricts usability Again, can we claim that it even works?

Page 30: CMSC 414 Computer and Network Security Lecture 22

Perturbation Purposely add “noise”

– Data perturbation: add noise to entire table, then answer queries accordingly (or release entire perturbed dataset)

– Output perturbation: keep table intact, but add noise to answers

Page 31: CMSC 414 Computer and Network Security Lecture 22

Perturbation Trade-off between privacy and utility! No randomization – bad privacy but perfect utility Complete randomization – perfect privacy but no

utility

Page 32: CMSC 414 Computer and Network Security Lecture 22

Data perturbation One technique: data swapping

– Substitute and/or swap values, while maintaining low-order statistics

F Bio 3.0

F CS 3.0

F EE 4.0

F Psych 3.0

M Bio 4.0

M CS 3.0

M EE 3.0

M Psych 4.0

F Bio 4.0

F CS 3.0

F EE 3.0

F Psych 4.0

M Bio 3.0

M CS 4.0

M EE 4.0

M Psych 3.0

Page 33: CMSC 414 Computer and Network Security Lecture 22

Data perturbation Second technique: (re)generate the table based on

derived distribution– For each sensitive attribute, determine a probability

distribution that best matches the recorded data– Generate fresh data according to the determined

distribution– Populate the table with this fresh data

Queries on the database can never “learn” more than what was learned initially

Page 34: CMSC 414 Computer and Network Security Lecture 22

Data perturbation Data cleaning/scrubbing: remove sensitive data, or

data that can be used to breach anonymity k-anonymity: ensure that any “identifying

information” is shared by at least k members of the database

Example…

Page 35: CMSC 414 Computer and Network Security Lecture 22

Example: 2-anonymityRace ZIP Smoke? Cancer?Asian 02138 Y YAsian 02139 Y NAsian 02141 N NAsian 02142 Y YBlack 02138 N NBlack 02139 N YBlack 02141 Y YBlack 02142 N NWhite 02138 Y YWhite 02139 N NWhite 02141 Y YWhite 02132 Y Y

- 02138- 02139- 02141- 02142- 02138- 02139- 02141- 02142- 02138- 02139- 02141- 02132

Asian 0213xAsian 0213xAsian 0214xAsian 0214xBlack 0213xBlack 0213xBlack 0214xBlack 0214xWhite 0213xWhite 0213xWhite 0214xWhite 0213x

Page 36: CMSC 414 Computer and Network Security Lecture 22

Problems with k-anonymity Hard to find the right balance between what is

“scrubbed” and utility of the data Not clear what security guarantees it provides

– For example, what if I know that someone Asian in ZIP Code 0214x smokes?

– Again, does not deal with out-of-band information