Upload
greta
View
41
Download
0
Embed Size (px)
DESCRIPTION
Hey, That’s Personal!. Lorrie Faith Cranor 28 July 2005 http://lorrie.cranor.org/. Outline. Privacy risks from personalization Reducing privacy risks Personalizing privacy. Privacy risks from personalization. PRIVACY RISKS. Unsolicited marketing. - PowerPoint PPT Presentation
Citation preview
CMU Usable Privacy and SecurityLaboratory
http://cups.cs.cmu.edu/
Hey, That’s Personal!Hey, That’s Personal!
Lorrie Faith Cranor28 July 2005
http://lorrie.cranor.org/
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 2
OutlineOutline
Privacy risks from personalizationPrivacy risks from personalization
Reducing privacy risksReducing privacy risks
Personalizing privacyPersonalizing privacy
Privacy risks from Privacy risks from personalizationpersonalization
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 4
Unsolicited marketingUnsolicited marketing
Desire to avoid unwanted marketing Desire to avoid unwanted marketing causes some people to avoid giving causes some people to avoid giving out personal informationout personal information
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 5
My computer can My computer can “figure things out about “figure things out about
me”me”The little people inside my computer The little people inside my computer might know it’s me…might know it’s me…
… … and they might tell their friendsand they might tell their friends
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 6
Inaccurate inferences Inaccurate inferences
““My TiVo thinks I’m gay!”My TiVo thinks I’m gay!”
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 7
Surprisingly accurate Surprisingly accurate inferencesinferences
Everyone wants to be understood. No one wants to be known.
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 8
You thought that on the You thought that on the Internet nobody knew you Internet nobody knew you
were a dog…were a dog…
…but then you started getting personalized ads for your favorite brand of dog food
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 9
Price discriminationPrice discrimination
Concerns about being charged higher Concerns about being charged higher prices prices
Concerns about being treated Concerns about being treated differentlydifferently
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 10
Revealing private Revealing private information to other users information to other users
of a computerof a computer Revealing info to family members or co-Revealing info to family members or co-
workersworkers• Gift recipient learns about gifts in advance
• Co-workers learn about a medical condition
Revealing secrets that can unlock many Revealing secrets that can unlock many accountsaccounts• Passwords, answers to secret questions, etc.
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 11
The Cranor family’s 25 most frequentgrocerypurchases (sorted by nutritional value)!
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 12
Exposing secrets to Exposing secrets to criminalscriminals
Stalkers, identity thieves, etc.Stalkers, identity thieves, etc.
People who break into account may People who break into account may be able to access profile infobe able to access profile info
People may be able to probe People may be able to probe recommender systems to learn profile recommender systems to learn profile information associated with other information associated with other usersusers
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 13
SubpoenasSubpoenas
Records are often subpoenaed in Records are often subpoenaed in patent disputes, child custody cases, patent disputes, child custody cases, civil litigation, criminal casescivil litigation, criminal cases
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 14
Government surveillanceGovernment surveillance
Governments increasingly looking for Governments increasingly looking for personal records to mine in the name personal records to mine in the name of fighting terrorismof fighting terrorism
People may be subject to investigation People may be subject to investigation even if they have done nothing wrongeven if they have done nothing wrong
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 15
Risks may be magnified in Risks may be magnified in futurefuture
Wireless location trackingWireless location tracking
Semantic web applicationsSemantic web applications
Ubiquitous computingUbiquitous computing
PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 16
If you’re not careful, you If you’re not careful, you may violate data protection may violate data protection
lawslaws Some jurisdictions have privacy laws Some jurisdictions have privacy laws
that that • Restrict how data is collected and used
• Require that you give notice, get consent, or offer privacy-protective options
• Impose penalties if personal information is accidently exposed
PRIVACY RISKS
Reducing privacy risksReducing privacy risks
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 18
Tends to be MORE Privacy Invasive
Tends to be MORE Privacy Invasive
Tends to be LESS Privacy Invasive
Tends to be LESS Privacy Invasive
Implicit Explicit
Persistent(profile)
Transient(task or session)
System initiated User initiated
Predication based Content based
Axes of personalizationAxes of personalization
Data collection methodData collection method
DurationDuration
User involvementUser involvement
Reliance on predictionsReliance on predictions
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 19
A variety of approaches to A variety of approaches to reducing privacy risksreducing privacy risks
No single approach will always work No single approach will always work
Two types of approaches:Two types of approaches:• Reduce data collection and storage
• Put users in control
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 20
Collection limitation: Collection limitation: Pseudonymous profilesPseudonymous profiles
Useful for reducing risk and complying with Useful for reducing risk and complying with privacy laws when ID is not needed for privacy laws when ID is not needed for personalizationpersonalization
But, profile may become identifiable because of But, profile may become identifiable because of unique combinations of info, links with log data, unique combinations of info, links with log data, unauthorized access to user’s computer, etc.unauthorized access to user’s computer, etc.
Profile info should always be stored separately Profile info should always be stored separately from web usage logs and transaction records that from web usage logs and transaction records that might contain IP addresses or PIImight contain IP addresses or PII
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 21
Collection limitation: Collection limitation: Client-side profilesClient-side profiles
Useful for reducing risk and complying with lawsUseful for reducing risk and complying with laws
Risk of exposure to other users of computer Risk of exposure to other users of computer remains; storing encrypted profiles can helpremains; storing encrypted profiles can help
Client-side profiles may be stored in cookies Client-side profiles may be stored in cookies replayed to server that discards them after usereplayed to server that discards them after use
Client-side scripting may allow personalization Client-side scripting may allow personalization without ever sending personal info to the serverwithout ever sending personal info to the server
For some applications, no reason to send data to For some applications, no reason to send data to serverserver
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 22
Collection limitation: Collection limitation: Task-based personalizationTask-based personalization
Focus on data associated with current session or Focus on data associated with current session or task - no user profile need be stored anywheretask - no user profile need be stored anywhere
May allow for simpler (and less expensive) May allow for simpler (and less expensive) system architecture too!system architecture too!
May eliminate problem of system making May eliminate problem of system making recommendations that are not relevant to current recommendations that are not relevant to current tasktask
Less “spooky” to users - relationship between Less “spooky” to users - relationship between current task and resultant personalization usually current task and resultant personalization usually obviousobvious
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 23
Putting users in controlPutting users in control
Users should be able to control Users should be able to control • what information is stored in their profile
• how it may be used and disclosed
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 24
Developing good user Developing good user interface to do this is interface to do this is
complicatedcomplicated Setting preferences can be tediousSetting preferences can be tedious
Creating overall rules that can be Creating overall rules that can be applied on the fly as new profile data applied on the fly as new profile data is collected requires deep is collected requires deep understanding and ability to anticipate understanding and ability to anticipate privacy concernsprivacy concerns
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 25
Possible approachesPossible approaches
Provide reasonable default rules with the Provide reasonable default rules with the ability to add/change rules or specify ability to add/change rules or specify preferences for handling of specific datapreferences for handling of specific data• Up front• With each action• After-the-fact
Explicit privacy preference prompts during Explicit privacy preference prompts during transaction processtransaction process
Allow multiple personaeAllow multiple personae
REDUCING PRIVACY RISKS
Example: Google Search Example: Google Search HistoryHistory
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 27
Amazon.com privacy Amazon.com privacy makeovermakeover
REDUCING PRIVACY RISKS
Streamline menu navigation Streamline menu navigation for customization for customization
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 29
Provide way to set up Provide way to set up default rulesdefault rules
Every time a user makes a new purchase Every time a user makes a new purchase that they want to rate or exclude they have that they want to rate or exclude they have to edit profile infoto edit profile info• There should be a way to set up default rules
Exclude all purchases Exclude all purchases shipped to my work address Exclude all movie purchases Exclude all purchases I had gift wrapped
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 30
Remove excluded purchases Remove excluded purchases from profilefrom profile
Users should be able Users should be able to remove items from to remove items from profileprofile
If purchase records If purchase records are needed for legal are needed for legal reasons, users should reasons, users should be able to request that be able to request that they not be accessible they not be accessible onlineonline
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 31
Better: options for Better: options for controlling recent historycontrolling recent history
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 32
Use personaeUse personae
Amazon already allows users to store Amazon already allows users to store multiple credit cards and addresses multiple credit cards and addresses
Why not allow users to create Why not allow users to create personae linked to each with option of personae linked to each with option of keeping recommendations and history keeping recommendations and history separate (would allow easy way to separate (would allow easy way to separate work/home/gift personae)?separate work/home/gift personae)?
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 33
Allow users to access all Allow users to access all privacy-related options in privacy-related options in
one placeone place Currently privacy-related options are Currently privacy-related options are
found with relevant featuresfound with relevant features
Users have to be aware of features to Users have to be aware of features to find the optionsfind the options
Put them all in one placePut them all in one place
But also leave them with relevant But also leave them with relevant featuresfeatures
REDUCING PRIVACY RISKS
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 34
I didn’t buy it for myselfI didn’t buy it for myselfHow about an How about an “I didn’t buy it “I didn’t buy it for myself”for myself” check-off box check-off box (perhaps (perhaps automatically automatically checked if gift checked if gift wrapping is wrapping is requested)requested)I didn’t buy it
for myself
REDUCING PRIVACY RISKS
Personalizing privacyPersonalizing privacy
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 36
Can we apply user modeling Can we apply user modeling expertise to privacy?expertise to privacy?
Personalized systems cause privacy Personalized systems cause privacy concernsconcerns
But can we use personalization to help But can we use personalization to help address these concerns?address these concerns?
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 37
What is privacy?What is privacy?
““the claim of individuals… to the claim of individuals… to determine for themselves when, how, determine for themselves when, how, and to what extent information about and to what extent information about them is communicated to others.”them is communicated to others.”
- Alan Westin, 1967- Alan Westin, 1967
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 38
Privacy as processPrivacy as process
““Each individual is continually Each individual is continually engaged in a engaged in a personal adjustment personal adjustment processprocess in which he balances the in which he balances the desire for privacy with the desire for desire for privacy with the desire for disclosure and communication….”disclosure and communication….”
- Alan Westin, 1967- Alan Westin, 1967
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 39
But individuals don’t always But individuals don’t always engage in adjustment engage in adjustment
processprocess Lack of knowledge Lack of knowledge
about how info is about how info is usedused
Lack of knowledge Lack of knowledge about how to about how to exercise controlexercise control
Too difficult or Too difficult or inconvenient to inconvenient to exercise controlexercise control
Data collectors Data collectors should inform usersshould inform users
Data collectors Data collectors should provide should provide choices and choices and controlscontrols
Sounds like a job Sounds like a job for a user model!for a user model!
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 40
Example: Managing privacy Example: Managing privacy at web sitesat web sites
Website privacy policiesWebsite privacy policies• Many posted• Few read
What if your browser could read them for you?What if your browser could read them for you?• Warn you not to shop at sites with bad policies• Automatically block cookies at those sites
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 41
Platform for Privacy Platform for Privacy Preferences (P3P)Preferences (P3P)
2002 W3C Recommendation2002 W3C Recommendation
XML format for Web XML format for Web privacy policiesprivacy policies
Protocol enables clients Protocol enables clients to locate and fetch to locate and fetch policies from serverspolicies from servers
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 42
Privacy BirdPrivacy Bird
P3P user agent P3P user agent originally developed originally developed by AT&Tby AT&T
Free download and Free download and privacy search service privacy search service at at http://privacybird.com/http://privacybird.com/
Compares user Compares user preferences with P3P preferences with P3P policiespolicies
PERSONALIZING PRIVACY
PERSONALIZING PRIVACY
PERSONALIZING PRIVACY
PERSONALIZING PRIVACY
Link to opt-out page
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 47
I would like to give the bird I would like to give the bird some feedbacksome feedback
““I read this policy and actually I think I read this policy and actually I think it’s ok”it’s ok”
““I took advantage of the opt-out on this I took advantage of the opt-out on this site so there is no problem”site so there is no problem”
““This site is a banking site and I want This site is a banking site and I want to be extra cautious when doing online to be extra cautious when doing online banking”banking”
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 48
Especially important if bird Especially important if bird takes automatic actionstakes automatic actions
Not critical when bird is only Not critical when bird is only informationalinformational
But if bird blocks cookies, the wrong But if bird blocks cookies, the wrong decision will get annoyingdecision will get annoying
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 49
Can we learn user’s privacy Can we learn user’s privacy preferences over time?preferences over time?
Bad bird!Bad bird!
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 52
Other example applications Other example applications for personalizing privacyfor personalizing privacy
Buddy lists: when to reveal presence Buddy lists: when to reveal presence information and to whominformation and to whom
Friend finder services: when to reveal Friend finder services: when to reveal location information and what level of location information and what level of detaildetail
Personalized ecommerce sites: when Personalized ecommerce sites: when to start and stop recording my actions, to start and stop recording my actions, which persona to usewhich persona to use
PERSONALIZING PRIVACY
• CMU Usable Privacy and Security Laboratory • Lorrie Cranor • http://lorrie.cranor.org/ 53
ConclusionsConclusions
Personalization often has real privacy Personalization often has real privacy risksrisks
Address these risks by minimizing Address these risks by minimizing data collection and storage, putting data collection and storage, putting users in controlusers in control
Challenge: Can we make it easier for Challenge: Can we make it easier for users to be in control by personalizing users to be in control by personalizing privacy?privacy?
CMU Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/