Upload
vankhuong
View
217
Download
0
Embed Size (px)
Citation preview
1
03. Intro to Security;
Conceptualizing &
Measuring Privacy
Blase Ur, April 3rd, 2017
CMSC 23210 / 33210
4
Introduction to Computer Security
• Defining and motivating computer security
• Types of misuse
• Threats and attackers
• Basic security analysis
Goals: To learn…
• about the breadth of things that one needs to worry about
• how an attacker might think
• how to reason about the security of a system
5
• Protecting information systems against
misuse and interference
• “Building systems to remain dependable in
the face of malice, error or mischance”
(Ross Anderson)
What is computer security?
6
Properties of a secure system (CIA)
• Confidentiality: information is protected
from unintended disclosure (secrecy,
privacy, access control)
• Integrity: system and data are maintained
in a correct and consistent condition
• Availability: systems and data are usable
when needed (includes timeliness)
7
Not exactly the same thing
• Secrecy– Keep data hidden
– E.g., Alice kept the incriminating information secret
• Confidentiality– Keep (someone else’s) data hidden from unauthorized entities
– E.g., banks keep much account information confidential
• Privacy– Use/disclose a person’s data according to a set of rules
– E.g., to protect Alice’s privacy, company XYZ removed her name
before disclosing information about her purchases
• Anonymity– Keep identity of a protocol participant secret
– E.g., to hide her identity from the web server, Alice uses The
Onion Router (Tor) to communicate
Secrecy, Confidentiality, Privacy, Anonymity
8
Not exactly the same thing
• Data integrity– Ensure data is “correct” (i.e., correct syntax & unchanged)
– Prevents unauthorized or improper changes
– E.g., Trent always verifies the integrity of his database after
restoring a backup, to ensure that no incorrect records exist
• Entity authentication or identification– Verify the identity of another protocol participant
– E.g., Alice authenticates Bob each time they establish a secure
connection
• Data authentication– Ensure that data originates from claimed sender
– E.g., For every message Bob sends, Alice authenticates it to
ensure that it originates from Bob
Integrity, Authentication
9
Attackers exploit bugs
• Software bugs
• Hardware bugs
• Humans (social engineering)
• Unintended characteristics (e.g., side
channels, poor sources of randomness)
10
Modeling the attacker
• What type of action will they take?
– Passive (only look)
– Active (look and inject messages)
• How sophisticated are they?
• How much do they care? What resources do they
have?
– How much time/money will they spend?
• How much do they already know?
– External / internal attacker?
11
Exploiting bugs for profit
• Credit card and financial account fraud
• Stealing intellectual property or
confidential information
• Ransom
• Extortion
• Stealing computing resources to sell
13
Pricelists
• $100-180 per 1000 installs (2011)
• $1-1,500 stolen bank account details (2009)
• $20-100+ US credit card (2013)
• $5-8 US citizen personal data (2009)
• $7-15 user accounts for paid online services (2009)
• $1000-2000 per month for botnet spam services (2009)
• $50-$$$ per day for botnet DDoS services (2009)
• $125,000 for zero-day browser exploit to private party (2012)
• $250,000 for zero-day iOS exploit to government (2012)
Sources:
• Andy Greenberg. Shopping For Zero-Days: A Price List For Hackers' Secret Software Exploits. Forbes, 23 Mar 2012.
• Juan Caballero, Chris Grier, Christian Kreibich, and Vern Paxon. Measuring pay-per-install: the commoditization of malware
distribution. In Proc. USENIX Security, 2011.
• Kaspersky reveals price list for botnet attacks. Computer Weekly, 23 Jul 2009.
• Stolen Target credit cards and the black market. Tripwire, 21 Dec 2013.
14
Types of Information System Misuse (1)[Neumann and Parker 1989]
• External
– Visual spying Observing keystrokes or screens
– Misrepresentation Deceiving operators and users
– Physical scavenging “Dumpster diving” for printouts
• Hardware misuse
– Logical scavenging Examining discarded/stolen media
– Eavesdropping Intercepting electronic or other data
– Interference Jamming, electronic or otherwise
– Physical attack Damaging or modifying equipment
– Physical removal Removing equipment & storage media
15
Types of Information System Misuse (2)[Neumann and Parker 1989]
• Masquerading
– Impersonation Using false identity external to computer
– Piggybacking Usurping workstations, communication
– Spoofing Using playback, creating bogus systems
– Network weaving Masking physical location or routing
• Pest programs
– Trojan horses Implanting malicious code
– Logic bombs Setting time or event bombs
– Malevolent worms Acquiring distributed resources
– Viruses Attaching to programs and replicating
• Bypasses
– Trapdoor attacks Utilizing existing flaws
– Authorization attacks Password cracking
16
Types of Information System Misuse (3)[Neumann and Parker 1989]
• Active misuse
– Basic Creating false data, modifying data
– Denials of service Saturation attacks
• Passive misuse
– Browsing Making random or selective searches
– Inference, aggregation Exploiting traffic analysis
– Covert channels Covert data leakage
• Inactive misuse Failing to perform expected duties
• Indirect misuse Breaking crypto keys
17
• How do you secure X? Is X secure?
1. What are we protecting?
2. Who is the adversary?
3. What are the security requirements?
4. What security approaches are effective?
Basic security analysis
18
• Enumerate assets and their value
• Understand architecture of system
• Useful questions to ask
– What is the operating value, i.e., how much
would we lose per day/hour/minute if the
resource stopped?
– What is the replacement cost? How long
would it take to replace it?
1. What are we protecting?
19
2. Who is the adversary?
• Identify potential attackers
– How motivated are they?
• Estimate attacker resources
– Time and money
• Estimate number of attackers, probability
of attack
20
• Attacker action– Passive attacker: eavesdropping
– Active attacker: eavesdropping + data injection
• Attacker sophistication– Ranges from script kiddies to government-funded
group of professionals
• Attacker access– External attacker: no knowledge of cryptographic
information, no access to resources
– Internal attacker: complete knowledge of all
cryptographic information, complete access
• Result of system compromise
Common (abstract) adversaries
21
• Enumerate security requirements
– Confidentiality
– Integrity
– Authenticity
– Availability
– Auditability
– Access control
– Privacy
– …
3. What are the security requirements?
22
• Age
– Prove that data exists before a certain time
– Lower bound on the duration of existence
• Freshness
– Prove that data was created after an event
– Upper bound on the duration of existence
• Temporal order
– Verify ordering of a sequence of events
Temporal properties
23
• Auditability
– Enable forensic activities after intrusions
– Prevent attacker from erasing or altering
logging information
• Availability
– Provide access to resource despite attacks
– Denial-of-Service (DoS) attacks attempt to
prevent availability
Other properties
24
• No security
– Legal protection (deterrence)
– Innovative: patent attack, get protection through patent law
• Build strong security defense
– Use cryptographic mechanisms
– Perimeter defense (firewall), VPN
• Resilience to attack
– Multiple redundant systems (“hot spares”)
• Detection and recovery (& offense ?)
– Intrusion detection system
– Redundancy, backups, etc.
– Counterstrike? (Legal issues?)
4. Approaches to achieve security
25
Threat models
• Can’t protect against everything– Too expensive
– Too inconvenient
– Not worth the effort
• Identify most likely ways system will be attacked– Identify likely attackers and their resources
• Dumpster diving or rogue nation?
– Identify consequences of possible attacks
• Mild embarrassment or bankruptcy?
– Design security measures accordingly
• Accept that they will not defend against all attacks
26
• Adversary is targeting assets, not
defenses
• Will try to exploit the weakest part of the
defenses
– E.g., bribe human operator, social
engineering, steal (physically) server with
data
Think like an attacker
27
Engineering &
Public Policy[ From http://blogs.technet.com/b/rhalbheer/archive/2011/01/14/real-physical-security.aspx ]
28
Engineering &
Public PolicyCopyright © 2014 Lujo Bauer[ From https://flic.kr/p/amsEr6 (creative commons) ]
30
Case study
• Class discussion on security of a house
– What are we protecting?
– Who is the adversary?
– What are the security requirements?
– What security approaches are effective?
31
• Security: important but difficult
• Security is not absolute
– Attacker
– Properties
– Cost
• Security is about managing risk in the
presence of an adversary
• Security is often an arms race
Takeaways
49
I have four children, two of which I share a
bedroom with. Privacy, to me, is to have a space
to yourself that no one is allowed in to keep
whatever it is you want to keep for yourself.
– Karin, age 26
50
I believe a fence is a
sign of privacy. This
picture shows enough
of my house to show
that I don’t mind some
people to see me, but I
prefer a barrier when it
comes to some things.
– Shanna, age 32
51
Privacy for me is like
a place with a one-
sided mirror. I can
see outside but no
one can see in unless
I open the door. Also
an extra wall on the
outside just in case.
– Kim, age 21
52
Unfortunately I think most of my
life has been digitized so true
privacy would be me alone in a
faraday cage, maybe napping.
– Maddy, age 20
54
Privacy to me
is sealing my
box of data
from web
applications
and services
which try to
collect it.
– Hana H.,
age 23
55
Privacy is about control – controlling
what is shared about your thoughts
and preferences, the things that
make you you.
– KRB, age 39
56
– Allison Lefrak,
Senior Staff
Attorney,
Federal Trade
Commission
Division of
Privacy and
Identity
Protection
59
Having the ability
to not be 'seen'
online, to not be
tracked or have all
of your
information
remembered…. I
think the
'incognito mode'
on a browser
helps.
– George, age 18
70
Privacy is Hard to Define
“Privacy is a value so complex, so entangled
in competing and contradictory dimensions,
so engorged with various and distinct
meanings, that I sometimes despair whether
it can be usefully addressed at all.”
Robert C. Post, Three Concepts of Privacy,
89 Geo. L.J. 2087 (2001).
71
Limited Access to Self
“the right to be let alone”
“to protect the privacy of the
individual from invasion either by
the too enterprising press, the
photographer, or the possessor of
any other modern device for
rewording or reproducing scenes
or sounds” - Samuel D. Warren and Louis D. Brandeis, The Right to
Privacy, 4 Harv. L. Rev. 193 (1890)
72
Michael Wolf- The Transparent City
“Chicago has recently undergone a
surge of new construction…In early
2007, the Museum of Contemporary
Photography…invited Michael Wolf
as an artist-in-residence….Wolf
chose to photograph the central
downtown area, focusing on issues
of voyeurism and the contemporary
urban landscape….his details are
fragments of life—digitally distorted
and hyper-enlarged—snatched
surreptitiously via telephoto lenseshttp://aperture.org/shop/the-transparent-city/
73
Photography Laws
https://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people#The_right_of_publicity
74
Privacy as Control
“Privacy is the claim of individuals,
groups or institutions to determine for
themselves when, how, and to what
extent information about them is
communicated to others.”
“…each individual is continually
engaged in a personal adjustment
process in which he balances the
desire for privacy with the desire for
disclosure and communication….”
Alan Westin, Privacy and Freedom, 1967
75
Privacy as Boundary Regulation
Privacy regulation theory:
“a selective control of access to the
self or to one’s group”
(this is why sometimes you want to be
alone, and sometimes you don’t)
Irwin Altman, 1975
76
Privacy as Contextual Integrity
“Contextual integrity ties adequate
protection for privacy to norms of
specific contexts, demanding that
information gathering and
dissemination be appropriate to
that context.”
Helen Nissenbaum, 2004
77
Pluralistic conceptions of privacy
• Some data isn’t “sensitive,” but its
collection and use can invade privacy
– Impact power relationships
– Kafka
• Solove’s privacy taxonomy
– Information collection
– Information processing
– Information dissemination
– Invasion
78
Important terms
• Chilling effect: discouragement of
exercising a legitimate right
• Privacy paradox: behaviors are
inconsistent with concerns
• Privacy by design: consider privacy
throughout the lifecycle of a product
• Secondary use: those other than the
intended purpose
79
Issues of privacy
• Can conflict with free speech / security
• How do we quantify privacy harms?
• Can we measure chilling effects?
• How do we provide transparency?
• Distortion: false of misleading information
• Data mining future activities?
• Oversight and accountability
80
Right to be forgotten
• Should a person have the agency to cause
items from the past to be removed?
• Who owns information?
• EU
81
How does each goal relate to privacy?
• Solitude, uninterrupted
• Unseen, unheard, unread
• Not talked about
• Not judged
• Not profiled, not targeted, not treated differently than others
• Not misjudged
• Free to try, practice, make mistakes, self-reflect
• Not surprised (contextual integrity)
• Not accountable
• Not required to reveal
• Unknown
• Forgotten
• Intimacy
• Control
• Boundaries
• Identity
• Security
• Safety
• Others?
I want to have… I want to be….
82
Measuring privacy
• Why is privacy hard to measure?
• Why are attitudes about privacy hard to
measure?
• Why is the cost of privacy invasion hard to
measure?
83
How privacy is protected
• Laws, self regulation, technology
– Notice and access
– Control over collection, use, deletion, sharing
– Collection limitation
– Use limitation
– Security and accountability
84
Privacy laws around the world
• US has mostly sector-specific laws, minimal protections, often
referred to as “patchwork quilt”
– No explicit constitutional right to privacy or general privacy law
– But some privacy rights inferred from constitution
– Narrow regulations for health, financial, education, videos, children, etc.
– Federal Trade Commission jurisdiction over fraud and deceptive
practices
– Federal Communications Commission regulates telecommunications
– Some state and local laws
• Data Protection Directive - EU countries must adopt similar
comprehensive laws, recognize privacy as fundamental human right
– Privacy commissions in each country
85
OECD Fair Information Principles
• Collection limitation
• Data quality
• Purpose specification
• Use limitation
• Security safeguards
• Openness
• Individual participation
• Accountability
• http://www.privacyrights.org/ar/fairinfo.htm
86
US FTC’s Fair Information Practice
Principles (FIPPs)
• Notice / Awareness
• Choice / Consent
• Access / Participation
• Integrity / Security
• Enforcement / Redress
• https://en.wikipedia.org/wiki/FTC_Fair_Information_Practice
87
US government privacy reports
• U.S. FTC and White House
reports released in 2012
• U.S. Department of
Commerce
multi-stakeholder
process to develop
enforceable
codes of conduct
89
Notice and choice
Protect privacy by giving people control over their
information
Notice about data
collection and use
Choices about allowing their
data to be collected and
used in that way
90
Nobody wants to read privacy policies
“the notice-and-choice
model, as implemented,
has led to long,
incomprehensible privacy
policies that consumers
typically do not read, let
alone understand”
− Protecting Consumer Privacy in an
Era of Rapid Change. Preliminary
FTC Staff Report. December 2010.
91
Cost of reading privacy policies
• What would happen if everyone read the privacy
policy for each site they visited once per year?
• Time = 244/hours year
• Cost = $3,534/year
• National opportunity cost for
time to read policies: $781 billion
A. McDonald and L. Cranor. The Cost of Reading Privacy Policies. I/S:
A Journal of Law and Policy for the Information Society. 2008 Privacy Year
in Review Issue. http://lorrie.cranor.org/pubs/readingPolicyCost-
authorDraft.pdf
94
Towards a privacy
“nutrition label”
• Standardized format
– People learn where to find answers
– Facilitates policy comparisons
• Standardized language
– People learn terminology
• Brief
– People find info quickly
• Linked to extended view
– Get more details if needed
95
Iterative design process
• Series of studies
– Focus groups
– Lab studies
– Online studies
• Metrics
– Reading-comprehension (accuracy)
– Time to find information
– Ease of policy comparison
– Subjective opinions, ease, fun, trust
P.G. Kelley, J. Bresee, L.F. Cranor, and R.W.
Reeder. A “Nutrition Label” for Privacy. SOUPS
2009.
P.G. Kelley, L.J. Cesca, J. Bresee, and L.F. Cranor.
Standardizing Privacy Notices: An Online Study
of the Nutrition Label Approach. CHI 2010.