47
Class 13 Review CIS 755: Advanced Computer Security Spring 2015 Eugene Vasserman http://www.cis.ksu.edu/~eyv/ CIS755_S15/

Class 13 Review CIS 755: Advanced Computer Security Spring 2015 Eugene Vasserman eyv/CIS755_S15

Embed Size (px)

Citation preview

Class 13Review

CIS 755: Advanced Computer SecuritySpring 2015

Eugene Vasserman

http://www.cis.ksu.edu/~eyv/CIS755_S15/

Administrative stuff

• TEVAL offered – please fill it out :)• Even/especially if you thought this class was horrible!

• No class or office hours after May 3rd

• Quiz this week• Final exam on May 13th (2:00 – 3:50)–Review document will be posted today–Come to front office on May 13th at 2 PM

2

The most important slide of the class

• What are the take-away messages?– Think like an adversary–Kerckhoffs’ principle and Shannon’s maxim–Be able to search for solutions–Read papers–Reuse, reuse, reuse (correctly!)– State assumptions (be sure they hold)–Be able to admit “I don’t know” – not everyone

can engineer every solution3

I’m sure this is someone’s law…

• If a security system is too difficult to use, users will find a way to get around it

–Corollary: Getting the job done is more important than security• Has more immediate potentially bad outcomes

4

Things to remember

• I can be wrong; papers can be wrong; anyone can be wrong!

• If you don’t understand something, ask!• What does “secure” mean?• Who is the adversary, and why?• There is such a thing as too much security• If too hard to use, users will bypass security

• Attacks only get better5

Some things to remember

• Theoretical to practical in ~10 years–Chosen ciphertext attack–HDMI–CBC chosen plaintext attack

• Attacks only get better– Look at history of MD5– Look at history of SHA (e.g. SHA-0)

• Some things are a bad idea in the first place, e.g. “trusted” hardware

6

NEVER BUILD YOUR OWN WHEN

SOLUTION EXISTS!!!

NEVER COMPOSE YOUR OWN WHEN LIBRARY EXISTS!!!

Safety vs. security

• Think like an adversary!• Random → malicious faults• Engineering for security:

“What’s the worst that can happen?”Assume it will…

• Always, always, ALWAYS state your assumptions!

8

Security: Fundamental differences

• Real world: physical, intuitive–Risk assessment• People are not even good at this in the real world!

–Trusted vs. trustworthy– Forensics, physical evidence• Forgery

– Fail “evident,” e.g. theft– Scale of failures

9

More basics

• Trusted vs. trustworthy– e.g. the recent SSL Certificate Authority fiasco

• Risk, hazard, vulnerability–Adversary, ROI, scale

• Assurance levels– “Rainbow” book series, Common Criteria

• Method of returning to secure states• Fail-closed/secure or fail-open/insecure?

10

Basic cryptographic primitives

• Confidentiality (encryption)– Symmetric (e.g. AES)– Asymmetric (e.g. RSA)

• Hash functions (e.g. SHA1)• Integrity and authentication– Symmetric (message authentication codes)– Asymmetric (signatures)

• Key agreement• Random numbers

11

Block cipher modes of operation

• ECB, CBC, CTR, OFB, CFB, GCM, XEX, XTS• Differences, i.e. why do we care?– Some are parallelizable (GCM)• Also provides authentication!

– Some are self-synchronizing (CFB)

• Trick question: Block ciphers vs. stream ciphers vs. pseudorandom number generators (PRNG)?

12

Security (strength)

• Key size*

–Commonly 2256 for AES, 22048 for RSA–What is a [good] key?

• Underlying cryptosystem/primitives

• Composition• e.g. MAC with broken underlying hash function may

not itself be broken13

Modes of operation (ECB)

Images borrowed from Wikipedia :) 14

Modes of operation (CBC)

Images borrowed from Wikipedia :) 15

Recall: MACs

• “Keyed hash” (MAC from a cryptographically-secure hash function)–Hash Block cipher (CBC or CFB) MAC

• Hybrid modes e.g. CBC-MAC– Secrecy plus authenticity (2-party)

• Remember to use different keys for MAC and encryption… why?

16

Modes of operation (CFB)

Images borrowed from Wikipedia :) 17

Modes of operation (CTR)

Images borrowed from Wikipedia :)

VS. ECB

18

Giving, storing and wiping secrets

• Credentials• Password security• Storage security• Input security–Ctrl-Alt-Del

• Forgetfulness security– Encryption?–https://citp.princeton.edu/research/memory/

19

Access control

• Authentication → access• No authentication → no access

• What are we protecting?• Who is our adversary?– Threat model

• Who is trusted?• Where does enforcement occur?

20

Implementation considerations

• Kerckhoffs’ principle and Shannon’s maxim– Especially tempting to violate in case of “dirty”

code – I’ve been there!

• Watch your (unstated) assumptions– Example: Unsanitized (untrustworthy) input

• Adversaries• Side-channels• Performance

21

More considerations

• Correct tool for the job– Requirements (before, not after) – spend time on this

• Correct usage of the tool• Documentation!• Weakest links• Pay attention to potential non-cryptographic issues

such as side/covert channels–But you can never eliminate them: PROVABLE

• Think / test like an adversary22

Current state of symmetric encryption

• DES is too weak (56-bit key)• 3DES is weak (168-bit keys but only 2112

security – meet-in-the-middle attack)

• Recent weaknesses in AES:–AES-256 (2254.4) AES-192 (2189.7) AES-128 (2126.1)

http://research.microsoft.com/en-us/projects/cryptanalysis/aesbc.pdf

23

Current state of hash functions

• MD5 is broken– http://www.win.tue.nl/hashclash/

• SHA-1 is known to be weak– http://theory.csail.mit.edu/~yiqun/shanote.pdf (269)– http://eprint.iacr.org/2004/304 (2106, generalizable)

– SHA-256 (variant) is even weaker

• SHA-3 currently in “development” (NIST)–We have a winner: all hail Keccak (SHA-3)!– http://csrc.nist.gov/groups/ST/hash/sha-3/

24

Problems: Side channels

• Side-channel attacks VERY damaging–Power– Timing– Error messages• Different errors in SSH leak information

(mismatch between implementation and specification of CBC block cipher mode):

http://portal.acm.org/citation.cfm?id=586112

25

Distributed systems: Security

• Eliminating a single point of failure–Denial of service protection (robustness)

• Eliminating a single point of trust–What if your boss is malicious?

• If we want to reap benefits of distributed system designs, we have to take care of the “maybes”

• How?26

Distributed systems: Privacy

• Local system – local information• Distributed system – more access to

potentially private information• Privacy vs. authentication• Sometimes privacy is not a security

requirement, sometimes it is• Are there other potential security

requirements related to privacy?27

Source routing with capabilities

B, dataS3S2S1 B

S3

S2

S1

A

28

eCash

Broker

WitnessClient

Merchant

29

Chaum MixesBob

Alice

Output in lexographic order30

Global Adversary vs. MixBob

Alice

31

Tor

A

B

C

TCP over TCP (UGH!)32

Tor hidden services

A

B

C

D

E

F

33

Global adversary vs. TorBob

Alice

Entire Tor

network

Entire Tor

network

34

Tor network positioning attack

A

B

C

M

35

Tor linkability attack

A

B

C

36

Tor selective DoS attack

A

B

C

37

Tor and bridges

38

Enumerating Freenet

• Run a Freenet node; wait for nodes to contact you

• Or just query random “locations”

3939

ISPISP

Anonymity

ISPISPAS1AS1

AS2AS2

Anonymizing Network

40

ISPISP

Censorship resistance

ISPISPAS1AS1

AS2AS2

Anonymizing Network

MembershipConcealingNetwork

41

secretsecret Covert auth.!!

Hi? Hi!

XX

Hi? ??

42

Steganographic embedding

Linux 2.6 TCP SYN packet header with embedded MAC

43

Adeona

44

Novel Ideas in OTR

• Off-the-record–How is this different from what we’ve already

discussed (e.g. signatures)?– Threat model

• Why OTR?• Theoretical issues• Practical considerations–More on this next week

45

Tools and Concepts

• Deniability– Symmetric authentication– Symmetric malleable encryption–Key exposure

• Long-term keys–Authentication

• Perfect forward secrecy

46

Final Exam

• Significantly longer than exams I and II• (10) True/False• (5) Multiple choice• (8) Fill-in-the-blank• (7) Short answer–But some include sub-questions–Different point values depending on difficulty

and importance47