30
Application Assessment Techniques OWASP Northern Virginia OWASP Northern Virginia August 6 th , 2009

Application Assessment Techniques

Embed Size (px)

DESCRIPTION

This talk will review a number of application assessment techniques and discuss the types of security vulnerabilities they are best suited to identify as well as how the different approaches can be used in combination to produce more thorough and insightful results. Code review will be compared to penetration testing and the capabilities of automated tools will be compared to manual techniques. In addition, the role of threat modeling and architecture analysis will be examined. The goal is to illuminate assessment techniques that go beyond commodity point-and-click approaches to web application or code scanning.From the OWASP Northern Virginia meeting August 6, 2009.

Citation preview

Page 1: Application Assessment Techniques

Application Assessment Techniques

OWASP Northern VirginiaOWASP Northern Virginia

August 6th, 2009

Page 2: Application Assessment Techniques

Agenda• Background• Common Pitfalls in Application Assessment• Moving Beyond• Moving Beyond

– Threat Modeling– Code Review– Dynamic Testingy g

• Presenting Results• Questions / Panel Discussion

1

Page 3: Application Assessment Techniques

Background• Dan Cornell

– Principal at Denim Group www.denimgroup.comSoftware Developer: MCSD Java 2 Certified Programmer– Software Developer: MCSD, Java 2 Certified Programmer

– OWASP: Global Membership Committee, Open Review Project, SA Chapter Lead

• Denim GroupDenim Group– Application Development

• Java and .NET– Application Security

• Assessments, penetration tests, code reviews, training, process consulting

2

Page 4: Application Assessment Techniques

How Not To Do It

3

Page 5: Application Assessment Techniques

How Not To Do It• Q: What are you all doing to address application security concerns in

your organization?A W b ht “XYZ S ”• A: We bought “XYZ Scanner”

• Q: Okay… Are you actually using it?• A: We ran some scans• Q: And how did that go?• A: Oh we found some stuff…• Q: How did you address those issues?Q: How did you address those issues?• A: I think we sent the report to the developers. Not sure what they did

with them. I guess I ought to check in on that…

4

Page 6: Application Assessment Techniques

Goals of Application Assessment• Vary by organization, by application and by assessment

• Determine the security state of an application• Characterize risk to executives and decision makers• Prove a pointp• Set the stage for future efforts

5

Page 7: Application Assessment Techniques

Common Pitfalls in Application Assessment

6

Page 8: Application Assessment Techniques

Common Pitfalls in Application Assessment• Ad hoc approach

– Non-repeatable, non-comprehensive

R li t t d t l• Reliance on automated tools– Can only find a subset of vulnerabilities – false negatives– Even the good tools need tuning to reduce false positives

• Current commercial tools are biased• Current commercial tools are biased– Rulesets and capabilities typically over-focused on web applications

• Too focused on one approachStatic and dynamic testing have different strengths– Static and dynamic testing have different strengths

– Economic concerns constrain the amount of testing that can be performed – make the most of the time you have

7

Page 9: Application Assessment Techniques

Moving Beyond• Automated versus Manual• Threat Modeling• Dynamic Testing• Source Code Review

8

Page 10: Application Assessment Techniques

Automated Versus Manual

9

Page 11: Application Assessment Techniques

Automated Versus Manual• Automated tools are great at:

– Consistency - not getting tiredData flow analysis– Data flow analysis

• Automated tools are terrible for:– Understanding business context

• Manual testing is great at:• Manual testing is great at:– Identifying business logic flaws

• Manual testing is terrible for:

10

Page 12: Application Assessment Techniques

Threat Modeling• Provides high-level understanding of the system

– Useful for creating a structured test plan

P id li ti t t• Provides application context– Crucial for characterizing results

• Complementary with Abuse Cases

11

Page 13: Application Assessment Techniques

Threat Modeling Approach• Establish scope and system boundaries• Decompose the system into a Data Flow Diagram (DFD)• Assign potential threats based on asset types

12

Page 14: Application Assessment Techniques

Threat Model Example

13

Page 15: Application Assessment Techniques

Mapping Threats to Asset TypesThreat Type External

InteractorProcess Data Flow Data Store

S S fi Y YS – Spoofing Yes Yes

T – Tampering Yes Yes Yes

R – Repudiation Yes Yes Yes

I – Information Disclosure Yes Yes Yes

D – Denial of Service Yes Yes Yes

E – Elevation of Privilege Yes

14

Page 16: Application Assessment Techniques

Threat Modeling• Result is a structured, repeatable list of threats to check

– Strength is to find known problems repeatably

A t ith Ab C• Augment with Abuse Cases– “What could go wrong” scenarios– More creative and unstructured

15

Page 17: Application Assessment Techniques

Dynamic, Static and Manual Testing

Page 18: Application Assessment Techniques

Source Code Review• Advantages• Disadvantages• Approaches

17

Page 19: Application Assessment Techniques

Static Analysis Advantages• Have access to the actual instructions the software will be executing

– No need to guess or interpret behaviorFull access to all the software’s possible behaviors– Full access to all the software s possible behaviors

• Remediation is easier because you know where the problems are

Page 20: Application Assessment Techniques

Static Analysis Disadvantages• Require access to source code or at least binary code

– Typically need access to enough software artifacts to execute a build

T i ll i fi i i ft b ild• Typically require proficiency running software builds• Will not find issues related to operational deployment environments

Page 21: Application Assessment Techniques

Approaches• Run automated tools with default ruleset

– Provides a first-cut look at the security state of the applicationIdentify “hot spots”– Identify hot spots

• Craft custom rules specific to the application– 3rd party code– Break very large applications into manageable chunks– Break very large applications into manageable chunks– Application-specific APIs – sources, sinks, filter functions– Compliance-specific constructs

• This is an iterative processThis is an iterative process

20

Page 22: Application Assessment Techniques

Approaches• Auditing results from an automated scan

– Typically must sample for larger applications (or really bad ones)Many results tend to cluster on a per application basis coding idioms for error– Many results tend to cluster on a per-application basis – coding idioms for error handling, resource lifecycle

• Manual review– Must typically focus the effort for economic reasonsMust typically focus the effort for economic reasons– Hot spots from review of automated results– Security-critical functions from review of automated results – encoding,

canonicalization– Security-critical areas– Startup, shutdown

21

Page 23: Application Assessment Techniques

Dynamic Testing• Advantages• Disadvantages• Approaches

22

Page 24: Application Assessment Techniques

Dynamic Analysis Advantages• Only requires a running system to perform a test• No requirement to have access to source code or binary code• No need to understand how to write software or execute builds

– Tools tend to be more “fire and forget”

• Tests a specific, operational deployment– Can find infrastructure, configuration and patch errors that Static Analysis tools will

miss

Page 25: Application Assessment Techniques

Dynamic Analysis Disadvantages• Limited scope of what can be found

– Application must be footprinted to find the test areaThat can cause areas to be missed– That can cause areas to be missed

– You can only test what you have found

• No access to actual instructions being executed– Tool is exercising the application– Tool is exercising the application– Pattern matching on requests and responses

Page 26: Application Assessment Techniques

Approaches• Where possible/reasonable confirm findings of the source code review• Determine if mitigating factors impact severity

– WAFs, SSO, etc– Be careful with this

• Look at things easiest to test on a running applicationM h dli– Macro error handling

– Authentication and authorization implementation

25

Page 27: Application Assessment Techniques

Bringing Approaches Together• These approaches feed one another

– Valuable to be able to re-run tools and iterate between static and dynamic testing

R lt t b i t d i th t t th Th t M d l• Results must be communicated in the context the Threat Model– Severity, compliance implications, etc

26

Page 28: Application Assessment Techniques

Presenting Results

27

Page 29: Application Assessment Techniques

Presenting Results• Universal developer reaction:

– “That’s not exploitable”“That’s not the way it works in production”– That s not the way it works in production

• Demonstrations of attacks can inspire comprehension– This can be a trap – often demonstrating exploitability of a vulnerability takes longer

than fixing the vulnerabilitythan fixing the vulnerability

• Properly characterize mitigating factors– Often deployed incorrectly– Code has a tendency to migrate from application to applicationy g pp pp

• Risk is important – so is the level of effort required to fix

28

Page 30: Application Assessment Techniques

Questions?Dan [email protected] itt @d i l llTwitter: @danielcornell

(210) 572-4400

Web: www.denimgroup.comBlog: denimgroup.typepad.com

29