24
Static Analysis for Dynamic Assessments Greg Patton | September 2014

Static Analysis for Dynamic Assessments Greg Patton | September 2014

Embed Size (px)

Citation preview

Static Analysis for Dynamic Assessments

Greg Patton | September 2014

Agenda

• Introduction• Background & observations• Static analysis for dynamic

assessments– RIPSA tool

• Takeaways

Introduction

Greg PattonMobile Delivery Manager, HP Fortify on Demand

• Work on Fortify on Demand team• Web & mobile dynamic application testing• Attended first OWASP meeting on June 5, 2007 (Houston, TX)

[email protected]

BACKGROUND & OBSERVATIONS

Great divides

• Security vs. Usability

• Builders vs. Breakers

• Dynamic vs. Static

Common dynamic challenges

• Lack of complete security assessments– Few conduct static and dynamic assessments in

concert

Common dynamic challenges

• Lack of complete security assessments– Few conduct static and dynamic assessments in

concert• Client-side false negatives– Dynamic tools and tests miss stuff

Common dynamic challenges

• Lack of complete security assessments– Few conduct static and dynamic assessments in

concert• Client-side false negatives– Dynamic tools and tests miss stuff

• “No source code available”– Dynamic testers rarely receive source code

A possible solution

Use static tools during dynamic assessments

Deeper analysis of JavaScript, HTML, XML, and other client-side files

STATIC ANALYSIS FORDYNAMIC ASSESSMENTS

RIPSA

• Accepts XML from Burp– Target Site Map– Proxy History

• Parses and saves responses as individual files on tester’s machine

• Output files can be scanned with static tools and manually audited

Save Burp responses as XML

RIPSA

Evaluate XMLSave files locally

Statically analyze local files

DEMO: RIPSARESPONSE INTERPRETATION AND PREPARATION FOR STATIC ANALYSIS

#Winning

Reduces potential false negatives by increasing breadth of dynamic web assessments

Utilizes information from Burp Suite that dynamic testers already collect

Pairs part of a static assessment with a full dynamic web assessment

#Winning

• Static tools– Fortify SCA, FxCop, JSHint, etc.

• JavaScript analysis – DOM based XSS

• Silverlight analysis• Gather and group files

– .dll files for disassembly– .pdf files for strings analysis

TAKEAWAYS

Takeaways

Embrace static

• Use static tools and techniques to dig deeper into client-side & DOM results– Use automated static tools– Disassemble and decompile

Java, Silverlight, Flash, etc.

Takeaways

Embrace static

• Use static information to assist with content discovery.– Map application– Identify files and targets

Call to the community

• ZAP extensions– Save responses as local files?– Static scanning signatures?

• Other ideas?

Special thanks

Special thanks toSam DenardDavid Nester

Reach out

[email protected]