32

A tale of two proxies

Embed Size (px)

DESCRIPTION

Presentation by Haroon Meer, Roelof Tammingh at black hat USA in 2006. This presentation is about Suru, the inline proxy tool developed by Roelof Tammingh. How it works and some of it's features are discussed.

Citation preview

Page 1: A tale of two proxies
Page 2: A tale of two proxies

• From the makers of Wikto, Crowbar and BiDiBLAH, the producers of Hacking by Numbers, Setiri and Putting the Tea in CyberTerrorism, the directors of When the tables turn, several Syngress fairy tales and the inspiration of the Matrix trilogy (…right…) comes a presentation so powerful and compelling…

Introduction

Page 3: A tale of two proxies

We wanted something that:• Does intelligent file and directory discovery (and Wikto was just not

cutting it anymore).• Does intelligent fuzzing of web applications (without trying to be too

clever about it).

After looking for long at how people use other web application assessment tools we found that:

• There is no ‘one-button’ web application assessment tool• Those who attempt to provide such tools mostly fail miserably.• People that are good at web application assessments still want to be

in control of every request that they make (that’s why the @stake webproxy rocked so much).

• While they still want to be in control, they perform some actions over and over (but with enough variation that it cannot be automated).

• They need something that can automate *some parts* of the assessment process effectively without taking away flexibility or power of doing it manually.

• The lines between the application and web server are blurring…

Why *another* proxy??

Page 4: A tale of two proxies

We wanted something that works like Nikto, but won’t be fooled by friendly 404s

We created Wikto in 2004• Some people still don’t know how the AI option works .The cleverness of Wikto sits in the content comparison

algorithm.

We created Crowbar early in 2005• Most people don’t know how it works .• Sadly, most people don’t know how to use it either…With Crowbar we expanded the thinking – we wanted to create

a generic brute forcer and ended up with something a lot more useful. Of all the tools up to this point, Crowbar was one of the most powerful – yet most people didn’t know how to use it properly.

We really wanted a proxy (for E-Or actually), so we took some proxy code and started mangling it early in 2006.

…it didn’t happen in one day

Page 5: A tale of two proxies

The content comparison algorithm basically it compares two strings.

In Wikto it compares the response for a test file with that of a file that will never exist on the system. If the response differs we know that the file is there.

GET /scripts/moomoomoo.pl HTTP/1.0 [BRR]GET /scripts/login.pl HTTP/1.0 [real test]

In Crowbar it compares the output of a test request with that of a ‘base response’. The user can choose the base response, and choose how she wants to construct the test response.

GET /scripts/login.pl?user=moo&pass=blah HTTP/1.0 [BRR]GET /scripts/login.pl?user=admin&pass=aaa HTTP/1.0

[real test]

So…how DOES it work?

Page 6: A tale of two proxies

Step 1 – crop header (if possible)Step 2 – split string A and B on \n, > and space =>

collectionA,BStep 3 – count blanks items in both A and B

Foreach itemA in collectionAforeach itemB in collection B

if (itemA==itemB)increment counterbreak

}}

}Return counter x 2 / ((#collectionA+#collectioB)-blanks)

And what about the content compare?

Page 7: A tale of two proxies

See it in action:<b> I am testing this </b> <b> doedelsakdoek</b><b> I am testing this </b><b> kaaskrulletjies</b>

Becomes:Collection A: I am testing this doedelsakdoekCollection B: I am testing this kaaskrulletjies

Matching count = [I] [am] [testing] [this] = 4 Blank count = zero #A + #B = 5+5 = 10

Return (4 x 2) / 10 = 0.8 or 80% match

<b> I was testing </b><b> I am testing them things </b>

Return (2 x 2)/8 = 0.5 or 50% match

And what about the content compare?

Page 8: A tale of two proxies

Crowbar also started to provide us with the ability to filter certain responses using a fuzzy logic trigger:

So…how DOES it work?

Page 9: A tale of two proxies

Crowbar also allowed us to do content extraction. For example consider ‘mining’ information from

Google regarding how many results for a certain item (a name in this case):

So…how DOES it work?

Page 10: A tale of two proxies

One of the most used features of Wikto is the ‘BackEnd miner’ used to discover directories and files.

What if the entire site is located behind /bigsite/ ? It fails to find anything cause its testing in the /.

That’s why we have mirroring option in Wikto – to find directories and mine within the known directories.

But what if the site has form based login (or something similar)?

That’s why Wikto sucks - it wouldn’t test anything beyond the login screen…

What about finding /bigsite/strange_form.bak from /bigsite/strange_form.asp ? Or .backup or .zip ? What about /bigsite/cgi-bin/bigsite ?

That’s why Wikto sucks – it does not know anything about the site itself. Wikto is a blind chicken, pecking away at dirt.

Why Wikto sucks

Page 11: A tale of two proxies

Now, if we had a proxy we could see where the user is browsing to and adjust our recon process accordingly:

• If we see /bigsite/content.php – Automatically start looking for other directories

within /bigsite/

• If we see /bigsite/moo_form.asp – Automatically start looking for moo_form.XX where XX

is all other extensions (like .zip and .backup and .old etc.)

• If we see /scripts/abc_intranet/login.php – Automatically start looking for /abc_intranet in other

places

• And while we’re at it – why not check the indexability of every directory we visited and mined?

Why Wikto sucks

Page 12: A tale of two proxies

Recon demo

Page 13: A tale of two proxies

• If we have a content comparison algorithm, then we can see if an application would react differently when we put ‘junk’ into it compared to ‘good’ data.

• In other words, we can send a whole lot of requests, and see what different responses are generated, and how the ‘good’ responses differ to the ‘bad’ responses.

• We can, in fact, group the responses by looking how they differ from a base response.

• In other words – when I send 1000 different requests to the application modifying a single parameter I could just get back 2 different responses.

Fuzzing with Suru

Page 14: A tale of two proxies

• Having a proxy, we can thus parse the request, break in nicely up into pairs and let the user decide what portion she wants to fuzz.

Fuzzing with Suru

Page 15: A tale of two proxies

• Of course, you can choose to fuzz ANYTHING in the HTTP request…

• We can also choose to extract anything from the reply…

• ..and group results automatically, with adjustable tolerance

Fuzzing with Suru (Demo)

Page 16: A tale of two proxies

Automatic relationship discovery• Compares md5, sha1, b64e and b64d of every

parameter with all other parameters (incl. cookie values)

WHY?• Example - after login the application uses the MD5

of your username to populate a cookie that’s used for session tracking (this is a very bad idea), or sending your password Base64 encoded in another parameter (also a bad idea).

• Search and replace on both incoming and outgoing streams with ability to also change binary data.

Other reasons why Suru is nice

Page 17: A tale of two proxies

Usability+++• Uses a IE browser object to replay requests [no issues with authentication

etc]• Change and replay request instantly whilst keeping track of what you’ve

done.• Edited requests are marked – you don’t need to find them in a sea of

requests.• Handles XML (for web services) MultiPart POSTs, and shows verb and number

of POST/GET parameter instantly (so you can choose the juicy requests quickly).

• Saving & loading of sessions.• Instantly fuzz any variable (and edit your fuzz strings in the app)• Free form fuzz strings (supports commenting) – NO limitation – only your

imagination – sorted by file name.• Instant access to HTTP raw request with automatic content length

recalculation.• Raw replay or browsed replay.• One click file/directory mining from recon tree.• User defined speed for recon (cause you want to be able to still surf the app).• Etc.etc.etc.

Other reasons why Suru is nice

Page 18: A tale of two proxies

And now for something completely different..And now for something completely different..

• Suru is a neat well packaged tool that addresses some unique needs

• LR is a collection of other peoples utilities (and some duct-tape^H^H python)

• Almost everything achievable by SP_LR is available through other tools in existence today..

• What does this mean?

I have no future in sales or marketing..

Page 19: A tale of two proxies

What is it?What is it?

• (Someday) Suru for generic TCP connections• (Today….) simple, extensible method to alter

packets (headers or payloads) within a TCP stream

• (Honestly) A collection of a few scripts around two much smarter open source projects

• Written in Python– Because all the cool kids were doing it

Why ?Why ?• To free you from current tools..• To get the juices flowing• To demonstrate how easily it can be done• To ponder some possibilities…

Page 20: A tale of two proxies

What about …What about …• Existing tools:

– ITR, ngrep, ….– Great when you are in a position to run the proxy on the

machine doing the testing– Generally modify payload or headers (seldom both)– Are either closed source (or involve scary looking

packet-fu)

The goal..The goal..• The ability to modify packets and payloads..• The ability to do this within complex conversation

sequences• The ability to do this comfortably within a scripting

environment• The ability to do this quickly leaving more time for

minesweeper…

Page 21: A tale of two proxies

How it currently works..How it currently works..

• Installed on gateway using Linux LIBIPQ or FreeBSD’s IPDIVERT.

• This moves packets from kernel to userspace program

• Heavy lifting then done by:– Neale Pickett’s ipqueue– Philippe Biondi’s scapy

Page 22: A tale of two proxies

A brief interlude..A brief interlude..

• to pay homage to scapy..• available from

http://www.secdev.org/projects/scapy by Philippe Biondi

• By far the easiest way to generate arbitrary packets

• #28 on Fyodors Top 100 Security Tools..• Which means…

– The majority of the people have yet to discover its coolness

• Some quick examples…

Page 23: A tale of two proxies

Scapy simplenessScapy simpleness

Page 24: A tale of two proxies

So… SP_LR simply does…So… SP_LR simply does…

• Get the packet through libipq• Decode the packet using scapy• Mangle the packet using scapy• Accept or Reject packet through libipq• s/foo/bar/• There is a tiny bit more…

– What about checksums?– The old sequence number chestnut.

Page 25: A tale of two proxies

Visio of payload increase + seq numberVisio of payload increase + seq number

Page 26: A tale of two proxies

But hold on.. This is classic mitmBut hold on.. This is classic mitm

• Once we alter payload length– We no longer let sequence or ack numbers through,

without first modifying their values.– Client and Server are both kept happy– We need to do this till the end of the session (or till we

adjust another payload to bring the delta to 0)– s/foo/SensePost Does Las Vegas/

Page 27: A tale of two proxies

Since we are inline…Since we are inline…

• We are in a position to alter data to or from the client.

• Interesting for client fuzzing• Interesting for lame-client-side security.• Lame client side security can be read as VNC

4.1 Authentication Bypass..

Page 28: A tale of two proxies

And obviously header modification is trivial..And obviously header modification is trivial..

• FreeBSD ECE Overloading:• Old bug:

“Overloading in the TCP reserved flags field causes ipfw/ip6fw to treat all TCP packets with the ECE flag set as being part of an established connection. A remote attacker can create a TCP packet with the ECE flag set to bypass security restrictions in the firewall.”

• We simply need to tag all our outgoing packets with the ECE flag.

• SensePost Exploit (2001) - 270 lines of C• SP_LR version (today) - X lines of python

Page 29: A tale of two proxies

Other Uses..Other Uses..

• Arbitrary DNS resolution• Malware Analysis• …

What it needs?What it needs?

• More fiddling..• An int3

– The client timeout problem…– An (untested) possible solution..

Page 30: A tale of two proxies

Window 0 and int3Window 0 and int3

• TCP Window Size• Tar Pits?• Hmmm.. What if ? (visio of win0 int3)

• Watch this space…

Page 31: A tale of two proxies

So…So…

• You should have a easy to use, trivial to extend alternative to current packet mangling options.

• You should be in a position to mangle payloads and headers from the warm cozy python environment

• Most importantly, you should have some ideas about stuff you would like to fiddle with..

• .tgz will be made available for download off http://www.sensepost.com/research

Page 32: A tale of two proxies

• Suru is a very nice new MITM web application proxy.• Suru still allow the analyst the freedom of thought,

but automates the mundane. • Suru is a combination between a useful proxy and

the best features of Wikto, Crowbar and E-Or.• If you are new to web application assessment you

should perhaps start off with another proxy – Suru is intense.

• Suru was written by people that do hundreds of web application assessments every year. In others words – a web application testing tool by people that do web application testing.

Suru lives at:http://www.sensepost.com/research/suru

Conclusion