Best Effort Security Testing for Mobile Applications - 2015 #ISC2CONGRESS

Preview:

Citation preview

B.E.S.T.-MobileBest Effort Security Testing for Mobile Apps

Murat Lostar

AboutMurat Lostar@muratlostar

• IT since 1986• Full time security since 1998• Based in Istanbul & Turkey• ISC2 - CISSP, ISSMP, CSSLP, CCSP

Extracted Data from Apple AppStore• 3 Industries: Finance, eCommerce, Social

Media• 50 Applications:

• Finance: Akbank, Bloomberg, Chase, ING, PayPal…

• eCommerce: AliBaba, Amazon, eBay, Kliksa, n11…

• Social Media: Ask.fm, BIP, Facebook, Instagram, Tango…

• 1051 Revisions

Revision Age of Mobile Apps

0-3 4-7 8-15 16-31 32-63 64-127 128-255 256-511 512-0.00%

5.00%

10.00%

15.00%

20.00%

25.00%

30.00%

35.00%

eCommerce Finance Social Media

Source: Apple

n=1051

Average App Revision Count = 21.02

2009 2010 2011 2012 2013 2014 2015 -

5.00

10.00

15.00

20.00

25.00

30.00

First version relase year

Avg.

Tot

al N

umbe

r of V

ersio

ns Source: Apple

n=50 (1051)

App update period (days)

2010.5 2011 2011.5 2012 2012.5 2013 2013.5 2014 2014.5 2015 2015.52

20

200

Source: Apple

n=1051

The Situation• Mobile app development

• Agile is ad-hoc standard• New app release is under pressure

• Competition• OS updates & new devices

• New version every … 16-31 days (& decreasing)

The Requirement• Detailed manual penetration testing for

each version (before release)• If possible, source code review

The Situation for Mobile App Security• Testing (mobile) applications are

• Expensive• Takes time & resources

• Mostly perfomed periodically• Once a year• 4 times a year

Problem• Detailed penetration

NOT POSSIBLE (each time)• Not enough time• Not enough resources• Not needed!

• Most of the app is the same

• Is it?

• Each new version is subject to• Developer mistakes,• Security vulnerabilities

(including re-appear)• Time pressure

• Less user acceptance testing• Less developer & unit testing• Less code review

BEST-Mobile• Best Effort Security Testing for Mobile

Applications• Continue periodical, detailed pen-testing...• For each new version: Perform mobile

application security tests with best effort and best efficiency

• Use: Automated tools & Security Testing as a Service

Question• Can we trust automated testing?

• Is there enough investment on automated mobile vulnerability

Tool / Research Staff Counts

2010 2011 2012 2013 2014 20155152535455565758595

105

345678910111213

Staff Counts Tool Counts

Data gathered from projects’ offical websites and GitHub accounts.

Data From Mobile App Testers• Source

• OWASP – WhiteHat, Pure Hacking, Lostar, BugCrowd, SecureNetwork, Metaintelli

• 2014 – Detected and reported issues, n=1065 • Overall 54% found by automated tools

• How about risk level?

Top 10 Mobile Risks Exploitabili

ty Prevalence Detectability Impact BEST

Capability*

1 Weak Server Side Controls Easy Common Average Severe 61%

2 Insecure Data Storage Easy Common Average Severe 64%

3Insufficient Transport Layer Protection

Difficult Common Easy Moderate 91%

4 Unintended Data Leakage Easy Common Easy Severe 24%

5Poor Authorization and Authentication

Easy Common Easy Severe %14

Top 10 Mobile Risks Exploitabili

ty Prevalence Detectability Impact BEST

Capability*

6 Broken Cryptography Easy Common Easy Severe 90%

7 Client Side Injection Easy Common Easy Moderate No data

8Security Decisions Via Untrusted Inputs

Easy Common Easy Severe No data

9 Improper Session Handling Easy Common Easy Severe 16%

10 Lack of Binary Protections Medium Common Easy Severe 82%

Verdict • Automated testing can NOT replace

manual mobile penetration testing and code review.

• It can however, can be used to see the basics (i.e.Usefull in the sence of pareto rule)

Proposed Testing Process1. Classify Mobile Applications

• High, Medium, Low

Proposed Testing Process2. Use Classification for Pre-Relase Security

TestingClass Major Release Minor Release Regular Reviews

High Pre: PenTest + Code Review

Pre: PenTest (Opt/Post: Code Review)

External (PenTest+Code Review)

Medium Pre: PenTest (Opt/Post: Code Review)

Pre: BEST (Opt/Post: PenTest)

PenTest + Code Review

Low Pre: BEST (Post PenTest) Pre/Post: BEST PenTest

Proposed Testing Process3. Decision MatrixApp Class Major Issue Minor Issue

High Postpone to fix Postpone for Risk Advisory Board

Medium Postpone for Risk Advisory Board Postpone to verify and decide

Low Issue verification Deploy, then fix

Thank You@muratlostar

Next Step• Put BEST into mobile SDLC• Strategies

• Send the beta product to BEST tool• Integrate BEST tool with Bug Tracking System• Create automated bug/issue entries for each finding

• When• Ath the end of each sprint• Part of testing process

Recommended