15
SRS II Winter PI Meeting December 18, 2007 SRA // RABA CENTER

SRA // RABA CENTER

  • Upload
    lemuel

  • View
    56

  • Download
    1

Embed Size (px)

DESCRIPTION

SRA // RABA CENTER. SRS II Winter PI Meeting December 18, 2007. Agenda. Background SRA’s Infra-Red Team Approach Rules of Engagement What’s Next?. SRA SRS Phase II Quad Chart. Background. RABA was founded in 1994 as a Boutique Technology Company - PowerPoint PPT Presentation

Citation preview

Page 1: SRA  //  RABA CENTER

SRS II Winter PI MeetingDecember 18, 2007

SRA // RABA CENTER

Page 2: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Agenda

• Background• SRA’s Infra-Red Team Approach• Rules of Engagement• What’s Next?

Page 3: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

SRA SRS Phase II Quad Chart

Page 4: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Background• RABA was founded in 1994 as a Boutique Technology Company

• Acquired by SRA International, Inc. in October 2006

• Related Past Performance:• 6 Red Team evaluations on SRS Phase I prototypes

• AWDRAT (MIT), Dawson (GITI), Genesis (UVa)• LRTSS (MIT), QuickSilver (Cornell), Steward (Johns Hopkins University)

• Red Team review of Maryland's Direct Recording Electronic (DRE) voting system• Only one of its kind in the nation• Recommendations were adopted, in part, by Maryland, Ohio, and California

• Trusted contractor to the Intelligence Community for over 10 years designing and performing Red Team exercises for national systems

• Penetration testing in a classified environment both as government employees and private contractors

• Hardware and software systems

• Our Assessment Team:• Unique composition for each assessment depending on the technology• All TS SCI cleared individuals• All have extensive experience in US Gov, DoD, and Intel Community• All have extensive experience in Information Warfare and Information Operations• All have extensive systems / software development experience

Page 5: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

SRA

’s In

fra-

Red

Tea

m A

ppro

ach

85% of effort

5% of effort

10% of effort

START

PLANNING

INITIAL TIM

SCHEDULE EVENTS

ASSESSMENT PLAN

LEARNING

STUDY DOCUMENTATION

INTERVIEWS

PRE-TEST VISIT

STUDY CODE

ASSESSMENT PLAN

ASSESSMENT EXERCISE

ASSESSMENT EXERCISE OUTBRIEF

DATA ANALYSIS

RULES OF ENGAGEMENT

OUTBRIEF

ASSESSMENT REPORT

COMPLETE

RULES OF ENGAGEMENT

Page 6: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Red Team’s Perspective

Value and Mindset of the Red Team Approach

Existing System’s Perspective

Page 7: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

• Tailor Each Assessment Completely • to the technology under test and with respect to DARPA’s goals

• Unearth the Assumptions • of the designers / developers to clearly describe the current state

of the technology• Focus

• on the most important and innovating aspects• on the goal of providing a thorough evaluation of the technology

• Add value • to the development through a collaborative relationship with the

Blue Team• by not gaming the assessment in an effort to rack up points

• Extensively Study the Technologies• In-depth study of performer’s briefings, technical papers, source

code, and relevant publications on state-of-the-art

Keys to Success

Page 8: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

What We Have Done…

• Reviewed all four programs• Performer’s Briefings, Related Publications provided by Blue

Teams• Researched the state-of-the-art in each technology • Performed additional research into your specific solutions

and innovations • Publications by PIs available on the Internet

• Developed DRAFT RoEs for your review and feedback• Delivered 11/30/2007• Participated in 3 conference calls to resolve disputes as well

as exchanged several emails toward that end

Page 9: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Rules of Engagement• Collaborating with Blue / White Teams to produce a mutually-

agreeable RoE:• Based on lessons-learned from prior IPTO Red Teams – great benefit to think

through all the potential issues and scoring early rather than during the exercise itself.

• 6 Specific Areas of Interest:• Red Team Start Position• Allowed and Disallowed Red Team Activities• Victory Conditions• Events Counted for Score• Test Script

• Common Concerns voiced by Blue Teams:• No fair beating my dead horse!• That’s outside my scope – you can’t test that!• Scored vs. In-Scope vs. Out-of-Scope • My system doesn’t fit the metrics exactly, so the scoring diagram doesn’t work

for me.

Page 10: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Attack Distribution

• Uniqueness• Concerns regarding finding one attack that works, then

performing that attack multiple times to score more points• Our goal is not to score points. Our goal is to fully and

rigorously evaluate the prototype.• Irrelevant Attacks

• Concerns that attacks will not do something malicious with respect to the system under test

• Carefully define attack such that it affects the system in a way that is relevant way

Page 11: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Attack Distribution Cont.

• Issues with Not-So-Random Sampling and % Metrics• Concerns about attacks selected not representing the entire

population of attacks, but the subset population of attacks this prototype is not capable of protecting against

• A biased sample produces biased calculations • Validation, Conceded, and In-Determinant Test Sets

• Validation set to even the population a bit and to serve as verification that prototype meets the Blue Team’s claims

• Conceded set to prevent expending resources developing attacks where the outcome is known

• In-Determinant set to address those attacks where the outcome is not known

Page 12: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Scored vs. In-Scope vs. Out-of-Scope

Page 13: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Metrics and Scoring Diagrams

• Multiple interpretations of the written-word metrics• compound sentences and distribution of percentages

• Various definitions for the key performance indicators• detection, effective response, correct thwart, attribution

• In some cases, only partial compliance with the metrics as stated in the BAA• As DARPA’s advocates, we look to verify the prototypes

meet the criteria to the satisfaction of the PM• Prototypes are somewhat specialized and Blue Team

research is often more narrow than what the metrics measure

• Struggling to find a compromise that is fair to the Blue Team and still a satisfactory application of the metrics

Page 14: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

What’s Next?• Continue to study your technology• Visit your site

• Get hands-on experience with your prototype (when possible)• Engage in white-board discussions

• Collaborate and communicate attack strategies • Until the Go Adversarial Date

• Obtain the baseline version of your source code• Develop the Assessment Plan• Perform the Assessments

• Day 1: Logistics, Set-up and Assessment Plan Review• Day 2: Test Execution Event• Day 3: Execute Remaining Tests

Give the Assessment Out-Brief (last hour of event)• Analyze the Assessment Results, develop the Assessment Report• Close out the program with a Final Report

Page 15: SRA  //  RABA CENTER

SRA // RABA CENTER

SRS II Winter PI Meeting - December 18, 2007

Questions?