17
Mr. David Jimenez, ETD US Army Test & Evaluation Center For All Audiences April 2015 U.S. Army Test and Evaluation Command US ARMY TEST AND EVALUATION COMMAND Supporting a Leaner, More Capable Expeditionary 2025 Army

ITEA SoS Conference ATEC

  • Upload
    ngonhi

  • View
    231

  • Download
    2

Embed Size (px)

Citation preview

Page 1: ITEA SoS Conference ATEC

Mr. David Jimenez, ETD

US Army Test & Evaluation Center

For All Audiences

April 2015

U.S. Army Test and Evaluation Command

US ARMY TEST AND

EVALUATION COMMAND

Supporting a Leaner, More Capable

Expeditionary 2025 Army

Page 2: ITEA SoS Conference ATEC

Mission, Vision, End State

MissionATEC plans, integrates and conducts experiments, developmental

testing, independent operational testing and independent evaluations and assessments to provide essential information to acquisition decision makers and commanders.

VisionTo be a team of highly skilled test and evaluation professionals

focused on informing equipping decisions for today’s and tomorrow’s warfighter.

End StateThe Army possesses a versatile mix of adaptable Soldiers and units

equipped with proven materiel based on decisions informed by independent evaluations of technically proficient and versatile experimental, developmental, and operational tests.

2

Page 3: ITEA SoS Conference ATEC

Where We Are

Ft GreelyCold Regions Test Center

Ft Huachuca

Ft Hood

Ft Sill

YumaProvingGround

Aberdeen ProvingGround

HQ ATEC

White SandsMissile Range Legend

Headquarters

Test Center

Operational Test Dir

PanamaTropic Test Site

AEC

Ft Bliss

DugwayProving Ground

Electronic Proving Ground

RedstoneArsenal

Ft Bragg

OTC

3

FY15 Assigned Personnel (8,279)*

504 Military

3664 Civilians

4111 Contractors

* as of 28Feb15 AEC – Army Evaluation Center OTC – Operational Test Command

Page 4: ITEA SoS Conference ATEC

4

= ATEC

Nevada Test and Training Range

Utah Test and Training Range

West Desert Test Center

Cold Regions

Test Center

NAWC-WD China Lake

Electronic Proving Ground

30th Space Wing

Keyport

NAWC-WD

Point Mugu

412th Test Wing

Yuma Test Center

Tropic Regions

Test Centervarious locations

US Army Kwajalein Atoll

Reagan Test Site

PMRF DISA, JITC

White Sands Test Center

96th Test Wing (includes 96th Test Group)

Atlantic Undersea Test andEvaluation Center

NAWC-ADPatuxent River

Arnold EngineeringDevelopment Complex

45th Space Wing

AberdeenTest Center

DISA

Legend:

Army Navy Air Force Defense Agency

Major Range and Test Facility Base(MRTFB*)

23 Sites – Army 8, Navy 6, Air Force 7, Defense Agency 2

*DoDD 3200.11 and DoDI 3200.18

Page 5: ITEA SoS Conference ATEC

ATEC Organizational Chart

ATEC: Army Test Evaluation Command

OTC: Operational Test Command

AEC: Army Evaluation Center

OTA: Operational Test Agency

WDTC: West Desert Test Center

WSTC: White Sands Test Center

ATC: Aberdeen Test Center

DPG: Dugway Proving Ground

EPG: Electronic Proving Ground

RTC: Redstone Test Center

WSMR: White Sands Missile Range

YPG: Yuma Proving Ground

BMD OTA: Ballistic Missile Defense Operational Test Agency

CRTC: Cold Regions Test Centers

TRTC: Tropical Regions Test Center

YTC: Yuma Test Center

IEWTD: Intelligence Electronic Warfare

Test Directorate

CG/Staff

/

BMD OTA

Redstone

AEC

/ ATC RTC YPG

• CRTC

• TRTC

• YTC

DPG

• WDTC

WSMR

/ OTC

/

*O/O IEWTD

(OPCON)

• WSTC

General Officer

Senior Executive Service

5

EPG

• IEWTD

*On Order IEWTD OPCON to OTC for Operational Test missions

Page 6: ITEA SoS Conference ATEC

Changes to Test and Evaluation

Policies

Stemming from DODI 5000.02 affecting

Army future programs and 2025

execution

6

Page 7: ITEA SoS Conference ATEC

ATECLEAD OTALEAD DT&E

ORGANIZATION

DODD 5000.01

“Each Military Department shall

establish an independent OTA

reporting directly to the Service

Chief to plan and conduct

operational tests, report results,

and provide evaluations of

effectiveness and suitability.”

DODI 5000.02 and Title 10

(A) providing technical expertise on testing and evaluation issues to the chief developmental tester for the program;

(B) conducting developmental testing and evaluation activities for the program, as directed by the chief developmental tester; and

(C) assisting the chief developmental tester in providing oversight of contractors under the program and in reaching technically informed, objective judgments about contractor developmental test and evaluation results under the program.

INDEPENDENT

FROM PM &

SUPPORTS

DECISION MAKERS

SUPPORTS

MATERIAL

DEVELOPMENT

AND

MATURATION

ATEC Has Two Roles

8

Page 8: ITEA SoS Conference ATEC

DODI 5000.02 Changes Impacting ATEC

Change in DODI 5000.02 Change to ATEC Process/Product

Shift Left - Emphasis on Developmental Test

and Evaluation (DT&E)

System Evaluation Plans and Test & Evaluation Master

Plans (TEMP) will include both:

• Developmental Evaluation Framework (DEF)

• Operational Evaluation Framework (OEF)

TEMP required at Milestone A

• Update at each Acquisition Milestone

• Early Strategy Review / Concept In-Process Review

replaced with Test & Evaluation Strategy Review (TESR)

• Approve TEMP inputs at each milestone

(TESR A, TESR B, etc)

Lead DT&E Organization reports to Chief

Developmental Tester at the PM

Continuous Evaluation publication will be used as

reporting mechanism for DT&E

Operational Test Agency Assessment of

Concept of Operations T&E Implications

New ATEC input into the TEMP at Milestone A

T&E Reports and Supporting Data and Meta-

data to DTIC

New requirement to provide supporting data and meta-

data in addition to the T&E Reports

Addition of a Developmental Request for

Proposal release milestone decision for

Engineering and Manufacturing Design (EMD)

and often for Low-Rate Initial Production (LRIP)

Will drive additional and more frequent input to PM for

acquisition strategy, test schedule, cost estimates, etc.

(TESR, SEP-U, CE PUB will help)

9

Page 9: ITEA SoS Conference ATEC

Change in DODI 5000.02 Change to ATEC Process/Product

Shift Left - Emphasis on Developmental Test

and Evaluation (DT&E)

System Evaluation Plans and Test & Evaluation Master

Plans (TEMP) will include both:

• Developmental Evaluation Framework (DEF)

• Operational Evaluation Framework (OEF)

TEMP required at Milestone A

• Update at each Acquisition Milestone

• Early Strategy Review / Concept In-Process Review

replaced with Test & Evaluation Strategy Review (TESR)

• Approve TEMP inputs at each milestone

(TESR A, TESR B, etc)

Lead DT&E Organization reports to Chief

Developmental Tester at the PM

Continuous Evaluation publication will be used as

reporting mechanism for DT&E

Operational Test Agency Assessment of

Concept of Operations T&E Implications

New ATEC input into the TEMP at Milestone A

T&E Reports and Supporting Data and Meta-

data to DTIC

New requirement to provide supporting data and meta-

data in addition to the T&E Reports

Addition of a Developmental Request for

Proposal release milestone decision for

Engineering and Manufacturing Design (EMD)

and often for Low-Rate Initial Production (LRIP)

Will drive additional and more frequent input to PM for

acquisition strategy, test schedule, cost estimates, etc.

(TESR, SEP-U, CE PUB will help)

Impacts to ATEC

• DEF required for ACAT-I and

recommended for all others.

• AEC provides major input to DEF.

• Separate table/section in TEMP

• SEP format change required to maximize

use of DEF tool.

• Advantage of providing the TCs clearer DT

guidance.

• DEF working efforts included for WIN-T

Inc3, AIAMD

10

DODI 5000.02 Changes Impacting ATEC

Page 10: ITEA SoS Conference ATEC

Developmental Evaluation Framework

Test and Evaluation Master Plan (TEMP) includes a Developmental Evaluation

Framework (“T&E Roadmap”)

―Knowledge gained from testing provides information for

technical, programmatic, and acquisition decisions. DoDI 5000.02 (Interim)

Developmental Evaluation Framework:– Identifies key data that contributes to assessing progress on:

− Key Performance Parameters

− Critical Technical Parameters

− Key System Attributes

− Interoperability requirements

− Cybersecurity requirements

− Reliability growth

− Maintainability attributes

− Developmental test objectives

− Others as needed

– Show the correlation/mapping between:

− Test events

− Key resources

− Decision supported

11

Page 11: ITEA SoS Conference ATEC

Increased T&E Activity at Early Stages in the

Life Cycle Management System

BA C

Materiel Development

Decision

CDD Validation

Dev RFP Release

Low-Rate Initial

Production (LRIP)

Initial Operational

Capability (IOC)

Full-Rate Production (FRP)

Decision

Full Operational Capability

(FOC)

Materiel Solution Analysis

Technology Maturation &

RiskReduction

OT&E, Production

and Deployment

Engineering & Manufacturing Development Sustainment Disposal

DoDI 5000.02

Initial

Capability

Document

Capability Based

Assessment(CBA)

AEC Req’tOps Concept

Analysis

Developmental Test Planning/Execution

& EvaluationOperational Test

Planning & Evaluation

TEMP

T&E WIPTCoE Experiment

Support

TEMP-U TEMP-U TEMP-U

EOA OA OA

EOA = Early Operational Assessment

OA = Operational Assessment

T&E WIPT = Test & Evaluation Working

Integrated Product Team

CoE = Center of Excellence

TEMP/U = Test & Evaluation Master

Plan/Update

Continuous

Data Collection

0

2

4

6

8

10

12

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50

Pre Nov 13 Current

Manye

ars

Delta

+370% +20% +11% +6% +1%

DEF

OEF

TESR A TESR B TESR C TESR FRP

SEP SEP-U SEP-U

12

Page 12: ITEA SoS Conference ATEC

ATEC: Reducing Risk in 2025

through Test & Evaluation

Optimizing T&E Data

• Obtaining and Retaining Information

• Rapid Analysis

• Greater Breath of Analysis

• Advanced Cybersecurity T&E

iknnn

i

ttinnnttn

ikn

i

ttintn

ikn

i

ttin

ikn

i

ttin

ikn

i

ttin

eeinnni

nnneen

eeini

ne

eeini

n

eeini

n

eeini

ntR

1544

22544335

44

22435

33

32323

22

32322

11

111

0

1)(

544

544)1(

5

0

1)(

4

4

0

1)(

3

3

0

1)(

2

2

0

1)(

1

1

!!

!1

!!

!

!!

!

!!

!

!!

!)(

Gartner’s 2015 Trends:

#4 – Advanced,

Pervasive, and Invisible

Analytics

16

Page 13: ITEA SoS Conference ATEC

Effectiveness, Suitability & Survivability

(ESS) Assessment and Reporting Policy

Overall two-tier effectiveness, suitability and survivability

Key Parameter Summary for XX System

KPP 1: Range (m)

KSA 2: Reliability

(MRBBEFF – Hrs

80% LCB)

0 150

KSA 1: Lethality (See

Classified Report)

KPP 2: Accuracy (m)20 0

KPP 3: Operational

Availability (%)0 100

90

KSA 3: Fire Mission

Time (sec) 180 0

150Eff

ecti

ven

ess

Su

itab

ilit

y

Legacy XX System Requirement+

5

0 8500

7390

Greater than YY system

7500

15 4

60

32

92 97

Requirement

Less Capability More Capability

8465

120

86 143

Builds upon AoA analysis conducted

by AMSAA & TRAC.

Intent is to show decision makers how

key system parameters compare to

the requirement and legacy systems.

Continuous scales convey greater

information than binary met/not met.

Scales need to be adjusted so less

capability is always on the left and

greater capability on the right. 0

should almost always be included on

each scale to limit the likelihood of

“lying with statistics”.

Depending on the program, assessed

numbers should be stated with

confidence (e.g. 80% LCB) as much as

possible for both legacy and new

systems.MRBEFF (80% LCB)

13

Page 14: ITEA SoS Conference ATEC

Operational Impact Rating Provides

Further Analysis of Test Result Impact

This is a “reasonableness” evaluation. For example• Minimum speed: test result = 53 mph. Requirement = 55 mph

• Requirement rating: could be yellow or red based on the full context of

the evaluation.

• Operational Impact: could be green based on the system CONOPs,

current system performance, etc.

Rating Symbol Operational Impact Rating Definition

Similar or

Enhanced

Capability

The system evaluation finding indicates that the operational capability is

similar to current capabilities, provides an improved capability, or provides

a new capability, relative to the operational task.

Reduced

Capability

The system evaluation finding indicates that the system may result in

decreased mission capability, relative to the operational task.

Significantly

Degraded

Capability

The system evaluation finding indicates that the system may have

significant, negative impact on mission capability, relative to the

operational task.

Unknown The system impact on mission operations is not known and cannot be

determined from the information and data available.

14

Page 15: ITEA SoS Conference ATEC

15

Summary of Cyber Security Risk Metrics and

Mapping to AEC DocumentsLR DHS Entities to examine Condition exploited by

attacker

Input to operational metric

1 HWAM Subnets and

removable storage

Unauthorized Devices Authorized device on subnet and authorized removable storage on devices; list of first and last

seen times of unauthorized devices, attachment logs for removable storage

2 SWAM Software on devices Unauthorized Software Un/Authorized software on each device. List of first and last seen times for each unauthorized

software package on each device, anti-malware findings

3 VULN Vulnerabilities on

devices

Known vulnerabilities Vulnerabilities for which we need to scan. List of first and last seen times and score for each

vulnerability on each device

4 CSM Configurations on

devices

Mis-configurations Tests (including expected results) on each device. List of first and last seen times and score for

each failed test on each device

5 TRUST Role assignments to

users

Roles assigned to

untrustworthy user

Role assignment to users providing access to assets, asset theft model, vetting history of users

6 CRED Role assignments to

accounts

Roles assigned to

compromised accounts

Role Assignments to Accounts providing access to assets, asset theft model, authentication

policy for accounts

7 PRIV Permissions and role

assignments to

accounts

Permissions assigned to

accounts and risk

incurring roles

Permissions that should be assigned to an account for each role assigned to the account.

Permissions assigned to each account, risky roles for each account

8 BEHAVE Users education Users lacking training or

education

Required education for each role a user hold

.Education history for each user relative to assigned roles

9 BOUND-N Network topology and

filtering

Overly permissive

reachability

Network topology and filtering and a exploitation model for protocols

10 BOUND-P Access to physical

spaces by user and

mechanism

Physical access to cyber

assets

Mapping of the resources of devices from LR-5, LR6, LR-7, mapping of devices to physical

areas, account usage pre devices, access control policies and assignments to physical areas

11 BOUND-V Encryption of

resources on devices

Lack of encryption

enabling resource theft

Inputs of LR-10 together with encryption status of resources and devices

SUT Environment

Page 16: ITEA SoS Conference ATEC

2025 T&E and Big Data

16

Goals:• Utilize knowledge, information, and data to

achieve core mission and business objectives.

• Faster, more Accurate Decision Making

• Cost Optimization

• Quicker Responses to Requests for Information

• More Holistic Test and Evaluation

• Automated tracking items or status

• Make useful big data capabilities available to everyone, but tailored to specific needs.

Sustainment of data for long term use

(Archival)

Discoverability and Access to data

Analytics of historical and current information

Derive context to inform decision

making

Common Core Requirements:

2025 T&E

Leveraging Historical Data

Faster, More Sophisticated

Analytical Tools

Modeling & Simulation

Design of Experiments

Cloud Computing

Page 17: ITEA SoS Conference ATEC

Conclusion

• Emerging DoD policies requires

in test and evaluation planning.

major risks earlier / increase likelihood

of successful program execution.

(two-tier) evaluation criteria provide

better insight to decision makers.

leveraged to the greatest extent possible

to provide content based analytic and design

insights.

18