27
1 CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics Laboratory Precision Engagement - Operational Command & Control Technical Manager: Glenn Conrad – (240) 228-5031 – [email protected] Project Manager: F.T. Case – (240) 228-8740 – [email protected]

1 CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

Embed Size (px)

Citation preview

Page 1: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

1

CAOC Performance Assessment System“CPAS”

Information Age Metrics Working Group26 October 2005

JOHNS  HOPKINS U   N   I  V  E   R  S   I  T  Y

Applied Physics Laboratory

Precision Engagement - Operational Command & Control

Technical Manager: Glenn Conrad – (240) 228-5031 – [email protected] Manager: F.T. Case – (240) 228-8740 – [email protected]

Page 2: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

2

Background

CAOC performance assessment system (CPAS) project evolved from an the FY05 C2ISRC-sponsored project DeCAIF.

505th OS leadership identified the need for an “ACMI-like” capability for the CAOC-N instructor cadre. The system should be capable of: • Tracking targets through the F2T2EA “kill chain”

• Highlighting procedural errors that occur during execution

• Automatically generating the CAOC-N’s debriefing “shot sheet” C2 Battle Lab leadership approved APL’s proposed CPAS project as an

initiative in Jan 05.• APL was tasked to develop and demonstrate a prototype CAOC “instrumentation”

capability that would support TST operations at the CAOC-N and the TBONE initiative.

APL worked with CAOC-N subject matter experts (SMEs) on the spiral development of CPAS from Feb 05 – Aug 05 to establish the required info items and the look & feel of how CPAS would display them.

APL engineers demonstrated the functional CPAS X-1 prototype at CAOC-N during RED FLAG 05-4.2 (mid-Aug 05) and the X-2 prototype at the C2 Transformation Center during TBONE LOE#5 (JEFX 06 Spiral 0) in Sep 05.

Page 3: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

3

2/14/2003

CAOC PASDemonstrates a prototype “ACMI-like” instrumentation capability for Falconer instructor cadre. Enables near-real-time, end-to-end performance assessment and reconstruction of TST operations to support performance assessment.

CAOC PERFORMANCE ASSESSMENT SYSTEM (CPAS)

CAOC PAS provides a means to collect and store time-stamped information in near real-time during TST operations.

Develops and refines filters and algorithms that provide key information items required for Falconer crewmember situational awareness and performance assessment

Provides an end-user defined, near real-time presentation of time-lined activity information for presentation to individuals and groups during post-mission operations

TSTCell

C2 Duty Officer

Surf Track Cord

C2 Duty Officer

Surf Track Cord

SOLE FIRES N/S

Attack Cord S

Targeteer S

Rerole Cord S

Targeteer S

Rerole Cord S

Targeteer S

SOLE FIRES W

Attack Cord W

Targeteer W

Attack Cord W

Targeteer W

Rerole Cord W

UK Targeteer W

Rerole Cord W

UK Targeteer W

Rerole Cord N

Attack Cord N

Targeteer N

Targeteer N

Rerole Cord N

Attack Cord N

Targeteer N

Targeteer N

TST Cell Chief

Dep SIDODep TST Chief

TST Cell Chief

Dep SIDO

TST Cell Chief

Dep SIDODep TST Chief

DCO

RPTS BDA TargeteerRPTS BDA Targeteer

DTRA

SOF

Coalition SOF

DTRA

DTRA

SOF

Coalition SOF

SOF

Coalition SOF

DTRA

Navy

UK SOF

RAAF SOF

UK, AF & NavyUK, AF & Navy

Navy

Navy

Navy

Marine

Navy

US SOF

UK

Navy

Page 4: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

4

Project Objective

Demonstrate a capability to reconstruct and display events and decisions that occur within a TST cell to support post-mission assessment.

GOALS Develop a “non-intrusive” data capture (collection and archiving) capability

for TST-related information at CAOC-N Develop a reconstruction capability that will portray event and decision

activities that occurred during the execution of TST ops Develop an information injection capability that enables an instructor or

assessor to input comments/observations into the data archive Implement the CAOC Assessment System capability at CAOC-N and

demonstrate it during Red Flag 05-4.2 (Sep 2005) Develop initial capability to work with the C2B TBONE initiative and CAOC

processes using TBONE

Page 5: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

5

CPAS Operational Architecture for CAOC-N

NELLIS Range

Operations

CPASDR

TSTCell

C2 Duty Officer

Surf Track Cord

C2 Duty Officer

Surf Track Cord

SOLE FIRES N/S

Attack Cord S

Targeteer S

Rerole Cord S

Targeteer S

Rerole Cord S

Targeteer S

SOLE FIRES W

Attack Cord W

Targeteer W

Attack Cord W

Targeteer W

Rerole Cord W

UK Targeteer W

Rerole Cord W

UK Targeteer W

Rerole Cord N

Attack Cord N

Targeteer N

Targeteer N

Rerole Cord N

Attack Cord N

Targeteer N

Targeteer N

TST Cell Chief

Dep SIDODep TST Chief

TST Cell Chief

Dep SIDO

TST Cell Chief

Dep SIDODep TST Chief

DCO

RPTS BDA TargeteerRPTS BDA Targeteer

DTRA

SOF

Coalition SOF

DTRA

DTRA

SOF

Coalition SOF

SOF

Coalition SOF

DTRA

Navy

UK SOF

RAAF SOF

UK, AF & NavyUK, AF & Navy

Navy

Navy

Navy

Marine

Navy

US SOF

UK

Navy

O-O-D-A

CPASAlgorithms

CAOC-N network CPAS Presentation Display

Group After Action Review (AAR)

Individual Presentation

InstructorInjection

Capability

Page 6: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

6

CPAS Prototype Software

Developed in Microsoft Visual Studio (IDE) using Microsoft .NET Software Framework

• ~ 20K lines of C# and C++ code ; Runtime executable ~ 2 Mb

• Built on top of MS Access (DBMS)

• 3 logical constructs of software: collectors, algorithms, user interface Layered software architecture for both CAOC-N and TBONE

IMTDSLink 16

Collector

CPASAlgorithms

ADOCSCollector

Data Layer

Presentation Layer

CA

OC

-N

LA

N

ADOCSServer

JabberChat

Collector

CPASAlgorithms

WEEMCCollector

Data Layer

Presentation Layer

ExportedWEEMC

Msn HistoryFile

Jabber Server

Database

CAOC-N (X-1) TBONE LOE #5 (X-2)

Page 7: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

7

CPAS Functionality•Creates and Exports Summary of Target Activity

•“Drill-down” to F2T2EA Details for any Target

•Summary of Procedural Errors•ADOCS “Status Light” Slide Bar•ADOCS “Remarks” Search Engine

•Maintains Status Monitor of Targets in Near Real-Time

•Calculates TST measures and metrics of interest •Elementary search engine for IM chat dialogue

Page 8: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

8

CPAS X-1 Demo

ADOCSClient

ADOCSServer

CPAS

Page 9: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

9

Next Steps in CPAS Evolution

APL Focus

• Develop additional “collectors” and “correlation/fusion algorithms” for integration into CPAS X-1 and/or X-2 framework(s)

CAOC-N Focus

• Employ CPAS X-1 at CAOC-N during AWC sponsored exercises and training events• Develop and deliver CPAS V-1.0 capability to CAOC-N and evolve V-1 as required

AFRL/HE Focus

• Incorporate the CPAS X-1 “data collection instrument” with AFRL/HE’s (Mesa) subjective evaluation capability (Performance Measurement System) to provide an enhanced AOC Performance Measurement System

AFEO Focus

• Evolve and employ CPAS X-2 during JEFX Spiral 2/3/MainEx in support of AFEO assessment team

• Develop and deliver CPAS V-2.0 capability for use by AFEO assessment team during JEFX 2006 MainEx

• Work with JFIIT to integrate CPAS V-2.0 with WAM to provide an enhanced data collection and presentation capability for use during JEFX 2006 MainEx

Page 10: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

BACKUPSLIDES

Page 11: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

11

Why a CAOC Assessment System ?

The Problem: There is no efficient means to collect, store, correlate, and retrieve key information items needed to reconstruct a time history of activities and events that occur during real-time CAOC operations to answer the question “What happened ?”

The Need: (C2-C-N-06-05 & C2-A-N-02-03)

• A means to capture and archive time-stamped information from multiple CAOC operational sources,

• Algorithms to support correlation of events, and

• A means to visually portray both activities performed and decisions made for both operators and instructors alike.

The Payoff:

• Enable experts to assess what happened during training and operational sessions,

• Provide the information required to improve process shortfalls, identify training needs, and to highlight the things that are being done right

• Support archiving for future after-action review (AAR)

Page 12: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

12

CPAS Spiral Development

KA with CAOC-N

InstructorCadre

KA onPresentation

Prototype

RefinePresentation

Prototype

CAOC-NIntegration

Test #2(SP2A)

CAOC-NFunctional

Demo

TBONEInterfaceResearch

TBONEFunctional

Demo

Prototype Screens

Updated Screens Prototype DR

Updated DR Refine Presentation Initial Algorithms

Completed DR Refine Presentation

Robust DR Final Presentation Procedural Algorithms

• CAOC-N DR• CAOC-N Algorithms• CAOC-N Presentation• WEEMC Data

Feb 05

Mar 05

Aug 05

Aug 05

Sep 05

May 05

Aug 05

Page 13: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

13

TST Process @ CAOC - N

1. White Force (instructors) provide initial input to TST cell on potential target

2. Potential target info entered into ADOCS TDN manager and Intell develops target

3. Target is classified as a TST and entered into ADOCS TDL list

4. Targeting data is developed for target folder by Intel and targeting

5. Attack Coordinator searches ATO for available weapon platforms to be tasked to engage TST.

6. Nominations are provided to TST cell chief who approves pairing.

7. Positive target ID is accomplished by the Intell cell and SODO approves tasking order.

8. Tasking order (15-line text msg) is drafted by C2DO & GTC and transmitted via Link-16 or voice to controller (AWACS)

9. AWACS acknowledges receipt and passes info to weapon platform who accepts or rejects tasking.

10. Acknowledgment is provided to C2DO with estimated TOT from the weapon platform

11. Target is attacked12. Weapon platform provides MISREP to

AWACS from fighter with actual TOT and BDA (if available).

13. Target kill chain is complete

TSTCell

C2 Duty Officer

Surf Track Cord

C2 Duty Officer

Surf Track Cord

SOLE FIRES N/S

Attack Cord S

Targeteer S

Rerole Cord S

Targeteer S

Rerole Cord S

Targeteer S

SOLE FIRES W

Attack Cord W

Targeteer W

Attack Cord W

Targeteer W

Rerole Cord W

UK Targeteer W

Rerole Cord W

UK Targeteer W

Rerole Cord N

Attack Cord N

Targeteer N

Targeteer N

Rerole Cord N

Attack Cord N

Targeteer N

Targeteer N

TST Cell Chief

Dep SIDODep TST Chief

TST Cell Chief

Dep SIDO

TST Cell Chief

Dep SIDODep TST Chief

DCO

RPTS BDA TargeteerRPTS BDA Targeteer

DTRA

SOF

Coalition SOF

DTRA

DTRA

SOF

Coalition SOF

SOF

Coalition SOF

DTRA

Navy

UK SOF

RAAF SOF

UK, AF & NavyUK, AF & Navy

Navy

Navy

Navy

Marine

Navy

US SOF

UK

Navy

TST Ch

C2DOGTC

SIDO

ACFAnalysis

CoordFusion

CCO

Target Duty OFC

(TDO)

ATK COORD

WhiteForce

SIGINT

MASINT IMINT

IDOISR SCIF

SADO

SOLE NALEMARLOBCD

SODO

Wpn’eerTgt’eer

Page 14: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

14

Reactive Targeting Coordination Card

Page 15: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

15

CPAS Near Real Time Status Display

ADOCS

CPAS

Page 16: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

16

Generates Summary of Process Errors

Page 17: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

17

“Drill-down” to Kill-Chain Details for any Target

Page 18: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

18

Create “Shot Sheet” Summary of Target Activity and Export to Excel Spreadsheet

Page 19: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

19

IM Chat Correlation and Search

Correlates milestone events with IM chat dialogue

Page 20: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

20

Chat Log Search and Catalogue

Page 21: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

21

Measures & Metrics

These were measures of interest for CAOC-N instructors at RED FLAG 05-4.2• Number of targets prosecuted listed by their priority number

• Total number of targets prosecuted

• Average time of target prosecution

• Average time a target stayed in each kill chain segment

Page 22: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

22

TST Measures of Interest*

CPAS X-1 can calculate 34% of these measures of interest

● Green – can do now (12)

● Yellow – could do (7)

● White – harder to do - need additional collectors (15)

● Red – can’t do (1)

*e-mail correspondence - Bill Euker to F.T. Case - Friday, September 16, 2005 4:33 PM

TST Measure of Interest1 Detected entities correctly identified (%)     2 Detected entities incorrectly identified (%)   3 Detected entities reported without sufficient ID (%)   4 Source of detection and ID (entity name or node)       5 Source of ID error (entity name or node)       6 Detection to to ID (elapsed time)      7 ID to target nominated (elapsed time)  8 Target nominated to target approved (elapsed time)     9 Target approved to weapon tasked (elapsed time)

10 Positional accuracy of reported targets coordinates (TLE)      11 ISR reported targets on Track Data Nominator (quantity)12 ISR reported targets on ITM-Dynamic Target List (quantity)     13 Completeness of ISR reported targets (%)       14 Targets tasked to shooters (quantity)  15 Targets tasked to shooters via data link (quantity)    16 Targets tasked to shooters via voice (quantity)17 Targets tasked to shooters via both data link voice (quantity) 18 Target location provided verbally to shooters via coordinates or bullseye (quantity)   19 Target location provided verbally to shooters via talk-on from geographical features  (quantity)       20 Target location provided via digital cueing-e.g., track number and SPOI (quantity)     21 Targets successfully attacked (quantity)       22 Targets unsuccessfully attacked (quantity)     23 Targets assigned (quantity)    24 Time from target tasked to target attacked (quantity)  25 Targets attacked (quantity)    26 Targets not attacked (quantity)        27 Time form target detected to target successfully attacked (elapsed time)       28 Unsuccessful Attacks: Attacked wrong enemy target (quantity)   29 Unsuccessful Attacks: Attacked friendly (quantity)     30 Unsuccessful Attacks: Other (quantity and characterization)    31 Cause of unsuccessful attacks: C2 error (quantity)     32 Cause of unsuccessful attacks: Aircrew error (quantity)33 Cause of unsuccessful attacks: Environmental (quantity)34 Cause of unsuccessful attacks: Other (quantity)35 Timeliness of DT processing (time) ?   

Page 23: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

23

CPAS Calculation of the TST MOI

6

9

23

21

22

14

24

25

26

21

8

35

Page 24: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

24

CPAS Data Collection Sources

* From AFC2TIG, JEFX 02 TIME CRITICAL TARGETING DATA MANAGEMENT AND ANALYSIS PLAN, July 2002, para 4.2, pg 11..

During JEFX 2002, the following systems in the TCT section, on the Combat Ops floor, and on the Nellis ranges had the capability to automatically record time tagged information for post processing: *

• ADSI. Records link data received at the ADSI. Data is written to JAZZ disks for replay on ADSI.

• Sheltered JTIDS System (SJS). Records link data received and transmitted at the SJS. Data will be translated into ASCII format and provided on a CR-ROM on a daily basis.

• Joint Services Workstation (JSWS). Records data received and displayed on the JSWS. Data is written to 8mm DAT tapes for replay on JSWS.

• Information Workspace (IWS). • IWS voice will be recorded using VCRS connected to primary TCT nodes on the AOC floor.

• IWS text chat will be logged and downloaded at the end of each day.

• IWS peg-board activity will be manually captured using screen capture and stored for post-event analysis.

• Digital Collection Analysis and Review System (DCARS). DCARS will record all simulation traffic including ground game target locations.

• Tip OFF. A laptop loaded with Tip Off software will be connected to the TRS in the JICO cell and will log all incoming TIBS and TDDS traffic received at that node.

• GALE. The GALE system used by the Threat Analyst (SIGINT Tracker) will log all incoming SIGINT data and provide post event hard and soft copy detection and convolution data.

• WebTAS. WebTAS can be interfaced with the TBMCS SAA system and record all track information received and displayed on the SAA. Data is translated into ASCII format for inclusion into the master JEFX 02 TCT assessment database.

Page 25: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

25

Future Focus Areas

• “Automate” scenario selection/tailored training

– Scenarios designed to address specific individual and team

objectives based on performance

– Warehoused vignettes based on desired knowledge, skills, and

experiences

• Make real-time interventions during training event via

coaching, tutoring, scenario changes

• Part-Task Training capability for

planning

• AOC Brief/Debrief System

Page 26: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

26

AOC Performance Measurement

Training Objectives

Competencies,Knowledge and Skills

Tasks

Simulated Training Scenario

Stimulated or Trained by...

Improve by X%

Put together into vignettes...

IndividualTeam

Team-of-Teams

DataCollectionInstruments

MeasurementChallenges

Performance Measures by TaskHow well did learners perform?

KSA AssessmentWhat KSAs do learners have/lack?Diagnose individuals’ needs for additional training

Success in Meeting Training ObjectivesHow well are training objectives met?Success of training program

Training DevelopmentChallenges

Page 27: 1     CAOC Performance Assessment System “CPAS” Information Age Metrics Working Group 26 October 2005 JOHNS HOPKINS U N I V E R S I T Y Applied Physics

27

AOC Performance Measurement

• Research objective:

– Design and validate measurement methods, criteria, and tools

• Importance to the warfighter:

– Enables performance-based personnel readiness assessment – not “box-checking”

– Quantitative & Qualitative assessment of training effectiveness

• Status:

– Testing at CAOC-N (Jun 05)

– Approved Defense Technology Objective (HS.52)