22
CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

Embed Size (px)

Citation preview

Page 1: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

CARE ASAS Validation Framework

System Performance Metrics 10th October 2002M F (Mike) Sharples

Page 2: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

2

Content

• Aims

• Approach

• Analysis

System Performance Metrics

Page 3: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

3

Aims

• Using recognised metrics is fundamental to measuring system performance

• The ASAS Validation Framework requires consistent metrics to provide comparable results

• The ‘System Performance Metrics’ work demonstrates a method for identifying existing metrics for new scenarios

System Performance Metrics

Page 4: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

4

Approach

• Considerable existing work in this area

– PRS

– C/AFT

– TORCH

– INTEGRA

• Collating these required a consistent hierarchy & taxonomy

System Performance Metrics

Page 5: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

5

Hierarchy

OBJECTIVES PERFORMANCE METRICSAREAS

System Performance Metrics

Page 6: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

6

Hierarchy

• OBJECTIVES

– Tie in with ATM 2000+ Strategy

– High level & therefore no direct measure

• PERFORMANCE AREAS

– Tie in with PRC (as this gives greatest commonality)

– Lower level & therefore easier to measure

• METRICS

– The measurements that can be made

System Performance Metrics

Page 7: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

7

TaxonomySystem Performance Metrics

METRICS INDICATORSCARE-ASAS PRCValidation C/AFTFramework TORCH

METRICDEFINITION

MEASUREEvent, ratio or unit thatis quantifiable

Page 8: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

8

Linkage (many to many)

• ECONOMICS

• ENVIRONMENT

• SECURITY / DEFENCE

System Performance Metrics

Delay (not capacity)

Cost effectiveness

Flight Efficiency

Environment regulation

Military Co-operation

Military Access

Air transport security

Metrics

Page 9: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

9

Further breakdown

• PERFORMANCE AREAS broken down into ASPECTS where appropriate

• Example:– ACCESS (PERFORMANCE AREA)

• Airports

• Sectors (ASPECTS)

• Routes

• Assists use with scenarios that look at specific airspace

System Performance Metrics

Page 10: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

10

Perspectives

• Different views (perspectives) can be applied to the selection of metrics:

– Airline perspective as in C/AFT

– ATM perspective as in PRC

– Validation technique

• Permits further breakdown and filtering other than purely hierarchical

System Performance Metrics

Page 11: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

11

Example of perspective

• Performance Area: Flight efficiency

– Airline perspective

• Actual fuel burn .v. planned fuel burn

– ATM perspective

• Efficiency of route structure

System Performance Metrics

Page 12: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

12

Characteristics

• Further criteria for selecting metric suitability– Objectivity Objective/subjective

– Intrusive High / Low

– Cost High / Low

– Reliability High / Low

– Validity High / Low

– Utility High / Low

– Expertise High / Low

– Resource High / Low

System Performance Metrics

Page 13: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

13

Analysis

• To illustrate feasibility of approach a ‘demonstrator’ database was created

• 230 System Performance Metrics stored on database

• Derived from recognised sources

• Preliminary metric classification

• Perspectives available

– ATS provider / Operator / ASAS / Analysis Type(or any combination of these)

System Performance Metrics

Page 14: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

14

Metrics storageSystem Performance Metrics

Page 15: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

15

Cross link queriesSystem Performance Metrics

Page 16: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

16

Flexible outputSystem Performance Metrics

Page 17: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

17

ASAS case studies

• Time based sequencing in approach

• Airborne self-separation in en-route airspace

System Performance Metrics

Page 18: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

18

Metrics selection criteria• Time based sequencing in approach

– Selected Objectives: Safety; Capacity; Economics

– Selected Performance Areas: Safety; Delay; Cost Effectiveness; Flexibility; Flight Efficiency

– Methodology: Each of...• 1 Analytic or fast-time simulation

• 2 Real-time simulation

– Airspace: TMA / Airport

– Perspective: ASAS & each of...• 1 Operator

• 2 Service provider

System Performance Metrics

Page 19: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

19

Metrics selection criteria• Airborne self-separation in en-route airspace

– Selected Objectives: Safety; Capacity; Economics

– Selected Performance Areas: Safety; Delay; Cost Effectiveness; Predictability; Flexibility; Flight Efficiency; Equity

– Methodology: Each of...• 1 Analytic or fast-time simulation

• 2 Real-time simulation

– Airspace: En-route

– Perspective: ASAS & each of...• 1 Operator

• 2 Service provider

System Performance Metrics

Page 20: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

20

Metrics selection

• Microsoft Access prototype developed to demonstrate the filtering and selection process

• Automated selection process provides guidance

– Identifies metrics used in previous work

– List is not definitive or restrictive

• Once automatic selection process is complete, a manual overview can select the most appropriate metrics

System Performance Metrics

Page 21: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples

21

Conclusions

• System performance metrics can be linked to the strategic objectives of ATM (and ASAS)

• The work has successfully consolidated metrics from a number of sources

• Effective filtering requires effective classification - this will necessarily be an ongoing and iterative process

• Selection process provides guidance - it is not definitive or restrictive

System Performance Metrics

Page 22: CARE ASAS Validation Framework System Performance Metrics 10th October 2002 M F (Mike) Sharples