211
Contract No. TREN/04/FP6AE/SI2.374991/503192 Project Funded by European Commission, DG TREN The Sixth Framework Programme Strengthening the competitiveness Contract No. TREN/04/FP6AE/SI2.374991/503192 Project Manager M. Röder Deutsches Zentrum für Luft und Raumfahrt Lilienthalplatz 7, D-38108 Braunschweig, Germany Phone: +49 (0) 531 295 3026, Fax: +49 (0) 531 295 2180 email: [email protected] Web page: http://www.dlr.de/emma © 2007, - All rights reserved - EMMA Project Partners The reproduction, distribution and utilization of this document as well as the communication of its contents to other without explicit authori- zation is prohibited. This document and the information contained herein is the property of Deutsches Zentrum für Luft- und Raumfahrt and the EMMA project partners. Offenders will be held liable for the payment of damages. All rights reserved in the event of the grant of a patent, utility model or design. The results and findings described in this document have been elaborated under a contract awarded by the European Commission. Verification and Validation Analysis Report J. Teutsch et al. NLR Document No: D6.7.1 Version No. 1.0 Classification: Public Number of pages: 211

Verification and Validation Analysis Report · Verification and Validation Analysis Report Save date: 2007-06-29 Public Page 7 File Name: D671_Analysis_V1.0.doc Version: 1.0 2 Analysis

  • Upload
    others

  • View
    11

  • Download
    0

Embed Size (px)

Citation preview

Contract No. TREN/04/FP6AE/SI2.374991/503192

Project Funded by European Commission, DG TREN The Sixth Framework Programme Strengthening the competitiveness

Contract No. TREN/04/FP6AE/SI2.374991/503192

Project Manager M. Röder

Deutsches Zentrum für Luft und Raumfahrt Lilienthalplatz 7, D-38108 Braunschweig, Germany

Phone: +49 (0) 531 295 3026, Fax: +49 (0) 531 295 2180 email: [email protected]

Web page: http://www.dlr.de/emma

© 2007, - All rights reserved - EMMA Project Partners The reproduction, distribution and utilization of this document as well as the communication of its contents to other without explicit authori-zation is prohibited. This document and the information contained herein is the property of Deutsches Zentrum für Luft- und Raumfahrt and the EMMA project partners. Offenders will be held liable for the payment of damages. All rights reserved in the event of the grant of a patent, utility model or design. The results and findings described in this document have been elaborated under a contract awarded by the European Commission.

Verification and Validation Analysis Report

J. Teutsch et al.

NLR

Document No: D6.7.1 Version No. 1.0

Classification: Public Number of pages: 211

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 2 File Name: D671_Analysis_V1.0.doc Version: 1.0

Distribution List

Member Type No. Name POC Distributed1

Internet http://www.dlr.de/emma 2007-06-29 Web Intranet https://extsites.dlr.de/fl/emma 2007-06-29 1 DLR Joern Jakobi 2 AENA Mario Parra 3 AIF Patrick Lelievre 4 AMS Giuliano D'Auria 5 ANS CR Miroslav Tykal 6 BAES Stephen Broatch 7 STAR Jens Olthoff 8 DSNA Nicolas Marcou 9 ENAV Antonio Nuzzo 10 NLR Juergen Teutsch 11 PAS Alan Gilbert 12 TATM Stephane Paul 13 THAV Alain Tabard 14 AHA David Gleave 15 AUEB Konstantinos G.Zografos 16 CSL Libor Kurzweil 17 DAV Rolf Schroeder 18 DFS Klaus-Ruediger Täglich 19 EEC Stephane Dubuisson 20 ERA Jan Hrabanek 21 ETG Thomas Wittig 22 MD Julia Payne 23 SICTA Salvatore Carotenuto

Contractor

24 TUD Christoph Vernaleken CSA Karel Muendel Sub-Contractor N.N.

Customer EC Morten Jensen

Additional EUROCONTROL Paul Adamson

1 Please insert an X, when the PoC of a company receives this document. Do not use the date of issue!

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 3 File Name: D671_Analysis_V1.0.doc Version: 1.0

Document Control Sheet Project Manager Michael Röder

Responsible Author J. Teutsch et al. NLR

L.J.J. de Nijs NLR

J. Jakobi DLR

S. Loth DLR

S. Paul TATM

K.G. Zografos AUEB

Additional Authors

S. Carotenuto SICTA

Subject / Title of Document: Verification and Validation Analysis Report

Related Task(s): WP6.7

Deliverable No. D6.7.1

Save Date of File: 2007-06-29

Document Version: 1.0

Reference / File Name D671_Analysis_V1.0.doc

Number of Pages 211

Dissemination Level Public

Target Date 29-Jun-2007

Change Control List (Change Log)

Date Release Changed Items/Chapters Comment 2006-04-06 0.01 ALL Initial draft (template)

2006-04-10 0.02 6 Template for requirement analysis

2006-04-28 0.03 2.2 TATM contribution (Toulouse results)

2006-05-04 0.04 2, 3, 4, Appendices DLR contribution (Prague results)

2006-10-27 0.05 ALL AUEB contribution (Milan results)

2006-11-24 0.06 ALL Layout

2006-12-22 0.07 5.1 Missing partner contributions

2007-02-01 0.08 2.3, 3.3, 4.3, Appendices SICTA contribution (Milan results)

2007-06-29 0.09 ALL European Commission comments

2007-06-29 1.0

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 4 File Name: D671_Analysis_V1.0.doc Version: 1.0

Table of Contents Distribution List ...................................................................................................................................... 2 Document Control Sheet ......................................................................................................................... 3 Change Control List (Change Log) ......................................................................................................... 3 Table of Contents .................................................................................................................................... 4 1 Introduction .......................................................................................................................................... 6 2 Analysis and Description of Technical Performance ........................................................................... 7

2.1 Prague............................................................................................................................................ 7 2.1.1 Introduction ............................................................................................................................ 7 2.1.2 Results .................................................................................................................................. 11

2.2 Toulouse ...................................................................................................................................... 12 2.2.1 Analysis of Latency in the Surveillance Processing Chain .................................................. 12 2.2.2 Analysis of Track / Flight Plan Correlation Mechanisms .................................................... 19

2.3 Malpensa ..................................................................................................................................... 23 2.3.1 Introduction .......................................................................................................................... 23 2.3.2 Methods and Tools for Data Collection and Analysis.......................................................... 23 2.3.3 Results .................................................................................................................................. 24 2.3.4 Conclusions .......................................................................................................................... 28

2.4 Combined Analysis ..................................................................................................................... 30 3 Analysis and Description of Operational Feasibility.......................................................................... 32

3.1 Prague.......................................................................................................................................... 32 3.1.1 Introduction .......................................................................................................................... 32 3.1.2 Results .................................................................................................................................. 32

3.2 Toulouse ...................................................................................................................................... 47 3.2.1 General ................................................................................................................................. 47 3.2.2 Surveillance .......................................................................................................................... 47 3.2.3 Routing ................................................................................................................................. 47 3.2.4 Control.................................................................................................................................. 48 3.2.5 Human Machine Interface .................................................................................................... 48 3.2.6 Conclusion............................................................................................................................ 48

3.3 Malpensa ..................................................................................................................................... 49 3.3.1 Malpensa Real-time Simulations.......................................................................................... 49 3.3.2 Shadow-mode....................................................................................................................... 53 3.3.3 Conclusions .......................................................................................................................... 57

4 Analysis and Description of Operational Improvements ................................................................... 59 4.1 Prague.......................................................................................................................................... 59

4.1.1 Introduction .......................................................................................................................... 59 4.1.2 Safety.................................................................................................................................... 61 4.1.3 Efficiency/Capacity .............................................................................................................. 64 4.1.4 Human Factors ..................................................................................................................... 71 4.1.5 Conclusions .......................................................................................................................... 78

4.2 Toulouse ...................................................................................................................................... 81 4.2.1 Safety.................................................................................................................................... 81 4.2.2 Efficiency ............................................................................................................................. 81 4.2.3 Capacity................................................................................................................................ 81 4.2.4 Human Factors ..................................................................................................................... 82 4.2.5 Conclusion............................................................................................................................ 82

4.3 Malpensa ..................................................................................................................................... 83 4.3.1 Safety.................................................................................................................................... 83 4.3.2 Efficiency ............................................................................................................................. 85 4.3.3 Capacity................................................................................................................................ 97 4.3.4 Human Factors ................................................................................................................... 111 4.3.5 Conclusions ........................................................................................................................ 112

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 5 File Name: D671_Analysis_V1.0.doc Version: 1.0

5 Analysis of Results with regard to EMMA Requirements ............................................................... 113 5.1 Consequences for Technical Requirements............................................................................... 113 5.2 Consequences for Operational Requirements ........................................................................... 114 5.3 Impact of V&V Results on A-SMGCS Concept ....................................................................... 115

6 References ........................................................................................................................................ 116 7 Abbreviations ................................................................................................................................... 119 8 Figures and Tables............................................................................................................................ 123

8.1 List of Figures ........................................................................................................................... 123 8.2 List of Tables............................................................................................................................. 123

Appendix A Detailed Results for Technical Requirements ......................................................... 125 Appendix B Detailed Results for Operational Requirements .......................................................... 161 Appendix C Additional Results for Operational Requirements....................................................... 206

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 6 File Name: D671_Analysis_V1.0.doc Version: 1.0

1 Introduction This document provides a summary of all high-level objectives addressed in the EMMA project. Re-sults are given per airport but there will also be an analysis to identify commonalities in the results for the different airports.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 7 File Name: D671_Analysis_V1.0.doc Version: 1.0

2 Analysis and Description of Technical Performance

2.1 Prague

2.1.1 Introduction

2.1.1.1 EMMA Test-Bed at Prague Ruzynĕ Tower The following figure shows the architecture of the EMMA test-bed system used for the technical tests at Prague.

LAN Switch/RouterRANC

SDS RPS

AUX

CWP-1

CWP-1

CWP-2

CWP-2

CWP-3

CWP-3

CWP-4

CWP-4SDS/RPS

KVM Switch

TECAMS

TECAMS

CDD GEC TEC TPC

MLATRCMS

VSDFLocal Area Networkof

ANS CRIP-RS &

Companel

SMR

MLAT/ADS-B

ASR - E2000

MVP (Gap-Filler)

AGL - AMS.2

FDPS/ESUP

TIME - NTP

Companels& RS-IP

Sensors andInformation Sources

Figure 2-1: EMMA Test-Bed Set-up at Prague

The EMMA test-bed system at Prague-Ruzynĕ airport consists of a combination of hardware and software components provided specifically for the EMMA project together with the pre-existing infra-structure. This infrastructure includes the surveillance sensors (SMR, MLAT, and ASR-E2000), the Flight Data Processing System (FDPS-ESUP), the Aerodrome Ground Lighting (AGL) system, and the local area network (LAN). Components provided specifically for the EMMA test-bed comprise the following items: • Surveillance Data Server (SDS) • Technical Control and Monitoring System (TECAMS) • Recording and Playback System (RPS) with Auxiliary Mass Storage Unit (AUX)

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 8 File Name: D671_Analysis_V1.0.doc Version: 1.0

• Keyboard/Video/Mouse (KVM) switch • Controller Working Positions (CWP) denoted CDD, GEC, TEC and TPC • SMR Extractor (RANC) • Gap-Filler System, including Machine Vision Processor (MVP) sensors, communication panels

(Companels) and RS-485 to Internet converters, and Video Sensor Data Fusion (VSDF) • MLAT/ADS-B Processing System, including Remote Control and Monitoring System (RCMS) In addition, eighty vehicles belonging to ANS CR and Prague Airport Company were equipped with Mode S squitter beacons (SQB).

2.1.1.2 Indicators and Measurement Instruments The definition of indicators that were to be measured can be found in the document D6.1.2 Test Plan - Prague [7]. Only the key words and abbreviations are repeated here. The most important technical performance requirements were to be assessed by 18 verification indica-tors. Their relation to the TRD [4], ORD [3], ICAO [18], and EUROCAE (MASPS) [17] technical requirements can be seen in the table below. The verification tests aim primarily at assessing the long-term quality of the surveillance and conflict detection performance. These long-term measurements were to be performed by the recording and analysis tool MOGADOR, which is described in Document D1.1.2 CDG A-SMGCS Data Analysis [1]. Long-term measurements were conducted by DLR in January 2006. Other measurement instruments were Matrices of Detection and Identification, described in the data analysis section below. In addition, short-term tests were to be performed prior to the long-term tech-nical and operational test period in order to assess the readiness of the test-bed system and to verify by visual observation the system’s compliance with the technical requirements in D3.1.1 [5]. The following table summarises the indicators and measurement instruments associated with the veri-fication of the technical performance requirements.

ID Indicator Acronym Requirement Reference Measurement

Instruments

VE-1 Coverage Volume

CV Approaches Manoeuvring Area Apron taxi lines

TRD: Tech_Surv_01; 02 ORD: Op_Serv-07 ICAO: 4.1.1.4 MASPS: 3.1.3

Recording Observations MOGADOR

VE-2 Probability of Detection

PD ≥ 99.9% TRD: Tech_Surv_35 ORD: Op_Perf-01 ICAO: 3.4.1.4.a MASPS: 3.2.3

Recording Observations MOGADOR Matrix of Detection

VE-3 Probability of False Detection

PFD < 10E-3 per Re-ported Target

TRD: Tech_Surv_36 ORD: Op_Perf-02 ICAO: 3.4.1.4.b MASPS: 3.2.3

Recording Observations MOGADOR Matrix of Detection

VE-4 Reference Point RP Not defined TRD: Tech_Gen_45 ORD: None ICAO: 3.5.7; 4.2.2 MASPS: 3.2.1.2

Recording Observations

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 9 File Name: D671_Analysis_V1.0.doc Version: 1.0

ID Indicator Acronym Requirement Reference Measurement Instruments

VE-5 Reported Position Accuracy

RPA ≤ 7.5 m at a confidence level of 95%

TRD: Tech_Surv_26 ORD: Op_Perf-05; 15 ICAO: 4.2.3 MASPS: 3.2.3

Recording Observations

VE-6 Reported Position Resolution

RPR ≤ 1 m TRD: Tech_Surv_27 ORD: Op_Perf-06 ICAO: None MASPS: 3.2.3

Recording Observations

VE-7 Reported Position Discrimination

RPD Not defined TRD: None ORD: None ICAO: None MASPS: None

Recording Observations

VE-8 Reported Velocity Accuracy

RVA Speed: ≤ 5 m/s Direction: ≤ 10° at a confidence level of 95%

TRD: Tech_Surv_28; 29 ORD: Op_Perf-16 ICAO: 4.1.1.8, 4.1.1.10 MASPS: 3.2.3

Recording Observations

VE-9 Probability of Identification

PID ≥ 99.9% for identi-fiable Targets

TRD: Tech_Surv_37 ORD: Op_Perf-03 ICAO: 3.4.1.4.c MASPS: 3.2.3

Recording Observations MOGADOR

VE-10 Probability of False Identifica-tion

PFID < 10E-3 per Re-ported Target

TRD: Tech_Surv_38 ORD: Op_Perf-04 ICAO: 3.4.1.4.d MASPS: 3.2.3

Recording Observations MOGADOR

VE-11 Target Report Update Rate

TRUR ≤ 1 s

TRD: Tech_Surv_34 ORD: Op_Perf-08 ICAO: 4.2.4 MASPS: 3.2.3

Recording Observations

VE-12 Probability of Detection of an Alert Situation

PDAS ≥ 99.9% TRD: Tech_Cont_11 ORD: None ICAO: 4.5.1 MASPS: 3.3.3

Recording Observations

VE-13 Probability of False Alert

PFA < 10E-3 per Alert TRD: Tech_ Cont_12 ORD: Op_Perf-20 ICAO: 4.5.1 MASPS: 3.3.3

Recording Observations

VE-14 Alert Response Time

ART ≤ 0.5 s

TRD: Tech_ Cont_13 ORD: None ICAO: 4.5.2 MASPS: 3.3.3

Observations

VE-15 Routing Process Time

RPT < 10 s TRD: None ORD: None ICAO: 4.3.2

Not applicable for Prague

VE-16 Probability of Continuous Track

PCT Not specified TRD: None ORD: None ICAO: None MASPS: None

Recording MOGADOR

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 10 File Name: D671_Analysis_V1.0.doc Version: 1.0

ID Indicator Acronym Requirement Reference Measurement Instruments

VE-17 Matrix of Detection

MOD Not specified TRD: None ORD: None ICAO: None MASPS: None

Recording MOGADOR

VE-18 Matrix of Identification

MOI Not specified TRD: None ORD: None ICAO: None MASPS: None

Recording MOGADOR

Table 2-1: Technical Verification Indicators

Raw data was gathered during Site Acceptance Testing (SAT) of the EMMA test-bed system carried out at Prague Ruzynĕ airport in the period 14-18 March 2005. Site acceptance testing concentrated mainly on the specific items provided for the EMMA Test-Bed. However, to prepare the way for the operational verification and validation exercises in SP6, the SAT also included basic technical performance verification tests of the overall A-SMGCS including the existing surveillance sensors. The objectives of the SAT were to verify the correct function of the EMMA test-bed system and to demonstrate that the technical requirements defined in deliverable document D3.1.1 Ground System Requirements - Prague [5] had been fulfilled. The SAT was performed by Park Air Systems personnel with the assistance of ANS CR and witnessed by ANS CR. Supplementary tests were performed in the period 8-11 November 2005. Testing consisted mainly of visual observation of the traffic situation displays at the controller work-ing positions in the EMMA test room (old TWR) while observing the live traffic through the window. In addition, a follow-me vehicle equipped with a 1090ES squitter beacon (SQB) was directed to per-form various manoeuvres in order to gather data for the measurement of specific verification indica-tors. All relevant data was continuously recorded throughout the trial period for later analysis. The data collected consisted of recordings in Park Air proprietary format and included: 1. Target reports from all surveillance sensor systems 2. Flight plan data 3. Target reports from the surveillance data fusion process of the SDS 4. Operator actions at the CWPs 5. Alerts 6. Airport context data During a replay session, this recorded information is sufficient to permit the full reconstruction of all information displayed at any CWP. The archive media is Advanced Intelligent Tape ™ (™ Sony AIT).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 11 File Name: D671_Analysis_V1.0.doc Version: 1.0

2.1.2 Results Except for the performance indicator results derived in this section, the full list of technical require-ments for the Prague test-bed, with the related acceptance tests and the results obtained, is given in document D3.6.1 Site Acceptance Test Report - Prague [6] and is compared with results of the other EMMA test sites within this document (cf. §5.1 and Appendix A). The main objective of the site ac-ceptance tests was to ensure that the performance of the EMMA test-bed system was adequate to per-mit the system to be used for operational tests. Some requirements were verified by visual observation, others by analysing recorded data to obtain quantitative results. These tests and the results obtained are described below. The MOGADOR tool was used to perform automatic long-term observations of the system surveil-lance performance. Data were compiled and analysed over a period of 4 weeks. The tool can locate blind spots and output maps with blind spots for the different conditions. The data are analysed by taking into account different independent variables: • Different traffic objects that operate on the airport (aircraft, vehicles, unknown) • Different weather conditions (no snow and precipitation vs. snow or and precipitation) • Different zones of the aerodrome (Runway, Obstacle Free Zone [OFZ], Taxiways) The detailed test descriptions and results can be found in the D631 Test Report for Prague [12]. The table below summarises the results for all measured individual verification metrics.

Measured Value ID Indicator Acronym Requirement

Short-Term Long-Term

VE-1 Coverage Volume

CV Approaches Manoeuvring Area Apron taxi lines

√ √ √

N/A

VE-2 Probability of Detection

PD ≥ 99.9% 99.65% 97,1 – 99,4%

VE-3 Probability of False Detection

PFD < 10E-3 per Reported Target 0.07% 0,04 – 0,16%

VE-4 Reference Point RP Not defined 2-20 m N/A

VE-5 Reported Position Accuracy

RPA ≤ 7.5 m at a confidence level of 95%

3.2 m (static)

N/A

VE-6 Reported Posi-tion Resolution

RPR ≤ 1 m 0.1 m N/A

VE-7 Reported Position Discrimination

RPD Not defined Not tested N/A

VE-8 Reported Velocity Accuracy

RVA Speed: ≤ 5 m/s Direction: ≤ 10° at a confidence level of 95%

1.2 m/s 7.9°

N/A

VE-9 Probability of Identification

PID ≥ 99.9% for identifiable Targets 99.72% 78,8 – 94,1%

VE-10 Probability of False Identification

PFID < 10E-3 per Reported Target 0.00% 3,2 – 19,7%

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 12 File Name: D671_Analysis_V1.0.doc Version: 1.0

Measured Value ID Indicator Acronym Requirement

Short-Term Long-Term

VE-11 Target Report Update Rate

TRUR ≤ 1 s

0.47 s N/A

VE-12 Probability of Detection of an Alert Situation

PDAS ≥ 99.9% 100% N/A

VE-13 Probability of False Alert

PFA < 10E-3 per Alert Insufficient data N/A

VE-14 Alert Response Time

ART ≤ 0.5 s

<0.5 s N/A

VE-16 Probability of Continuous Track

PCT Not specified N/A See [12]

VE-17 Matrix of Detection

MOD Not specified N/A See [12]

VE-18 Matrix of Identification

MOI Not specified N/A See [12]

Table 2-2: Summary of Technical Verification Results

2.2 Toulouse Based on the Toulouse-Blagnac A-SMGCS verification results, as described in D6.4.1 (Ref. [13]), two topics have been deemed interesting for further in-depth analysis because the collected results were not quite in line with our expectations. These topics are: - latency in the surveillance processing chain, in particular when tracking vehicles (cf. §2.2.1); - identification, i.e. track / flight plan correlation (cf. §2.2.2). The following sections provide architectural information that helps explain the obtained results (cf. [13]). The main goal of the analysis provided below was to verify if (long-term) improvements were possible and where.

2.2.1 Analysis of Latency in the Surveillance Processing Chain As shown in §2 of D6.4.1, significant latency was observed in the A-SMGCS surveillance processing chain, in particular when tracking vehicles using automatic dependant surveillance broadcast (ADS-B) technology. The surveillance processing chain to produce a dependant target report on a controller working position (CWP) is very long (cf. Figure 2-2). Since the unsatisfactory measurements performed by DSNA/DTI (cf. [13]), Thales ATM investigated three sources affecting position accuracies due to latency. These sources were located: - at the automatic dependant surveillance broadcast (ADS-B) vehicle equipment level; - at the ADS-B ground station (GS) level; - at the ADS-B gateway (GTW) level within the sensor data fusion (SDF). The results of the analysis are provided below.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 13 File Name: D671_Analysis_V1.0.doc Version: 1.0

GPSRX

1090ESTX

1090 ESRX-#1

1090 ESRX-#2

1090 ESRX-#3

1090 ESRX-#4

1090 ESRX-#5

GPS

PositionFIX

NMEA

PositionFIX

1090ES Position

& Identity

Target Report

Position&

Identity

ASX 021

ADS-B GTW SURV

ASX 062

ASRMLATSMR

Target Reports

ASX 062

ToSCA & CWP

AS-680GroundStation

SDF

FlightPlans

Target Report

Position&

Identity

Figure 2-2: Surveillance Processing Chain when Tracking MOSQUITO-equipped Vehicles at TLS

2.2.1.1 Latencies at Vehicle Equipment Level (MOSQUITO) At Toulouse-Blagnac the ADS-B vehicle equipment consisted of a mode S squitter generator called MOSQUITO. The MOSQUITO is composed of a global positioning system (GPS) receiver (RX) and a 1090 extended squitter (ES) transmitter (TX), cf. Figure 2-2. The GPS signals, and potentially EGNOS signals, are received by the GPS RX, which in turn outputs National Maritime Electronic Association (NMEA) strings containing, amongst other items, the mo-bile position. The GPS RX output follows a one second update rate synchronised to GPS time. Its exact latency is unknown, but estimated to something within [500, 1000] ms. A typical MOSQUITO NMEA output is provided in Figure 2-3, where the coloured cells indicate information that is processed and broadcast by the 1090 extended squitter (ES) transmitter (TX).

Figure 2-3: Mosquito NMEA String Output

MOSQUITO supports the emission of three types of 1090 messages, which can be individually en-abled / disabled by users:

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 14 File Name: D671_Analysis_V1.0.doc Version: 1.0

- ADS-B surface positions extended squitters, - ADS-B identity extended squitters, - short squitters. Surface positions, as provided by the GPS RX, undergo odd and even encoding by the 1090ES TX (cf. Figure 2-4). The emission rate is either high, i.e. one odd and one even per second, or low, i.e. once every 5 seconds (odd and even alternating).

1 0.2

GPS Position fix

State Vector & Time

Surface Position

EVEN Encoding

Identity

Encoding

Surface Position

ODD Encoding

1

1

Position processing and updates per second

Mosquito

1

0.2 Identity processing and updates per second

CAPTION:

Figure 2-4: ADS-B 1090ES Surface Pos. and Ident. Processing by MOSQUITO (High Update Rate)

Identities processed by the 1090ES TX also follow either a high emission rate, i.e. once per 5 seconds, or a low emission rate, i.e. once every 10 seconds. Short squitters are transmitted at a once per second rate. Each emission, whether extended squitter or short squitter, is performed using a jitter of ± 100 ms (cf. Figure 2-5). However, only the ADS-B surface positions extended squitters are sensitive to latency.

Time msec

GPSposition

fix

T_mobTrue time of

mobile positioned in

the GPS reported position

GPS latency????

Time of Even surfaceposition

message emission

Jitter+/- 100 msec

Jitter+/- 100 msec

Jitter+/- 100 msec

Jitter+/- 100 msec

GPS nextPosition fix

Time of identification

message emission

Time of Odd surface position message emission

Time of short

squitter message emission

0 200 450 700 900

Even latencyEven_to_odd_latency

1000

1 SEC

1.5 SEC

Figure 2-5: Mosquito Position and Identity Internal Processing Time

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 15 File Name: D671_Analysis_V1.0.doc Version: 1.0

From the above, it is possible to compute that the different latencies in the processing of the 1090ES even (resp. odd) positions, until the 1090ES message broadcast, add up to a total latency value within [600, 1300] ms (resp. [1100, 1800] ms). During the verification tests at Toulouse-Blagnac, MOSQUITO was configured (cf. Figure 2-5): - to emit even (resp. odd) encoded 1090ES position messages 200 ms (resp. 700 ms) after the GPS

position fix; - to apply a fixed 1s (resp. 1.5s) position extrapolation for 1090ES even (resp. odd) position

encoding. So doing, the GPS latency is only guessed and the jitter latency is not compensated.

Latency error of Even and Odd ES emissionMobile velocity = 72 km/h, GPS RX latency = 800 ms

Prediction Even: 1 s, Odd = 1,5 s

-25,0

-20,0

-15,0

-10,0

-5,0

0,0

5,0

10,0

15,0

0 200 400 600 800 1000 1200

ES emission delay w.r.t. GPS fix [ms]

Posi

tion

erro

r rel

ated

to la

tenc

y [m

]

Even ES

Odd ES

Figure 2-6: Quantification of Position Error Related to Latency

For a mobile moving at 72 km/h, and supposing a GPS RX latency of 800 ms, the position errors re-lated to even and odd latencies are reported in Figure 2-6. It can be see that the best accuracy is ob-tained if the even (resp. odd) 1090ES message is emitted exactly 200 ms (resp. 700 ms) after the GPS position fix. The uncertainty created by the jitter is picture by the yellow boxes, i.e. ± 2 m. The curves also allow us to read the position error if our assumption of 800 ms for the GPS latency is wrong. For example, if the GPS latency is only 600 ms in reality, then the yellow boxes should be shifted by 200 ms to the left, following the curves: a 4 m bias would then be added to the ± 2 m jitter uncertainty. It is important to note, that for the above processing, the 1090ES TX must be perfectly synchronised to the GPS time (cf. Figure 2-7). If not, the odd position message may be emitted just after the GPS up-date (instead of just before), thus giving the impression that the mobile backtracks. Indeed, the next even position message will use the same GPS data but with a smaller extrapolation value. The de-synchronisation of internal emission cycle from GPS can also be understood by moving the yellow boxes in Figure 2-6 by 400 ms to the right. When passing the GPS position fix update, the yel-low box of the odd 1090ES position report will undergo a big and sudden jump forward whilst the quality of the even value will slowly degrade.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 16 File Name: D671_Analysis_V1.0.doc Version: 1.0

In conclusion of the above analysis, Thales has specified a MOSQUITO firmware release that pre-cisely compensates latency due to jitter, allows for user configuration of GPS latency value and allows for user enabling/disabling of overall latency compensation.

Time

GPSPosition fix

GPS latency

Time of ODD Surface Position message emission

Time of EVEN

SurfacePosition

message emission

Error in ODD 1090ESError in EVEN 1090ES

GPSPosition fix

GPS latency

Mobile Trajectory as output from Mosquito

Figure 2-7: Ex. of De-synchronisation of Internal 1090ES TX Emission Cycle from GPS Cycle

2.2.1.2 Latencies at ADS-B Ground Station Level (AS-680) At Toulouse-Blagnac the ADS-B ground stations consisted of Thales ATM AS-680 ground stations. The processing chain, from 1090ES message reception to ASTERIX category 021 message generation is pictured in Figure 2-8. The AS-680 ground station supports two ASTERIX message generation modes: - the pipeline mode; - the buffered mode. In the pipeline mode, an ASTERIX report is emitted each time a new target position update is re-ceived, i.e. the time of emission is target dependent, and not synchronised to universal time co-ordinated (UTC). For ADS-B MOSQUITO messages, up to 2 ASTERIX reports per second may be generated. In the buffered mode, an ASTERIX report is emitted at a user-specified interval, i.e. the time of emis-sion is target independent. In both modes: - the ground station performs neither extrapolation nor coasting; - the time of applicability of the target position is the time of reception of the last position update. Due to the above time-stamping of the target reports by the ADS-B ground stations, it is important that the ADS-B GS are fully synchronised to the sensor data fusion (SDF) or UTC using an NTP protocol. At Toulouse-Blagnac, during the verification tests, the ADS-B ground stations were usually config-ured in pipeline mode.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 17 File Name: D671_Analysis_V1.0.doc Version: 1.0

The total latency between the reception of a radio frequency (RF) signals and the output of the corre-sponding ADS-B report on the network (in ADS-B pipeline mode) was measured to be between 6 and 35 ms (cf. [13]).

1090ES bit decoding

1

1

0.2 0.2

Position ODD Local Decoding

Position EVEN Local Decoding

Position Global Decoding

1

1

2

Track Store ASX Report

Position processing and update rate

AS-680 Ground Station

1 0.2 Identity processing and update rate

1090ES EVEN Surface position

1090ES ODD Surface position

1090ES Identity 2 0.2

TVariable Rate

T

Time is the time of last reception

update

GS TIME

T UTC time information

Figure 2-8: AS-680 ASTERIX Report Generation

2.2.1.3 Latencies at ADS-B Gateway Level (within the Sensor Data Fusion) At Toulouse-Blagnac there were 5 ADS-B ground stations whose ASTERIX category 021 outputs all converge to an ADS-B gateway (GTW) before being fused with tracks from non ADS-B sources (cf. Figure 2-2). The ADS-B GTW performs (cf. Figure 2-9): - ASTERIX category 021 message decoding and filtering; - sensor combination; - tracking; - ASTERIX category 062 message encoding and transmission. Target reports are filtered against: - a volume of interest, typically a cylinder centred on the airport reference point (ARP); - an emitter category; - a figure of merit / precision accuracy (FOM/PA) value. The remaining target reports are buffered. Every second, the buffered target reports are sent to a sensor combiner function and the buffer is emp-tied. The sensor combiner: - eliminates duplicate tracks (as coming from multiple ground stations), keeping the target report

with the time of applicability closest to the current UTC time; - resets sensor system identification codes (SIC) and system area codes (SAC), so that the sensor

data fusion (SDF) sees the set of ADS-B ground stations as a single sensor;

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 18 File Name: D671_Analysis_V1.0.doc Version: 1.0

- provides the result to the ADS-B tracker.

T

ASXDecoding &

filter

ASXDecoding &

filter

TBuffer

TSensor

Combiner Tracking

TTrack Store

ASX Encoding

1 1

T

Position processing and updates per second1

0.2 Identity processing and updates per second

T UTC time information

ASX 062 Target Reports

to SDF

ASX 021 Target Reports

from GS

Local UTC timebeginning of second

1 sec cycleacquisition

Few milliseconds processing period

Buffer Refresh

Figure 2-9: ADS-B GTW Internal Processing

The tracking function, supported by a track store (cf. Figure 2-9): - ensures that valid ADS-B targets reports are always sent to the SDF with a one second update rate,

irrespective of the message reception rate, in particular for stationary targets; - ensures target velocities are always sent to the SDF, computing a velocity when (and only when)

necessary. The ADS-B GTW performs neither position extrapolation nor modification of the time of applicabil-ity. Finally, the encoding function maps the ASTERIX category 021 information to ASTERIX category 062 data fields. During the encoding phase, some additional filtering (called data item forward inhibi-tion) may be applied to inhibit the forwarding of some messages corresponding to configurable aircraft / unknown object call signs. In conclusion, the latency introduced by the ADS-B gateway is mainly related to the buffering, so it will differ from one target report to the other, ranging from a few milliseconds to one second. Since the time of applicability is unchanged, the latency will be very precisely compensated by adequate extrapolation within the SURV process of the sensor data fusion (cf. Figure 2-2).

2.2.1.4 General Conclusion on Latency in the Surveillance Processing Chain Results for static accuracy, as documented in the Toulouse-Blagnac test results document (Ref. [13]) or the Prague-Ruzynĕ test results document (Ref. [12]), are highly satisfactory. Objective measure-ments of the dynamic accuracy could not be performed at Prague-Ruzynĕ (cf. [12]). During the Tou-louse-Blagnac site acceptance tests and verification tests, latency in the surveillance processing chain and its compensation through adequate extrapolation, appeared as one of the key issue for automatic

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 19 File Name: D671_Analysis_V1.0.doc Version: 1.0

dependant surveillance. Indeed, fully satisfactory results could not be obtained for the dynamic accu-racy tests when the target was co-operative. The in-depth analysis above (cf. §2.2.1.1 to §2.2.1.3) showed that: - hypothesis ver.sur.62 (as documented in the Toulouse-Blagnac verification and validation test plan

[8]), is not technically feasible; - small errors in the computation (and/or estimation) of the latency can result in highly detrimental

losses of dynamic reported position accuracy (RPA), e.g. 2 meters bias for each 100 ms error on the latency value for a vehicle moving at 72 km/h;

- the time synchronisation of the different elements of the surveillance processing chain is key to the RPA.

Time synchronisation (typically using NTP) should be closely monitored due to its dramatic effects on accuracy. Requirements setting the maximum time drift per day of surveillance sub-elements in case of synchronisation failure have a direct impact on reported position accuracy, and should therefore be (more) carefully considered. It is indeed highly unlikely that the commonly agreed 7.5m RPA at a confidence level of 95% can be maintained in case of sub-system de-synchronisation. Time synchroni-sation loss therefore means that the A-SMGCS enters a degraded system state.

2.2.2 Analysis of Track / Flight Plan Correlation Mechanisms During verification tests, a number of track / flight plan correlation issues were observed. Some typi-cal cases included: - false identification, with two track reports bearing the same call sign; - track duplication, with a single target represented on the CWP by two tracks, one co-operative and

one non co-operative, which are never fused, and appear to be very close to each other; - missing identification in case of flight plan data processing system (FDPS) failure, even though

automatic dependant surveillance broadcast was active, etc. During the verification test period, general identification performance level was not sufficient to de-cide to run the tests as originally scheduled in the Toulouse-Blagnac verification and validation test plan ([8]). In compensation, it was decided to perform an in-depth analysis of the reasons of the poor identification performance. The main goal of the analysis (provided below) was to verify if (long-term) improvements were possible and where. The false identification issue, with two track reports bearing the same call sign, was quickly identified as being a configuration issue, related to the Toulouse-Blagnac operational system. Indeed, the Tou-louse-Blagnac operational system is only fed by approach secondary surveillance radar and surface movement radar. Thus, within the sensor data fusion (SDF), the system tracks maintain their identifi-cation, independently of the identification renewal rate. The same configuration was reproduced, per human error, within the EMMA SDF. An update of the SDF will be performed within the scope of EMMA-2. The track duplication issue (cf. §2.2.2.1) and the missing identification in case of FDPS failure issue (cf. §2.2.2.2) are more interesting and the analysis is provided below.

2.2.2.1 Analysis of the Track Duplication Issue

2.2.2.1.1 Problem Statement During the verification and validation tests at Toulouse-Blagnac, in the presence of co-operative and unstable non co-operative tracks, a single target was sometimes represented on the controller working position (CWP) by two tracks, one co-operative and one non co-operative. Once created, the two

2 It is recalled that hypothesis ver.sur.6 on latency states that: "The latency […] of surveillance position data for aircraft and vehicles should not exceed 1 second. The latency […] of identification data for aircraft and vehicles should not exceed 3 seconds."

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 20 File Name: D671_Analysis_V1.0.doc Version: 1.0

tracks never fused until one of two system-tracks died, e.g. at target take-off or parking. Naturally, this track duplication also generated nuisance runway incursion alarms. The following case study provides an example of conditions leading to track duplication.

2.2.2.1.2 Case Study

Let’s suppose a co-operative aircraft having per-formed its pushback from the apron directly onto a taxiway, and a non co-operative vehicle (e.g. bag-gage or fuel) crossing the taxiway just in front of the aircraft (cf. Figure 2-10). In EMMA, the SMR coverage setting is identical to the SMR setting in the Toulouse-Blagnac op-erational system, i.e. tracking is disabled on aprons except on apron taxi lanes and service roads crossing apron taxi lanes. Thus, within the EMMA A-SMGCS (cf. Figure 2-11): - the aircraft is represented by one system track

(pictured by a green disk), result of the fusion of one co-operative (MLAT or ADS-B) input track (pictured by a yellow diamond located at the aircraft nose) and one non co-operative (SMR) input track (pictured by a blue square centred on the aircraft's reference point);

SMR

Aircraft

Vehicle

TWY

ApronApron

Figure 2-10: Track Duplication Scenario Example

1

2

Aircraft

Vehicle

AFR1234

Figure 2-11: Track Duplication (Step 1)

- the vehicle is represented by one system track result of the fusion of one unique non co-operative (SMR) input track (pictured by a blue square centred on the vehicle's centre of gravity).

As the vehicle gets closer to the aircraft, the SMR is unable to resolve the two plots. In Toulouse-Blagnac this phenomenon is favoured by the fre-quent alignment of the SMR, vehicle and aircraft.

There is a 50% chance that the SMR will output a track with the same track ID number as the previ-ous aircraft track and a 50% chance that the SMR will output a track with the same track ID number as the previous vehicle track. In the first case, the vehicle and aircraft system tracks are merged (cf. Figure 2-12).

1Aircraft

Vehicle &AFR1234

Figure 2-12: Track Duplication (Step 2)

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 21 File Name: D671_Analysis_V1.0.doc Version: 1.0

2

1

Aircraft

Vehicle

AFR1234

Figure 2-13: Track Duplication (Step 3)

As the vehicle continues its course, the SMR is once again in a position to resolve the aircraft and vehicle plots. Two non co-operative tracks are delivered. There is a 50% chance that the vehicle track ID is the same as the previous vehicle and aircraft common track ID, whilst a new non co-operative track is created for the aircraft. In this case, the vehicle grabs the co-operative (MLAT or ADS-B) input track (cf. Figure 2-13).

Then, as the vehicle exits the service road crossing the apron taxi lane, the vehicle's non-cooperative input track dies again. The aircraft's co-operative and non co-operative input tracks are stable. As long as this condition remains true, the two asso-ciated system tracks are never fused.

2

Aircraft

Vehicletrack

AFR1234

Realvehicle

Figure 2-14: Track Duplication (Step 4)

Other situations may lead to the same result. In Toulouse-Blagnac, similar effects were observed when a very large moving aircraft (e.g. A380, Beluga), is split by the SMR into two tracks. The generation of two non co-operative input tracks generates two system tracks, one composed by only one non co-operative input track, the other composed by one non co-operative and one co-operative input track. When the two SMR tracks are merged again, there is a 50% chance that the track that dies was fused with the co-operative input track, like in the scenario above. In that case, the two generated system tracks are kept distinguished as long as their respective input tracks are stable.

2.2.2.1.3 Analysis and Conclusion From the above, it can be seen that the input track association process based on input track ID to sys-tem track ID logical relationship does not always work. Specifically, there is no guarantee that stable non co-operative input tracks always represent the same target. This behaviour is intrinsically related to the physical nature of primary surveillance based on SMR, and is maximised when the following conditions happen: - vehicles are not co-operative; - plot extraction is performed in areas where there is a high probability to have many targets

frequently close to one another. The process currently implemented in the sensor data fusion (SDF) allows for a fast track fusion. It was introduced into the SDF for historical reasons, when the hardware platforms had less processing performance than today. The following proposal is made for an improved input track association process: - enable suppression within the identification test based on input track ID to system track ID

association; - introduce, in addition, the following tests for input to system track association: - the input track mode S address equals the system track one, - the input track aircraft registration equals the system track one, - the input track call sign equals the system track one,

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 22 File Name: D671_Analysis_V1.0.doc Version: 1.0

- the input track SSR code equals the system track one; - all tests, including track-ID key, shall be user configurable for enabling/disabling; - if enabled, association tests shall be executed following the above indicated key order. The improvement shall be implemented within EMMA-2.

2.2.2.2 Analysis of the Missing Identification in case of FDPS Failure Issue Identification is the assignment of a call sign to a system track for the purpose of correctly labelling the system track symbol on the controller traffic situation display. Identification can usually be per-formed automatically by the A-SMGCS equipment, or manually by a controller. Automatic identifica-tion can be obtained directly from sensors, through a correlation process between a system track and a flight plan, or from an adjacent centre (e.g. APP) via external system tracks. Controller manual identi-fication can be obtained by drag & drop of a flight plan onto a system track or by direct labelling of the system track. Automatic correlation between a system track and a flight plan is the preferred identification process. Multiple parameters are used to support the correlation process between a system track and a flight plan: track ID, SSR code, mode S address, aircraft registration, track status (alive or dead), sensor IDs of children input tracks, flight plan ID, flight plan correlation status, flight plan correlation origin, etc.

2.2.2.2.1 Problem Statement During the installation of the surveillance system in Toulouse-Blagnac, multiple cases of inconsistent call signs were noticed between the values contained in the ASR tracks, the ADS-B/MLAT tracks and the flight plans, even though the SSR codes were consistent.

2.2.2.2.2 Analysis To solve those issues, it was decided: - to systematically report the ASR call signs if and when available; - to report the flight plan call sign if and when the system track to flight plan correlation was

successful; - to systematically drop the ADS-B and MLAT aircraft call signs. The advantage of the above was that automatic identification always meant successful system track to flight plan correlation, whether locally at the A-SMGCS level, or at the APP level (when the call sign is provided by the ASR). The counterpart is that A-SMGCS FDPS failure results in non-identified outbound traffic, even though the correct call sign may be known at MLAT or ADS-B level. Within EMMA-2, the current EMMA flight plan management gateway will be replaced by a Terminal Co-ordination System (TECOS™). TECOS™ is the standard Thales product used for electronic flight strip management. Its server part includes an identification process based on track to flight plan corre-lation, which will go far beyond the current EMMA process. In particular, TECOS™ has: - the capability to extend flight plan correlation using also aircraft registration/mode S identity as

key; - the capability to allow users to configure identity conflict resolution. However, it is our feeling that the complete identification process has been insufficiently specified at operational requirements level, and that EMMA-2 should devote some specific efforts in that direc-tion.

2.2.2.2.3 Conclusion Automatic identification within A-SMGCS is generally performed (with a wide community consen-sus) using a correlation process between a target identification key (typically SSR code) as provided by a co-operative sensor and a flight plan. With level-2 A-SMGCS deployment including mode S

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 23 File Name: D671_Analysis_V1.0.doc Version: 1.0

transponder equipped vehicles, call signs can be directly retrieved from transponders, through active interrogations using MLAT, or through ADS-B3 in a passive way. In that case the call signs are the ones set by the pilots. Having multiple sources for call sign assignment can create: - additional problems of false identification due to unintentional wrong call sign setting and

sometimes intentional wrong call sign setting by pilot; - a need for label conflict detection by automated systems. For the former, pilot call sign setting procedures and controller call sign verification procedures should be clearly defined. For the latter, further studies should specify: - how label conflicts should be detected; - how label conflicts should be notified to the controller; - how label conflicts should be resolved. Label conflict resolution may be handled: - manually by the controller and/or the pilot; - automatically by the system. In the latter case, further studies should specify assignment priorities or suggest user-configurable settings. Additionally, further functional hazard assessments should consider this specific issue.

2.3 Malpensa

2.3.1 Introduction The Verification phase, aiming to test the technical performance of new systems and functionalities implemented at Malpensa in the context of EMMA phase 1, was conducted in compliance with the list of indicators reported in document D6.2.2 (see Ref. [11]). The activity aimed to prove that the ad-vanced system is compliant with operational and performance requirements, as defined inside the ICAO manual on A-SMGCS. All trials were organised in such a way that as many indicators as possi-ble could be evaluated. All data were acquired using two computers connected to the ENAV LAN. The whole set of acquired data can be divided into two big categories: • Short-term data: all data are acquired in a short period (less than a day) • Long-term data: all data are acquired in a long period (5 days) All short-term data were generated by driving testing vehicles on the airport surface, in accordance with agreed ad-hoc testing procedures. All long-term data were acquired using traffic of opportunity.

2.3.2 Methods and Tools for Data Collection and Analysis All data were acquired using LAN sniffer software (ETHEREAL to acquire ASTERIX 62 data and SNIFFER PRO to acquire SELEX-SI radar data format). ARTES-RTD is a software tool, developed by SELEX-SI for live recordings of surveillance data. It is able to show the surveillance data to the operator while the recording is in progress, so to make a pre-liminary data analysis possible.

3 Mode S ADS-B does not provide SSR codes. In case ADS-B is the only co-operative sensor, correlation with flight plan using SSR code cannot be performed.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 24 File Name: D671_Analysis_V1.0.doc Version: 1.0

Short-term results were analysed using ARTES-AES tool as far as possible or using Microsoft Excel if ARTES-AES could not produce a valid result. ARTES-AES is a SELEX-SI tool to analyse a record of surveillance data. Long-term results were analysed using the software MOGADOR (Ref. D1.1.2 CDG A-SMGCS Data Analysis [1]). The assessment of the Verification indicators was based on the statistical analysis of relevant data collected through a set of pre-planned on-site experiments. In particular, the Verification indicators referring to the performance of the surveillance function of the system (i.e. VE-2 ‘Probability of De-tection’, VE-3 ‘Probability of False Detection’, and VE-09 ‘Probability of Identification’, VE-10: ‘Probability of False Identification’ , VE-12: ‘Probability of Detection of an Alert Situation’, and VE-13: ‘Probability of False Alert’) were measured by performing binomial tests on the mean value of the probabilities of the corresponding indicators. The objective of these binomial tests was to verify if the mean value of the relevant probabilities (in VE-2, VE-3, VE-9, VE-10, VE12, and VE-13) were above the corresponding minimum acceptable level.

2.3.3 Results The following table summarises all the verification results collected during the Technical Verification phase at Malpensa airport. For each indicator, the table contains the ID, the acronym, the requirement that the obtained value has to be compliant with and additional information used to produce the final result.

ID Indicator Acronym Requirement Measured Value

Additional Information

VE-1 Coverage Volume CV

Approaches

Manoeuvring Area

Apron taxi lines

see D6.5.1 ([14]), Para-graph 2.3.1.1

Information indicating the size of the whole extension of the Move-ment Area was not available at the moment of this result production (under investigation). The amount (in percent-age) of the ‘covered’ area could be added as soon as this data will be available.

VE-2.1 Probability of Detection PD ≥ 99.9%

99.92% (this value

does not in-clude Termi-nal 2 area)

Received targets: 9786 Expected targets: 9794

VE-2.2 Probability

of Detection (Long-term)

PD ≥ 99.9%

99.91% (this value

does not in-clude Termi-nal 2 area)

Total reports: 1,512,483 Identified: 1,513,852

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 25 File Name: D671_Analysis_V1.0.doc Version: 1.0

ID Indicator Acronym Requirement Measured Value

Additional Information

VE-3.1 Probability

of False Detection

PFD < 10-3 per report

0 (this value

does not in-clude Termi-nal 2 area)

Position recordings: 9794 Unsuccessful re-cordings: 0

VE-3.2

Probability of False

Detection (Long-term)

PFD < 10-3 per report not estimated

The amount of recorded data was not adequate to calculate this indicator.

VE-4 Reference Point RP not defined 109 cm

For this test only a bus was used (COBUS). Values were measured in 3 different positions.

VE-5.1

Reported Position

Accuracy (Static)

RPA ≤ 750 cm

at a confidence level of 95%

720 cm

Position reports in 3 points. • Point 1: 1206 re-

ports 718 cm • Point 2: 1194 re-

ports 707 cm • Point 3: 1198 re-

ports 720 cm Total reports: 3,598 Worst case: 720 cm A confidence level of 95% was considered

VE-5.2

Reported Position

Accuracy (Dynamic)

RPA ≤ 750 cm

at a confidence level of 95%

720 cm (only for test

car in ma-noeuvring

area)

Reports were acquired moving the test car on the runways. Values are estimated in 2 different conditions: • constant speed (20

km/h) on the run-ways and taxiways

• increasing and de-creasing speed on the runways (from 0 km/h to 80 km/h and braking to 0 km/h)

Results are: • constant speed:

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 26 File Name: D671_Analysis_V1.0.doc Version: 1.0

ID Indicator Acronym Requirement Measured Value

Additional Information

5621 reports 718 cm

• acceleration and deceleration: 2348 reports 720 cm

Total reports: 7969 Worst case: 720 cm A confidence level of 95% was considered.

VE-6 Reported Position

Resolution RPR ≤ 100 cm 95 cm

3 positions were consid-ered : 1) 85 cm 2) 95 cm 3) 88 cm Worst case: 95 cm

VE-7

Reported Position

Discrimina-tion

RPD not defined 165 cm

All measurements were estimated on 2 positions.

Considered scenarios: • C-C: 2 co-

operative vehicles • C-NC: a co-

operative moving vehicle and a not co-operative stopped vehicle

• NC-C: a not co-operative moving vehicle and a co-operative stopped vehicle

• NC-NC: 2 not co-operative vehicles

Results are: • C-C: 158 cm, 154

cm • C-NC: 157 cm, 153

cm • NC-C: 165 cm, 165

cm • NC-NC: 186cm, no

data

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 27 File Name: D671_Analysis_V1.0.doc Version: 1.0

ID Indicator Acronym Requirement Measured Value

Additional Information

This result is provided ignoring NC-NC and considering the worst case: 165 cm.

VE-8 Reported Velocity Accuracy

RVA < 5 m/s

at a confidence level of 95%

3.1 m/s

2 speed values were analysed: • 40 km/h 1814 re-

ports 2.8 m/s • 60 km/h 1631 re-

ports 3.1 m/s Confidence level con-sidered is 95%. Worst case: 3.1 m/s

VE-9.1 Probability of Identifi-

cation PID

≥ 99.9 % for identifiable

targets 99.9%

Total reports: 9794 Identified: 9785

VE-9.2

Probability of Identifi-

cation (long-term)

PID ≥ 99.9 % for identifiable

targets 99.9%

Total reports: 1,512,261 Identified: 1,513,852

VE-10.1

Probability of False

Identifica-tion

PFID ≤ 10-3 per report 3·10-4%

Identified: 9785 Incorrect identifications: 3

VE-10.2

Probability of False

Identifica-tion

PFID ≤ 10-3 per report not estimated Results are too correct to report them. I prefer to re-calculate.

VE-11 Target Re-port Update

Rate TRUR ≤ 1 s 1 s

VE-12

Probability of Detection

of an Alert

Situation

PDAS ≥ 99.9% 99.9%

Runway incursion (alerts + alarms) Expected: 2913 Detected: 2912 Opposite direction Expected: 823 Detected: 823

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 28 File Name: D671_Analysis_V1.0.doc Version: 1.0

ID Indicator Acronym Requirement Measured Value

Additional Information

Closed Runway Expected: 935 Detected: 935 Closed area Expected: 57 Detected: 57 Total Expected: 4728 Total Detected: 4727

VE-13 Probability

of False Alert

PFA < 10-3 per Alert 2·10-4% see VE-12

VE-14 Alert

Response Time

ART < 0.5 s not estimated

Not estimated because it was not possible to ap-preciate time less than 1 s. ART is less than 1s but it is not a right estima-tion

VE-15 VE-16 VE-17

not estimated

Not estimated due to inadequacy of data re-cordings duration in-duced by frequent tun-ing and upgrading ac-tivities of the system

Table 2-3: Malpensa Verification Results

2.3.4 Conclusions As stated above, Verification Indicators were analysed on statistical base to verify if the mean values were above the corresponding minimum acceptable level. For those indicators expressing the probability of detection or identification (i.e. VE-2, VE-9, and VE-12) this objective was achieved by investigating the following two alternative hypotheses: i) the value of the corresponding probability is equal or above 99.9 % established for the acceptance

of the performance of the surveillance function, vs. ii) the value of the probability lies below the threshold value.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 29 File Name: D671_Analysis_V1.0.doc Version: 1.0

For indicators expressing the probability of false detection or identification, the corresponding hy-potheses can be stated as follows: i) the value of the corresponding probability is equal or below 0.1 % vs. ii) the value of the probability lies above the threshold value. In this context, if hypothesis (i) was rejected for any of these indicators, then the corresponding prob-ability was overestimated (or underestimated for VE-10 and VE-13) by the corresponding threshold value. If hypothesis (i) was not rejected then the performance of the surveillance function under the specific indicator was acceptable. The indicators VE-3.2: ‘Probability of False Detection (long term)’, and VE-10.2: ‘Probability of false identification (long-term)’ could not be measured due to lack of data. With the exception of VE-3.2 and VE-10.2: Probability of false identification (long term), per-forming this type of hypothesis tests resulted to the following conclusions: • The threshold value 99.9% (or 0.999) was achieved for VE-2.2: Probability of Detection (long-

term), VE-9.1: Probability of identification (short-term), and VE-12: Probability of Detection of an Alert Situation. The hypothesis (i) was rejected for the indicators VE-2.1: Probability of Detection (short-term) and VE-9.2: Probability of identification (long-term). However, the hypothesis (i) was not rejected when the corresponding threshold values were changed to 99.3% (or 0.993), and 99.8% (or 0.998) respectively.

• The threshold value 0.1% (or 0.001) was achieved for VE-3.1: Probability of False Detection (short-term), VE-10.1: Probability of false identification (short-term), and VE-13: Probability of False Alert.

With the exception of VE-2.1: Probability of Detection (short-term) and VE-9.2: Probability of identification (long term), there is not significant statistical evidence for rejecting the hypothesis that the expected value of the aforementioned probabilities exceeds the aforementioned lowest acceptable levels. Even for VE-2.1 and VE-9.2, the corresponding probability values for not rejecting this hypothesis were very close to the threshold values. This conclusion implies that the reliability of the EMMA surveillance and alerting functions complies with the pre-specified standards and threshold values. For those Verification indicators that involve continuous measures (i.e. VE-4 ‘Reference Point’, VE-5.1 ‘Reported Position Accuracy’, VE-6 ‘Reported Position Resolution’, and VE-7 ‘Reported Position Discrimination’, VE-8 ‘Reported Speed accuracy’, VE-11 ‘Target Report Update Rate’) point estima-tion at a 95% confidence level was used in order to determine the mean value of the corresponding measures. The estimated mean values for the indicators VE-5.1, VE-6, VE-8 and VE11 were lower than the corresponding predefined threshold values. Based on the estimated values for these indicators, the performance of the EMMA surveillance function (implemented in MPX site) in terms of accuracy complies with the pre-specified threshold values and expectations.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 30 File Name: D671_Analysis_V1.0.doc Version: 1.0

2.4 Combined Analysis This chapter shows the combined analysis of verification results of Prague and Malpensa airports per verification indicator:

Measured Value PRG Measured Value MXP ID Indicator Acronym Requirement

Short-Term Long-Term Short-Term Long-Term

VE-1 Coverage Volume CV Approaches Manoeuvring Area Apron taxi lines

√ √ √

N/A √ √ √

N/A

VE-2 Probability of Detection PD ≥ 99.9% 99.65% 97.1 – 99.4% 99.2% 99.96%

VE-3 Probability of False Detection PFD < 10E-3 per Reported Target 0.07% 0.04 – 0.16% 0.001 Not available data

VE-4 Reference Point RP Not defined 2-20 m N/A 1.09 m Not available data

VE-5 Reported Position Accuracy RPA ≤ 7.5 m at a confidence level of 95%

3.2 m (static)

N/A 7.2 m 720 cm (only for test

car in manoeu-vring area)

VE-6 Reported Position Resolution RPR ≤ 1 m 0.1 m N/A 0.95 m Not available data

VE-7 Reported Position Discrimination RPD Not defined Not tested N/A 1.65 m Not available data

VE-8 Reported Velocity Accuracy RVA Speed: ≤ 5 m/s Direction: ≤ 10° at a confidence level of 95%

1.2 m/s 7.9°

N/A 2.8 m/s (for 40 km/h)

3.1 (for 60 km/h)

Not available data

VE-9 Probability of Identification PID ≥ 99.9% for identifiable Targets 99.72% 78.8 – 94.1% 99.9 % 99.9 %

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 31 File Name: D671_Analysis_V1.0.doc Version: 1.0

Measured Value PRG Measured Value MXP ID Indicator Acronym Requirement

Short-Term Long-Term Short-Term Long-Term

VE-10 Probability of False Identification PFID < 10E-3 per Reported Target 0.00% 3.2 – 19.7% 3·10-4 Not available data

VE-11 Target Report Update Rate TRUR ≤ 1 s

0.47 s N/A 1 sec Not available data

VE-12 Probability of Detection of an Alert Situation

PDAS ≥ 99.9% 100% N/A 99.9 % Not available data

VE-13 Probability of False Alert PFA < 10E-3 per Alert Insufficient data N/A 2·10-4 Not available data

VE-14 Alert Response Time ART ≤ 0.5 s

< 0.5 s N/A Not able to measure (less

than 1 sec)

Not available data

VE-15 Routing Process Time RPT < 10 s N/A N/A Not available data

Not available data

VE-16 Probability of Continuous Track PCT Not specified N/A See [12] Not available data

Not available data

VE-17 Matrix of Detection MOD Not specified N/A See [12] Not available data

Not available data

VE-18 Matrix of Identification MOI Not specified N/A See [12] Not available data

Not available data

Table 2-4: Combined Analysis for Verification Results of Prague and Malpensa

TLS results unfortunately did not match the setup of this table. For a complete overview of TLS results the reader is referred to D6.4.1 (Ref. [13]).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 32 File Name: D671_Analysis_V1.0.doc Version: 1.0

3 Analysis and Description of Operational Feasibility

3.1 Prague

3.1.1 Introduction With the ANS CR controllers the operational feasibility of the installed A-SMGCS has been proven at two test platforms, in a real time simulation (RTS) environment and on-site at the Prague Ruzynĕ Con-trol Tower (field trials). With the real time simulation the focus was more laid on the design of the HMI and working with it whereas with the on-site field trials the focus laid on the actual technical performance the controllers were confronted with. With the RTS trials a 31 item acceptance questionnaire was given to each of the 11 controllers at the end of the trials. With the operational field trials a 144 item questionnaire was given to the ANS CR controllers after doing their regular job, whereas they already worked with the installed A-SMGCS fully operational for more than half a year. The 144 items questionnaire was composed of questions to the surveillance and alerting performance, HMI and Procedure aspects. There were also questions to the ‘operational improvements’, which are reported with §4.1. Further on-site operational feasibility tests have been done with respect to the monitoring and alerting service. Special flight tests were per-formed to assess the controllers’ acceptance of the function to detect conflict situations with respect to simultaneous traffic with crossing runways and to alert the controller in an appropriate way.

3.1.2 Results

3.1.2.1 RTS Results Each of the 11 ANS CR Controllers of the RTS trials was given a 30 items acceptance questionnaire after finishing all test runs. They were asked to give their opinion to the use of A-SMGCS. The an-swering scale reached from 1 ‘Strongly disagree’ to 10 ‘Strongly agree’. The following general hy-pothesis was set up to describe the expectation with the controllers’ answers: Identifier Hypothesis

OF-H0 The controllers’ opinion does not agree to the ‘operational feasibility’ aspects of a spe-cific item.

OF-H1 The controllers’ opinion agrees to the ‘operational feasibility’ aspects of a specific item.

By use of a t-test for a single sample size, each item was proved for its statistical significance by a one sample t-test. With answers from 1 through 10 the critical test-value was 5.5. Table 3-1 shows the respective results. A p-value with a star (*) indicates statistical significance.

Test Value = 5.5

Item T df p (1-sided)

ITEM01 14.087 10 .000*

ITEM02 -10.241 10 .000*

ITEM03 4.787 10 .001*

ITEM04 10.338 10 .000*

ITEM05 -12.618 10 .000*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 33 File Name: D671_Analysis_V1.0.doc Version: 1.0

ITEM06 -30.016 10 .000*

ITEM07 12.594 10 .000*

ITEM08 -8.953 10 .000

ITEM09 20.618 10 .000*

ITEM10 14.986 10 .000*

ITEM11 2.096 10 .031*

ITEM12 11.423 10 .000*

ITEM13 18.764 10 .000*

ITEM14 -15.932 10 .000*

ITEM15 10.178 10 .000*

ITEM16 11.847 10 .000*

ITEM17 10.338 10 .000*

ITEM18 18.764 10 .000*

ITEM19 10.510 10 .000*

ITEM20 7.113 10 .000*

ITEM21 6.550 10 .000*

ITEM22 7.832 10 .000*

ITEM23 14.252 10 .000*

ITEM24 19.341 10 .000*

ITEM25 -3.594 10 .003*

ITEM26 20.618 10 .000*

ITEM27 7.262 10 .000*

ITEM28 5.590 10 .000*

ITEM29 -2.096 10 .031

ITEM30 10.338 10 .000*

Table 3-1: t-test for 30 Items of the Acceptance Questionnaire

The bar chart with Figure 3-1 gives a good overview about the answers to each item. The mean value is 5.5, which is represented by the blue line. Except for item 08 and item 29 all statements have been answered towards the expected end of the scale. The p-values are represented by the scale of the lower horizontal axis, the 0.05 yellow line and the yellow bars, which are superimposed to the red bars. No yellow bar exceeds the critical 0.05% line, which expresses the statistical significance of all items, except for item 08 and 29 that are answered in the non-expected direction.

3.1.2.1.1 Concluding Results 28 of 30 acceptance items have been significantly answered by 11 controllers in the expected direc-tion. Therefore, it can be stated that the use of the A-SMGCS in the two RTS phases was of high op-erational feasibility.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 34 File Name: D671_Analysis_V1.0.doc Version: 1.0

8,36

3,18

8

8,09

8,81

2,45

8,63

9

6,18

7,72

8,45

2,9

8,18

8,18

8,09

8,45

7,63

8,18

7,54

7,81

8,45

8,54

4,09

8,63

7,54

7,09

4,81

8,09

1,9

1,27

1 2 3 4 5 6 7 8 9 10

A01 - I experienced the level of safety by using the A-SMGCS as veryhigh.

A02 - EMMA enabled you to handle more traffic

A03 - EMMA enabled you to provide the pilots a better level of service

A04 - EMMA enabled you to execute your tasks more efficiently

A05 - The introduction of EMMA will increase the potential of humanerror

A06 - The types of human error associated with EMMA are different thanthose associated with normal work

A07 - The A-SMGCS DISPLAY is easy to handle

A08 - The A-SMGCS DISPLAY provides an active, involved role for me

A09 - The A-SMGCS DISPLAY gives me support I miss with the currentsystems

A10 - The use of the different windows is clear to me

A11 - Called windows appear at the expected place and size

A12 - The layout of the windows on the screen is good, i.e. the windowsare conveniently arranged

A13 - I experienced textual representation as appropriate

A14 - In general, automated features within the A-SMGCS DISPLAYbehave in ways that are consistent with my expectations

A15 - I experienced the mouse and the keyboard for an A-SMGCSDISPLAY input device as well-suitable

A16 - All information I need to accomplish a ATC instructions isavailable

A17 - The display colours chosen in the A-SMGCS DISPLAY aresatisfying

A18 - The contrast between the windows and their background issufficient

A19 - The layout of the A-SMGCS DISPLAY is good, i.e. the informationis conveniently arranged and the amount of information in is not to

large

A20 - The different information is easy to find

A21 - Visual coding techniques help me maintain productive scanning

A22 - Different colour codes are easy to interpret

A23 - The used symbols are easy to interpret

A24 - Symbols can easily be read under different angle of view

A25 - Labels, terms and abbreviations chosen in the A-SMGCSDISPLAY are easy to interpret

A26 - The height and width of characters are sufficient

A27 - The A-SMGCS DISPLAY provides me with the right information inthe right time

A28 - Sometimes information was displayed, which I did not need

A29 - The number of keystrokes (or other control actions) necessary tointeract with the system is kept to a minimum

A30 - I experienced the level of safety by using the A-SMGCS DISPLAYas high

0,00 0,05 0,10 0,15 0,20 0,25 0,30 0,35 0,40 0,45 0,50 0,55

Figure 3-1: Bar Chart for Means, SD, and p-values for 30 Items of RTS Acceptance Questionnaire

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 35 File Name: D671_Analysis_V1.0.doc Version: 1.0

3.1.2.2 Field Trials Results The operational feasibility tests aim at assessing the user’s acceptance of the EMMA ORD [3] opera-tional procedures and requirements. It was expected that the operational feasibility of the system would be confirmed, for each set of visibility conditions, using defined procedures derived EMMA Operational Requirements Document (ORD). The following general hypothesis has been used to de-cide upon the test results:

High-level Objective 1

EMMA A-SMGCS shows the operational feasibility of the operational procedures and requirements expressed in the initial EMMA ORD [3] for each set of conditions.

To prove the operational feasibility of the installed A-SMGCS three main exercises were conducted: 1. Debriefing Questionnaires and Interviews 2. Long Term Alert Performance Assessment 3. Flight Tests with test aircraft and test vehicles The following sections give details and results to each exercise.

3.1.2.2.1 Debriefing Questionnaire (Operational Feasibility) A total of 15 ANS CR controllers filled out the debriefing questionnaire during the EMMA opera-tional field trials. All 15 ANS CR had worked with the A-SMGCS for 7 months at the time of the in-vestigation. The table below shows the distribution of age, gender, and ATC experiences:

Category N

Age 20-29 2

30-39 8

40-49 3

>50 2

Gender male 12

female 3

ATCo Experience (years) <5 3

6-10 3

11-15 7

16-20 0

21-25 1

> 26 1

Table 3-2: Social-Demographic Data of the Sample Size

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 36 File Name: D671_Analysis_V1.0.doc Version: 1.0

A 144-item debriefing questionnaire was given to 15 ANS CR controllers after their regular shift. The items that refer to the „operational feasibility’ questions/statements loaded to five areas: - General usability, - Surveillance service, - Control service, - HMI design, and - New or potential procedures. The following general hypothesis was set up to describe the expectation with the controllers’ answers:

Identifier Hypothesis

OF-H0 The controllers’ opinion does not agree to the ‘operational feasibility’ aspects of a spe-cific item.

OF-H1 The controllers’ opinion agrees to the ‘operational feasibility’ aspects of a specific item.

Ratings to a statement could be given from 1 (strongly disagree) up to 6 (strongly agree). A one-sample t-test has been applied to prove the data for their statistical significance for all 144 items: - One-Sample t-test - Expected mean value = 3,5 - Answers from 1 (disagreement) through 6 (agreement) - N = 15 - α = 0.05 - p-value is single-sided because of the use of a directed hypothesis Results referring the ‘operational feasibility’ items are reported with sections 3.1.2.2.1.1 ‘General’, 3.1.2.2.1.2 ‘Surveillance’, 3.1.2.2.1.3 ‘Control’, 3.1.2.2.1.4 ‘HMI’, and 3.1.2.2.1.5 ‘Procedures’. Re-sults to the ‘operational improvement’ items can be found in section 4.1. A star (*) with the p-value means that a item has been answered significantly because the p-value is equal or less than the critical error probability α, which is 0.05. When the controllers significantly express their acceptance to a single service or procedure item, it can be assumed that the operational feasibility is proven for this area of interest. Items written in italics could not be answered meaningfully because the controllers had limited or no operational experience with the topic (e.g. except in the case of ‘lit stop bar crossing’, no system alerts have been used operationally by the ATCos). When controller comments were given to an item, they are reported directly below the statement. In addition to that, the sources of each item are reported. Sources are requirements or procedures re-ported in the ORD [3] and TRD [4].

3.1.2.2.1.1 General VA-Id. Questionnaire Item N Mean SD p

78 I used the A-SMGCS frequently. 15 5.3 0.6 0.00* Comment by ATCo E: During LVP operated

80 The A-SMGCS is highly relevant for my work. 15 5.0 0.7 0.00*82 I feel very confident using the A-SMGCS. 15 4.9 0.6 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 37 File Name: D671_Analysis_V1.0.doc Version: 1.0

85 Under visibility 1 / good visibility conditions A-SMGCS provides no additional information.

15 2.9 1.2 0.08

Comment by ATCo C: … but I still use it.

86 It is helpful to use A-SMGCS when visual reference is impaired 15 5.1 0.8 0.00*87 I find the A-SMGCS unnecessarily complex. 15 2.4 1.1 0.00*95 The A-SMGCS display gives me information which I missed before. 15 5.0 1.1 0.00*117 I experienced the level of safety by using the A-SMGCS as very high. 15 4.9 0.7 0.00*

Comment by ATCo O: Slightly disagree. I am especially referring to the indication of blocked RWY. The mouse is not always at hand’s reach and, especially in busy hours, it is difficult to operate this function. It’s happened often that we executed DEP/ARR without switching off the indication.

135 The A-SMGCS display makes it easier to detect potentially problematic situations.

14 5.0 1.0 0.00*

140 It is easy to learn to work with A-SMGCS. 15 4.9 0.8 0.00*141 I would imagine that most operational personnel would learn to use A-

SMGCS very quickly. 15 4.9 0.8 0.00*

142 I needed to learn a lot of things before I could get going with the A-SMGCS.

15 2.5 1.1 0.00*

143 There was enough training on the display, its rules, and its mechanisms. 15 4.6 0.6 0.00*144 There was enough training on how to control traffic with the use of the

A-SMGCS. 14 4.4 1.2 0.02*

Comment by ATCo C: It was really easy for me; I needn’t any special training on. Comment by ATCo G: There was none, do we need any?

Table 3-3: Debriefing Questionnaire - Means, SD, and p-value for ‘General’ OF Items

3.1.2.2.1.2 Surveillance

VA-Id. Questionnaire Item ORD /

D1.3.6 HMI TRD N Mean SD p

1 When visual reference is not possi-ble, the displayed position of the aircraft in the runway sensitive area is accurate enough to exercise control in a safe and efficient way.

OP_Perf-05 OP_Serv-11

Tech_Surv_26

15 5.1 0.5 0.00*

2 When visual reference is not possi-ble, the displayed position of vehicles in the runway sensitive area is accu-rate enough to exercise control in a safe and efficient way.

OP_Perf-05 OP_Serv-11

Tech_Surv_26

15 4.7 0.9 0.00*

3 When visual reference is not possi-ble, the displayed position of the aircraft on the taxiways is accurate enough to exercise control in a safe and efficient way.

OP_Perf-05 OP_Serv-11

Tech_Surv_26

15 5.4 0.5 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 38 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item ORD /

D1.3.6 HMI TRD N Mean SD p

4 When visual reference is not possi-ble, a missing label is not a problem to exercise control in a safe and effi-cient way.

OP_Serv-04 OP_Perf-12 OP_Perf-11

Tech_Gen_28 Tech_Surv_03

15 2.9 1.1 0.07

5 When visual reference is not possi-ble, a missing position report is not a problem to exercise control in a safe and efficient way.

OP_Serv-11 OP_Serv-04 OP_Perf-12 OP_Perf-11

Tech_Gen_28 Tech_Surv_03

14 3.4 1.3 0.85

Comment by ATCo M: hasn’t happened.

6 When visual reference is not possi-ble, a wrong label is not a problem to exercise control in a safe and effi-cient way.

OP_Serv-04 OP_Perf-11

Tech_Surv_03 15 1.9 1.1 0.00*

7 Very frequently I experienced track swapping.

Tech_Gen_28 15 3.4 1.2 0.75

8 When visual reference is not possi-ble, track swapping prevents me to exercise control in a safe and effi-cient way.

OP_Perf-11 OP_Perf-13

Tech_Gen_35 Tech_Gen_36

15 4.3 0.9 0.00*

15 I think manual labelling is useful. HMI_REQ 3.1.1 #6 + #19

Tech_HMI_07 14 4.5 1.0 0.00*

Comment by ATCo H: I haven’t used it yet. Comment by ATCo O: It takes some time and label is often lost.

16 I think that the A-SMGCS surveil-lance display could be used to deter-mine that an aircraft has vacated the runway.

OP_Serv-11 Tech_Supp_03

15 5.3 0.5 0.00*

17 I think that the A-SMGCS surveil-lance display could be used to deter-mine that an aircraft has crossed a holding position.

OP_Serv-11 Tech_Supp_03

15 4.5 1.3 0.01*

Comment by ATCo F: The holding position is much more accurate (???) position then vacating RWY therefore I slightly agree.

35 I think that the A-SMGCS surveil-lance display could be used to deter-mine that an aircraft is on stand or has left the stand.

OP_Perf-05 OP_Serv-11

Tech_Surv_26

15 3.8 1.3 0.37

Comment by ATCo F: it depends on quality of surveillance Comment by ATCo O: During LVO yes. Otherwise, I still prefer to look out of the window.

89 I think there is too much inconsis-tency between A-SMGCS and real traffic.

OP_Serv-01 OP_Serv-03

Tech_Surv_34 15 2.5 1.0 0.00*

Comment by ATCo K: But sometimes false targets.

111 The A-SMGCS display gives me sufficient information about airborne

OP_Perf-07 OP_Serv-13

Tech_Surv_32 Tech_Surv_20

15 4.5 1.0 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 39 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item ORD /

D1.3.6 HMI TRD N Mean SD p

traffic in the vicinity of the airport.

Comment by ATCo E: I rely more on E 2000.

Table 3-4: Debriefing Questionnaire - Means, SD, and p-value for ‘Surveillance’ OF Items

3.1.2.2.1.3 Control VA-Id. Questionnaire Item ORD TRD N Mean SD p

25 A-SMGCS helps to issue traffic informa-tion.

Tech_Surv_05 Tech_Gen_02

15 5.1 1.0 0.00*

26 A-SMGCS makes it easier to detect pilot errors.

Tech_Surv_05

15 5.2 0.8 0.00*

27 When visual reference is not possible, A-SMGCS facilitates to give traffic infor-mation to pilots so that they can avoid other traffic.

OP_Perf-5

Tech_Surv_26 Tech_Gen_02

15 4.9 0.5 0.00*

40 A-SMGCS display gives me better means to expedite or slow down an aircraft’s taxi speed.

Tech_Surv_28 15 4.1 1.2 0.06

Comment by ATCo D: I don’t do it very often without (with) A-SMGCS.

64 Information alerts are often popping up too late to solve the situation before an alarm comes up.

Tech_Cont_13

7 3.0 1.2 0.30

Comment by ATCo L: In test – not used in real traffic.

65 Too many unnecessary information alerts were popping up.

Op_Perf-20 Tech_Cont_08 Tech_HMI_15

7 3.4 1.4 0.90

Comment by ATCo L: In test – not used in real traffic. Comment by ATCo O: in case of false targets.

66 I think that all Runway Incursion Alerts are triggered at the right moment.

Op_Perf-20 Tech_Cont_13 7 4.1 1.2 0.21

Comment by ATCo L: In test – not used in real traffic.

67 I think that Runway Incursion monitoring an alert function helps me to react in an expeditious and safe manner.

OP_Serv-16 Tech_Cont_13 Tech_Cont_03

8 4.4 1.3 0.10

Comment by ATCo K: The problem is that A-SMGCS display is not the ATCo’s primary display. Comment by ATCo L: In test – not used in real traffic.

68 I experienced too many false alerts to work in a safe and efficient way.

OP_Perf-20OP_Perf-21

Tech_Cont_12

8 3.1 0.8 0.24

Comment by ATCo L: In test – not used in real traffic.

69 There were cases where an alarm was missing.

Tech_Cont_02

8 2.9 1.4 0.23

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 40 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item ORD TRD N Mean SD p

Comment by ATCo L: In test – not used in real traffic.

77 Issuing clearances to aircraft is supported well by the A-SMGCS.

OP_Serv-14 Tech_Gen_02 15 4.5 1.0 0.00*

79 The information displayed in the A-SMGCS is helpful for avoiding conflicts.

OP_Serv-21OP_Serv-30OP_DS-6

Tech_Surv_05 15 5.1 0.7 0.00*

123 The A-SMGCS enables me to provide the pilots a better level of service.

OP_Serv-14 Tech_Gen_02 15 4.4 1.5 0.03*

Comment by ATCo E: Not in normal condition. Within LVP traffic information are given.

Table 3-5: Debriefing Questionnaire - Means, SD, and p-value for ‘Control’ OF Items

3.1.2.2.1.4 HMI

VA-Id. Questionnaire Item ORD /

D136_HMI N Mean SD p

75 The A-SMGCS provides the right information at the right time.

Op_Serv-30 15 5.1 0.6 0.00*

81 Improvements in the A-SMGCS display would be desirable.

15 3.7 1.0 0.36

83 The display enables to recognize a degrading accuracy of surveillance.

13 3.6 1.2 0.73

84 The display layout is easy to customize to my own preferences.

REQ 3.1.1 #2 + #18

15 4.7 0.7 0.00*

88 I think the A-SMGCS is easy to use. Op_If-1 15 4.9 0.7 0.00*90 I find the A-SMGCS very difficult to use. Op_If-1 15 1.9 1.0 0.00*91 The use of the different windows on the A-SMGCS

display is clear to me. Op_If-1 15 4.8 0.6 0.00*

92 Too much interaction with the A-SMGCS is needed. Op_If-1 15 2.9 0.8 0.01*93 The A-SMGCS display is easy to understand. Op_If-1 15 5.0 0.4 0.00*94 The A-SMGCS display provides an active, involved

role for me. Op_If-1 15 4.7 0.6 0.00*

96 Information is conveniently arranged in the A-SMGCS display.

Op_If-1 15 4.7 0.5 0.00*

97 The amount of information in the A-SMGCS display is not too large.

Op_If-1 15 4.4 1.2 0.01*

98 Symbols can easily be read under different angles of view in the A-SMGCS display.

Op_If-1 15 5.1 0.6 0.00*

99 Labels, signs, and symbols in the A-SMGCS display are easy to interpret.

REQ 3.1.1 #15 +

15 5.0 0.5 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 41 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item ORD /

D136_HMI N Mean SD p

#16 + #17 REQ 3.2.4# 4 Op_If-1

100 The height and width of characters in the A-SMGCS display is sufficient.

Op_If-1 15 5.0 0.7 0.00*

101 The A-SMGCS display layout in general should not be changed.

Op_If-1 15 4.5 1.1 0.00*

102 The A-SMGCS display size is appropriate for daily work.

Op_If-1 15 5.1 0.5 0.00*

103 All text in the display is easy to read. Op_If-1 15 4.7 1.0 0.00* Comment by ATCo E: ARR + DEP windows are difficult to read (not often used in real traffic).

Comment by ATCo G: When yellow (?) alert is on the colour of box and text inside the box is not very well com-bined,

104 There is too much information in the A-SMGCS dis-play which is not needed.

Op_If-1 15 2.5 0.7 0.00*

Comment by ATCo E: Can be set up at personal feelings.

105 Some relevant information is frequently missing in the A-SMGCS display.

Op_If-1 15 2.7 1.3 0.03*

Comment by ATCo D: Labels on the end of screen. Comment by ATCo G: Designation of temporary maps window, when you open the window you don’t know which of the maps is used. Comment by ATCo K: Departing aircraft in DEP window Comment by ATCo L: Some missing aircraft in departure list while aircraft is ready to go -> manual labelling impossible.

106 The display colours chosen in the A-SMGCS display are appropriate.

REQ 3.1.1#13 Op_If-1

15 4.9 0.5 0.00*

107 Pop-up windows appear at the expected place and size. Op_If-1 15 4.1 1.4 0.15 Comment by ATCo K: Pop-up window referring to time to threshold is not visible in certain situations (THD (?) is

close to window edge). Comment by ATCo O: When RWY 13 is in use (not often) -> when a pop-up window (arrival) appears, it hides a label of departure that holds short of RWY 13 on RWY 24. It is necessary to open secondary window to get the information.

108 The windows on the A-SMGCS display are conven-iently arranged.

Op_If-1 14 4.6 0.7 0.00*

Comment by ATCo O: (except for the pop-up window of arrival on RWY 13 – see item 107)

109 Aircraft that should have been visible are sometimes obscured by pop-up windows.

REQ 3.1.1 #14

15 3.5 1.1 0.91

Comment by ATCo O: (except for the pop-up window of arrival on RWY 13 – see item 107)

110 The contrast between the windows and their back-ground is sufficient.

15 5.0 0.4 0.00*

130 The A-SMGCS display is detracting too much atten-tion.

Op_If-1 15 2.7 1.1 0.01*

Comment by ATCo O: not the display itself; it is sometimes forgotten to operate the function of blocked RWY, especially in heavy traffic (we are used to a different indication) and it can lead to a situation when reality is differ-ent from what is indicated on display.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 42 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item ORD /

D136_HMI N Mean SD p

131 The A-SMGCS display helps to have a better under-standing of the situation.

REQ 3.1.1 #9 + #15

5 5.0 0.7 0.01*

132 Important events on the A-SMGCS were difficult to recognize.

REQ 3.1.1 #13 + #14 + #17 + #23 + #27 Op_If-1

15 2.3 0.6 0.00*

133 Sometimes information is display, which I don't need. REQ 3.1.1 #12 + #14 + #15 ? Op_If-1

15 3.1 0.8 0.05

134 Different colour codes on the A-SMGCS display are easy to interpret.

(REQ 3.1.1 #13) Op_If-1

15 5.0 0.4 0.00*

Table 3-6: Debriefing Questionnaire - Means, SD, and p-value for ‘HMI’ OF Items

3.1.2.2.1.5 Procedures VA-Id. Procedure ORD

sections N Mean SD p

18 Contingency A-SMGCS surveillance identification pro-cedures I think when the SMR completely fails but MLAT remains the A-SMGCS display cannot be used as a primary means for identification anymore.

7.2.2 12 3.5 1.0 1.00

Comment by ATCo F: Depends on how many aircraft and vehicles are equipped with transponder. Comment by ATCo O: I have no experience with that

19 When the direct recognition of aircraft/vehicle IDs through the label is no longer possible, due to a ground MLAT failure, the surveillance display should be downgraded to a lower level of surveillance, such as SMGCS surveillance display (e.g. labelled SMR) or SMR display only.

7.2.2 13 4.4 0.7 0.00*

Comment by ATCo O: I have no experience with that

20 I think an individual aircraft’s failure to comply with A-SMGCS procedures (e.g. MODE-S transponder failure) requires returning completely to SMGCS procedures for all aircraft.

7.2.2 15 2.8 1.4 0.07

21 I think procedures in case of A-SMGCS failure are defined clear enough.

7.2.2 15 3.9 1.2 0.25

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 43 File Name: D671_Analysis_V1.0.doc Version: 1.0

22 Transponder Operating Procedures I experienced that aircraft have failed to comply with the transponder operating procedures.

4. 15 4.7 0.9 0.00*

23 I think it is appropriate that pilots switch on the trans-ponder before requesting pushback (or taxiing or whatever is earlier).

4.1.2 15 5.4 0.6 0.00*

24 I experienced that pilots have failed to turn the transponder on just prior to requesting push back (or taxiing or what-ever is earlier).

4.1.2 15 4.7 0.9 0.00*

30 Start-Up clearance delivery The A-SMGCS surveillance display enables me to estab-lish a more efficient start-up sequence in visibility 1 condi-tions.

7.3.1

14 2.8 1.3 0.06

Comment by ATCo G: There is no such description in the system.

31 The A-SMGCS surveillance display enables me to estab-lish a more efficient start-up sequence in visibility 2 condi-tions.

7.3.1

14 2.8 1.1 0.03*

Comment by ATCo G: There is no such description in the system.

32 The A-SMGCS surveillance display enables me to estab-lish a more efficient start-up sequence in visibility 3 condi-tions.

7.3.1

14 3.0 1.4 0.19

Comment by ATCo G: There is no such description in the system.

33 Push-back clearances When gates are not visible push-back clearances based on A-SMGCS traffic information can be given in a safe way.

7.3.2.2 15 4.2 0.8 0.00*

34 I think that traffic information on the A-SMGCS surveil-lance display helps me to decide whether a push-back clearance should be delayed.

7.3.2.2 15 4.1 0.8 0.02*

36 Taxi clearances I can rely on A-SMGCS when giving taxi clearances even when visual reference is not possible.

7.3.3 15 4.9 0.7 0.00*

Comment by ATCo K: slightly disagree due to false targets

37 Longitudinal spacing on taxiways is easier to survey with A-SMGCS even when visual reference is not possible.

15 4.9 0.6 0.00*

38 When visual reference is not possible I think longitudinal spacing on taxiways can be reduced with A-SMGCS.

15 4.7 1.3 0.00*

Comment by ATCo L: … if approved by our authority, it would be great.

44 Taxiing on the runway ICAO doc 4444 states that for the purpose of expediting air traffic, aircraft may be permitted to taxi on the runway-in-use. I think the use of A-SMGCS could allow this even

7.3.6 15 4.5 1.4 0.02*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 44 File Name: D671_Analysis_V1.0.doc Version: 1.0

when visual reference is not possible.

48 Line-up procedures When an intersection is not visible, line-up from this inter-section could be applied in a safe way when using A-SMGCS.

7.3.7.2.2 15 5.1 0.5 0.00*

49 I think it could practicable to make multiple line-ups using A-SMGCS when visual reference is not possible.

7.3.7.3 15 4.0 1.7 0.28

Comment by ATCo G: Multiple line-ups when no visual are nonsense. Comment by ATCo L: … if approved by our authority, it would be great.

54 Take-off clearance I think that the A-SMGCS surveillance display could be used to determine when to issue a take-off clearance.

7.3.8 15 4.5 1.4 0.02*

55 Landing clearances When visual reference is not possible I think the A-SMGCS surveillance display can be used to determine if the runway is cleared to issue a landing clearance.

7.3.10.2 15 5.3 0.6 0.00*

56 Conditional clearances Under good visibility conditions I think A-SMGCS surveil-lance data helps me to give conditional clearances in a safe and efficient way.

7.3.9 15 4.0 1.6 0.23

Comment by ATCo O: Not only the A-SMGCS.

57 When visual reference is not possible, I think A-SMGCS surveillance data helps me to give conditional clearances in a safe and efficient way.

7.3.9 7.3.9.3

15 4.1 1.5 0.13

Comment by ATCo G: When no visual reference = no conditional clearances Comment by ATCo O: Not only the A-SMGCS.

60 Visibility Transition With A-SMGCS, it would make sense to redefine the visi-bility limits for the transition to low visibility operations. (if yes, please indicate your suggestions)

5. 12 2.3 0.9 0.00*

Comment by ATCo C: The time between arrivals and departures would be shorter, we shouldn’t wait for ‘runway vacated’ report, and the distance between two arrivals could be shorter. Comment by ATCo F: In a process to redefine visibility limits is A-SMGCS ok only one part. Comment by ATCo G: Visibility limits are for pilots.

63 A-SMGCS level 2 procedures I think A-SMGCS can help me to detect lit stop bar cross-ings.

7.3 14 4.9 0.7 0.00*

63a I think A-SMGCS can help me to detect runway incur-sions.

70 A-SMGCS level I & II phraseology Existing phraseology can be maintained without change while using A-SMGCS.

6.

14 4.7 0.6 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 45 File Name: D671_Analysis_V1.0.doc Version: 1.0

71 I have experienced situations where existing phraseology should have been changed while using A-SMGCS.

6. 14 2.8 1.2 0.04*

Comment by ATCo F: e.g. squawk assigned code – some pilots do not understand

Table 3-7: Debriefing Questionnaire - Means, SD, and p-value for ‘Procedure’ OF Items

3.1.2.2.2 Long Term Alerting Performance Assessment The objective of this test was to assess the operational feasibility of the alerting function. Technically the function’s performance has been verified (cf. §2.1.2) but the controllers’ acceptance has not been assessed in the field. For this purpose the monitoring and alerting function was switched on at the ac-tive CWP for more than two weeks in January 2006. But, the service was not used fully operational4 but only used to be monitored by controllers in case of a conflict situation. To assess the operational performance the controllers were requested to report each conflict situation and to compare it with the alerts shown on the A-SMGCS display. They were requested to report the date and UTC and to assess whether the alert was right (wanted), false (due to a false target), un-wanted, or missed. Information (stage 1) alerts was not assessed to reduce the additional workload of the controller. The reporting sheet was developed with support of an ANS CR controller and translated to Czech language to get easier the controllers’ acceptance to perform this additional work. Concluding Results As it happened, the template could not been filled out by the controllers. This was caused by several reasons: First, the controller did not accept the additional workload or simply forgot to report an ob-served conflict situation. Secondly, the A-SMGCS display is not the primary display that is observed by the TEC but the E2000. Alerts that are displayed on the A-SMGCS display are not supported by an audible signal and thus could easily miss the controller’s attention. Concluding, no results were gained with this test.

3.1.2.2.3 Flight Tests - Case Studies for Testing the Alert Performance of ‘Crossing Runway Alerts’ These trials were performed on five days during the field trials (cf. the protocols in the Prague Test Report D6.3.1 [12]). Each trial lasted approximately one hour with five to 12 conflict situations. Case studies mean that during the regular traffic (at times of very less traffic amount) test vehicle or test aircraft cause safety critical scenarios to issue system alerts. The controllers who actively con-trolled the traffic were presented with these alerts and were asked afterwards for their views on the operational feasibility. The detailed conflict scenarios can be seen in the annex of the Prague Test Re-port D6.3.1 (Ref. [12]). Four different runway crossing scenarios have been tested: - departure – departure, - departure – arrival, - departure – crossing, and - approach – approach conflicts

4 With visibility conditions lower than 3000m, the service had to be switched off.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 46 File Name: D671_Analysis_V1.0.doc Version: 1.0

The first three departure conflicts could be tested by using a test car and the regular approaching or departing traffic. For the tests of the approach – approach conflicts, test aircraft had to be used. There were two CAA aircraft, a BE 400 and an L 410.

3.1.2.2.3.1 Results In total 50 conflict situation have been tested with very satisfying results (cf. Table 3-8). There were only 4% of unwanted alerts and 10% of missed alerts, which seems to be a bit too much to assess it as operationally acceptable but the results do not reflect the full operational alert performance because the tests were also used to tune the alert parameter settings. At the end of the trials, the best setting was found so that the assumption can be made that further tests would increase the percentage of right alerts compared to unwanted and missed alerts.

Stage 2 alert (red) Conflict too early right too late missed unwanted

DEP - DEP 0 10 0 0 2 12

DEP - APP 0 11 0 0 0 11

DEP - CROSS 0 7 0 3 0 10

APP - APP 1 13 1 2 0 17

1 41 1 5 2 50

2% 82% 2% 10% 4% 100%

Table 3-8: Alert Performance Results with Crossing Runway Conflicts

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 47 File Name: D671_Analysis_V1.0.doc Version: 1.0

3.2 Toulouse

3.2.1 General The findings in terms of general issues of the operational feasibility of A-SMGCS at Toulouse-Blagnac are summarised in this section. The general technical characteristics of the A-SMGCS are well accepted by Toulouse-Blagnac con-trollers. In particular, the modularity, limits of the manoeuvring and movement areas defined locally prior to the experimentation sessions and level of automation are accepted by the controllers. The operational implementation of the A-SMGCS at Toulouse-Blagnac should affect the share of re-sponsibilities only between the controllers and the A-SMGCS. In particular, the increased automation of the detection and identification functions should result in a transfer of a part of the responsibilities currently taken by the controllers to a validated A-SMGCS. The A-SMGCS having been implemented for shadow-mode trials only, no A-SMGCS procedures have been defined. Even if the controllers consider that the A-SMGCS may help them perform surveillance of the traffic and deliver most of the clearances on the ground, they are conservative about the modification of the procedures. In particular, they consider that the implementation of A-SMGCS is not sufficient to mod-ify the visibility transition procedures. However, they expect to reduce the number of pilot position reports on taxiways in low visibility con-ditions.

3.2.2 Surveillance The operational feasibility of the surveillance function of A-SMGCS was evaluated during the shadow-mode trials. The main findings are recapped in this section. The feeling of Toulouse-Blagnac controllers about the surveillance function of the A-SMGCS is mixed. On the one hand, they have accepted the detection coverage, report delay and target position accuracy of the A-SMGCS. It was noted that accuracy margins are still to be defined and procedures should be adapted to take these accuracy margins into account. On the other hand, the poor quality of the detection and identification functions led them to reject the probability of detection, probability of false detection and probability of false identification of the A-SMGCS. Several features of the surveillance functions could not be evaluated during the validation sessions. The mobiles’ heading and velocity not having been implemented in the A-SMGCS HMI, as requested by Toulouse-Blagnac controllers, their acceptance could not be tested. The poor quality of the identification function prevented the controllers from assessing the identifica-tion coverage and probability of identification. Since the position discrimination verification test had not been performed when the validation sessions took place, the controllers could not evaluate the discrimination accuracy.

3.2.3 Routing The routing function has not been implemented on the A-SMGCS HMI installed in the tower. The routing system installed could only be used for technical testing with the objective of further develop-ing it during the EMMA 2 project.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 48 File Name: D671_Analysis_V1.0.doc Version: 1.0

3.2.4 Control The operational feasibility of the control function of the A-SMGCS was evaluated during the shadow-mode trials, focussing on conflict detection and alerting services. The controllers could not take a stand on the conflict detection and alerting systems implemented in Toulouse-Blagnac because of their lack of experience (i.e. very limited number of potential conflict situations observed). However, they noted that too many false alerts were triggered, mainly due to false detections, for the current system to be efficient. They expressed their interest for an efficient conflict detection and alerting system.

3.2.5 Human Machine Interface The A-SMGCS Human Machine Interface (HMI) installed in the Toulouse-Blagnac control tower was evaluated during the shadow-mode trials. The controllers are positive with the HMI. They consider that the presentation provided through the A-SMGCS display is adapted to their need, that the information displayed is useful for the operations and that they can interface efficiently with the HMI. However, they noted that the manual modification and automatic label anti-overlapping functions need to be improved.

3.2.6 Conclusion The operational requirements defined in the ICAO manual on A-SMGCS and by local end-users have been partly validated in Toulouse-Blagnac. While the HMI is well accepted, the control and surveillance primary functions still need to be im-proved in order that the controllers consider the A-SMGCS as a control tool and not only as an addi-tional information source. In particular, the quality of detection, identification and conflict prediction services are insufficient. As a consequence, the controllers are ready to accept only minor adaptations to the procedures in order to use the A-SMGCS in operations. However, they are convinced of the usefulness of such a system in Toulouse-Blagnac and they are ready to delegate some responsibilities to the A-SMGCS provided that its quality and reliability is proved.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 49 File Name: D671_Analysis_V1.0.doc Version: 1.0

3.3 Malpensa The Validation process for Malpensa airport, in the context of the EMMA project, was based on both a set of Real-time Simulations (RTS) runs at the NLR NARSIM-Tower simulator and Shadow-mode Trials performed on the test site. According to what is stated about Operational Feasibility in the V&V Methodology for A-SMGCS (D6.2.1, [10]), issues concerning the analysis of Acceptability and Usability results have been consid-ered as appropriated for this section. Questionnaires were the main tool used for collecting data from RTS and Shadow-mode sessions.

3.3.1 Malpensa Real-time Simulations The Human Factors assessment of the EMMA system through the RTS runs was based on collecting the qualitative judgments of the traffic controllers involved in the runs, using ad-hoc questionnaires (see also Ref. [9]). The questionnaires were focused on the following ‘key performance indicators’: Situation Awareness, Controller Workload, System Acceptability and Usability and they were to be completed at several moments throughout a Simulation session: • Before the validation experiments: pre-experiment questionnaire, • After each experimental run: after-each-run questionnaires (A-SMGCS ON/OFF), • At the end of the validation experiments: post-experiment questionnaire. For the scope of this section we consider only the questionnaires named ‘After each Run Question-naire’. They were to be completed by the controllers taking the role of runway controller (TWR1 and TWR2) in an experimental run. There were two different kinds of questionnaires, namely question-naires for runs with the A-SMGCS switched on and questionnaires for experimental runs with the A-SMGCS switched off. The questionnaires consisted of questions regarding the Human Machine Inter-face (HMI), Situation Awareness and Workload. Questions regarding the HMI consisted of the so called System Usability Scale (SUS). As written at the beginning of this section, only Acceptability and Usability are linked to the definition of Operational Feasibility therefore, they are both mentioned here. Main results obtained through the RTS exercise and related to the two aforementioned categories of indicators are presented in the following sections.

3.3.1.1 Results

3.3.1.1.1 Acceptability It was expected that comfort, satisfaction and ease of use of the system would increase using A-SMGCS Level I and II. In the post-experimental questionnaire the controllers were to rate these as-pects as well as others on a 6-point scale. Concerning the comfort, satisfaction, ease of use and performance improvement, the controllers rated the system 5 or higher (on 6-point scale). A-SMGCS Level I (the labels) was perceived as more com-fortable and easier to use than A-SMGCS Level II (the alerts). The statement ‘I want A-SMGCS at Malpensa’ was rated slightly higher for Level I than for Level II. Controllers mentioned that before introduction of Level II the parameters needed better setting.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 50 File Name: D671_Analysis_V1.0.doc Version: 1.0

Acceptability

0,00

1,00

2,00

3,00

4,00

5,00

6,00

comfortable satisfaction easy improvesperformance

I want A-SMGCS

A-SMGCS 1A-SMGCS 2

Figure 3-2: Acceptability Ratings for the Use of A-SMGCS

3.3.1.1.2 HMI Usability For the HMI evaluation the System Usability Scale (SUS) was used and questions concerning partici-pant attitude. The SU scale consists of 10 statements concerning the usability5. The participants should indicate their level of agreement with these statements on a Likert scale. An overall score for usability can be derived per completed SUS. This score is a number between 0 and 100. This score in itself is not meaningful but it allows for comparison of the baseline HMI and HMI with A-SMGCS. Scores per question can be compared for the different conditions as well as the overall score.

3.3.1.1.2.1 Impact of A-SMGCS on SU between Visibility Conditions Considering all runs (nominal and non-nominal) the HMI usability with A-SMGCS and without, the differences for the visibility conditions were analysed. Under visibility condition 1 (bright daylight) the difference with and without A-SMGCS was not significant. Average SU scores were 69.3 without the system and 74.9 with. Using a t-test, the difference is not significant. The outcome under visibility condition 2 (darkness) is very similar: with averages of 70.6 for baseline and 74.6 with A-SMGCS the results did not yield significant differences. However, some individual statements did reveal signifi-cant differences (like to use system, would need support, too much inconsistency, learn to use very quickly) in favour of the A-SMGCS system.

5 The SU scale originally was constructed form an original pool of 50 items, of which 10 questions were selected as most evoking and pro-viding a most consist and polarised responses.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 51 File Name: D671_Analysis_V1.0.doc Version: 1.0

SU (System Usability)

62

64

66

68

70

72

74

76

78

80

1 2

visibility

SU

scor

e

A-SMGCS offA-SMGCS on

Figure 3-3: A-SMGCS Impact on System Usability between Visibility Conditions (All Runs)

H5 H5.2

(HF05) Usability of A-SMGCS Level I

H5.2.1 HMI Usability Index See appendix Ref. [9].

Considering the SU overall scores in nominal runs without A-SMGCS was 69 compared to 76 with A-SMGCS. This difference is significant (t-test t = -3.165, df = 22, p = 0.004). In particular the state-ments ‘I think that I would like to use this system frequently’ and ‘I found the various functions in this system were well integrated’ were significantly higher with A-SMGCS as compared to the baseline condition.

3.3.1.1.2.2 Impact of A-SMGCS on SU between Visibility Conditions (Nominal Runs) Significant differences in SU score were measured within visibility conditions for nominal runs. Under VIS-1 (bright daylight), with A-SMGCS, the average was 76 and for runs without the system it was 69 (t = -3.841, df = 7.527, p = 0.006). Under dark conditions the differences were very similar but non-significant (averages 69.5 and 75.7). It was expected that the Level I system is especially useful under low visibility conditions, which can not be concluded from this study. Individual SUS statements that yielded significant differences between the baseline and the labels under visibility 2 conditions were 4, 6, 7, 8, 10 (‘would need support’, ‘too much inconsistency’, ‘learn to use very quickly’, ‘cumbersome to use’, ‘needed to learn a lot’). ‘Cumbersome to use’ was rated higher for A-SMGCS Level I compared to the baseline whereas all other statements yielding significant differences were rated in favour of the A-SMGCS Level I.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 52 File Name: D671_Analysis_V1.0.doc Version: 1.0

SU (system usability), nominal conditions

60

62

64

66

68

70

72

74

76

78

80

82

1 2

visibility

SU s

core

A-SMGCS offA-SMGCS on

Figure 3-4: A-SMGCS Impact on SU between Visibility Conditions (Nominal Runs)

3.3.1.1.2.3 Impact of A-SMGCS on SU between Controller Positions (Nominal Runs) No significant differences in SU scores between controller positions were found for the system with aircraft labels (A-SMGCS Level I). H6 H6.2 Usability of A-

SMGCS Level II H6.2.1 HMI Usability Index See appendix Ref. [9].

For non-nominal runs the difference in overall SU score was negative (with 72 with A-SMGCS off and 71 for A-SMGCS on). Although it is not a significant difference it is remarkable as in nominal runs the overall SU score was rated significantly higher with the A-SMGCS system than without. In post-experimental questionnaires it was indicated that the parameter setting of the alerting system was not considered appropriate, which may explain the negative influence of the runway incursion alert on the rating.

3.3.1.1.2.4 Impact of A-SMGCS on SU between Visibility Conditions (Non-nominal Runs) No significant differences in SU score were measured between visibility conditions for non-nominal runs. For visibility condition VIS-1 (normal daylight) the difference between A-SMGCS on and off is almost zero (averages of 70 for A-SMGCS off and 70.5 for A-SMGCS on). Under visibility condition VIS-2 the SU scores for both the baseline and the advanced system were rated higher, which can be explained by the necessity of a system in low visibility. The differences between the experimental runs with the alerting system on or off were higher and negative for the alerting system. The SU score for A-SMGCS on was on average 71.5 and 74 for A-SMGCS off. This difference is not significant con-sidering the limited number of non-nominal runs.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 53 File Name: D671_Analysis_V1.0.doc Version: 1.0

SUS (system usability), non-nominal conditions

64

66

68

70

72

74

76

1 2

visibility

SU

sco

re

A-SMGCS offA-SMGCS on

Figure 3-5: A-SMGCS Impact on SU between Visibility Conditions (Non-nominal Runs)

3.3.1.1.2.5 Impact of A-SMGCS on SU between Controller Positions (Non-nominal Runs) No significant differences in SU scores between controller positions were found considering the non-nominal runs only.

3.3.1.1.2.6 A-SMGCS Usability Issues The controllers stressed that they would also want to see labels of aircraft that are on the apron and just departing. The labels of aircraft that have just arrived at a gate should automatically disappear, to avoid the display to get cluttered. On the approach display the labels should not be visible because of clutter. Controllers believed that the A-SMGCS Level I would increase their capacity under low visibility conditions. The controllers mentioned that they needed to take care not to be head-down too much.

3.3.2 Shadow-mode The shadow-mode trials performed at Malpensa test site, involved the subjective measurement of the impacts of the EMMA A-SMGCS on Safety, Capacity, Efficiency, and the Human Factors. This type of assessment was based on the collection of subjective judgments of the controllers regarding the system performance. The qualitative measurements were carried out by submitting five ad-hoc debriefing questionnaires to the ATCos involved in the sessions, one for each measured indicator. It allowed a systematic collec-tion of the associated controllers’ judgments. Specifically, the indicators measured were grouped in the following key areas: • Safety, • Capacity, • Efficiency, and

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 54 File Name: D671_Analysis_V1.0.doc Version: 1.0

• Human Factors: - Acceptance, and - Usability.

The Safety and Human Factors questionnaires aimed to determine the level of agreement or disagree-ment of the participating controllers, with respect to a set of statements focused on the performance of the system. The relevant scales used ranged from 1-6 or 1-5. The analysis of this type of data was based on non-parametric statistical tests performed on the median of the level of agree-ment/disagreement provided by the controllers. According to the definition of Operational Feasibility, Human Factors is the only key area considered, even in this section. The Human Factors impacts examined at Malpensa site were grouped into two major categories: i) User Acceptance, and ii) System Usability. Main results obtained from this testing activity, for both classes of indicators, are presented in the fol-lowing sections.

3.3.2.1 Results

3.3.2.1.1 Acceptance The user Acceptance questionnaire involved 65 questions (see Ref. [14], Appendix A.4.2) assessing the performance of the system under the following major features: i) the improvement of the operations performed by the controllers, ii) the clarity and presentation quality of the information displayed by the A-SMGCS, iii) the user friendliness of the system and iv) the sufficiency of the information provided by the system. Sign tests were performed on the median of the scores collected for each question in order to assess the corresponding performance of the system.

ANSWERS

Strongly Disagree Disagree Slightly

DisagreeSlightlyAgree Agree Strongly

Agree QUESTION NUMBER

1 2 3 4 5 6 M SD

1 4 14 14 3.3125 0.681794512 10 13 9 1.96875 0.769917813 5 18 9 3.125 0.649519054 10 13 9 2.96875 0.769917815 4 23 5 2.03125 0.529408576 4 22 4 2 3.125 0.695970557 1 12 14 5 3.71875 0.759702868 4 15 13 4.28125 0.672419849 3 21 8 3.15625 0.56509817

10 18 14 2.4375 0.4960783711 12 10 10 2.9375 0.8267972812 4 18 10 4.1875 0.63430572

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 55 File Name: D671_Analysis_V1.0.doc Version: 1.0

13 11 12 9 3.9375 0.7880950114 11 11 10 1.96875 0.8094896215 28 4 4.125 0.3307189116 10 22 2.6875 0.4635124117 12 20 1.625 0.4841229218 17 15 3.46875 0.4990224819 11 21 1.65625 0.4749588820 9 23 4.71875 0.4496092121 14 18 4.5625 0.4960783722 6 14 12 4.1875 0.7261843823 9 10 13 4.125 0.8196798224 10 22 4.6875 0.4635124125 8 24 4.75 0.433012726 18 14 3.4375 0.4960783727 13 19 4.59375 0.491132328 17 15 5.46875 0.4990224829 8 24 4.75 0.433012730 22 10 4.3125 0.4635124131 24 8 2.25 0.433012732 16 8 8 2.75 0.829156233 14 18 4.5625 0.4960783734 8 10 14 3.1875 0.80767835 17 15 5.46875 0.4990224836 14 16 2 1.625 0.5994789437 5 13 14 4.28125 0.7173900238 12 20 4.625 0.4841229239 15 8 9 3.8125 0.8454843340 13 19 4.59375 0.491132341 13 19 4.59375 0.491132342 21 11 4.34375 0.4749588843 22 10 2.3125 0.4635124144 12 12 8 3.875 0.7806247545 5 20 7 2.0625 0.6091746546 16 8 8 3.75 0.829156247 14 12 6 3.75 0.7548 12 20 3.625 0.4841229249 14 16 2 3.625 0.5994789450 2 20 10 4.25 0.5590169951 8 20 4 3.875 0.5994789452 18 14 4.4375 0.4960783753 21 11 2.34375 0.4749588854 9 15 8 1.96875 0.7281987655 12 20 3.625 0.4841229256 18 14 2.4375 0.4960783757 8 24 3.75 0.433012758 8 12 12 4.125 0.7806247559 5 8 19 4.4375 0.747391360 11 12 9 3.9375 0.7880950161 1 22 9 2.25 0.562 14 18 4.5625 0.4960783763 6 18 8 5.0625 0.6584783664 10 14 8 1.9375 0.7473913

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 56 File Name: D671_Analysis_V1.0.doc Version: 1.0

65 7 7 18 4.34375 0.81430089

Table 3-9: Answer Distribution for Acceptance Questionnaire

ACCEPTANCE

0

1

2

3

4

5

6

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65Question Number

Agr

eem

ent S

cale

(Mea

n va

lue)

Figure 3-6: Answer Trend for the Acceptance Questionnaire

3.3.2.1.2 Usability The system usability questionnaire included 10 questions (see Ref. [14], Appendix A.4.1) that referred to the performance of the system in relation to the difficulty of using it and the training required for learning how to use it. On the basis of these questions the controllers tend to believe that the system is easy to use while they found no difficulty in learning how to use it.

ANSWERS

Strongly Disagree Strongly

Agree QUESTION NUMBER

1 2 3 4 5 MN SD

1 4 25 3 3.96875 0.4666620162 8 20 4 1.875 0.599478943 7 25 3.78125 0.4133986424 24 8 1.25 0.4330127025 3 19 10 4.21875 0.5986638776 2 22 8 2.1875 0.5266343617 4 26 2 3.9375 0.4284784138 3 25 4 2.03125 0.4666620169 8 20 4 2.875 0.5994789410 2 16 14 2.375 0.59947894

Table 3-10: Answer Distribution for Usability Questionnaire

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 57 File Name: D671_Analysis_V1.0.doc Version: 1.0

USABILITY

00,5

11,5

22,5

33,5

44,5

1 2 3 4 5 6 7 8 9 10Question Number

Agr

eem

ent S

cale

(Mea

n va

lue)

Figure 3-7: Answer Trend for the Usability Questionnaire

3.3.3 Conclusions The main conclusions concerning the Human-Factors-related results provided by the EMMA RTS exercise for the Milan Malpensa airport, and particularly for Acceptability and Usability indicators, are as follows: Concerning the acceptability of the EMMA system (as expressed by comfort, satisfaction, ease of use, and performance improvement), the controllers rated the system with 5 in a 6 point scale (1denotes the lowest score and 6 the highest). A similar qualitative scale was used for the assessment of the usability of the EMMA system based on the controllers’ judgment, which consisted of 10 main aspects. A t-test of the ratings revealed no sig-nificant difference concerning the Usability of the system between the runs with and without the EMMA system. Each of the MPX RTS runs was followed by a debriefing session for assessing the overall judgments and remarks of the participating controllers’ regarding the EMMA system. The following issues were raised by the controllers during the corresponding debriefing session: i) labels were proposed to appear for the just departing aircraft on the apron ii) the labels should not appear on the approach display iii) the labels of the aircraft just arriving at a gate should disappear iv) not to head-down too much was stressed v) it was believed that the EMMA level I system would definitely increase their capacity under low

visibility conditions. On the basis of the answers provided by the operational controllers involved in the shadow-mode trials at the Malpensa test site, the conclusion is that they tend to believe that the system is easy to use while they found no difficulty in learning how to use it.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 58 File Name: D671_Analysis_V1.0.doc Version: 1.0

Concerning the improvement of the operations, the controllers believe that the system is not expected to decrease the level of safety of the controllers’ operations. In addition they anticipated the system as helpful in performing their tasks. It should be pointed out though that the controllers also believe that the introduction of the system may be associated with new types of human errors. Regarding the presentation quality of the information provided by the system, the controllers tend to believe that the display of the system is user friendly while the associated information provided by it, is easy to understand. Concerning the user friendliness of the system, the controllers identified no difficulty in utilizing the services provided by the system. Finally, the controllers tend to believe that the system provides consistent sufficient information for performing their tasks.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 59 File Name: D671_Analysis_V1.0.doc Version: 1.0

4 Analysis and Description of Operational Improvements

4.1 Prague

4.1.1 Introduction

4.1.1.1 Participants of RTS and On-site Trials A total of 11 ANS CR controllers in four groups participated in the two phases of the EMMA real-time simulations. There were five controllers with the first phase and six with the second RTS phase. Controllers of the first phase were confronted with conflict scenarios to test operational feasibility and operational improvements of the A-SMGCS conflict alert service. Controllers of the second RTS phase did also use the conflict alert service but the traffic scenarios went without evoked conflicts. Separated from the experimental RTS 2 exercises controllers of the third and fourth group (RTS2) were requested to perform a test run using electronic flight stripes and a departure manager and to give their comments to this new A-SMGCS service. All participants were male. The table below outlines the distribution of the controllers to the RTS phases.

Subject Sex RTS phase Conflict scenarios Groups

C1 M 1 X

C2 M 1 X

C3 M 1 X

1st group

C4 M 1 X

C5 M 1 X 2nd group

C6 M 2

C7 M 2

C8 M 2

3rd group

C9 M 2

C10 M 2

C11 M 2

4th group

Table 4-1: Allocation of Controllers to the Groups and RTS Phases

With the operational field trials a total of 15 ANS CR controllers filled out the debriefing question-naire during the EMMA. All 15 ANS CR had worked with the A-SMGCS for 7 months at the time of the investigation. A 144-item debriefing questionnaire was given to 15 ANS CR controllers after their regular shift.

4.1.1.2 Experimental Design of RTS and On-site Trials

4.1.1.2.1 Real-time Simulation Trials The RTS experimental design is based on the use of real experiments. In real experiments, the same scenarios are used for the Baseline System and the A-SMGCS set-up in order to achieve ceteris paribus conditions. In this way, results from the Baseline system can be directly compared with the

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 60 File Name: D671_Analysis_V1.0.doc Version: 1.0

A-SMGCS within a traffic scenario. However, a comparison of traffic characteristics between the traf-fic scenarios is lacking the different amount of traffic and runway configurations because Prague regu-lations do allow only low or medium traffic and runway 24 operations with CATII/III conditions in low visibility. Scenario A and B uses other runway configurations and more traffic volume because they operate with VIS-2 and VIS-1 conditions respectively. In conclusion, the following matrix shows the experiment set-up in terms of experimental factors:

SYS 1 SMGCS (Baseline)

SYS 2 A-SMGCS Level II

VIS-1 (scenario B) X X

VIS-2 (scenario A) X X

VIS-3 (scenario C) X X

Table 4-2: Combination of Experimental Factors

As already mentioned above, emphasis has been put on realistic traffic scenarios to measure the poten-tial influence of using an A-SMGCS at Prague Airport. Realistic scenarios go with realistic traffic amounts in accordance to the visibility conditions, which comply with the local CAT II/III regulation. This mixture of traffic amount and visibility prevents an objective comparison of operational im-provements between visibility conditions. However, comparison between A-SMGCS and Baseline are the most wanted and reveal significant operational improvements. For more details see the Prague Test Report D6.3.1 [12]. With the Prague Test Plan (D6.1.2, [7]) high-level and low-level V&V objectives were translated into measurable indicators and measurement instruments. The following table gives an overview about the operational improvements that were intended to be measured with the real-time simulation exercises:

High-level Objective

Low-level Objective Indicator Measurement Instruments

Objective/ Subjective

Reduced number of inci-dents and accidents

1. Number of incidents and accidents

Observations Objective Safety

Faster identification and mitigation of safety haz-ards

2. Time for conflict detec-tion, identification, and resolution

Observations

Objective

Higher maximum number of aircraft handled

1. Number of aircraft han-dled6

Recordings Objective

Lower holding time per aircraft

2. Holding Time7 Recordings Objective

Lower Taxi Time for in and outbound traffic

3. Taxi Time Recordings Objective

Efficiency/ Capacity

Lower duration of radio communications

4. Duration of radio com-munications (R/T load)

Recordings Objective

6 The “number of aircraft handled by the controllers” was given by the traffic scenario itself. Differences in terms of efficiency can be seen with the “taxi time” 7 The “holding time” of aircraft during taxiing could not be recorded.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 61 File Name: D671_Analysis_V1.0.doc Version: 1.0

Lower number of requests to the pilot to report her/his position

5. Number of requests to the pilot to report her/his position8

Observations Objective.

Higher Situation Aware-ness

1. Situational Awareness SASHA_Q SASHA_ on-Line

Subjective Human Fac-tors

Convenient level of work-load

2. Workload I.S.A Subjective

Table 4-3: Low-level Objectives, Indicators, and Measurement Instruments for OIs in the RTS

4.1.1.2.2 Field Trials With the field trials the 144-item debriefing questionnaire was given to a controller after finishing her/his regular shift. The ANS CR controllers worked with the A-SMGCS fully operational for more than 7 month at the time of the field trials. With the questions/statements, ratings could be given from 1 (strongly disagree) up to 6 (strongly agree). With the operational feasibility tests (see chapter 3.1.2.2.1), full sets of performance/operational requirements and procedures have been tested for their operational feasibility. To fully validate a system it must also show that new services and procedures contribute to an operational improvement. There were four areas of interest to measure these opera-tional improvements where items of the debriefing questionnaire referred to: - Safety - Capacity (in terms of throughput) - Efficiency - Human Factors aspects

4.1.2 Safety

4.1.2.1 Number of Incidents and Accidents Identifier Hypothesis

OI-SAF1-H0 There is no difference in terms of number of incidents between the Baseline and the A-SMGCS Level II.

OI-SAF1-H1 The number of incidents decreases as an effect of introducing the A-SMGCS ap-plication and the related procedures.

No accidents were observed during the RTS. Incidents occurred but they were caused by the pseudo-pilots and thus were not human errors in terms of controller mistakes. In general, Controller errors are very rare and thus hard to assess in test trials. The H0 hypothesis OI-SAF1-H0 could not be rejected with the used experimental design.

8 As the surveillance service worked with a 100% performance there was no need for the controller to request a pilot to report her/his posi-tion. In the baseline condition in VIS3 procedural control was applied. Differences in terms of efficiency can be seen in the “R/T load”.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 62 File Name: D671_Analysis_V1.0.doc Version: 1.0

4.1.2.2 Reaction Time for Conflict Detection Identifier Hypothesis

OI-SAF2-H0 There is no difference in terms of time between the start of a conflict and resolu-tion of it by the controllers between the Baseline and the A-SMGCS Level II.

OI-SAF2-H1 The time between the start of a conflict and resolution of it by the controllers decreases as an effect of introducing the A-SMGCS application and the related pro-cedures.

The reaction time was measured by an observer who measured the time between the initiation of a conflict and the reaction of a controller. The reaction of a controller was defined by the time when the controller contacts the pilots to resolve the conflict. Pilots in the simulation were not real pilots but pseudo-pilots. They were instructed to cause conflict situations, which were outlined in [7]. The kind and number of conflict situations were adapted to the own dynamic of a traffic scenario. That’s why the kind and number of conflicts slightly varies between the test runs and scenarios. Therefore, the reaction time was summed up over the scenarios and controllers, but was separately analysed with respect to the controller working positions: TEC and GEC. The TPC was not affected by conflict situations and therefore was not analysed. The following tables outline the statistical values and the statistical test results. A t-test for paired dif-ferences was conducted to prove the data for their statistical significance:

Mean N SD SE

A-SMGCS 5,3077 13 4,32791 1,20035 TEC

Baseline 6,0000 13 2,94392 ,81650

Table 4-4: Means, SD, and SE for the TEC’s Reaction Time (RTS 1 only)

Mean N SD SE

A-SMGCS 3,8571 14 3,57033 ,95421 GEC

Baseline 3,8571 14 1,40642 ,37588

Table 4-5: Means, SD, and SE for the GEC’s Reaction Time (RTS 1 only)

65,37

12345678

Reaction Time TEC [sec]

Baseline

EMMA A-SMGCS

3,86 3,86

12345678

Reaction Time GEC [sec]

Baseline

EMMA A-SMGCS

Figure 4-1: Bar Charts of the Mean Reaction Time for TEC and GEC Position (RTS 1 only)

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 63 File Name: D671_Analysis_V1.0.doc Version: 1.0

M SD SE T df p-value

TEC A-SMGCS -Baseline

-0,69231 4,46065 1,23716 -0,560 12 0,30

GEC A-SMGCS -Baseline 0,00000 3,48624 0,93174 0,00 13 1,000

Table 4-6: t-tests for Paired Differences: Reaction Time of TEC and GEC position (RTS 1 only)

4.1.2.2.1 Concluding Results The results show no significant differences in the ‘reaction time’ between A-SMGCS and the baseline condition neither for the TEC (M = -0,69 seconds, T(12) = -0,560, p > .05) nor for the GEC position (M = 0.00 seconds, T(13) = 0.00, p > .05). For the TEC position, there is a trend that shows that control-lers react faster in the A-SMGCS condition but the effect seems not that high to be proven with only 13 pairs of conflict situation. For the GEC position there was no difference measured. In addition to that, the test observer reported that reaction times are hard to measure. Particularly, as-sessing the time when a conflict is initiated or when it can be identified as a potential conflict situation is a rather subjective estimation by the observer. Additional error variance can be assumed with the fact that conflict situations are always slightly different even when they happen at the same time in the same traffic scenario. By so far, the sample size of conflict situations in RTS must be very high to randomise these side effects and to show significant differences. However, the greater the amount of conflict situations the less is the naturalness of the traffic scenario. The H0 hypothesis OI-SAF2-H0 can not be rejected.

4.1.2.3 Debriefing Questionnaire (Field Trials) The following general hypothesis had been set up to describe the expectation with the controllers’ answers with respect to the ‘operational improvement’:

Identifier Hypothesis

OI-SAF3-H0 The controllers’ opinion does not agree to the ‘safety’ aspects expressed by a spe-cific safety item.

OI-SAF3-H1 The controllers’ opinion agrees to the ‘safety’ aspects expressed by a specific safety item.

A star (*) attached to the p-value means that a questionnaire item has been answered significantly because the p-value is equal or less than the critical error probability α, which is 0.05. When the controllers significantly express their acceptance to a single service or procedure item, it can be assumed that the operational improvement with respect to this statement is proven. Items written in italics could not be answered meaningfully because the controllers had limited or no operational experience with the topic (e.g. except in the case of lit stop bar crossing, no system alerts have been used operationally by the ATCos). When controller comments were given to an item, they are reported directly below the statement. A one-sample t-test has been applied to prove the data for their statistical significance for all 144 items:

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 64 File Name: D671_Analysis_V1.0.doc Version: 1.0

- One-Sample t-test - Expected mean value = 3.5 - Answers from 1 (disagreement) through 6 (agreement) - N = 15 - α = 0.05 - p-value is single-sided because of the use of a directed hypothesis VA-Id. Questionnaire Item N Mean SD p

28 When procedures for LVO are put into action, A-SMGCS helps me to operate safer.

15 5.4 0.5 0.00*

50 A-SMGCS is helpful for better monitoring aircraft commencing it’s take off roll.

15 5.0 1.3 0.00*

61 I think A-SMGCS can help me to detect or prevent runway incursions. 13 4.9 1.0 0.00*

62 I think A-SMGCS can help me to detect or prevent incursions into re-stricted areas.

13 5.0 0.8 0.00*

120 The use of A-SMGCS endangers safety at the airport. 10 2.1 1.2 0.00*

129 There is a risk of focusing too much on a single problem when using A-SMGCS.

10 2.9 1.1 0.12

Table 4-7: Debriefing Questionnaire - Means, SD, and p-value for ‘Safety’ OI Items

4.1.3 Efficiency/Capacity

4.1.3.1 Taxi Time Identifier Hypothesis

OI-EFF1-H0

There is no difference in terms of global taxiing time between the Baseline and the A-SMGCS Level II.

OI-EFF1-H1

The global taxiing time is reduced as an effect of introducing the A-SMGCS Level II application and related procedures.

The taxi time was measured automatically for each aircraft starting from the gate (velocity > 0 kts) until the wheels left the ground (take-off) for outbound movements. For inbound movements the time measurement started when the wheels touched the ground (touch down) until the velocity was 0 at the gate or stand. Since identical traffic scenarios were used for A-SMGCS and Baseline trials (except of that the call-signs were changed to alleviate recall effects with controllers), pairs of identical taxiing aircraft within identical traffic scenarios could be gained. This guaranteed that measured differences could be claimed for better efficiency of A-SMGCS to reduce the average duration of taxi times.

Scenario In-or Out-bound

A-SMGCS / Baseline

Average Taxi Time (sec)

Difference(sec) df t p-value

A-SMGCS 398 A In

BASELINE 417 -19 43 -0.786 0.22

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 65 File Name: D671_Analysis_V1.0.doc Version: 1.0

A-SMGCS No valid data Out BASELINE No valid data

A-SMGCS 414 In

BASELINE 451 -37 64 -1.541 0.07

A-SMGCS 683 B

Out BASELINE 820

-137 65 -6.370 0.00*

A-SMGCS 431 In

BASELINE 460 -29 59 -1.941 0.03*

A-SMGCS 532 C

Out BASELINE 674

-142 69 -6.351 0.00*

A-SMGCS 500 Total

BASELINE 579 -79 304 -7.728 0.00*

Table 4-8: Taxi Time Results (RTS 1)

Scenario In-or Out-bound

A-SMGCS / Baseline

Average Taxi Time (sec)

Difference(sec) df t p-value

A-SMGCS 444 In

BASELINE 423 21 35 0.820 0.21

A-SMGCS 659 A

Out BASELINE 629

30 30 0.795 0.22

A-SMGCS 422 In

BASELINE 497 -75 34 -1.883 0.03*

A-SMGCS 625 B

Out BASELINE 778

-153 30 -4.782 0.00*

A-SMGCS 426 In

BASELINE 457 -31 23 -0.734 0.24

A-SMGCS 478 C

Out BASELINE 430

48 21 1.732 0.05

A-SMGCS 510 Total

BASELINE 540 -30 178 -1.973 0.03*

Table 4-9: Taxi Time Results (RTS 2)

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 66 File Name: D671_Analysis_V1.0.doc Version: 1.0

579500

540 510

0

100

200

300

400

500

600

700

RTS 1 RTS 2

BaselineA-SMGCS

Figure 4-2: Total Average Taxi Times for RTS 1 and RTS 2 [sec]

4.1.3.1.1 Concluding Results The results show significant differences in the taxi times between A-SMGCS and the Baseline condi-tion for both RTS phases: For the RTS 1 (MTotal = -79 seconds, T(304) = -7.728, p < .05) and for the RTS 2 (MTotal = -30 seconds, T(178) = 1.973, p < .05). It has to be considered that pseudo-pilots are not affected by reduced visibility conditions and the speed of their controlled aircraft has always a constant level. Measured differences can only be inter-preted as a more efficient control by the controllers using A-SMGCS. As the patterns of Table 4-8 and Table 4-9 show: The differences are particularly high with scenario B where the visibility is good but the amount of traffic is the biggest. Furthermore, the results of RTS 2 should be more reliable than the RTS 1 results, because lots of movements in RTS 1 are affected by evoked conflict situations that were not applied with RTS 2. However, even with RTS 1, taxi times are significantly lower with A-SMGCS compared with the baseline condition. The H0 hypothesis OI-EFF1-H0 can be rejected and the alternative hypothesis OI-EFF1-H1 can be assumed to be valid. This means, A-SMGCS reduces taxi times.

4.1.3.2 Radio Communication Load Identifier Hypothesis

OI-EFF4-H0 There is no difference in terms of duration of radio communications between the Baseline and the A-SMGCS Level II.

OI-EFF4-H1 The total duration of radio communications is reduced as an effect of introducing the A-SMGCS Level II application and related procedures.

With both phases of the RTS, the duration of radio communication has been measured for each con-troller working position. The duration of a test run was one hour (3600 sec). However, if a test run

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 67 File Name: D671_Analysis_V1.0.doc Version: 1.0

lasted longer than the 3600 seconds, the recording file was cut after 3600 seconds. Therefore, all pre-sent R/T durations refer to 3600 seconds overall test time.

CWP A-SMGCS Mean M

Standard Deviation SD

Sample Size N

ASMGCS 522.0 74.6 6 TPC

BASE 576.0 55.7 6

ASMGCS 1413.6 151.8 15 TEC

BASE 1764.0 106.2 15

ASMGCS 1363.2 217.2 15 GEC

BASE 1560.0 385.9 15

ASMGCS 1244.0 369.6 36 Total

BASE 1481.0 491.8 36

Table 4-10: R/T Load Means, SD, and Sample Size (RTS 1)

576 522

17641413 1560 1363

0

600

1200

1800

2400

3000

3600

TPC TEC GEC

BaselineA-SMGCS

Figure 4-3: Means of R/T Load between A-SMGCS and Baseline for each CWP [sec/hr] (RTS1)

CWP A-SMGCS Mean M

Standard Deviation SD

Sample Size N

ASMGCS 582 76.9 6 TPC

BASE 720 109.1 6

ASMGCS 1626 322.3 6 TEC

BASE 1710 275.8 6

GEC ASMGCS 1668 222.6 6

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 68 File Name: D671_Analysis_V1.0.doc Version: 1.0

BASE 1932 371.8 6

ASMGCS 1292 560.4 18 Total

BASE 1454 600.3 18

Table 4-11: R/T Load Means, SD, and Sample Size (RTS2)

720 582

1710 16261932

1668

0

600

1200

1800

2400

3000

3600

TPC TEC GEC

BaselineA-SMGCS

Figure 4-4: Means of R/T Load between A-SMGCS and Baseline for each CWP [sec/hr] (RTS2)

UV QS df F p-value

CWP 9772362 2 98.606 0.000*

ASMGCS 602402 1 12.157 0.001*

CWP x ASMGCS 209034 2 2.109 0.129

Error 3270456 66

Total 147924144 72

Table 4-12: R/T Load Test for Significance (two-way ANOVA [F-test]) (RTS 1)

UV QS df F p-value

CWP 9487656.000 2 73.806 0.000*

ASMGCS 236196.000 1 3.675 0.065

CWP x ASMGCS 51192.000 2 .398 0.675

Error 1928232.000 30

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 69 File Name: D671_Analysis_V1.0.doc Version: 1.0

Total 79567920.000 36

Table 4-13: R/T Load Test for Significance (two-way ANOVA [F-test]) (RTS2)

4.1.3.2.1 Concluding Results The two-way 2x3 ANOVA shows a significant result for A-SMGCS with RTS 1 with a significant mean difference of 237 seconds per hour less R/T load (F(1,30) = 12.2, p < .05). With RTS 2, a 162 second difference between A-SMGCS and baseline was measured, which shows a positive trend to assume the H1 hypothesis, but became not significant (F(1,30) = 3.6, p > .05). However a p-value of 0.065 is rather close to significance and with a greater sample size the effect could also be proved. With the interpretation of the results, it has to be regarded that with RTS 1 the controllers were very much interrupted by evoked conflict situations that did not happen with RTS 2. However, as the im-pact of conflicts is equal to both conditions (A-SMGCS and Baseline) a systematic effect of a variance can be excluded. OI-EFF4-H0 can be rejected and the alternative H1 can be assumed: A-SMGCS reduces the load of R/T communication.

4.1.3.3 Debriefing Questionnaire (Field Trials) The following general hypothesis had been set up to describe the expectation with the controllers’ answers with respect to the ‘operational improvement’: Identifier Hypothesis

OI-EFF6-H0 The controllers’ opinion does not agree to the ‘efficiency/capacity’ aspects ex-pressed by a specific safety item.

OI-EFF6-H1 The controllers’ opinion agrees to the ‘efficiency/capacity’ aspects expressed by a specific safety item.

A star (*) attached to the p-value means that a questionnaire item has been answered significantly because the p-value is equal or less than the critical error probability α, which is 0.05. When the controllers significantly express their acceptance to a single service or procedure item, it can be assumed that the operational improvement with respect to this statement is proven. Items written in italics could not be answered meaningfully because the controllers had limited or no operational experience with the topic (e.g. except in the case of lit stop bar crossing, no system alerts have been used operationally by the ATCos). When controller comments were given to an item, they are reported directly below the statement. A one-sample t-test has been applied to prove the data for their statistical significance for all 144 items: - One-Sample t-test - Expected mean value = 3.5 - Answers from 1 (disagreement) through 6 (agreement) - N = 15 - α = 0.05 - p-value is single-sided because of the use of a directed hypothesis

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 70 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item N Mean SD p

9 When visual reference is not possible, I think identifying an aircraft or vehicle is more efficient when using the surveillance display.

15 5.2 1.3 0.00*

10 I think, also in good visibility conditions, identifying an aircraft or vehicle is even more efficient when using the surveillance display.

15 5.2 0.6 0.00*

11 Recognition of the aircraft type is more efficient with A-SMGCS. 15 5.0 1.1 0.00*

Comment by ATCo F: depends on information in a label. When I have a type in a label then I agree. Comment by ATCo K: If real type is identified with flight plan one.

29 When procedures for LVO are put into action, A-SMGCS helps me to operate more efficient.

15 5.2 0.8 0.00*

38 When visual reference is not possible I think longitudinal spacing on taxiways can be reduced with A-SMGCS.

15 4.7 1.3 0.00*

39 Without visual reference but using A-SMGCS, it would no longer be necessary to make records of vehicles on the manoeuvring area.

15 3.8 1.2 0.35

41 Coordination between involved control positions is more efficient with A-SMGCS.

10 4.5 1.1 0.00*

42 With A-SMGCS hand over processes between different control posi-tions are more efficient.

15 4.6 1.3 0.03*

Comment by ATCo F: We have not hand over procedures so? Comment by ATCo H: hand over procedures not applicable

43 The number of position reports will be reduced when using A-SMGCS (e.g. aircraft vacating runway-in-use).

15 4.9 0.7 0.00*

45 In good visibility line-up at the runway threshold is easier to control with A-SMGCS.

15 4.7 1.0 0.00*

46 When the runway threshold is not visible line-up is easier to control with A-SMGCS.

15 5.3 0.5 0.00*

47 In good visibility line-up from intersection is easier to control with A-SMGCS.

14 4.7 1.3 0.00*

51 With A-SMGCS, a clearance for a rolled take-off can be issued more frequently.

15 4.6 0.9 0.00*

52 In good visibility take-offs from intersection are easier to control with A-SMGCS.

15 4.5 1.1 0.00*

53 When an intersection is not visible take-offs from the intersection are easier to control with A-SMGCS.

15 5.1 0.6 0.00*

58 The transition from normal operations to low visibility operations is easier with A-SMGCS.

15 4.9 0.9 0.00*

72 The control of aircraft with the A-SMGCS is very efficient. 15 5.1 0.8 0.00*

74 A-SMGCS reduces waiting times for aircraft at the airport. 15 3.9 1.5 0.35

112 With A-SMGCS, it is easier to separate aircraft safely. 15 4.9 1.0 0.00*

Comment by ATCo E: Not in the air.

113 With A-SMGCS, it is easier to detect runway incursions. 14 4.9 1.2 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 71 File Name: D671_Analysis_V1.0.doc Version: 1.0

VA-Id. Questionnaire Item N Mean SD p

Comment by ATCo E: Not if warnings are not on. (not used in Prague).

114 With A-SMGCS, it is easier to detect incursions into closed taxiways. 13 4.8 1.3 0.00*

Comment by ATCo E: Not if warnings are not on. (not used in Prague). Comment by ATCo F: Not applicable.

115 With A-SMGCS, it is easier to detect incursions into protected areas. 15 4.8 1.3 0.00*

Comment by ATCo E: Not if warnings are not on (not used in Prague).

116 With A-SMGCS, it is easier to detect aircraft on the apron. 15 4.5 1.1 0.00*

Comment by ATCo E: Apron in common settings is suppressed.

121 I think that the A-SMGCS increases traffic throughput at the airport. 15 4.0 1.1 0.22

121a When the traffic demand is higher than the current capacity I think with A-SMGCS the traffic throughput can be increase.

5 3.4 1.5 0.89

122 The A-SMGCS enables me to handle more traffic when visual refer-ence is not possible.

15 4.3 1.1 0.01*

124 The A-SMGCS enables me to execute my tasks more efficiently. 14 4.5 1.1 0.00*

128 There are less frequent unexpected calls of A/C and vehicles with A-SMGCS.

12 3.4 1.3 0.84

137 The use of A-SMGCS facilitates information gathering and interpreta-tion.

15 4.6 0.7 0.00*

Table 4-14: Debriefing Questionnaire - Means, SD, and p-value for ‘Efficiency/Capacity’ OI Items

4.1.4 Human Factors

4.1.4.1 Situation Awareness Identifier Hypothesis

OI-HF1-H0 The ATCos’ situational awareness in the Baseline condition is higher or at least equal compared to the A-SMGCS Level II test condition.

OI-HF1-H1 The ATCos’ situational awareness is improved as an effect of introducing the A-SMGCS Level II application and the related procedures.

4.1.4.1.1 SASHA Questionnaire Results After each test run the controllers’ situation awareness was measured with the SASHA Questionnaire. This questionnaire was developed within the project ‘Solutions for Human-Automation Partnerships in European ATM’ (SHAPE) conducted by EUROCONTROL (2003)9. The questionnaire uses a five-point scale and contains 12 questions, of which eight questions address generic subjective aspects of SA referring to the work of an ATCo, three questions addressing aspects of specific tools, and one question addressing SA globally. Each ATCo completes the questionnaire at the end of a test run.

9 http://www.eurocontrol.int/humanfactors/gallery/content/public/docs/DELIVERABLES/HF35-HRS-HSP-005-REP-01withsig.pdf

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 72 File Name: D671_Analysis_V1.0.doc Version: 1.0

These ratings have been merged to two scores per controller, one for the EMMA A-SMGCS and one for the baseline condition. The following bar charts show the mean values for each of the 12 SASHA questionnaire items. The star with a p-value shows the significance of an item, which means the controller commonly saw dif-ferences between the A-SMGCS and baseline conditions with respect to the assumption of the alterna-tive hypothesis (H1).

4,664,99

1

2

3

4

5

Item 1*

Baseline

EMMA A-SMGCS

4,72 4,85

1

2

3

4

5

Item 2

Baseline

EMMA A-SMGCS

1,62 1,59

1

2

3

4

5

Item 3

Baseline

EMMA A-SMGCS

1,521,23

1

2

3

4

5

Item 4

Baseline

EMMA A-SMGCS

4. Did you have the feeling of starting to focus too much on a single problem and/or traffic area under your control?

p = 0.05

3. Have you been surprised by an aircraft (or vehicle) call that you were not expect-ing?

p = 0.44

2. Did you have the feeling that you were able to plan and organise your work as you wanted?

p = 0.133

1. Did you have the feeling that you were ahead of the traffic, able to predict the evolution of the traffic?

p = 0.04*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 73 File Name: D671_Analysis_V1.0.doc Version: 1.0

1,3 1,2

1

2

3

4

5

Item 5

Baseline

EMMA A-SMGCS

1,771,38

1

2

3

4

5

Item 6

Baseline

EMMA A-SMGCS

3,82

4,36

1

2

3

4

5

Item 7

Baseline

EMMA A-SMGCS

2,362,59

1

2

3

4

5

Item 8

Baseline

EMMA A-SMGCS

3,73

4,55

1

2

3

4

5

Item 9

Baseline

EMMA A-SMGCS

3,62

1

2

3

4

5

Item 10

EMMA A-SMGCS

10. Do you think the RWY incursion alert function provided you with useful infor-mation? (only with A-SMGCS test run)

p = 0.00*

9. Did the A-SMGCS / SMR Display help you to have a better understanding of the situation?

p = 0.03*

8. Were you paying too much attention to the A-SMGCS / SMR Display?

p = 0.19

7. Do you think the A-SMGCS / SMR Display provided you with useful informa-tion?

p = 0.02*

6. Did you have any difficulty finding an item of information?

p = 0.03*

5. Did you forget to transfer any aircraft?

p = 0.24

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 74 File Name: D671_Analysis_V1.0.doc Version: 1.0

2,67

1

2

3

4

5

Item 11

EMMA A-SMGCS

4,084,59

1

2

3

4

5

Item 12

Baseline

EMMA A-SMGCS

4.1.4.1.1.1 Concluding Results Six of the 12 questionnaire items have been significantly answered in the expected direction, five showed the right trend but without significance, and item 11 was answered in the non-expected direc-tion. However, the main situation awareness item 12 has been answered significantly, supporting the hypothesis OI-HF1-H1, which expects a higher SA with A-SMGCS use. With this result, the OI-HF1-H0 can be rejected and the H1 can be assumed as valid alternative that means A-SMGCS increases the Controller’s Situation Awareness.

4.1.4.1.2 SASHA On-line Questionnaire Results This technique is based on the Situation Present Assessment Method (SPAM). Five ATCos of the RTS 1 were asked three questions by a subject matter expert (SME) via their intercom within each test run. This was done while the simulation was still running, i.e. the simulation was not frozen like in the classical SAGAT query technique. The following questions were asked: 1. Where is flight x? 2. Is flight y under your control? 3. Which flight has to be transferred next? The SA of the ATCos is usually that high that they do not give wrong answers. The following table shows the small cases where they gave wrong answers:

Scenario A Scenario B Scenario C

A-SMGCS Baseline A-SMGCS Baseline A-SMGCS Baseline

TPC TEC GEC TPC TEC GEC TPC TEC GEC TPC TEC GEC TPC TEC GEC TPC TEC GEC

C1 0 0 0 0 0 0 0 0 0 0 0 1

C2 0 0 1 0 0 0 0 0 0 0 0 0

C3 0 0 0 0 1 0 0 0 0 0 0 0

C4 0 0 1 0 0 0 0 2 0 0 0 0

C5 0 1 0 0 0 0 0 0 0 0 0 0

12. How would you rate your overall Situation Awareness during this exer-cise?

p = 0.01*

11. Did the RWY incursion alert function help you to have a better understanding of the situation? (only with A-SMGCS test run)

Mean is lower than 3

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 75 File Name: D671_Analysis_V1.0.doc Version: 1.0

0 1 0 1 0 0 0 1 0 1 2 0 0 0 0 0 1 Total

1 1 1 3 0 1

Table 4-15: Raw Data of Wrong Answers with the SAHA On-line Query (RTS 1 only)

4.1.4.1.2.1 Concluding Results In total 180 (3 times per test run and CWP) questions have been asked to the controllers, but only 7 wrong answers have been given. Among these 7 wrong answers, 5 have been given when A-SMGCS was not used. This result is far from statistically significance but it further supports the hypothesis OI-HF1-H1, which expects a higher SA with A-SMGCS use.

4.1.4.2 Workload Identifier Hypothesis

OI-HF3-H0 When workload is on a non-convenient level, the controllers’ workload with the A-SMGCS Level II test condition is not lower compared to Baseline test condition.

OI-HF3-H1 When workload is on a non-convenient level in the baseline condition, the workload with use of an A-SMGCS would be reduced with the same scenario.

With every test run every controller was asked to give his perceived workload rating every 10 minutes. The controller could choose one of five ISA workload categories: 1 = underutilised 2 = relaxed 3 = comfortable 4 = high 5 = excessive For the analysis, the ISA mid-run workload scores were summed up over each Controller for each test run and respective mean scores were calculated.

2,285 2,276

1

2

3

4

5

I.S.A. Workload

Baseline

EMMA A-SMGCS

Figure 4-5: Total Means for ISA Workload between A-SMGCS and Baseline Test Conditions

Factors df F-value p-value

A-SMGCS 1 0.019 0.89

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 76 File Name: D671_Analysis_V1.0.doc Version: 1.0

error 10

Traffic scenario 2 4.540 0.02*

error 20

asmgcs * scenario 2 0.869 0.44

error 20

Table 4-16: ANOVA with Repeated Measurements for ISA Workload

The ISA workload means (cf. [12] for all means of the six cells) were analysed in separate 2 x 3 (A-SMGCS x Scenario) analyses of variance (ANOVA) with repeated measurements on all independent factors. The ANOVA revealed no significant main effect of A-SMGCS (F(1,10) = 0.019; p = 0.89) with a mean of M = 2.285 compared to the baseline mean of M = 2.276 on a scale reaching from 1-5. A significant main effect for the traffic scenario was found (F(2,20) = 4.540; p = 0.02), whereas traffic scenario C reaches the smallest workload mean (MA = 2.40; MB = 2.30; and MC = 2.15), where the visibility is the lowest but also the traffic amount is only the half of scenario A or B.

4.1.4.2.1 Concluding Results Most of the time the controllers felt relaxed and comfortable in the simulation runs, independent of the test condition A-SMGCS or Baseline. Traffic scenarios were not demanding enough to stress the con-trollers. Therefore, A-SMGCS had no chance to show a workload improvement compared to a high or even excessive workload in the baseline condition. Since a ‘non-convenient workload level’ as stated by OI-HF3-H0 could not be reached, A-SMGCS could not show its benefits in terms of workload reduction. Therefore, OI-HF3-H0 cannot be rejected with the used experimental design.

4.1.4.3 Debriefing Questionnaire (Field Trials) The following general hypothesis had been set up to describe the expectation with the controllers’ answers with respect to the ‘operational improvement’: Identifier Hypothesis

OI-HF5-H0 The controllers’ opinion does not agree to the ‘Human Factors’ aspects expressed by a specific safety item.

OI-HF5-H1 The controllers’ opinion agrees to the ‘Human Factors’ aspects expressed by a specific safety item.

A star (*) attached to the p-value means that a questionnaire item has been answered significantly because the p-value is equal or less than the critical error probability α, which is 0.05. When the controllers significantly express their acceptance to a single service or procedure item, it can be assumed that the operational improvement with respect to this statement is proven. Items written in italics could not be answered meaningfully because the controllers had limited or no operational experience with the topic (e.g. except in the case of lit stop bar crossing, no system alerts have been used operationally by the ATCos). When controller comments were given to an item, they are reported directly below the statement.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 77 File Name: D671_Analysis_V1.0.doc Version: 1.0

A one-sample t-test has been applied to prove the data for their statistical significance for all 144items: - One-Sample t-test - Expected mean value = 3,5 - Answers from 1 (disagreement) through 6 (agreement) - N = 15 - α = 0.05 - p-value is single-sided because of the use of a directed hypothesis

4.1.4.3.1 Situation Awareness VA-Id. Questionnaire Item N Mean SD p

12 The A-SMGCS display gives me a better position situational awareness (where is the traffic).

15 5.4 0.5 0.00*

13 The A-SMGCS display gives me a better identification situational awareness (who is who).

15 5.3 0.6 0.00*

125 The A-SMGCS helps me to maintain good situation awareness. 15 5.2 0.4 0.00*

126 ‘Maintaining the Picture’ is supported well by the A-SMGCS. 15 4.9 0.5 0.00*

127 I feel that A-SMGCS enables me to predict better the evolution of the traffic (to be ahead of the traffic).

15 4.4 1.0 0.00*

131 The A-SMGCS display helps to have a better understanding of the situation.

15 5.2 0.4 0.00*

Table 4-17: Debriefing Questionnaire - Means, SD, and p-value for ‘Situation Awareness’ OI Items

4.1.4.3.2 Workload VA-Id. Questionnaire Item N Mean SD p

14 I think identifying the traffic using A-SMGCS increases workload. 15 2.1 1.3 0.00*

59 When procedures for LVO are put into action, A-SMGCS helps me to reduce my workload.

15 5.2 0.6 0.00*

73 The use of A-SMGCS makes the controller’s job more difficult. 15 1.7 0.5 0.00*

76 The use of A-SMGCS has a negative effect on job satisfaction. 15 1.5 0.5 0.00*

138 The use of A-SMGCS increases mental effort for checking information sources.

14 2.4 0.9 0.00*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 78 File Name: D671_Analysis_V1.0.doc Version: 1.0

139 The use of A-SMGCS decreases workload for anticipating future traffic situations.

15 4.7 0.6 0.00*

Table 4-18: Debriefing Questionnaire - Means, SD, and p-value for ‘Workload’ OI Items

4.1.4.3.3 Human Error VA-Id. Questionnaire Item N Mean SD p

118 The introduction of the A-SMGCS decreases the potential of human error.

14 2.7 1.4 0.05

119 The introduction of the A-SMGCS is associated with new types of human error.

13 3.1 1.1 0.20

Comment by ATCo M: Aircraft at holding point with mixed up labels can lead to calling wrong aircraft. Comment by ATCo O: see item 107

136 The A-SMGCS is useful for reducing mental workload. 15 4.8 0.7 0.00*

Table 4-19: Debriefing Questionnaire - Means, SD, and p-value for ‘Human Error’ OI Items

4.1.5 Conclusions The A-SMGCS V&V activities with Prague Ruzynĕ Airport were done on two validation platforms: - Real-time Simulation (RTS) and - On-site at the Prague Ruzynĕ Airport. Three levels of V&V activities have been performed on both platforms: - Technical Tests (Verification), - Operational Feasibility (Validation), and - Operational Improvements (Validation). Different objectives were aimed for with the different test platforms and levels of testing. The techni-cal tests checked whether the installed A-SMGCS in the simulation or in the Tower environment ful-filled all technical requirements to enable the operational use of the system and to perform the valida-tion activities. The technical systems answered the question: ‘Did we set up the system right?’

RTS and On-site trials focussed on validation activities but with two different levels of testing. Real Time simulations usually offer a good opportunity to measure operational improvements in terms of objective traffic data (e.g. taxi times, R/T load, etc.). They were also used to investigate safety critical situations like low visibility conditions or conflict situations without any danger. On-site trials were mainly needed to test the system in the real environment in terms of its technical performance and of its operational feasibility. The controllers who worked with the A-SMGCS fully operational (within all visibility conditions) were asked if they accept the A-SMGCS design, perform-ance, and the new operational procedures. All platforms and levels of testing were needed to fully validate the A-SMGCS.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 79 File Name: D671_Analysis_V1.0.doc Version: 1.0

The A-SMGCS, which was installed at Prague Ruzynĕ Airport, has been validated. All technical and operational results affirm the overall main question: ‘Did we build the right system?’ All main technical and operational requirements could be verified (cf. chapter 5.3, Appendix A and Appendix B). For this purpose, technical short- and long-term measurements have been conducted. With some requirements the system performance could not fully meet the standards (e.g. ‘Op_Perf-01-Probability of Detection’ should be 99.9% but only 99.65% was measured) but the controllers’ accep-tance of this performance validate this lower performance finally. For the long-term system performance measurements, the MOGADOR tool was used. MOGADOR is a tool developed in EMMA that analyses fully automatically specific performance parameters from a long-term recorded data pool of the regular airport traffic. This tool revealed interesting results that can also be used to tune and adapt the A-SMGCS to meet the operational needs. However, the Prague long-term results, analysed with MOGADOR, still lack maturity because the time was not sufficient to fully adapt the MOGADOR algorithm to the specific Prague airport characteristics, which is always needed to measure the real system performance automatically. The operational on-site trials (field tests) revealed that the controllers, who have worked with the A-SMGCS fully operationally for 7 months now, accepted the A-SMGCS and thus approved its ‘opera-tional feasibility’. Statements like: - ‘When visual reference is not possible, the displayed position of the aircraft on the taxiways is

accurate enough to exercise control in a safe and efficient way.’, or - ‘I think that the A-SMGCS surveillance display could be used to determine that an aircraft has

vacated the runway.’, or - ‘The information displayed in the A-SMGCS is helpful for avoiding conflicts.’, or - ‘The A-SMGCS provides the right information at the right time.’, or - ‘When visual reference is not possible I think the A-SMGCS surveillance display can be used to

determine if the runway is cleared to issue a landing clearance.’ have been significantly confirmed by 15 ANS CR controllers. The statements mainly refer to the sur-veillance service of the A-SMGCS, because the ANS CR controllers have not used the full scope of the monitoring and alerting function yet but only the ‘stop bar crossing’ alerts as a first step. However, flight tests, which were used to evoke additional conflict situations at crossing runways, showed that also the performance of other monitoring alerts was accepted by the controllers. To fully validate a system it must also show its operational improvements. This was mainly done in real time simulations because these simulations can provide real experimental conditions. The most important result of the RTS was that A-SMGCS is able to reduce the average taxi time. In both simulation phases, the average taxi time was reduced by 13.6% and 7.1% respectively. Both results are highly significant with 968 total movements. Furthermore, A-SMGCS reduces the load of the R/T communication. With RTS1 a reduction of 16.0% and with RTS2 a reduction of 11.1% was measured, whereas only the RTS1 results showed statistical significance. A further operational improvement can be assumed with the ‘controller’s reaction time in case of a conflict situation’: 5.3 seconds instead of 6.0 seconds without A-SMGCS showed an interesting trend but became not significant. However, with a bigger sample size it can be assumed that this small effect could also become significant. These objective operational improvements, which were measured on the real-time simulation test plat-form, could also be confirmed with controllers’ subjective statements in the field. Controllers were asked to estimate their perceived safety and efficiency when they work with A-SMGCS compared to earlier times when they did not use an A-SMGCS. The following main results were gained:

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 80 File Name: D671_Analysis_V1.0.doc Version: 1.0

- ‘When procedures for LVO are put into action, A-SMGCS helps me to operate safer.’, or - ‘I think A-SMGCS can help me to detect or prevent runway incursions.’, or - ‘When visual reference is not possible, I think identifying an aircraft or vehicle is more efficient

when using the surveillance display.’, or - ‘I think, also in good visibility conditions, identifying an aircraft or vehicle is even more efficient

when using the surveillance display.’, or - ‘The A-SMGCS enables me to execute my tasks more efficiently.’, or - ‘The number of position reports will be reduced when using A-SMGCS (e.g. aircraft vacating

runway-in-use).’, or - ‘The A-SMGCS enables me to handle more traffic when visual reference is not possible.’, or - ‘The A-SMGCS display gives me a better situational awareness.’, or - ‘When procedures for LVO are put into action, A-SMGCS helps me to reduce my workload.’ These examples, which were all positively answered by the controllers, further support the hypothesis that A-SMGCS provides significant operational improvements that will result in operational benefits for all stakeholders of an A-SMGCS.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 81 File Name: D671_Analysis_V1.0.doc Version: 1.0

4.2 Toulouse

4.2.1 Safety The current section gives an overview of the conclusions drawn after shadow-mode trials were run in order to evaluate the operational improvements brought by the A-SMGCS in Toulouse-Blagnac in terms of safety. The feelings of Toulouse-Blagnac controllers are mixed regarding the potential safety improvements brought by the A-SMGCS. The controllers accept the amount of detected incorrect display, loss of display and incorrect identifier to use the A-SMGCS but, again, as an additional information source and not as a control tool. The controllers do not accept the level of detected lack of identifier. Many incorrect display and lack of identifier events were observed during the shadow-mode trials. These events are the consequences of a bad reliability of the detection and identification functions. They have a very negative impact on the trust of the controllers in the system and their acceptance of the A-SMGCS as a control tool. The influence of the A-SMGCS on the number of control errors could not be assessed, the system being implemented for shadow-mode trials only.

4.2.2 Efficiency The shadow-mode trials and fast-time simulations led to draw the following conclusions on opera-tional improvements of the efficiency of the A-SMGCS at Toulouse-Blagnac. The A-SMGCS should increase the overall efficiency of the airport in low visibility conditions but mainly in high traffic situations. This observation made thanks to the fast-time simulations was con-firmed by the controllers, whose expectations are limited in terms of gain in efficiency at Toulouse-Blagnac. Their judgment can be explained by the current traffic, which, most of the time, can be man-aged without impacting the airport efficiency. The controllers noted that in addition to the A-SMGCS, the use of the second runway in low visibility conditions should optimise the airport efficiency. However, it would necessitate important works in order to adapt ground lighting to the second runway for low visibility operations. The shadow-mode trials and fast-time simulations revealed that the number of pilot reports should be decreased in low visibility conditions. The departure taxi time should be decreased in low visibility conditions, whatever the traffic density. In low visibility conditions; the arrival taxi time should be maintained when traffic density is high and it should be decreased when traffic density is intermediate. Delays due to ATC ground operations should be decreased in low visibility conditions, in particular departure delays and especially if both runways are used.

4.2.3 Capacity The operational improvements brought by the A-SMGCS in terms of capacity were evaluated during the shadow-mode trials and fast-time simulations in Toulouse-Blagnac. The conclusions of the valida-tion sessions are recapped in this section. Similarly to the efficiency improvements, the controllers expressed limited expectations for the poten-tial capacity gains brought by the A-SMGCS at Toulouse-Blagnac. Their judgment can be explained by the current traffic, which, most of the time, can be managed without impacting the airport capacity. Moreover, while the A-SMGCS will not be considered as a control tool by Toulouse-Blagnac control-lers, the procedures will not be adapted and the gain in runway and taxiway capacity should be lim-ited.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 82 File Name: D671_Analysis_V1.0.doc Version: 1.0

The fast-time simulations showed that a gain in departure runway throughput is expected in low visi-bility conditions and high traffic situations while it should be maintained in intermediate traffic situa-tions. The arrival runway throughput should be maintained. The A-SMGCS does not have any significant impact on the apron and gate capacity. The use of the second runway in low visibility conditions should again increase the overall airport capacity.

4.2.4 Human Factors The shadow-mode trials led to draw the following conclusions on operational improvements of the human factors of the A-SMGCS at Toulouse-Blagnac. The situational awareness of the controllers should be improved when using the A-SMGCS. In par-ticular, the controllers should have a better awareness of the respective position and identification of each mobile, better understanding of the traffic situation and better anticipation of the evolution of traffic. The A-SMGCS HMI helps the controllers reduce their mental workload. The controllers consider that the global usability of the A-SMGCS HMI is satisfactory.

4.2.5 Conclusion The A-SMGCS should bring benefits to controllers in terms of human factors and safety. This should be particularly significant if the current system is improved in terms of quality of information and reliability so that it can be used as a control tool. The benefit brought by the A-SMGCS in terms of capacity and efficiency should be more limited. Here again, improvements are affected by the restriction of the A-SMGCS to an additional informa-tion source, procedures remaining quite similar to the current ones. The amount of traffic at Toulouse-Blagnac also has an influence on the capacity and efficiency im-provements. It has been measured that the most important improvements brought by the A-SMGCS should take place during low visibility conditions and high traffic situations. These situations are lim-ited in time and frequency at Toulouse-Blagnac. If it is judge to be worthwhile and in order to significantly increase the airport capacity and efficiency with the A-SMGCS, works could be made to allow the second runway to be used during adverse con-ditions.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 83 File Name: D671_Analysis_V1.0.doc Version: 1.0

4.3 Malpensa As defined in the EMMA V&V Methodology for A-SMGCS (D6.2.1, [10]) the stage named ‘Opera-tional Improvements’ constitutes an intermediate step towards the assessment of direct impacts of the proposed system. It is part of the Validation phase and refers to the measurement of the potential im-provements of the airports system performance in terms of Safety, Capacity, Efficiency and Human Factors, caused by the use of the EMMA A-SMGCS. This section summarises the main conclusions drawn on the results obtained for Malpensa from both RTS and shadow-mode trials in the view of the Operational Improvements definition. An overview of the obtained results is reported here as well.

4.3.1 Safety Safety measurements were performed through both real-time simulations and shadow-mode trials. Although the amount of collected data was not sufficient for making a significant statistical analysis, the RTS was oriented to a quantitative assessment of safety-related indicators mainly, on the other hand, the shadow-mode trials, on site, allowed the Validation team to have a qualitative only evalua-tion in terms of safety. The main results obtained from the Validation techniques and the conclusions are reported below. An estimation of the Detection (S2.1/S4.1/S5.1), Resolution (S2.2/4.2/5.2) and Duration (S2.3/4.3/5.3) periods was extrapolated by an accurate post-elaboration of each complete recording of R/T commu-nication between controllers and pseudo-pilots involved in the simulation sessions. Results were pro-duced according to what is described in the RTS V&V Test Plan for Malpensa (D6.1.4, [9]).

S2.1/4.1/5.1 (DETECTION PERIOD)

S2.2/4.2/5.2 (RESOLUTION PERIOD)

S2.3/4.3/5.3 (DURATION PERIOD)

VIS-1/B VIS-1/A VIS-1/B VIS-1/A VIS-1/B VIS-1/A

0.00.12 0.00.56 0.00.02 0.00.04 0.00.14 0.01.00

0.00.21 0.01.23 0.00.08 0.00.08 0.00.29 0.01.31

0.00.07 0.00.53 0.00.03 0.00.08 0.00.10 0.01.01

0.00.31 0.00.46 0.00.08 0.00.04 0.00.35 0.00.50

0.00.01 0.00.56 0.00.04 0.00.04 0.00.08 0.01.00

0.00.57 0.00.36 0.00.04 0.00.03 0.01.01 0.00.39

0.00.12 0.00.26 0.00.08 0.00.04 0.00.20 0.00.30

0.00.08 0.01.07 0.00.04 0.00.05 0.00.12 0.01.12

0.00.23 0.00.21 0.00.03 0.00.05 0.00.26 0.00.26

0.01.04 0.00.11 0.00.06 0.00.03 0.01.10 0.00.14

- 0.00.55 - 0.00.04 - 0.00.59

0.00.07 - 0.00.02 - 0.00.09

MEAN Values 0.00.24 0.00.43 0.00.05 0.00.05 0.00.29 0.00.48

Table 4-20: Safety Results under VIS-1 Conditions (time in [h.mm.ss])

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 84 File Name: D671_Analysis_V1.0.doc Version: 1.0

S2.1/4.1/5.1 (DETECTION PERIOD)

S2.2/4.2/5.2 (RESOLUTION PERIOD)

S2.3/4.3/5.3 (DURATION PERIOD)

VIS-2/B VIS-2/A VIS-2/B VIS-2/A VIS-2/B VIS-2/A

0.01.14 0.00.24 0.00.05 0.00.07 0.00.19 0.00.31

0.01.08 0.00.28 0.00.05 0.00.06 0.01.13 0.00.34

0.00.38 0.00.36 0.00.02 0.00.03 0.00.40 0.00.39

0.00.59 0.00.53 0.00.06 0.00.15 0.01.05 0.01.08

0.00.07 0.00.08 0.00.05 0.00.12

MEAN Values

0.01.00 0.00.30 0.00.05 0.00.07 0.00.49 0.00.37

Table 4-21: Safety Results under VIS-2 Conditions (time in [h.mm.ss])

Concerning the assessment of the safety-related operational improvements, the following indicators were used: SA01-1: Safety Critical Situation Occurrence, SA01-3: Response Period for detection of pilot and driver error, SA01-4: Response period for resolution of Pilot and Driver Error, and SA01-5: Duration of conflict situation. Basically, the objective of the safety impacts assessment through the Malpensa RTS was focused on determining the detection, resolution, and total duration time (of the triggered non-nominal events) differences with and without the EMMA system under visibility condi-tions 1 & 2. The analysis of the measurements that emerged from the simulation runs was based on the comparison of the mean values of the corresponding times. However, no statistical test was performed in order to investigate the statistical significance of the difference between the two mean values (Ref. [14]). This type of analysis led to the following conclusions: i) the expected benefit of EMMA Level II system regarding the decrease of detection and the total

duration time was reached only under visibility conditions 2 i.e. low visibility conditions, and ii) no significant impacts of EMMA system have been identified on the resolution time for either

visibility conditions. In the following table controllers’ feedback to the safety questionnaire filled in during shadow-mode trials at Malpensa are shown:

ANSWERS

Strongly Disagree Disagree Slightly

DisagreeSlightlyAgree Agree Strongly

Agree QUESTION NUMBER

1 2 3 4 5 6 M SD

1 10 22 4.6875 0.4635

2 4 14 14 4.3125 0.6818

3 22 10 5.3125 0.4635

4 4 10 18 4.4375 0.7043

Table 4-22: Answer Distribution for Safety Questionnaire

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 85 File Name: D671_Analysis_V1.0.doc Version: 1.0

SAFETY

0

1

2

3

4

5

6

1 2 3 4

Question Number

Agr

eem

ent S

cale

(Mea

n va

lue)

Figure 4-6: Answer Trend for the Safety Questionnaire

Based on the analysis of the safety questionnaires, the controllers that participated in the shadow-mode trials of EMMA A-SMGCS in Malpensa tend to believe that the EMMA system will help them to operate safer (medium agreement) while it would accommodate the process of monitoring the traffic from the gate till the take-off (medium agreement). Furthermore, it can be concluded that the system will contribute in the detection of runway incursions (strong agreement) and it could accommodate the controllers in detecting the conflicts in the manoeuvring area (medium agreement). Thus, an overall positive feedback was received from the controllers regarding the safety impacts of the system.

4.3.2 Efficiency Efficiency of operations was looked at from two different points of view, namely punctuality in terms of taxi and departure delay and efficient use of resources, i.e. runways and taxiways. First, the results considering delays are looked at. ID Low-level Objective Hypothesis

E1 Punctuality of flights shall be improved when A-SMGCS Level I functionality is in use.

With the use of EMMA A-SMGCS Level I the punc-tuality of flights (in terms of taxi delays and depar-ture delays) will improve as compared to a situation without A-SMGCS Level I.

An important indicator for delays at the airport is taxiing delay. In this simulation taxiing delay was determined by comparing nominal and actual taxi periods. The nominal taxi period was determined by dividing the nominal taxi distance by a nominal taxi speed of 15 knots. The nominal taxi distance was defined as the distance moved in forward direction when the speed does not exceed 40 knots. Indicator Metrics Measurement

E1.1 (EF01)

Taxiing Delay E1.1.1 Difference between nominal taxi period

Nominal taxi period is determined by dividing

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 86 File Name: D671_Analysis_V1.0.doc Version: 1.0

and taxi period (posi-tive value indicating that taxi period was longer than nominal)

nominal taxi distance by nominal taxi speed (15 knots).Taxi period starts with first movement after pushback and ends when the aircraft reaches 40 knots on the runway.

The results were presented as a list of delay values which were defined as the difference between ac-tual and nominal taxi periods, meaning that positive values indicate a delay as compared to the nomi-nal taxi time. This resulted in distributions of delay values for each simulation, with a certain mean value and standard deviation. The final value in Table 4-23 represents a significance value as regards the difference between the means of the two distributions for baseline and advanced conditions. For all efficiency measurements, this value was determined by performing a t-test for the two samples with either assuming equal or unequal variances depending on the outcome of an F-test. Variances were assumed to be equal when the outcome of the F-test reached the first level of significance (0.68). The t-test was not considered a paired test, because variances in pseudo-pilot behaviour could be expected. Considering the null hy-pothesis for efficiency values in the test plan (see Ref. [9]) the t-test was performed as a one-tailed test. This means that the null hypothesis could be rejected, if it was shown that there is a significant difference between sample-means and the mean in the baseline condition is larger than the mean in the advanced condition. In more simple words, this means that the difference between the two sample-means was considered significant if the significance value determined in the t-test is less than 0.05.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Mean [s]

SD [s] Significance

27 B 79.92 133.19

37 A A

86.53 132.52 0.38

34 B 40.58 94.96

26 A B

62.98 96.25 0.06

31 B 26.34 64.60

35 A

1

C 30.43 79.64

0.37

29 B 11.25 45.04

25 A D

19.75 68.92 0.26

33 B 54.18 103.79

30 A E

54.21 74.85 0.50

38 B -10.75 59.15

23 A

2

F 34.75 53.79

0.00

Table 4-23: E1.1.1 – Taxiing Delay Results

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 87 File Name: D671_Analysis_V1.0.doc Version: 1.0

In Table 4-23 only traffic sample F shows a significant result and the null hypothesis must be rejected. Traffic sample B is close to significance, however, the null hypothesis cannot be rejected. The result is supported by the earlier results on throughput for both samples. It seems that all aircraft were handled a bit faster (trends in runway and runway crossing throughput) in the baseline condition. However, none of the other traffic scenarios shows this behaviour, so that for the overall results the null hypothesis cannot be rejected. Line-up queue delay was one of the more difficult measurements to define. Since this value is not related to a nominal value the actual measurement represents the period of time each aircraft remains in a line-up queue before take-off., i.e. the time difference between exiting and entering a line-up queue. The queue was exited as soon as the departure runway area was entered. The aircraft entered a queue when it reduced its speed to 0 knots for the first time after having been handed over to the de-parture runway controller. Indicator Metrics Measurement

E1.2 (EF05)

Line-up Queue Delay

E1.2.1 Difference between exiting time of queue and entering time of queue

Exiting time of queue is when aircraft enters the departure runway. Enter-ing time of queue is when aircraft reduces speed to 0 knots for the first time after having been handed over to the departure run-way controller.

Again results were presented as a list of time values, which resulted in distributions of line-up queue times for each simulation, with a certain mean value and standard deviation. The final value in Table 4-24 again represents a significance value for the difference between the means of the two distribu-tions for baseline and advanced conditions.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Mean [s]

SD [s] Significance

27 B 195.33 215.01

37 A A

279.60 229.57 0.15

34 B 124.80 156.13

26 A B

164.80 90.90 0.25

31 B 52.30 79.38

35 A

1

C 149.40 175.68

0.07

29 B 135.43 207.47

25 A D

61.57 162.90 0.24

33 B 0.00 0.00

30 A

2

E 66.75 124.03

0.04

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 88 File Name: D671_Analysis_V1.0.doc Version: 1.0

38 B 69.56 121.48

23 A F

48.22 126.67 0.36

Table 4-24: E1.2.1 – Line-up Queue Delay Results

In Table 4-24 only traffic sample E shows a significant result and the null hypothesis must be rejected. Traffic sample C is close to significance, however, the null hypothesis cannot be rejected. The result is supported by earlier results regarding departure throughput. Aircraft seemed to have been handled faster in the baseline condition. However, the result of sample E in the baseline condition must certainly be seen as an exceptional result since there was no line-up queue delay at all. Further-more, none of the other traffic scenarios shows this behaviour, so that for the overall results the null hypothesis cannot be rejected. Departure delay was impossible to define giving the working procedures of Malpensa ground control. Since ground control only worked with planning times for going off-blocks no scheduled departure times were available. In order to find a meaningful measurement the difference between the actual departure time (ATD) and the expected off-blocks time (EOBT) was determined. Obviously, the dif-ference between this measurement and the measurement suggested in the experiment plan (Ref. [9]) is the missing of information on planned taxi periods. Since results are compared between the same traf-fic samples, though, these periods are expected to be the same in both cases, so that it will be possible to make a comparison between baseline and advanced conditions, albeit with a somewhat artificial measurement. Indicator Metrics Measurement

E1.3 (EF07)

Departure Delay E1.3.1 Difference between scheduled time of departure and actual

Scheduled departure times not available. Difference between actual take-off time and expected off-blocks time is considered instead as a measure for planning efficiency.

Again results were presented as a list of measurements, representing distributions of time differences (ATD - EOBT) for each simulation, with a certain mean value and standard deviation. The final value in Table 4-25 represents a significance value as regards the difference between the two distributions for baseline and advanced conditions.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Mean [s]

SD [s] Significance

27 B 593.27 172.99

37 A A

605.80 145.83 0.42

34 B 566.30 136.72

26 A B

635.70 206.01 0.19

31 B

1

C 463.30 124.52 0.17

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 89 File Name: D671_Analysis_V1.0.doc Version: 1.0

35 A 524.10 151.81

29 B 461.14 91.49

25 A D

433.86 104.60 0.31

33 B 628.08 113.55

30 A E

659.08 171.44 0.30

38 B 481.67 116.47

23 A

2

F 568.00 131.23

0.09

Table 4-25: E1.3.1 – Departure Delay Results

In Table 4-25 only sample F is close to significance, which is again supported by the earlier mentioned throughput results for that sample, however, the null hypothesis cannot be rejected. None of the other traffic scenarios show this behaviour, so that for the overall results the null hypothesis cannot be re-jected. The next measurement to be looked at was runway crossing delay (only RWY 35L). Even though this measurement was also a bit more complex, it could easily be extracted from the NARSIM-Tower simulator loggings. Arrival time at runway crossings was defined as the time at which an aircraft en-tered a pre-defined area just before the stop bar and crossing time was defined as the time at which the aircraft entered a very small area around the runway centre line. The difference between these time values was defined to be the crossing delay, which again was compared for identical traffic samples in baseline and advanced condition. Indicator Metrics Measurement

E1.4 Crossing Delay E1.4.1 Difference between time of crossing the runway and arrival time at runway cross-ing

Arrival time at runway crossing is when the air-craft enters an area around the stop bar before the runway and the crossing time is when the aircraft enters an area around the centreline of the crossed runway.

Results were presented as a list of measurements, representing distributions of crossing delay for each simulation, with a certain mean value and standard deviation. The final value in Table 4-26 represents a significance value as regards the difference between the two distributions for baseline and advanced conditions.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample Mean SD Significance

27 B 85.65 51.42

37 A

1 A

97.35 62.98 0.28

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 90 File Name: D671_Analysis_V1.0.doc Version: 1.0

34 B 104.74 63.26

26 A B

83.96 22.20 0.07

31 B 69.09 20.95

35 A C

81.78 42.58 0.10

29 B 163.75 63.63

25 A D

145.25 73.54 0.36

33 B 219.36 101.01

30 A E

178.36 53.44 0.13

38 B 100.00 33.85

23 A

2

F 142.71 43.42

0.03

Table 4-26: E1.4.1 – Runway Crossing Delay Results

In Table 4-26 only the result for sample F is indeed significant. Sample B is close to significance, however, both results are contradictory. While all earlier results already indicated a slightly better handling of traffic in the baseline condition for sample F, the result for sample B is supported by the trend in better runway crossing throughput in the advanced condition. None of the other traffic scenar-ios shows either the one or the other behaviour, so that for the overall results the null hypothesis can-not be rejected. Pushback delay could be determined very straightforward. It was defined as the difference between pushback time and ready-for-pushback time. Pushback time was the time at which the aircraft first had a positive or negative speed and ready-for-pushback time was the time at which the pseudo-pilot switched to the ground control frequency for the first time. This always happened shortly before EOBT. Indicator Metrics Measurement

E1.5 (EF03)

Pushback Delay E1.5.1 Difference between pushback time and ready-for-pushback time

Ready-for-pushback time is the time when the pseudo-pilot switches to the ground frequency for the first time. Pushback time is when the aircraft first has a positive or negative speed.

Results were presented as a list of measurements, representing distributions of pushback delay for each simulation, with a certain mean value and standard deviation. The final value in Table 4-27 represents a significance value as regards the difference between the two distributions for baseline and advanced conditions.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample Mean SD Significance

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 91 File Name: D671_Analysis_V1.0.doc Version: 1.0

27 B 47.93 26.83

37 A A

40.93 23.72 0.23

34 B 63.60 61.42

26 A B

63.70 43.99 0.50

31 B 40.45 27.17

35 A

1

C 59.36 44.30

0.12

29 B 26.43 9.78

25 A D

35.00 21.25 0.18

33 B 60.25 37.75

30 A E

114.58 125.08 0.09

38 B 49.11 39.88

23 A

2

F 55.44 54.82

0.39

Table 4-27: E1.5.1 – Pushback Delay Results

None of the results in Table 4-27 can be considered significant, so that for the overall results the null hypothesis cannot be rejected. This essentially means that there was no exceptional pushback delay in the simulations. Finally, two measurements looked at efficient use of airport resources, i.e. runways and taxiways, namely the total arrival and total departure periods. ID Low-level Objective Hypothesis E2 Available resources shall be

used more efficiently when A-SMGCS Level I functionality is in use.

With the use of EMMA A-SMGCS Level I resources (in terms of runway occupancy time) will be used more efficiently than without A-SMGCS Level I.

The arrival taxi period was defined as the difference between engine shutdown, when the aircraft stops for the last time in the simulation (only completed arrivals were counted), and touchdown which is when the aircraft is less than one metre above the ground. Indicator Metrics Measurement

E2.1 (EF11)

Taxi Period of Arrival

E2.1.1 Taxi period from touchdown until en-gine shut-down

Touchdown is when the aircraft is less than 1 me-tre above the ground. En-gine shut-down is when the aircraft stops for the last time in the simulation.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 92 File Name: D671_Analysis_V1.0.doc Version: 1.0

Results were presented as a list of measurements, representing distributions of arrival periods for each simulation, with a certain mean value and standard deviation. The final value in Table 4-28 represents a significance value as regards the difference between the two distributions for baseline and advanced conditions.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample Mean SD Significance

27 B 264.91 91.01

37 A A

266.91 97.24 0.47

34 B 267.58 94.35

26 A B

289.73 108.13 0.19

31 B 254.50 91.66

35 A

1

C 237.85 81.28

0.24

29 B 224.85 78.41

25 A D

249.54 118.89 0.27

33 B 249.75 75.05

30 A E

266.56 106.64 0.31

38 B 185.14 42.91

23 A

2

F 194.57 57.18

0.37

Table 4-28: E2.1.1 – Arrival Period Results

The result in Table 4-28 shows that none of the simulations had significant differences in the total arrival period. For the overall results the null hypothesis cannot be rejected. This means that differ-ences in efficiency mainly had to do with departure operations. The departure taxi period was defined as the difference between take-off time, when the aircraft is more than one metre above the ground and pushback time, which is the first time at which the aircraft has a positive or negative speed value. Indicator Metrics Measurement

E2.2 (EF 10, EF12)

Taxi Period of Departure

E2.2.1 Taxi period from pushback until take-off

Pushback time is when the aircraft first has a positive or negative speed. Take-off time is when the air-craft is more than 1 metre above the ground.

Results were presented as a list of measurements, representing distributions of departure periods for each simulation, with a certain mean value and standard deviation. The final value in Table 4-29

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 93 File Name: D671_Analysis_V1.0.doc Version: 1.0

represents a significance value as regards the difference between the two distributions for baseline and advanced conditions.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample Mean SD Significance

27 B 644.20 172.36

37 A A

663.67 154.70 0.37

34 B 601.00 125.41

26 A B

670.30 171.47 0.16

31 B 521.30 112.03

35 A

1

C 567.40 120.10

0.19

29 B 531.43 89.48

25 A D

495.43 116.94 0.27

33 B 666.92 113.97

30 A E

643.50 90.06 0.29

38 B 532.11 112.34

23 A

2

F 608.00 78.53

0.06

Table 4-29: E2.2.1 – Departure Period Results

Keeping in mind the previous result, this final measurement combines all important efficiency meas-urements. As can be seen in Table 4-29 none of the results is significant, so that the null hypothesis cannot be rejected. Sample F is close to significance. This shows that in this sample there seems to have been an exceptionally good baseline run with a mean gain of about one minute per aircraft. How-ever, this trend is not reflected in any other of the results and there are even contradictory trends for the same visibility condition, so that no further conclusions should be drawn. Regarding efficiency values, the same trends are detected, however, no significant results are obtained with the exception of traffic sample F, which continually shows better and even significantly better efficiency values for the baseline condition. However, this fact must be considered as being excep-tional, due to the fact that there were no other traffic samples showing that behaviour. Above that, there were also trends that favoured the advanced condition (see crossing efficiency in traffic sample B). Thus, the null hypothesis, which states that controllers are working just as efficiently or even better with A-SMGCS than without A-SMGCS, cannot be rejected. Considering the set-up of the simulations and the chosen procedures, points of improvement would be to carry out simulations with clear depar-ture targets, such as the expected time of departure (ETD), and with a continuous stream of inbound and outbound flights at the very upper capacity level of the airport. In order to compare the improve-ments under bad visibility conditions, procedures between good and bad visibility conditions should not differ.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 94 File Name: D671_Analysis_V1.0.doc Version: 1.0

The data above leads to the following conclusions: the assessment of the efficiency related impacts of the EMMA system through the RTS runs was achieved through measuring the following indicators: taxiing delay (EF01), Line-up queue delay (EF05), Departure delay (EF07), Crossing delay, Push back delay (EF03), Taxi period of arrival (EF11), and Taxi period of departure (EF10, EF12). The latter two indicators refer to the efficiency of managing the resources of the airport while the remaining indica-tors assess the punctuality of the relevant airport operations. The collection of measurements emerging from each RTS run with and without the EMMA system led to two samples of time values expressing delays or total duration of operations. The assessment of each indicator was based on checking the significance of the difference of the mean values of the measurements collected with the EMMA sys-tem switched on and switched off respectively. A t-test (at 5% significance level) was performed for each indicator in order to validate the statistical significance of this difference. However, no sufficient statistical significance was detected for any test thus leading to the conclusion that the EMMA system does not significantly improve the aforementioned categories of delays. For more details on the set-up of the Malpensa RTS (i.e. scenarios and traffic samples) the reader is referred to documents [9] and [14]. The following tables (starting at the next page) give an overview of the efficiency-related results ob-tained from the shadow-mode trials performed at Malpensa test site:

DELIBERATELY BLANK

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 95 File Name: D671_Analysis_V1.0.doc Version: 1.0

VISIBILITY CONDITION 1

QUESTIONS x.1 - How do you assess impact of A-SMGCS on the…? x.2 - If positive, how do you estimate the benefits compared with the existing baseline?

Topic Number Positive % Pos. Negative % Neg. No Impact % No Imp. 0÷5% % 5÷10% % more % 1.1 4 12.5 0 28 87.5

Taxi-in Delay 1.2(a) 4 100 0 0 2.1 6 20 0 24 80

Taxi-out Delay 2.2(a) 6 100 0 0 3.1 0 0 32 100 Departure Queuing

Delays 3.2(a) N/A N/A N/A 4.1 4 12.5 0 28 87.5 Mean Departure

Delays 4.2(a) 4 100 0 0 5.1 4 12.5 0 28 87.5 Mean Departure

Taxi Time 5.2(a) 4 100 0 0 6.1 0 0 32 100

Mean Arrival Taxi Time 6.2(a) N/A N/A N/A 7.1 4 12.5 0 28 87.5

Minimum Taxi Time 7.2(a) 4 100 0 0 8.1 4 12.5 0 28 87.5

Maximum Taxi Time 8.2(a) 4 100 0 0 9.1 0 0 32 100 Mean Nr. of Aircraft in

the Departure Queue 9.2(a) N/A N/A N/A 10.1 0 0 32 100 Maximum Nr. of Aircraft

in the Departure Queue 10.2(a) N/A N/A N/A 11.1 20 62.5 0 12 37.5 Number of

Communications 11.2(a) 8 40 12 60 0

Table 4-30: Answer Distribution for Efficiency Questionnaire in Visibility Condition 1

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 96 File Name: D671_Analysis_V1.0.doc Version: 1.0

VISIBILITY CONDITION 2

QUESTIONS x.1- How do you assess impact of A-SMGCS on the…? x.2 - If positive, how do you estimate the benefits compared with the existing baseline?

Topic Number Positive % Pos. Negative % Neg. No Impact % No Imp. 0÷5% % 5÷10% % more % 1.1 23 71.875 0 9 28.125

Taxi-in Delay 1.2(a) 21 91.30435 2 8.695652 0 2.1 20 62.5 0 12 37.5

Taxi-out Delay 2.2(a) 18 90 2 10 0 3.1 14 43.75 0 18 56.25 Departure Queuing

Delays 3.2(a) 14 100 0 0 4.1 10 31.25 0 22 68.75

Mean Departure Delays 4.2(a) 10 100 0 0 5.1 22 68.75 0 10 31.25 Mean Departure Taxi

Time 5.2(a) 22 100 0 0 6.1 26 81.25 0 6 18.75

Mean Arrival Taxi Time 6.2(a) 24 92.30769 2 7.692308 0 7.1 22 68.75 0 10 31.25

Minimum Taxi Time 7.2(a) 22 100 0 0 8.1 22 68.75 0 10 31.25

Maximum Taxi Time 8.2(a) 22 100 0 0 9.1 8 25 0 24 75 Mean Nr. of Aircraft in

the Departure Queue 9.2(a) 8 100 0 0 10.1 8 25 0 24 75 Maximum Nr. of Aircraft

in the Departure Queue 10.2(a) 8 100 0 0 11.1 28 87.5 0 4 12.5 Number of

Communications 11.2(a) 22 78.57143 6 21.42857 0

Table 4-31: Answer Distribution for Efficiency Questionnaire in Visibility Condition 2

As concerns answers to question x.2(b) (in case of negative or no impact, explain the reason), the same considerations as for capacity (see next chapter) can be applied also for efficiency.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 97 File Name: D671_Analysis_V1.0.doc Version: 1.0

4.3.3 Capacity Regarding capacity there was only one objective with an associated hypothesis, namely to find out whether capacity at the runways and on the airport will be increased using A-SMGCS Level I: ID Low-level Objective Hypothesis

C1 Capacity shall increase when A-SMGCS Level I functionality is in use.

With the use of EMMA A-SMGCS Level I capacity at the runways and on the airport will be increased as compared to a situation without A-SMGCS Level I.

Table 4-32 Capacity Low-level Objective and Hypothesis Indicators C1.1-1.4 were analysed in the same manner. Results were looked at for different traffic samples so that a comparison between use of A-SMGCS and no use of A-SMGCS could be made for the same traffic sample. The combined output of three different traffic samples could then be com-pared for two different visibilities. As was mentioned earlier in this document, a direct comparison between VIS-1 and VIS-2 simulations was not possible due to different working procedures and run-way configurations. However, it was interesting to find out whether there were comparable trends in the data. In general, throughput was determined by looking at traffic development within a certain period of time. For example, the number of aircraft per hour at 30 minutes into the simulation consists of the number of aircraft from the start of the simulation until one hour into the simulation. Accordingly, the number of aircraft at simulation start is from -30 minutes until +30 minutes into the simulation. Al-though this seems to be a very artificial way to determine throughput, the associated graphs could show the development of traffic within a simulation run and also give the maximum achieved throughput. They could also show shifts in the simulation run, i.e. they indicate whether aircraft were handled earlier or later. In particular, the development of the throughput curve was considered a meas-ure for such a shift. A quicker build up to maximum throughput and a quicker decrease from maxi-mum throughput, means that aircraft were handled quicker (cf. Figure 4-7). The first indicator, to be looked at, was Runway Departure Throughput. It was measured as a list of take-off times of aircraft. Indicator Metrics Measurement

C1.1 (CA01)

Runway Departure Throughput

C1.1.1 Number of take-offs in a period of time

NARSIM-Tower event logging: list of take-off times.

The simulation runs showed the following maximum values, which were all reached at 30 minutes into the simulation:

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Run-way Departure

Throughput

Runway Depar-ture Throughput

Indicator

27 B 15

37 A A

15 -0.10

34 B

1

B 10 -0.28

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 98 File Name: D671_Analysis_V1.0.doc Version: 1.0

26 A 10

31 B 10

35 A C

10 -0.33

29 B 7

25 A D

7 0

33 B 12

30 A E

12 -0.16

38 B 9

23 A

2

F 9

-0.47

Table 4-33: C1.1.1 – Runway Departure Throughput Results

Number of Take-offs per Hour

0

1

2

3

4

5

6

7

8

9

10

0 4 8 12 16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76 80 84 88 92 96 100 104 108 112 116 120

Time [min]

Num

ber o

f Tak

e-of

fs p

er H

our

Run 23Run 38

Figure 4-7: Example for Throughput Build-up and Decrease in Traffic Sample F

As can be seen from Table 4-33, a difference in maximum departure throughput could not be detected. It seems though, that there is a trend that traffic was handled faster in the baseline condition. Since there was no distinction made between A-SMGCS Level I and II, it is difficult to say whether this trend is caused by the ground controller, who is releasing the flights earlier in the baseline condition, or whether it is caused by one of the runway controllers reacting on possible warnings given by the RIA system at the runway threshold. Thus, the results must be compared with other measurements made in the following.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 99 File Name: D671_Analysis_V1.0.doc Version: 1.0

The next indicator to be looked at was Runway Arrival Throughput. It was measured as a list of touch-down times of aircraft. Indicator Metrics Measurement

C1.2 (CA02)

Runway Arrival Throughput

C1.2.1 Number of landings in a period of time (scenario-fixed)

NARSIM-Tower event logging: list of landing times.

The simulation runs showed the following maximum values, which were all reached at 30 minutes into the simulation:

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Run-way Arrival Throughput

Runway Arrival Throughput Indi-

cator

27 B 23

37 A A

23 0

34 B 33

26 A B

33 0.06

31 B 28

35 A

1

C 28

0

29 B 13

25 A D

13 -0.03

33 B 16

30 A E

16 -0.07

38 B 7

23 A

2

F 7

0

Table 4-34: C1.2.1 – Runway Arrival Throughput Results

As can be seen from Table 4-34, a difference in maximum arrival throughput could not be detected. There is also no trend that flights were handled faster or slower (see also Figure 4-8), which can be explained by the fact that flights landed more or less automatically at pre-defined points in time and could only be influenced marginally by pseudo-pilots, who might have changed the arrival speed on final. The next indicator to be looked at was Runway Crossing Throughput. It was measured by defining a very small area around the centreline of runway 35L and marking all aircraft that entered the area and left it again within 15 seconds. For these aircraft a list of times for entering the defined centreline area was given.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 100 File Name: D671_Analysis_V1.0.doc Version: 1.0

Number of Landings per Hour

0

5

10

15

20

25

0 4 8 12 16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76 80 84 88 92 96 100 104 108 112 116 120

Time [min]

Num

ber o

f Lan

ding

s pe

r Hou

r

Run 27Run 37

Figure 4-8: Example for Identical Runway Arrival Throughput in Traffic Sample A

Indicator Metrics Measurement

C1.3 Runway Crossing Throughput

C1.3.1 Number of crossings in a period of time

NARSIM-Tower event logging: list of average times of entering and exit-ing runway 35L.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Run-way Crossing Throughput

Runway Cross-ing Throughput

Indicator

27 B 18

37 A A

18 -0.1

34 B 23

26 A B

23 0.46

31 B 23

35 A

1

C 23

0.06

29 B 4

25 A D

4 -0.03

33 B

2

E 11 0.04

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 101 File Name: D671_Analysis_V1.0.doc Version: 1.0

30 A 11

38 B 7

23 A F

7 -0.3

Table 4-35: C1.3.1 – Runway Crossing Throughput Results

Number of RWY Crossings per Hour

0

5

10

15

20

25

0 4 8 12 16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76 80 84 88 92 96 100 104 108 112 116 120

Time [min]

Num

ber o

f RW

Y C

ross

ings

per

Hou

r

Run 26Run 34

Figure 4-9: Example for Runway Crossing Throughput in Traffic Sample B

As can be seen from Table 4-35, a difference in maximum runway crossing throughput could not be detected. There is no trend that flights were crossing earlier or later, and there is also no correlation with the possible trend described for runway departure throughput. An example for early crossing in the advanced condition is shown in Figure 4-9. The next indicators to be looked at were the Hand-over Throughput indicators. Indicator Metrics Measurement

C1.4.1 (CA05)

Number of pushbacks in a period of time

NARSIM-Tower event logging: list of times of pushback initiation.

C1.4 Hand-over Throughput

C1.4.2 Number of hand-overs from GND to TWR in a period of time

NARSIM-Tower event logging: list of times of frequency hand-over.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 102 File Name: D671_Analysis_V1.0.doc Version: 1.0

First of all, the number of pushbacks was analysed. It was measured as a list of times when pushback was initiated. A pushback was defined as the first time an aircraft moved backwards. It should be noted that aircraft leaving a stand on the airport in forward direction were not counted. The second hand-over throughput indicator concerns the number of handovers from ground to tower controller positions. It was measured as a list of times when a frequency hand-over from ground to tower occurred. Since the ground controller had the possibility to switch to two different tower fre-quencies (TWR1 and TWR2), there were actually two such indicators. Except for traffic sample A, where there were direct hand-overs from ground to TWR1 due to aircraft coming from the northern gates, most hand-overs were from ground to TWR2, though. As can be seen from Table 4-36, a difference in maximum pushback throughput could not be detected. There is no trend that flights were pushed back earlier or later, which means that the ground controller complied with the off-blocks planning. An example for pushback throughput is shown in Figure 4-10.

Number of Pushbacks per Hour

0

1

2

3

4

5

6

7

8

0 4 8 12 16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76 80 84 88 92 96 100 104 108 112 116 120

Time [min]

Num

ber o

f Pus

hbac

ks p

er H

our

Run 23Run 38

Figure 4-10: Example for Identical Pushback Throughput in Traffic Sample F

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Pushback

Throughput

Pushback Throughput Indi-

cator

27 B 6

37 A A

6 0.03

34 B 4

26 A B

4 0.03

31 B 6

35 A

1

C 6

-0.08

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 103 File Name: D671_Analysis_V1.0.doc Version: 1.0

29 B 1

25 A D

1 0

33 B 6

30 A E

6 0.12

38 B 7

23 A

2

F 7

-0.01

Table 4-36: C1.4.1 – Pushback Throughput Results

As can also be seen from Table 4-36, most aircraft in traffic sample D did not pushback (only one pushback) but left a stand. Although it might have been possible to also look at the off-blocks throughput considering stands, it cannot be expected that this will be essentially different from the pushback throughput as the runway departure throughput showed no differences at all.

Number of Handovers GND-TWR per Hour

0

2

4

6

8

10

12

14

16

0 4 8 12 16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76 80 84 88 92 96 100 104 108 112 116 120

Time [min]

Num

ber o

f Han

dove

rs G

ND

-TW

R p

er H

our

Run 30Run 33

Figure 4-11: Hand-over Problems in Traffic Sample E

Comparably, the figures for hand-overs from ground to tower positions show no surprise (Table 4-37), with the only exception being the hand-overs from ground to TWR1 in traffic sample E (see also Figure 4-11).

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Hand-over

Throughput

Hand-over Throughput Indi-

cator

27 B 8 + 8 = 16

37 A

1 A

8 + 8 = 16 0.05

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 104 File Name: D671_Analysis_V1.0.doc Version: 1.0

34 B 7 + 3 = 10

26 A B

7 + 3 = 10 -0.32

31 B 10 + 1 = 11

35 A C

10 + 1 = 11 0.01

29 B 4 + 3 = 7

25 A D

4 + 3 = 7 -0.2

33 B 11 + 4 = 15

30 A E

11 + 2 = 13 -0.88

38 B 7 + 2 = 9

23 A

2

F 7 + 2 = 9

-0.27

Table 4-37: C1.4.2 – GND to TWR Hand-over Throughput Results

After a short analysis it was found that the reason for the difference in the two runs was the fact that pseudo-pilots made a wrong frequency switch when crossing behind runway 35L to get to runway 35R for take-off, which led to double counting of handovers in run 33. Furthermore, from the indicators there seems to be a trend towards faster frequency switches in the baseline runs. Frequency switches, however, are initiated by pseudo-pilots, so that it is very doubtful whether this trend is actually mean-ingful. The final indicator considered for capacity analysis was the number of aircraft under control of con-troller positions GND, TWR1 and TWR2 at any given moment in the simulation. This indicator was analysed by showing the distribution of number of aircraft under control over the simulation time. Given the earlier results, differences between the baseline and advanced runs cannot be expected. Nevertheless, this distribution should give a better indication of how the traffic developed than the results of hand-over throughput. Indicator Metrics Measurement

C1.5.1 Number of aircraft under control of GND

NARSIM-Tower event logging: list of number of aircraft under control every 20 seconds after simulation start.

C1.5.2 Number of aircraft under control of TWR1

NARSIM-Tower event logging: list of number of aircraft under control every 20 seconds after simulation start.

C1.5 (CA07, CA08)

Number of Air-craft under Control

C1.5.3 Number of aircraft and vehicles under control of TWR2

NARSIM-Tower event logging: list of number of aircraft under control every 20 seconds after simulation start.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 105 File Name: D671_Analysis_V1.0.doc Version: 1.0

0 200 400 600 800 10001200 1400 1600 1800 2000 2200 2400 2600 2800 3000 3200 3400 3600 3800 4000 4200 4400Run 31

0

0,5

1

1,5

2

2,5

3

3,5

4

4,5

5

Num

ber o

f A/C

und

er C

ontr

ol G

ND

Time [sec]

Number of A/C under Control GND

Run 31Run 35

Figure 4-12: Example for Number of Aircraft under Ground Control (Traffic Sample C)

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Num-ber of A/C

GND

Mean Difference Number of A/C

GND

27 B 4

37 A A

4 0.20

34 B 6

26 A B

5 -0.64

31 B 4

35 A

1

C 5

0.42

29 B 3

25 A D

3 0.3

33 B 8

30 A E

6 0.04

38 B 4

23 A

2

F 5

0.22

Table 4-38: C1.5.1 – Number of A/C GND Results

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 106 File Name: D671_Analysis_V1.0.doc Version: 1.0

As Table 4-38 shows there is no trend in the data for the maximum number of aircraft under control of the ground position. The mean difference between advanced and baseline results (measured per 20 seconds of simulation time), however, shows a trend (the only exception is traffic sample B) that more aircraft were under control of the ground position in the advanced condition than in the baseline condi-tion. This would indicate that the ground controller was able to handle more aircraft at a time with the help of the extra surveillance supplied by A-SMGCS Level I functionality.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Num-ber of A/C

TWR1

Mean Difference Number of A/C

TWR1

27 B 7

37 A A

7 -0.15

34 B 7

26 A B

6 0.19

31 B 6

35 A

1

C 6

0.24

29 B 1

25 A D

1 -0.08

33 B 3

30 A E

3 0.22

38 B 3

23 A

2

F 2

-0.10

Table 4-39: C1.5.2 – Number of A/C TWR1 Results

Regarding the aircraft under control of TWR1, results are less conclusive. Again there is not much difference between the maximum values reached, but this time also the mean difference does not show a real trend (see Table 4-39), although positive values (meaning that more aircraft are under control in the advanced situation) seem to be slightly higher.

Run ID

Baseline (B) or Advanced (A)

Visibility Condition

(1, 2) Traffic Sample

Maximum Num-ber of A/C

TWR2

Mean Difference Number of A/C

TWR2

27 B 5

37 A A

6 0.14

34 B

1

B 7 -0.11

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 107 File Name: D671_Analysis_V1.0.doc Version: 1.0

26 A 6

31 B 6

35 A C

6 -0.18

29 B 4

25 A D

4 0.01

33 B 5

30 A E

5 -0.08

38 B 3

23 A

2

F 3

-0.08

Table 4-40: C1.5.3 – Number of A/C TWR2 Results

Finally, Table 4-40 shows the results for TWR2. Again there is not much difference in the maximum values, but this time the trend in the mean differences between aircraft under control in the advanced and baseline situations shows a more negative trend, meaning that in the advanced condition, the TWR2 controller was handling less traffic at the same time than in the baseline situation. It should also be noted that the overall number of aircraft under control of any position at any given time in the simulation shows a positive trend as well. Considering that pushback throughput is compa-rable in both baseline and advanced situations (in time off-blocks), this means that aircraft stay at the airport a bit longer in the advanced situation. Thus, given the previous results of indicator C1.5, it seems that aircraft were under control of the ground controller a bit longer in the advanced situation than in the baseline situation. This is also supported by the fact that throughput from GND to TWR control positions shows a negative trend, meaning that throughput from ground to runway controllers is higher for the baseline situation. The EMMA Validation low level objectives tested through the Malpensa RTS imply that the use of EMMA A-SMGCS Level I capacity at the runways and on the airport will be increased as compared to the baseline situation. The indicators used to address this objective were: Runway departure throughput (CA01), Runway arrival throughput (CA02), Runway crossing throughput, Hand-over throughput (measured by the number of push-back throughput and the number of handovers from the ground to the tower controller), and number of aircraft under control (CA07, CA08). Statistical analy-sis (t-test at 5% significance level) was employed in order to investigate any significant differences in the performance of the EMMA system from the baseline under any of the aforementioned indicators. In particular, measuring the capacity indicators provided the following conclusions: i) no significant difference was detected regarding the runway departure, arrival throughputs, Run-

way crossing throughput, maximum push-back throughput, and number of handovers, between the baseline and the EMMA system at either visibility conditions,

ii) although no trend was detected in the maximum number of aircraft handled by the ground control-ler, it was verified that the ground controller could handle more aircraft at a time with the help of the EMMA system than with the baseline situation.

The results showing reduced throughput in the advanced situation with the active A-SMGCS Level I and II system should not be overrated, since overall throughput values remained constant, meaning that throughput per hour effectively did not change.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 108 File Name: D671_Analysis_V1.0.doc Version: 1.0

The fact, however, remains that it took the ground controller some more time to control aircraft in his sector in the advanced condition. Increased inbound throughput or problems with pushback throughput can be excluded as reasons for this fact. The numbers show that aircraft left the gates in time and that inbound traffic was pre-programmed and remained the same. Thus, an explanation could be that the controller took more time to study the additional information he was provided with through A-SMGCS Level I, simply because there was no time constraint (departure time planning) that had to be met after pushback. Obviously, the result cannot be considered structural, as traffic sample B shows exactly the opposite. Although hand-over throughput from ground to tower was less efficient under the advanced condition, a slight improvement in hand-over throughput to TWR2 and also a better runway crossing throughput coincide with a tendency of less aircraft under control for the ground position. Generally, it must be noted that capacity values did not improve for the advanced situation but they also showed no definite trend of deterioration. Causes lie in the set up of the traffic scenarios (constant supply of inbound and outbound traffic) and in the fact that no punctuality targets were set for runway departure times. The following tables (on the next page) give an overview of the capacity-related results obtained from the shadow-mode trials performed at Malpensa test site:

DELIBERATELY BLANK

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 109 File Name: D671_Analysis_V1.0.doc Version: 1.0

VISIBILITY CONDITION 1

QUESTIONS x.1 - How do you assess impact of A-SMGCS on the…? x.2 - If positive, how do you estimate

the benefits compared with the existing base-line?

Topic Number Positive % Pos. Negative % Neg. No Impact % No Imp. 0÷5% % 5÷10% % more %

1.1 1 3.125 0 31 96.875 Departure Throughput

1.2(a) 1 100 0 0 2.1 0 0 32 100

Arrival Throughput 2.2(a) N/A N/A N/A 3.1 8 25 0 24 75 Mean Nr. of

Push-back Clearances 3.2(a) 8 100 0 0 4.1 8 25 0 24 75 Max Nr. of

Push-back Clearances 4.2(a) 8 100 0 0 5.1 4 12.5 0 28 87.5 Mean Nr. of

Simultaneous Taxiing 5.2(a) 4 100 0 0 6.1 4 12.5 0 28 87.5 Max Nr. of

Simultaneous Taxiing 6.2(a) 4 100 0 0

Table 4-41: Answer Distribution for Capacity Questionnaire in Visibility Condition 1

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 110 File Name: D671_Analysis_V1.0.doc Version: 1.0

VISIBILITY CONDITION 2

QUESTIONS x.1 - How do you assess impact of A-SMGCS on the…? x.2 - If positive, how do you estimate the benefits compared with the existing baseline?

Topic Number Positive % Pos. Negative % Neg. No Impact % No Imp. 0÷5% % 5÷10% % more %

1.1 20 62.5 0 12 37.5 Departure Throughput

1.2(a) 18 90 2 10 0 2.1 0 0 32 100

Arrival Throughput 2.2(a) N/A N/A N/A 3.1 20 62.5 0 12 37.5 Mean Nr. of

Push-back Clearances 3.2(a) 20 100 0 0 4.1 20 62.5 0 12 37.5 Max Nr. of

Push-back Clearances 4.2(a) 20 100 0 0 5.1 20 62.5 0 12 37.5 Mean Nr. of

Simultaneous Taxiing 5.2(a) 20 100 0 0 6.1 20 62.5 0 12 37.5 Max Nr. of

Simultaneous Taxiing 6.2(a) 20 100 0 0

Table 4-42: Answer Distribution for Capacity Questionnaire in Visibility Condition 2

Concerning answers to question x.2(b) (In case of negative or no impact, explain the reason), the main reasons for which some ATCos do not consider A-SMGCS able to provide an added value with respect to baseline situation (i.e. ATCos did not give a positive answer to question x.1) are summarised in the fol-lowing: • Visibility Condition 1: ATCo is able to manage traffic manly by outside view and then he retains that efficiency and capacity will not increase with A-

SMGCS; • Visibility Condition 2: in order to really benefit from A-SMGCS introduction, it is necessary to introduce new rules and procedures that will allow to fully

exploit the new potentialities and functionalities of the system; • Concerning arrival indicators (e.g. arrival throughput), since TWR ATCo has a very low influence on arrival sequence, benefits are not expected in this case.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 111 File Name: D671_Analysis_V1.0.doc Version: 1.0

4.3.4 Human Factors H5.1.1

Comfort and Satis-faction Index

See appendix Ref. [9].

H5.1.2 Ease-of-Task Per-formance Index

See appendix Ref. [9].

H5 H5.1 (HF03)

Controller Atti-tudes using A-MSGCS Level I

H5.1.3 Acceptability Index See appendix Ref. [9]. ID Low-level Objective Hypothesis

H6 Controller acceptability of A-SMGCS Level II related tools.

The controller will accept the A-SMGCS Level II related tools and procedures.

H6.1.1 Comfort and Satis-

faction Index See appendix Ref. [9].

H6.1.2 Ease-of-Task Per-formance Index

See appendix Ref. [9].

H6 H6.1 Controller Atti-tudes using A-MSGCS Level II

H6.1.3 Acceptability Index See appendix Ref. [9]. The qualitative measurements for the situation awareness have been collected through the completion of the SASHA-Q questionnaire (see Ref. [9]). The following impacts were investigated through this questionnaire completion process: i) the impact of EMMA system on situation awareness (all runs), ii) the situation awareness between visibility conditions 1 & 2 (all runs), iii) situation awareness between controller positions (all runs), iv) impact of EMMA system on situation awareness between visibility conditions (nominal runs), and v) impact of EMMA system on situation awareness between visibility conditions (non-nominal runs). The participants’ ratings on the individual questions were analysed using the t-test and a one-way ANOVA F statistic. Neither of both revealed significant differences regarding the mean value of any of the aforementioned categories of measurements. The workload related impacts were measured with and without the EMMA system in two different ways: i) completing the NASA TLX (see Ref. [9]) rating scale at the end of each experimental run, and ii) rating the controllers’ workload during the experiments (i.e. every three minutes of simulation

time). The analysis of the corresponding measurements was executed using t-tests for equality of means and one-way ANOVA F statistics of the individual NASA-TLX ratings and the totals. Also the average scores of the workload rated during the experiments were analysed using a t-test. The analysis of these measurements led to the following results: i) No significant difference was identified in the workload performance with vs. without the EMMA

system (taking into account the measurements from all RTS runs) ii) No significant difference was identified in the workload performance between visibility conditions

1 vs. 2 and between the two controller positions (for all runs, nominal, and non-nominal runs). iii) During the debriefing session after the completion of all RTS runs, the controllers pointed out that

the detailed labels provided by the HMI relieved them from a great deal of effort. Furthermore, it was stated that by indicating any conflict the system enables the controllers to verify what he/she already sees in the apron and therefore it relieves the mental workload of the controller.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 112 File Name: D671_Analysis_V1.0.doc Version: 1.0

4.3.5 Conclusions The capacity and efficiency questionnaires aimed to collect the opinion of the controllers on the im-pacts of the system to the corresponding capacity and efficiency indicators (i.e. the controllers were prompted to state whether the impacts under each indicator is expected to be: i) positive, ii) negative, or iii) neutral). The analysis of this type of subjective data was based on determining the frequencies of the alternative answers per question and appraising the emerging figures. Moreover, the analysis of the capacity and efficiency related questionnaires provided the following results:

• No controller foresees negative impacts on the capacity and efficiency indicators under any of the two visibility conditions.

• Under visibility conditions 1, the majority of the controllers believed that the use of EMMA A-SMGCS will have no impact on the capacity and efficiency indicators apart from the Number of Communications. 62.5% of the controllers believed that the number of communications will be decreased. A minor but not negligible proportion of controllers provided the opinion that the use of the system may have minor increase on the mean number of Push-back Clearances, the maxi-mum number of Push-back Clearances, the mean number of Simultaneous Taxiing, and the maxi-mum number of Simultaneous Taxiing.

• The corresponding results for visibility conditions 2 imply that the majority of the controllers ex-pect increase on the measures of the capacity indicators (with the exception on the Arrival Throughput). However a large proportion (around 37.5%) of them was of the opinion that the sys-tem will not affect the capacity.

• Under visibility conditions 2, the majority of the controllers (over 60%) expected decrease on the following efficiency indicators: Taxi-in delay, Taxi-out Delay, Mean Departure Taxi Time, Mean Arrival Taxi Time, Minimum Taxi Time, Maximum Time, and Number of Communications.

The results of the V&V process at Malpensa site signify the technical adequacy and the user accep-tance of the EMMA A-SMGCS (I&II). In particular, the results of the EMMA Verification process for Malpensa imply that the technical performance of the EMMA system complies with the predetermined threshold values and standards. It should be emphasised though that the Verification indicators VE-14 ‘Alert Response Time’, VE-15 ‘Routing Process Time’, VE-16 ‘Probability of Continuous Track’, VE-17 ‘Matrix of Detection’, and VE-18 ‘Matrix of Identification’ were not measured at all while short-term data were used for the statistical analysis of the remaining indicators. The Validation results regarding human factors provided through the RTS runs at the NARSIM-Tower simulator are com-patible with the corresponding results that emerged from the shadow-mode trials. Furthermore, based on the controllers’ judgments at both platforms (i.e. RTS and shadow-mode trials) the system is ex-pected to facilitate the controllers’ operations in order to improve safety, and increase capacity and efficiency at visibility conditions 2. However, not sufficient statistical evidence was identified during the RTS runs to support this statement. Furthermore, with the exception of the assessment of the ca-pacity and efficiency indicators under the RTS runs, the remaining assessment results emerging from either RTS runs or shadow-mode trials were based on descriptive statistics. Consequently, the statisti-cal significance of the differences in the performance of the EMMA system vs. the baseline system was not tested in these cases. Therefore, it is imperative to stress the fact that the validity of the corre-sponding assessment results are subject the level of statistical evidence provided by the corresponding descriptive statistics tools.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 113 File Name: D671_Analysis_V1.0.doc Version: 1.0

5 Analysis of Results with regard to EMMA Requirements

5.1 Consequences for Technical Requirements Detailed results of all technical requirements for all three test-sites are provided in Appendix A. The objective of this section is not to summarise the appendix, but to highlight some remarkable results of the tests and analysis on the technical requirements. Technical tests at Prague revealed some weaknesses in the test methodology and in the way require-ments are expressed. In particular: Tech_Surv_32 (ICAO §4.2.3) This requirement specifies that the level of an aircraft when airborne should be determined within ±10m. There are two objections to this requirement: i) It seems unnecessarily stringent from an operational viewpoint and the ICAO Manual gives no

rationale for how the number was derived. ii) It is not verifiable without very specialised (and costly) test equipment and test aircraft. Tech_Surv_35 Probability of Detection Tech_Surv_36 Probability of False Detection Tech_Surv_37 Probability of Identification Tech_Surv_38 Probability of False Identification Tech_Cont_11 Probability of Detection of an Alert Situation Tech_Cont_12 Probability of False Alert All of these requirements are difficult to measure and verify by short-term testing. Results are highly dependent on the measurement method and there are significant temporal variations. The EMMA tests indicate that verification of such requirements really needs continuous long-term observation over a period of several months. The requirement for the Probability of Identification should specify that the targets in question are identifiable, i.e. suitably equipped and co-operating. We suggest the following change to the require-ment (ED-87A §3.2.3): The probability that the correct identity of an identifiable aircraft, vehicle or object is reported at the output of the surveillance element shall be 99.9% at minimum. Tech_Gen_45 (ICAO §2.6.6.2) This requirement states that position reports should use a common reference point on aircraft and ve-hicles. It should be pointed out that this in itself is not a verifiable metric; it merely specifies the refer-ence point to use when determining the accuracy metric. It was found that the most useful means of assessing surveillance detection capability and coverage is to plot target position reports onto an aerodrome map over a period of time. This method clearly iden-tifies areas with gaps in coverage and areas where false reports occur. The MOGADOR tool devel-oped by DSNA also provided a Matrices of Detection and Identification, which proved useful. In all sites, runway status, heading and LVP/non-LVP information is provided by manual input at the controller working positions. Depending on the selected configuration, conflict and infringement de-tection rules are automatically applied according to pre-configurable behaviour. In Toulouse-Blagnac, one controller informed the validation team that during the night: - all controller positions were grouped; - when there was no dominant wind, runway heading was optimised for each flight in order to

reduce standard instrument departure (SID) or standard terminal arrival route (STAR) lengths.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 114 File Name: D671_Analysis_V1.0.doc Version: 1.0

This means that successively there was a landing on 14L, a take-off on 32R followed by another land-ing on 14L. Manual update of runway heading for each movement cannot be decently considered, but numerous alarms are triggered otherwise. Traffic is low, so disabling surface conflict alert is an easy way out, but it is not a very satisfying solution. DSNA did not publish any specific requirements for this configuration in the course of EMMA. It therefore remains an open point to be addressed in the future. In Malpensa a remarkable result also concerned the surface conflict alerting system (SCA). SCA in-cludes among the conflict cases (triggering an alarm) the event of an aircraft landing on the same run-way as a departing aircraft, if separation is reduced below a set minimum. Operational procedures adopted in Malpensa, however, allow the landing aircraft to continue approach if the pilot accepts responsibility and the departing aircraft has exceeded a set speed threshold. This results in unwanted alarms generated by the SCA. For this reason a new requirement was added specifying that the alarm is triggered only if the depart-ing aircraft has not exceeded a set speed threshold. While adding this requirement eliminates the un-wanted alarms it also inhibits detection of conflict cases when the pilot has not accepted responsibility. Conflict Case detection cannot be improved since the possible acceptance of responsibility by the ar-riving pilot is unknown to the SCA system.

5.2 Consequences for Operational Requirements Operational requirements are the parent requirements for most of the technical requirements. Thus, most of the operational requirements have been verified by technical tests or even by a plausible check, e.g. ‘check, if the installed system is a modular one’. However, some operational requirements need to be proven also from an operational point of view, e.g. ‘Op_Perf-05: For the surveillance ser-vice, the allowable error in reported position shall be consistent with the requirements set by the con-trol task of the controller: 12m.’ Such an operational requirement had to be verified technically but also its operational feasibility had to be verified by asking the system operators (ATCos) for their ac-ceptance of the experienced performance. If the ATCos accepted the performance it could be stated that the operational requirement had been fully verified, technically and operationally. Note: When requirements are verified (technically and operationally) it cannot be concluded that they are validated because a lower performance for instance has not been proven for their operational feasibility by a real experiment. However, even when requirements cannot be validated they can be questioned with following result pattern:

• Requirements that could not be verified technically but operationally (the lower performance was accepted by the ATCos) or

• Requirements that could be verified technically but not operationally can be rejected or at least can be improved to meet the users’ needs or to soften the technical requirement.

Detailed results of all operational requirements by all three test sites can be found in Appendix B.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 115 File Name: D671_Analysis_V1.0.doc Version: 1.0

5.3 Impact of V&V Results on A-SMGCS Concept The EMMA A-SMGCS concept (level I and II) is described in the main EMMA sub-project 1 docu-ments D131u_OSED [2], D135u_ORD [3], and the D142au_TRD-GND [4], which base on the ICAO A-SMGCS Manual [18] and the basic EUROCONTROL A-SMGCS concept documents (cf. [15] and [16]). The EMMA A-SMGCS implementations and V&V activities focussed on the EUROCONTROL level I and II concepts, even if EMMA outlined a more comprehensive concept that also considers higher level A-SMGCS services (e.g. planning, routing, on-board services). The EMMA concept states (cf. section 2.1 of [3] and section 1.2 of [18]): ‘The objective of an A-SMGCS is to optimise the efficiency, capacity and safety of operations at an aerodrome. The surface movement infrastructure existing at many airports today can be enhanced by providing positive identification of traffic, improving all weather situation awareness, improving communications and navigation aids, and by providing route planning tools.’ Except for ‘improving communications and navigation aids by providing route planning tools’ as-pects, which are higher level A-SMGCS services that are covered by the EMMA follow-up project EMMA2, the aimed objectives could be reached with EMMA A-SMGCS implementations in the simulator and on-site at the airports. For instance, the simulation trials revealed that A-SMGCS is able to reduce the average taxi time, the load of the R/T communication, and the controller’s reaction time in case of a conflict situation (see [12] and [14] for more details). These objective operational improvements, which were measured on the real-time simulation test plat-form, could also be confirmed with controllers’ subjective statements in the field. Controllers of Pra-gue Ruzynĕ were asked to estimate their perceived safety and efficiency when they work with A-SMGCS compared to earlier times when they did not use an A-SMGCS. The positive answers and also the feedback from the shadow-mode trials with DSNA and ENAV ATCos at Toulouse-Blagnac and Milan-Malpensa showed that A-SMGCS provides significant operational improvements that will result in operational benefits for all stakeholders of an A-SMGCS (see chapter 4 for more details). Thus, in general, the EMMA V&V results fully support the feasibility of the level I and II concept and expected operational improvements were validated.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 116 File Name: D671_Analysis_V1.0.doc Version: 1.0

6 References [1] European Airport Movement Management by A-SMGCS (EMMA)

D112u Long-term Measurements of A-SMGCS Performance - CDG Case Study Version 1.0 N. Marcou (DSNA) et al., Paris, 2006 www.dlr.de/emma

[2] European Airport Movement Management by A-SMGCS (EMMA) D131u Air-Ground Operational Service and Environmental Description (OSED-update) Version 1.0 M. Möller (AIF) et al., Toulouse, 2006 www.dlr.de/emma

[3] European Airport Movement Management by A-SMGCS (EMMA) D135u Operational Requirements Document Version 1.0 O. Delain (EEC) et al., Paris, 2006 www.dlr.de/emma

[4] European Airport Movement Management by A-SMGCS (EMMA) D142au Technical Requirements Document - Part A: Ground Version 1.0 A. Gilbert (PAS) et al., Horten, 2006 www.dlr.de/emma

[5] European Airport Movement Management by A-SMGCS (EMMA) D311 Ground System Requirements for Prague-Ruzynĕ Airport Version 1.0 A. Gilbert (PAS) et al., Horten, 2005 www.dlr.de/emma

[6] European Airport Movement Management by A-SMGCS (EMMA) D361 Site Acceptance Test Report - Prague Version 1.0 A. Gilbert (PAS) et al., Horten, 2006 www.dlr.de/emma

[7] European Airport Movement Management by A-SMGCS (EMMA), D612 Verification and Validation Test Plan for Prague Ruzynĕ Airport, Version 1.0, J. Jakobi, F. Morlang (DLR), et al., Braunschweig, 2006 www.dlr.de/emma

[8] European Airport Movement Management by A-SMGCS (EMMA), D613 Verification and Validation Test Plan for Toulouse-Blagnac Airport, Version 1.2, S. Paul (TATM), N. Marcou (DSNA), Toulouse, 2006 www.dlr.de/emma

[9] European Airport Movement Management by A-SMGCS (EMMA), D614b Validation Plan for RTS of Malpensa Airport,

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 117 File Name: D671_Analysis_V1.0.doc Version: 1.0

Version 1.0, J. Teutsch, R.J.D. Verbeek (NLR), Amsterdam, 2006 www.dlr.de/emma

[10] European Airport Movement Management by A-SMGCS (EMMA), D621 Verification and Validation Methodology for A-SMGCS, Version 1.0, K.G. Zografos (AUEB-RC/TRANSLOG), Athens, 2005 www.dlr.de/emma

[11] European Airport Movement Management by A-SMGCS (EMMA), D622 Verification and Validation Indicators and Metrics for A-SMGCS, Version 1.0, K.G. Zografos (AUEB-RC/TRANSLOG), Athens, 2005 www.dlr.de/emma

[12] European Airport Movement Management by A-SMGCS (EMMA), D631 Prague Test Report, Version 1.0, J. Jakobi (DLR), et al., Braunschweig, 2006 www.dlr.de/emma

[13] European Airport Movement Management by A-SMGCS (EMMA), D641 Toulouse A-SMGCS Verification and Validation Results, Version 1.0, O. Mongenie (DSNA), S. Paul (TATM), Toulouse, 2006 www.dlr.de/emma

[14] European Airport Movement Management by A-SMGCS (EMMA), D651 Malpensa A-SMGCS Verification and Validation Results, Version 1.0, S. Carotenuto (SICTA), J. Teutsch (NLR), Naples, 2007 www.dlr.de/emma

[15] European Organisation for the Safety of Air Navigation (EUROCONTROL), Operational Concept and Requirements for A-SMGCS Implementation Level 1, Edition Number 2.0, EATM Publication, Brussels, 2006 www.dlr.de/emma

[16] European Organisation for the Safety of Air Navigation (EUROCONTROL), Operational Concept and Requirements for A-SMGCS Implementation Level 2, Edition Number 2.0, EATM Publication, Brussels, 2006 www.dlr.de/emma

[17] European Organisation for Civil Aviation Equipment (EUROCAE) ED-87A Minimum Aviation System Performance Specification for A-SMGCS, Paris, 2001 www.eurocae.org

[18] International Civil Aviation Organisation (ICAO) Advanced Surface Movement Guidance and Control Systems (A-SMGCS) Manual, Doc 9830, First Edition - 2004,

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 118 File Name: D671_Analysis_V1.0.doc Version: 1.0

ICAO Publication, Montreal, 2004 www.icao.org

[19] European Organisation for Civil Aviation Equipment (EUROCAE) ED-116 Minimum Operational Performance Specifications for Surface Movement Radar Sensor Systems for Use in A-SMGCS, Paris, 2004 www.eurocae.org

[20] European Organisation for Civil Aviation Equipment (EUROCAE) ED-117 Minimum Operational Performance Specifications for Mode S Multilateration Systems for Use in A-SMGCS, Paris, 2003 www.eurocae.org

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 119 File Name: D671_Analysis_V1.0.doc Version: 1.0

7 Abbreviations Abbreviation Description A/C Aircraft ADS Automatic Dependent Surveillance ADS-B ADS-Broadcast AGL Aerodrome Ground Lighting AIT Advanced Intelligent Tape ANOVA Analysis of Variance ANS CR Air Navigation Services of the Czech Republic APP Approach Control (Centre) ARP Airport Reference Point ARR Arrival ART Alert Response Time A-SMGCS Advanced Surface Movement Guidance and Control System ASR Airport Surveillance Radar ASTERIX All Purpose Structured EUROCONTROL Automated Radar Information Exchange ATC Air Traffic Control ATCo Air Traffic Controller ATD Actual Time of Departure AUX Auxiliary Mass Storage Unit CA Capacity CAA Civil Aviation Authorities CAT Category CDD Clearance Delivery Dispatcher CDG Charles-de-Gaulle CROSS Crossing CV Coverage Volume CWP Controller Working Position D Deliverable DEP Departure DLR Deutsches Zentrum für Luft- und Raumfahrt DSNA Direction des Services de la Navigation Aérienne DTI Direction de la Technique et de l' Innovation EFF Effective/Efficiency EGNOS European Geostationary Navigation Overlay Service EMMA European Movement Management by A-SMGCS ENAV Ente Nationale de Assistenza al Volo EOBT Estimated/Expected Off-Block Time ES Extended Squitter ESUP EUROCAT Support System

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 120 File Name: D671_Analysis_V1.0.doc Version: 1.0

Abbreviation Description EUROCAE European Organisation for Civil Aviation Equipment FDPS Flight Data Processing Service FoM Figure of Merit GEC Ground Executive Controller GND Ground (Control) GPS Global Positioning System GS Ground Station GTW Gateway H Hypothesis HF Human Factors HMI Human Machine Interface ICAO International Civil Aviation Organization ID Identification ISA Instantaneous Self Assessment KVM Keyboard/Video/Mouse LAN Local Area Network LVO Low Visibility Operations LVP Low Visibility Procedure MASPS Minimum Aviation System Performance Standards MLAT Multilateration MOD Matrix of Detection MOI Matrix of Identification MOSQUITO Mode-S Squitter Vehicle Tracking Unit MVP Machine Vision Processor N/A Not Applicable NARSIM NLR ATC Research Simulator NASA National Aeronautics and Space Administration NLR Nationaal Lucht- en Ruimtevaartlaboratorium NMEA National Marine Electronics Association NTP Network Time Protocol OF Operational Feasibility OFZ Obstacle Free Zone OI Operational Improvement ORD Operational Requirements Document PA Precision Accuracy PCT Probability of Continuous Track PD Probability of Detection PDAS Probability of Detection of an Alert Situation PFA Probability of False Alert PFD Probability of False Detection PFID Probability of False Identification PID Probability of Identification

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 121 File Name: D671_Analysis_V1.0.doc Version: 1.0

Abbreviation Description R/T Radio Telephony RANC Radar Absorption Noise and Clutter RCMS Remote Control and Monitoring System RF Radio Frequency RP Reference Point RPA Reported Position Accuracy RPD Reported Position Discrimination RPR Reported Position Resolution RPS Recording and Playback System RPT Routing Process Time RTS Real-Time Simulations RVA Reported Velocity Accuracy RX Receiver SAC System Area Code SAF Safety SASHA Situational Awareness Rating Scale for SHAPE SAT Site Acceptance Test SCA Surface Conflict Alerting SD Standard Deviation SDF Surveillance/Sensor Data Fusion SDS Surveillance Data Server SE Standard Error SHAPE Solutions for Human Automation Partnership in European ATM SIC System Identification Code SID Standard Instrument Departure SME Subject Matter Expert SMGCS Surface Movement Guidance and Control System SMR Surface Movement Radar SPAM Situation Present Assessment Method SQB Mode-S Squitter Beacons STAR Standard Terminal Arrival Route SU System Usability SURV Surveillance SUS System Usability Scale TEC Tower Executive Controller TECAMS Technical Control and Monitoring System TECOS Terminal Co-ordination System TLX Task Load Index TPC Tower Planning Controller TRD Technical Requirements Document TRUR Target Report Update Rate TWR Aerodrome Control Tower

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 122 File Name: D671_Analysis_V1.0.doc Version: 1.0

Abbreviation Description UTC Universal Time Co-ordinated V&V Verification and Validation VE Verification VIS Visibility VSDF Video Sensor Data Fusion

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 123 File Name: D671_Analysis_V1.0.doc Version: 1.0

8 Figures and Tables

8.1 List of Figures Figure 2-1: EMMA Test-Bed Set-up at Prague....................................................................................... 7 Figure 2-2: Surveillance Processing Chain when Tracking MOSQUITO-equipped Vehicles at TLS . 13 Figure 2-3: Mosquito NMEA String Output ......................................................................................... 13 Figure 2-4: ADS-B 1090ES Surface Pos. and Ident. Processing by MOSQUITO (High Update Rate)14 Figure 2-5: Mosquito Position and Identity Internal Processing Time ................................................. 14 Figure 2-6: Quantification of Position Error Related to Latency .......................................................... 15 Figure 2-7: Ex. of De-synchronisation of Internal 1090ES TX Emission Cycle from GPS Cycle ....... 16 Figure 2-8: AS-680 ASTERIX Report Generation ............................................................................... 17 Figure 2-9: ADS-B GTW Internal Processing ...................................................................................... 18 Figure 2-10: Track Duplication Scenario Example............................................................................... 20 Figure 2-11: Track Duplication (Step 1) ............................................................................................... 20 Figure 2-12: Track Duplication (Step 2) ............................................................................................... 20 Figure 2-13: Track Duplication (Step 3) ............................................................................................... 21 Figure 2-14: Track Duplication (Step 4) ............................................................................................... 21 Figure 3-1: Bar Chart for Means, SD, and p-values for 30 Items of RTS Acceptance Questionnaire.. 34 Figure 3-2: Acceptability Ratings for the Use of A-SMGCS................................................................ 50 Figure 3-3: A-SMGCS Impact on System Usability between Visibility Conditions (All Runs) .......... 51 Figure 3-4: A-SMGCS Impact on SU between Visibility Conditions (Nominal Runs)........................ 52 Figure 3-5: A-SMGCS Impact on SU between Visibility Conditions (Non-nominal Runs) ................ 53 Figure 3-6: Answer Trend for the Acceptance Questionnaire............................................................... 56 Figure 3-7: Answer Trend for the Usability Questionnaire................................................................... 57 Figure 4-1: Bar Charts of the Mean Reaction Time for TEC and GEC Position (RTS 1 only) ............ 62 Figure 4-2: Total Average Taxi Times for RTS 1 and RTS 2 [sec] ...................................................... 66 Figure 4-3: Means of R/T Load between A-SMGCS and Baseline for each CWP [sec/hr] (RTS1)..... 67 Figure 4-4: Means of R/T Load between A-SMGCS and Baseline for each CWP [sec/hr] (RTS2)..... 68 Figure 4-5: Total Means for ISA Workload between A-SMGCS and Baseline Test Conditions ......... 75 Figure 4-6: Answer Trend for the Safety Questionnaire ....................................................................... 85 Figure 4-7: Example for Throughput Build-up and Decrease in Traffic Sample F............................... 98 Figure 4-8: Example for Identical Runway Arrival Throughput in Traffic Sample A........................ 100 Figure 4-9: Example for Runway Crossing Throughput in Traffic Sample B .................................... 101 Figure 4-10: Example for Identical Pushback Throughput in Traffic Sample F................................. 102 Figure 4-11: Hand-over Problems in Traffic Sample E ...................................................................... 103 Figure 4-12: Example for Number of Aircraft under Ground Control (Traffic Sample C) ................ 105

8.2 List of Tables Table 2-1: Technical Verification Indicators ........................................................................................ 10 Table 2-2: Summary of Technical Verification Results........................................................................ 12 Table 2-3: Malpensa Verification Results ............................................................................................. 28 Table 2-4: Combined Analysis for Verification Results of Prague and Malpensa................................ 31 Table 3-1: t-test for 30 Items of the Acceptance Questionnaire............................................................ 33 Table 3-2: Social-Demographic Data of the Sample Size..................................................................... 35 Table 3-3: Debriefing Questionnaire - Means, SD, and p-value for ‘General’ OF Items ..................... 37 Table 3-4: Debriefing Questionnaire - Means, SD, and p-value for ‘Surveillance’ OF Items.............. 39 Table 3-5: Debriefing Questionnaire - Means, SD, and p-value for ‘Control’ OF Items...................... 40 Table 3-6: Debriefing Questionnaire - Means, SD, and p-value for ‘HMI’ OF Items .......................... 42 Table 3-7: Debriefing Questionnaire - Means, SD, and p-value for ‘Procedure’ OF Items.................. 45 Table 3-8: Alert Performance Results with Crossing Runway Conflicts .............................................. 46 Table 3-9: Answer Distribution for Acceptance Questionnaire ............................................................ 56

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 124 File Name: D671_Analysis_V1.0.doc Version: 1.0

Table 3-10: Answer Distribution for Usability Questionnaire .............................................................. 56 Table 4-1: Allocation of Controllers to the Groups and RTS Phases.................................................... 59 Table 4-2: Combination of Experimental Factors ................................................................................. 60 Table 4-3: Low-level Objectives, Indicators, and Measurement Instruments for OIs in the RTS ........ 61 Table 4-4: Means, SD, and SE for the TEC’s Reaction Time (RTS 1 only)......................................... 62 Table 4-5: Means, SD, and SE for the GEC’s Reaction Time (RTS 1 only) ........................................ 62 Table 4-6: t-tests for Paired Differences: Reaction Time of TEC and GEC position (RTS 1 only) ..... 63 Table 4-7: Debriefing Questionnaire - Means, SD, and p-value for ‘Safety’ OI Items ........................ 64 Table 4-8: Taxi Time Results (RTS 1) .................................................................................................. 65 Table 4-9: Taxi Time Results (RTS 2) .................................................................................................. 65 Table 4-10: R/T Load Means, SD, and Sample Size (RTS 1)............................................................... 67 Table 4-11: R/T Load Means, SD, and Sample Size (RTS2)................................................................ 68 Table 4-12: R/T Load Test for Significance (two-way ANOVA [F-test]) (RTS 1).............................. 68 Table 4-13: R/T Load Test for Significance (two-way ANOVA [F-test]) (RTS2)............................... 69 Table 4-14: Debriefing Questionnaire - Means, SD, and p-value for ‘Efficiency/Capacity’ OI Items. 71 Table 4-15: Raw Data of Wrong Answers with the SAHA On-line Query (RTS 1 only) .................... 75 Table 4-16: ANOVA with Repeated Measurements for ISA Workload............................................... 76 Table 4-17: Debriefing Questionnaire - Means, SD, and p-value for ‘Situation Awareness’ OI Items 77 Table 4-18: Debriefing Questionnaire - Means, SD, and p-value for ‘Workload’ OI Items................. 78 Table 4-19: Debriefing Questionnaire - Means, SD, and p-value for ‘Human Error’ OI Items............ 78 Table 4-20: Safety Results under VIS-1 Conditions (time in [h.mm.ss]) ............................................. 83 Table 4-21: Safety Results under VIS-2 Conditions (time in [h.mm.ss]) ............................................. 84 Table 4-22: Answer Distribution for Safety Questionnaire................................................................... 84 Table 4-23: E1.1.1 – Taxiing Delay Results ......................................................................................... 86 Table 4-24: E1.2.1 – Line-up Queue Delay Results.............................................................................. 88 Table 4-25: E1.3.1 – Departure Delay Results...................................................................................... 89 Table 4-26: E1.4.1 – Runway Crossing Delay Results ......................................................................... 90 Table 4-27: E1.5.1 – Pushback Delay Results ...................................................................................... 91 Table 4-28: E2.1.1 – Arrival Period Results ......................................................................................... 92 Table 4-29: E2.2.1 – Departure Period Results ..................................................................................... 93 Table 4-30: Answer Distribution for Efficiency Questionnaire in Visibility Condition 1 .................... 95 Table 4-31: Answer Distribution for Efficiency Questionnaire in Visibility Condition 2 .................... 96 Table 4-32 Capacity Low-level Objective and Hypothesis................................................................... 97 Table 4-33: C1.1.1 – Runway Departure Throughput Results .............................................................. 98 Table 4-34: C1.2.1 – Runway Arrival Throughput Results .................................................................. 99 Table 4-35: C1.3.1 – Runway Crossing Throughput Results.............................................................. 101 Table 4-36: C1.4.1 – Pushback Throughput Results ........................................................................... 103 Table 4-37: C1.4.2 – GND to TWR Hand-over Throughput Results.................................................. 104 Table 4-38: C1.5.1 – Number of A/C GND Results ........................................................................... 105 Table 4-39: C1.5.2 – Number of A/C TWR1 Results ......................................................................... 106 Table 4-40: C1.5.3 – Number of A/C TWR2 Results ......................................................................... 107 Table 4-41: Answer Distribution for Capacity Questionnaire in Visibility Condition 1 .................... 109 Table 4-42: Answer Distribution for Capacity Questionnaire in Visibility Condition 2 .................... 110 Table 8-1: Functional Tests for Level II A-SMGCS.......................................................................... 150 Table 8-2: Performance Tests for Level II A-SMGCS ...................................................................... 156 Table 8-3: Interface Tests for Level II A-SMGCS............................................................................. 160 Table 8-4: Results General Operational Requirements ...................................................................... 173 Table 8-5: Results Surveillance Service Requirements...................................................................... 194 Table 8-6: Results Control Service Requirements ............................................................................. 205 Table 8-7: Discussion of Operational Performance Requirements - Example 1: RPA....................... 208 Table 8-8: Discussion of Operational Performance Requirements - Example 2: PD.......................... 210 Table 8-9: Example for the ‘Matrix of Detection’ .............................................................................. 211

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 125 File Name: D671_Analysis_V1.0.doc Version: 1.0

APPENDIX A DETAILED RESULTS FOR TECHNICAL REQUIREMENTS Appendix A.1 Functional Tests This section describes the tests performed to verify compliance of the EMMA Level II A-SMGCS with the functional requirements listed in the EMMA Techni-cal Requirements Document [4]. In the following tables, the comment ‘FAT’ means that the requirement is only tested during factory acceptance, not at SAT.

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Service Provision Tech_Gen_01 For all levels of implementation, the A-SMGCS

shall provide equipment to support the Surveillance service

Op_Serv-01 [ICAO §2.2.1]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_02 For levels II and above, the A-SMGCS shall pro-vide equipment to support the Control service

Op_Serv-14 [ICAO §2.2.1]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_03 For levels III and above, the A-SMGCS shall pro-vide equipment to support the Routing and Guid-ance functions

ICAO §2.2.1 EMMA2. The routing function is in-cluded in EMMA system. The guidance function will be included in EMMA2 system.

EMMA2.

Tech_Gen_04 For levels III and above, the A-SMGCS shall pro-vide equipment to support the Planning of surface movements

ICAO §2.2.2 EMMA2. Not within the scope of EMMA.

EMMA2.

Modularity, Scalability and Adaptability Tech_Gen_05 The A-SMGCS equipment shall comprise hardware

and software modules. Op_Ds-1 OK. The installed system fulfils

this requirement. OK.

Tech_Gen_06 The system should be based as far as practicable on commercial off-the-shelf (COTS) hardware and software components.

MASPS §1.8.2 OK. The EMMA Level II system uses COTS hardware and software components from Park Air and ERA. The software is configured and adapted for the EMMA test-bed application.

The installed system fulfils this requirement.

OK, where available.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 126 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Gen_07 The modules shall be such that the system can be dimensioned according to the needs of different aerodromes.

ICAO §2.2.5 ICAO §2.4.1

OK. The EMMA Level II system is modular, and permits the A-SMGCS to be dimensioned to suit user requirements.

The installed system fulfils this requirement: it has been installed on several aero-dromes.

OK, the modularity allows adaptability to different aero-dromes (complexity, functional-ities, etc.).

Tech_Gen_08

It should be possible to utilise the existing SMGCS infrastructure and to add additional modules of ground equipment when required for the operation.

ICAO §3.3.1.2 OK. The installed system fulfils this requirement: this opera-tion has been performed in Toulouse.

OK, the system is up gradable from existing infrastructure.

Tech_Gen_09

The modules shall be such that components can be added in order to expand the system in terms of functionality and number of users.

Op_Evo-4 [MASPS §1.8.2]

OK. The modular, open system architecture permits expansion in terms of functionality and num-ber of users.

The installed system fulfils this requirement: this opera-tion has been performed in Toulouse.

OK, the system is expandable.

Tech_Gen_10

An open architecture using COTS equipment and standard interfaces is recommended in order to permit system enhancements at minimal cost.

Op_Evo-5 [MASPS §1.8.3]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_11 The equipment should be capable of being installed at any aerodrome.

Op_Env-1 OK. The installed system fulfils this requirement: it has been installed on several aero-dromes.

OK.

Tech_Gen_12 In order to accommodate changes to the aero-drome layout after installation, it should be possible to reconfigure the A-SMGCS equipment.

Op_Evo-1 [ICAO §2.6.10] Op_Evo-2 [MASPS §1.8.3]

OK. The modular, open system architecture and system support tools permit reconfiguration to accommodate changes to the aerodrome layout after installa-tion.

Verification hypothesis: Ver.gen.4

OK.

Tech_Gen_13 Adaptation of the equipment to the site configura-tion should be done through an appropriate data-base (sensor positions, airport topogra-phy/topology).

Op_Evo-1 [ICAO §2.6.10.1] Op_Evo-2 [MASPS §1.8.3]

OK. The system support equipment includes an appropriate data-base with editing facilities.

Verification hypothesis: Ver.gen.4

OK, in terms of system configu-ration, geography, etc.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 127 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Traffic Types Tech_Gen_14 The design of the equipment shall be such that its

functional performance is independent of the differ-ent types of aircraft and vehicle that are likely to use the aerodrome during the life expectancy of the equipment.

Op_Range-3 [ICAO §2.6.2] Op_Range-4 [ICAO §2.6.2]

OK. The surveillance system sup-ports all known aircraft and vehicle types that are likely to use the aerodrome during the life expectancy of the equipment.

Verification hypothesis: Ver.gen.2

OK, provided A/C are ade-quately equipped.

Operating Conditions Tech_Gen_15 The performance of A-SMGCS equipment should

not be significantly degraded due to meteorological conditions or poor visibility.

Op_Env-4 [ICAO §2.6.5.c] Op_Range-1 [ICAO §2.2.3]

Weather conditions were good throughout the SAT period. Further long-term testing has been carried out using the MOGADOR tool as part of the SP6 V&V exercise. Results are published in the D6.3.1 docu-ment [12].

Hypothesis Ver.sur.7 partially verified. Verification hypothesis: Ver.sur.8 Indicators: VE-2 and VE-9.

OK, this is the goal of A-SMGCS.

Tech_Gen_16 The outdoor equipment shall operate in the follow-ing environmental conditions: Temperature: -25°C to +55°C Rainfall: Up to 16mm/hr Hail: Up to a diameter of 12mm at 17m/s Wind Speed: Up to 80kt operational; Up to 120kt survival (3-second gust)

EUROCAE ED-116, ED-117

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_17 The outdoor equipment, including any enclosure, should utilise materials, coatings and finishes which are resistant to weathering and to industrial pollut-ants such as sulphur dioxides and/or nitric oxides.

EUROCAE ED-116, ED-117

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_18 The indoor equipment shall operate in the following environmental conditions: Temperature: +10°C to +30°C Relative Humidity: 10% to 80% non-condensing

EUROCAE ED-116, ED-117

OK. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 128 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Gen_19 The equipment shall comply with all relevant health and safety legislation; European Standard; or Code of Practice, including but not limited to the follow-ing:

• Grounding and power distribution • Inflammable atmospheres • Human exposure to radiation • Electro-mechanical detonators • Hazardous substances

EUROCAE ED-116, ED-117 Op_Evo-4

OK. The equipment is CE marked and approved by the relevant authorities.

The installed system fulfils this requirement.

OK, CE marked guarantees.

Power Requirements Tech_Gen_20 Electrical equipment should operate from standard

mains voltage and frequency at the airport. EUROCAE ED-116, ED-117

OK. The system is designed to oper-ate from standard 230V AC mains supplies available at the airport.

The installed system fulfils this requirement.

OK.

Installation Requirements Tech_Gen_21 Ground equipment should be installed so as not to

affect flight operations. Op_Env-2

OK. The only items of EMMA test-bed equipment installed close to the movement area are the gap-filler sensors. These are mounted on existing lighting pylons. The installation has been approved by the Czech Airports Authority.

The installed system fulfils this requirement.

OK.

Tech_Gen_22 Any A-SMGCS equipment sited close to the movement areas should be:

• Lightweight • Frangible, where appropriate • Capable of withstanding the effects of jet

blast.

Op_Env-3

OK. See previous test.

The installed system fulfils this requirement.

OK, for the operative A-SMGCS modules (except for vehicle equipment which is a prototype version in EMMA1).

Tech_Gen_23 Any A-SMGCS equipment installed in the move-ment area shall comply with obstacle limitations requirements.

ICAO Annex 14, Volume I.

OK. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 129 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Gen_24 Siting of equipment should take into account the adverse effects of signal reflections and shadowing caused by aircraft in flight, vehicles or aircraft on the ground, buildings, snow banks or other raised obstacles (fixed or temporary) in or near the aero-drome environment, so that performance require-ments are met.

Op_Env-4 [ICAO §2.6.5.b]

OK. The gap-filler sensors are mounted close to the taxiways to be surveyed, so that shadowing by objects entering the field of view is avoided. Since the gap-filler employs optical sensors signal reflections are not an issue.

Hypothesis Ver.sur.7 partially verified. Verification hypothesis: Ver.sur.8 Indicators: VE-2 and VE-9.

OK.

Tech_Gen_25 Audible noise and vibration from the equipment shall be confined to within acceptable levels com-mensurate with the environment. This is particularly important in the tower visual control room(s).

OK. Except for an audible alarm in the event of the system detecting any pre-defined situation in which controllers shall be alerted, the only audible noise from the EMMA equipment is the very low hum of cooling fans.

The installed system fulfils this requirement.

OK, a proper design guarantees acceptable noise levels.

Electromagnetic Compatibility Tech_Gen_26 Equipment shall have appropriate EMI/EMC char-

acteristics for operation in an airport environment. The EU directive 98/336/EEC is applicable.

Op_Env-4 [ICAO §2.6.5.a] Op_Env-6 Op_Env-5

OK. Equipment is CE marked. Test certificate available.

The installed system fulfils this requirement.

OK, CE marked guarantees.

Tech_Gen_27 Equipment and associated data links shall include appropriate lightning conductors and transient protection to ensure continued operation during lightning storms without equipment failure.

OK. Power supplies to all EMMA units are transient protected. Lightning conductors and data links are part of the existing installation at Prague.

The installed system fulfils this requirement.

OK, CE marked guarantees.

Traffic Context Tech_Supp_01 Traffic Context

This function should gather data from other airport systems to provide relevant information about the status of runways and taxiways, the runway(s) in use and meteorological conditions.

Fn-14 [Op_Serv-10]

OK. Runway status and LVP/non-LVP information is provided by man-ual input at the CWPs.

The installed system fulfils this requirement. Runway status, heading and LVP/non-LVP information is provided by manual input at the CWPs.

OK, context information is avail-able for EMMA1 functionalities (Conflict Alert, etc.).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 130 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Supp_02 Traffic Context

It should be possible to manually update topological and topographical data that cannot be updated automatically.

Fn-15

OK. This traffic context information is the topological and topographical information. It can be edited from TECAMS and transferred to the CWPs.

The installed system fulfils this requirement. It is possible to update topo-logical and topographical data using specific off-line tools and then distribute them among system applications.

OK, geographical data base is manually editable by proprietary software tools.

Tech_Supp_03 Traffic Context

This function should provide all topological and topographical information necessary for the A-SMGCS, including:

• Airport layout: geographical representa-tion of various airport areas (TWY, RWY, etc.)

• Reference points: holding positions, stop bars (and other airfield lighting), RWY thresholds, etc.

• Fixed obstacles

Fn-16 [Op_Serv-08] [Op_Serv-09]

OK. The display shows: • The aerodrome map with

RWYs, TWYs, parking stands, etc.

• Holding positions, stop bars, RWY thresholds, centerlines and other information as re-quired by ANS CR.

• Fixed obstacles.

The installed system fulfils this requirement.

All graphical attributes useful for operative personnel are included and reported on CWP (Controller Working Position).

Tech_Supp_04 Traffic Context

The technical support equipment should have a user-friendly and efficient means of entering and editing traffic context data.

Op_If-2

OK. The Editor function provides means to edit topological and topographical information. Multi-ple layers of information are available. There is a Layer Man-ager tool.

The installed system fulfils this requirement. The editor is an off-shelf commercially available tool.

OK, SELEX has a friendly and powerful graphic editor (XMG).

Tech_Supp_05 Level II Traffic context

For the Conflict/Infringement detection, the function should provide updated information on:

• Airport Configuration: runways in use, runways status, restricted areas,

• Applied procedures and working meth-ods: LVP, multiple line-ups

Fn-17 [Op_Serv-21]

OK. Runway status and LVP/non-LVP information is provided by man-ual input at the CWPs.

The installed system fulfils this requirement. Runway status, heading and LVP/non-LVP information is provided by manual input at the CWPs. Depending on the selected configuration, conflict and infringement detection rules are automatically applied according to pre-configurable behaviour.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 131 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Supp_06 Operational change

This function shall be capable of accommodating any operational change of the aerodrome, for in-stance a physical change in layout (runways, taxi-ways and aprons), or a change in the aerodrome procedures, rules.

Fn-28, Fn-55 [Op_Evo-1 MASPS §1.8.3 ICAO §3.4.5.2.b]

OK. The installed system fulfils this requirement

OK.

Tech_Supp_07 Adaptation to local procedures

This function should permit configuration of the Conflict/Infringement detection parameters in order to adapt to local rules.

Fn-38 [Op_Serv-29]

OK. The installed system fulfils this requirement.

OK, SCA (Surface Conflict Alert) is highly configurable.

Tech_Supp_08 For each runway, the traffic context function should define a topological area, the runway protection area, composed of two boundaries: A ground boundary to detect the mobiles on the surface, and an air boundary to detect airborne aircraft.

Op_Serv-18 OK. There are various boundary areas for configuration of the alerting function.

The installed system fulfils this requirement.

OK, proper volumes.

Tech_Supp_09 The length of the ground boundary should include the runway strip. The width should be defined ac-cording to the meteorological conditions, one width for Non-LVP and a wider width for LVP.

Op_Serv-19 OK. There are areas for the runway strips with different widths de-pending on the selection of nor-mal or LVC.

The installed system fulfils this requirement.

OK.

Tech_Supp_10

The air boundary should be parameterised and configurable to take into account the two stages of alert, as well as the meteorological conditions. The time to threshold parameter should be typically:

• Non-LVP: Prediction around T1 = 30s, Alert around T2 = 15s

• LVP: Prediction around T1 = 45s, Alert around T2 = 30s

Op_Serv-20 OK. Associated with each area are various attributes including time to threshold. The values for the EMMA tests at Prague are set to: • Non-LVP: T1 = 30s; T2 = 15s • LVP: T1 = 45s; T2 = 30s

The installed system fulfils this requirement. The values for EMMA system are set to: • Non-LVP: T1 = 46s; T2 =

40s • LVP: T1 = 60s; T2 = 50s

OK, please refer to correspond-ing deliverable (SCA settings: thresholds, timing, etc.).

Service Monitoring Tech_Supp_11 The system should provide a technical workstation

with suitable HMI for monitoring the status of each major item of equipment.

Fn-32 [Op_Mon-02 ICAO §2.7.4.4]

OK. The TECAMS HMI shows the status of the major components of the EMMA system. It is possible to start and stop subsystems from the TECAMS HMI.

Partially installed on EMMA system, available on the op-erational system.

In EMMA1 only some monitoring status functionalities are pro-vided at CWP level. A full CMS (Control & Monitoring System) is provided in operational system.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 132 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Supp_12

A clear indication should be given in the event of any failure of an item of equipment.

Fn-32 [Op_Mon-02 ICAO §2.7.4.4]

OK. Tested for ASR, ESUP, SMR, MLAT, VSDF and AGL sources. The TECAMS Monitor window shows the SDS block in red indicating the failure and the sensor source block turns white indicating that the node is not connected.

Partially installed on EMMA system, available on the op-erational system.

See above.

Tech_Supp_13 Operationally significant failures should be reported to the Controller HMI.

Fn-32 [Op_Mon-02 ICAO §2.7.4.4]

OK. Tested for ASR, ESUP, SMR, MLAT, VSDF and AGL sources.

Partially installed on EMMA system, available on the op-erational system.

OK, a synthesis of significant event/status is reported at CWP level.

Tech_Supp_14

The system should be capable of expansion to accept and integrate data from other surveillance sensor sources in the future.

OK. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 133 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

System Configuration and Control Tech_Supp_15

The system should provide a technical workstation with suitable HMI for configuring and controlling each major item of equipment.

OK. A TECAMS (Technical Control and Monitoring System) is in-cluded. a) The Monitor window shows

the system status. b) The ‘Config’ button provides

submenus for configuring the parameters of Database, System, RANC, SDS and CWP

• Green means that everything

is OK • Red means that the sub-

system is reporting failure • Yellow means that the sub-

system is reporting warning • White means that the node is

not connected • Grey means that the node is

not monitored The Editor function provides means to edit topological and topographical information. Multi-ple layers of information are available. There is a Layer Man-ager tool.

Partially installed on EMMA system, available on the op-erational system.

Partially installed on EMMA1.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 134 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Recording and Playback Tech_Supp_16 The system should provide recording equipment

capable of continuously recording and archiving relevant data in order to re-construct the image at any controller working position later.

ICAO §2.6.8.1 OK. An RPS (Recording and Playback System) is included. The RPS is able to select and display a precise reproduction of the picture at any of the CWPs. All operator interactions, except cursor movements, are repro-duced. The data is logged onto tape, which can be stored for playback whenever needed. For the EMMA system at Prague, the tapes will be reused after 30 days. Each tape has sufficient capacity to store five days of continuously recorded data.

In EMMA system recording is performed by the DSNA tool ELVIRA.

OK, RPB (Recording & Play-back) sub-system is included. However more flexible software tools have also been adopted for V&V analysis.

Tech_Supp_17 The system should provide playback means to directly replay recorded data within the operational system, as part of the requirement for immediate checking of suspect equipment and initial incident investigation.

ICAO §2.6.8.2 OK. The installed system fulfils this requirement. However, playback should be performed on an off-line CWP.

OK, see above.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 135 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Data Communication Protocols Tech_Supp_18 The Supporting functions should comply with the

general interface requirements given in section 3.3 of the TRD (Ref. [6]).

Compliance has been verified. Compliance is verified: the Toulouse-Blagnac A-SMGCS utilizes standard data commu-nication interface protocols and data formats. The system software applications use an extensive client-server archi-tecture and use inter-process communication. The inter-process communication sup-ports process distribution via LAN, using TCP/IP and UDP/IP.

The A-SMGCS implemented at Malpensa airport is compliant with technical standards consid-ered as reference for the EMMA project, in terms of interfaces protocols and data format.

Data Formats

Tech_Supp_19 Data Format

The data interchange with ground systems should be performed in a standardized format in order to ensure an adequate exchange of information.

Fn_If-5 [Op_If-4 Op_Ds-4 ICAO §2.6.16.2]

The installed system fulfils this requirement. All major data ex-change among different subsystems are based on the ASTERIX format.

Where possible standard formats are adopted (i.e. a proper cate-gory of ASTERIX format).

Reliability Tech_Gen_28 Continuity

The equipment should be capable of sustained operation 24 hours a day throughout the year.

[ICAO §2.2.3 §2.7.4] Op_Perf-12 [ICAO §2.7.4.2 MASPS §3.1.1.2]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_29 Equipment should be installed and configured in such a way that all possible essential maintenance can be carried out without interrupting operation.

OK. All EMMA equipment is easily accessible for maintenance purposes. Maintenance of EMMA equip-ment does not interfere with operations.

The system is not installed on the operational network.

OK. In EMMA 1 this aspect is not essential since the system is not to be intended as opera-tional (e.g. all modules in EMMA1 are provided in single configuration instead of redun-dant).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 136 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Gen_30 Integrity

Appropriate data integrity checks should be em-ployed to ensure that erroneous data is not pro-vided to users.

Op_Perf-09 [Op_Perf-25 Fn_Perf-07 ICAO §2.7.3.1 MASPS §3.1.1.1]

OK. Will be checked in EMMA2 SSA.

OK.

Tech_Gen_31 Availability

Appropriate levels of redundancy should be pro-vided for equipment that is to be continuously available.

Op_Perf-13 [ICAO §2.7.4.2 MASPS §3.1.1.2] ICAO §2.7.5.1 Op_Perf-10 [ICAO §2.7.4.1 MASPS §3.1.1.2]

OK. Redundancy is not required for the EMMA trials.

Will be checked in EMMA2 SSA.

Redundancy is not required for experimental system provided in EMMA1.

Service Monitoring (Built-in Test) Tech_Gen_32 Equipment Status

The A-SMGCS shall include built-in test equipment (BITE) to monitor the operational status of all A-SMGCS equipment.

Op_Mon-01 [Fn-31 ICAO §2.5.1.2] Op_Ds-5 [ICAO 2.7.5.1]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_33 Performance

The BITE shall detect operationally significant failures, and shall generate alerts when the system must not be used for the intended operations.

Op_Mon-02 [Fn-32 ICAO §2.7.4.3]

OK. The scrollable Status Log field displays system status, warning and failure messages together with the time as they occur. Detailed information such as date, time, status, and identifica-tion are available. A description of the message is displayed by double clicking on the message line.

The installed system fulfils this requirement.

OK.

Tech_Gen_34 Data Validation

The BITE shall perform a continuous validation of data provided to the user and generate a timely alert to the user when the system must not be used for the intended operation.

Op_Mon-03 [Fn-33 ICAO §2.7.3.2]

OK. Same as previous. The operational status of the system is monitored. The definition of required modules and level of per-formances depending on operations is the scope of the operational feasibility phase.

OK, but HMI and management level of BITE information is not provided in EMMA1 (it is pre-sent in operational system).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 137 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

System Failure Tech_Gen_35 Appropriate redundancy should be provided to

ensure that a failure of one item of equipment does not result in a loss of basic functions.

Op_Perf-11 [ICAO §2.7.5.2.b]

There is no redundancy provided within the EMMA test bed sys-tem. In the event of a failure of EMMA equipment, ATC will be able to resume normal opera-tions using the commercial A-SMGCS equipment.

Will be checked in EMMA2 SSA.

Redundancy is not required for experimental system provided in EMMA1.

Tech_Gen_36 Equipment should be both fail-safe and fail-soft. Op_Perf-11 [ICAO §2.7.5.2.a MASPS §3.1.1.2] Op_Ds-6 [ICAO §2.6.9.1 §2.7.5.3] Op_Ds-7 [ICAO §2.6.9.2]

Subject to Functional Hazard Analysis (FHA), System Safety Assessment (SSA), and appro-priate operational procedures. The term ‘fail-soft’ means that the system is so designed that, even if equipment fails to the extent that loss of some data occurs, sufficient data remain on the display to enable the control-ler to continue operation without assistance of the computer.

Will be checked in EMMA2 SSA.

A-SMGCS operational systems satisfy safety requirements but this aspect has not been stressed in EMMA1.

Tech_Gen_37 Operationally significant failures such as loss of a data source or unreliable or degraded performance should be reported at the Technical Workstation and to Clients.

Op_Mon-05 [Fn-34 ICAO §2.7.5.3] Op_Mon-04 [Fn-35 ICAO §2.7.5.3]

OK. The source node turns white indicating that the node is dis-connected and the SDF node turns yellow indicating warning status.

Will be checked in EMMA2 SSA.

See above.

Tech_Gen_38 All critical items of equipment should be provided with audio and visual indication of failure given in a timely manner.

Op_Mon-06 [Fn-01 ICAO §3.6.2.2 §3.6.5.1]

The system alarm window pops up with the name of the data source that is disconnected. An audible alarm sounds.

Will be checked in EMMA2 SSA.

See above.

System Restart Tech_Gen_39 All items of the equipment should be self-

restartable. Op_Ds-8 [Op_Perf-14 ICAO §2.6.9.4]

OK. The SDF does a controlled shut down, then automatically reboots and resumes operation.

Will be checked in EMMA2 SSA.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 138 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Gen_40 The recovery time after a restart should not exceed 60 seconds.

Op_Ds-8 [Op_Perf-14 ICAO §2.6.9.4]

OK. Recovery time for SDS: 36 s Recovery time for RANC: 28 s

Will be checked in EMMA2 SSA.

This has not been analysed in EMMA1.

Tech_Gen_41 The restart of an item of equipment should include the restoration of pertinent information on actual traffic and equipment performance.

Op_Ds-9 [ICAO §2.6.9.4]

OK. Will be checked in EMMA2 SSA.

OK.

Basic Function (HMI) Tech_HMI_01 The HMI should provide users with displays, indica-

tors and input devices to permit each user to inter-act with all of the A-SMGCS functions relevant to his or her role.

Fn-24 [Op_If-1 MASPS §2.5.3]

OK. The installed system fulfils this requirement.

OK.

Display Characteristics Tech_HMI_02 Ambient Light

The HMI should employ high resolution, high con-trast-ratio displays appropriate for viewing in all ambient light levels found in the user environment.

Fn_Perf-24 [Op_If-1] ICAO §2.6.15.4 [MASPS §2.5.3]

OK. 1024 lines x 1280 pixels resolu-tion. 19-inch LCD, Type Viewsonic V191b in old TWR. Barco TCD251 in operational VCR.

The installed system fulfils this requirement.

OK.

Tech_HMI_03 Operational Condi-tions

Where appropriate it should be possible to config-ure the HMI according to local requirements.

Fn-29 [Op_Evo-1 Op_Range-1 MASPS §2.5.2]

OK. The HMI is configurable to Pra-gue requirements.

The installed system fulfils this requirement. The EMMA HMI is configured with the same settings as the operational system.

OK.

Input Devices Tech_HMI_04 Entry Means

The HMI shall employ keyboard, mouse and on-screen menus and icons for data entry.

Fn_Perf-22 [Op_If-1 Op_If-2 MASPS §2.5.3]

OK. The HMI employs keyboard, mouse, on-screen menus and icons for data entry.

The installed system fulfils this requirement.

OK.

Tech_HMI_05 Input Actions

The HMI design shall be functionally simple involv-ing the controllers in a minimum number of input actions.

ICAO §2.6.15.3 OK. The HMI is simple and intuitive, requiring few input actions.

The installed system fulfils this requirement.

OK, this has been specified in controller sessions.

Display Controls

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 139 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_HMI_06 Display Capabili-ties

The HMI shall allow the user to configure the dis-play capabilities (e.g. range scale selection, pan/zoom, brightness, map overlays).

Fn-21 [Op_If-1 MASPS §2.5.3]

a) OK. The displayed range can be increased and de-creased in discrete steps in-dependently for each win-dow.

b) OK. The picture can be panned with the mouse in-dependently for each win-dow.

c) OK. The display brightness can be increased or de-creased.

d) OK. Map layers can be turned on/off independently.

The installed system fulfils this requirement.

OK.

Tech_HMI_07 Manual Label Attribution

The surveillance service shall provide to the user the ability to manually put the right callsign in the label associated to a vehicle equipped with coop-erative equipment.

Fn-22 [Op_Serv-12]

OK. The target is labelled and the line is removed from the list.

In the EMMA system, vehicle callsigns are the ones de-tected by the ADS-B sensors. Vehicle ADS-B transmitters allow callsign to be pre-configured.

The capability is only available for service level, not for control-lers (configuration parameter).

Design Principles Tech_HMI_08 Harmonisation

The HMI design should adhere to established prin-ciples for ATC equipment.

Fn-27 [Op_If-1 MASPS §2.5.3] Op_If-5 [ICAO §2.6.15.1] Fn-23 [Op_If-1 Op_If-5]

OK. The HMI design has been har-monised with existing ATM HMI and agreed with the Prague ANS Provider (ANS CR).

The EMMA CWP HMI is harmonised according to DSNA specifications.

OK, HMI is result of controller requirements.

Tech_HMI_09 Situation Assess-ment

The presentation of information on the HMI should be clear and uncluttered to permit rapid situation assessment.

Fn_Perf-03 [Op_If-1 MASPS §2.5.3]

OK. The installed system fulfils this requirement.

OK.

Tech_HMI_10 The HMI should be designed such that its function-ality can evolve as the A-SMGCS evolves.

Op_Ds-2 [MASPS §2.5.2]

OK. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 140 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Op_Evo-3 [Op_Ds-2 MASPS §2.5.2]

Tech_HMI_11 The HMI design should try to minimise the need for user interaction.

Fn_Perf-23 [Op_If-1 Op_If-2 MASPS §2.5.3] Fn-25 [Op_If-1 MASPS §2.5.3] Fn_Perf-21 [Op_Evo-3 ICAO §2.6.15.7 MASPS §2.5.3] Op_If-1 ICAO §3.5.13.8 Fn-26 [Op_If-1 MASPS §2.5.3]

OK. The HMI is simple and intuitive, requiring few input actions.

The installed system fulfils this requirement.

OK.

Traffic Situation Display Tech_HMI_12 Display Airport Traffic Situation

The HMI should provide, at each controller working position, a traffic situation display capable to pre-sent labelled target tracks superimposed on an airport and approach map.

Fn-18 ICAO §2.5.1.7

OK. The HMI provides a traffic situa-tion display with labelled target tracks superimposed on multilay-ered maps covering the aero-drome and the approaches. In addition to the main window, there is the possibility of display-ing two inset windows.

The installed system fulfils this requirement. The approach map is shown on a separate HMI window.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 141 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Basic Function (Surveillance) Tech_Surv_01 Ground Traffic Information

The surveillance equipment should detect and continuously provide accurate positional reporting on aircraft, vehicles and obstacles:

a) whether moving or static, b) within the aerodrome movement area, c) within the runway strips, and d) within any designated protected area as

required by airport authorities

Fn_13 [ICAO §2.2.4] ICAO §2.5.1.1.a ICAO §2.5.1.1.c Op_Serv-06 ICAO §4.2.1

OK. This requirement implies that the verification test shall be performed: - For moving and static mo-biles - On the whole manoeuvring area Hypothesis Ver.sur.2 verified. Hypotheses Ver.sur.3 and, Ver.sur.7 partially verified. Verification hypothesis: Ver.sur.8. Indicators: VE-2 and VE-9.

OK.

Tech_Surv_02 Approach Traffic Information

The surveillance equipment should detect and continuously provide accurate positional reporting on aircraft on approach,

a) out to a distance such that inbound air-craft can be integrated into the A-SMGCS operation, and

b) up to an altitude so as to cover missed approaches and low level helicopter op-erations.

ICAO §2.5.1.4 Op_Serv-07 [ICAO §2.5.1.5]

OK. Surveillance of the approach is provided from the Eurocat 2000 system. There is no visible dif-ference between the positions displayed on the Eurocat dis-plays and those on the EMMA displays. NOTE: the accuracy required is not stated in the ICAO man-ual.

Hypothesis Ver.sur.2 verified. Indicator VE-1.

Multi Sensor Fusion ensures continuity of track from ap-proach to Airport and vice versa, provided the target is detected by sensors (app radars, SMR, MLAT, etc.).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 142 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Surv_03 Having detected a target in any of the areas de-fined above, the surveillance equipment should provide users with information on:

a) Target position b) Target identity (for identifiable coopera-

tive targets) c) Target classification (for non-cooperative

or unidentifiable targets) d) Track history (at least the last three re-

ported positions)

Op_Serv-04 ICAO §2.5.1.1.b

OK. Targets are displayed by: a) A target symbol showing the

target position b) A leader line connecting the

symbol to a label containing the target identity and other information

c) A raw video (SMR) image that permits the user to classify the target by its size.

d) A selectable number of track history dots showing the past positions of the tar-get.

Hypothesis Ver.hmi.1 verified. The installed system fulfils this requirement.

OK.

Tech_Surv_04 Objects detected on the movement area should be classified according to size.

MASPS §3.2.1.1 OK. The SMR raw video image per-mits the user to classify the target by its size. For large air-craft and medium aircraft, the shape is discernable.

The size of objects may be discernable from the over-lapped raw video.

OK, SMR Raw video gives support for this objective.

Tech_Surv_05 Mobile Velocity

The surveillance equipment should provide users with information on: Target speed Direction of movement

Fn-53 Op_Serv-30 Fn-11

OK. The length of the PTL indicates the speed of the target; the direction of the PTL shows the direction of movement. Target speed in knots is given in the target information window.

The installed system does not display this information.

OK, speed module is presented while direction is temporarily not considered.

Surveillance Equipment Tech_Surv_06 The surveillance equipment shall comprise multiple

sensor systems and data fusion. ICAO §3.4.1.3 OK.

Existing sensor systems at Prague are used for EMMA.

The installed system fulfils this requirement.

OK.

Tech_Surv_07 The surveillance system should be capable of expansion to accept and integrate data from other surveillance sensor sources in the future.

ICAO §3.4.1.3 OK. The system can accept data from multiple sensors.

The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 143 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Surv_08 Mobile Position

The surveillance equipment shall include at least one non-cooperative surveillance sensor system to detect and determine the position of mobiles and obstacles on the movement area of the airport. Currently, SMR is preferred as a non-cooperative sensor.

Fn-06

OK. The existing SMR at Prague will provide data for EMMA. In addition, a gap-filler system is used to provide coverage in SMR blind spots.

The installed system fulfils this requirement.

OK.

Tech_Surv_09 SMR systems should comply with the minimum operational requirements given in ED-116 [19].

Fn-06 OK. The technical tests of surveil-lance capabilities aim at assessing the performance of A-SMGCS and each sensor (including SMR).

OK.

Tech_Surv_10 Mobile Position

The surveillance equipment shall include at least one cooperative surveillance sensor system to detect and determine the position of cooperative mobiles on the movement area of the airport. Currently, MLAT is preferred as a cooperative sensor.

Fn-07 OK. The existing MLAT system at Prague will provide data for EMMA. An ADS-B receiver capability will also be provided.

The installed system fulfils this requirement.

OK.

Tech_Surv_11 The MLAT system should comply with the minimum operational requirements given in ED-117 [20].

Fn-07 OK. The technical tests of surveil-lance capabilities aim at assessing the performance of A-SMGCS and each sensor (including MLAT).

OK.

Tech_Surv_12 Mobile Identity

The MLAT system shall also determine the identity of cooperative mobiles on the movement area of the airport.

Fn-08

OK. The MLAT system provides the mode S code of detected aircraft and can interrogate to obtain the mode A code. The SDS uses the mode A code to obtain the air-craft identification from flight plan data.

Hypothesis Ver.sur.7 partially verified. Indicators: VE-2 and VE-9. The installed system has been configured to detect and identity mobiles on the ma-noeuvring area.

OK.

Tech_Surv_13 Aircraft Position

The SDF shall connect to the airport’s approach RDPS to obtain the positions of airborne aircraft in the required areas.

Fn-09 OK. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 144 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Surv_14 Aircraft Identity

The approach RDPS shall provide the identity of airborne aircraft.

Fn-10 OK. The aircraft are presented with callsign or Mode A code if the callsign is not available.

Hypothesis Ver.sur.7 partially verified. Indicators: VE-2 and VE-9.

OK.

Target Reports Tech_Surv_15 Each surveillance sensor system shall transmit

continuous target position reports to the SDF. MASPS §2.5.1.1 OK. Hypothesis Ver.sur.5 verified.

Indicator: VE-11.

OK.

Tech_Surv_16 As a minimum, each target report from a surveil-lance sensor system should include the following information:

• Data Source Identifier • Target Report Descriptor • Target Position • Time of Measurement

If available, the following additional information should be provided:

• Target Identifier (e.g. Callsign or SSR Mode A code)

• Target Size Classifier • Measured Height • Estimated Accuracy of Position

Aircraft shall be identified and labelled based on the Mode A code set on the aircraft’s secondary radar transponder.

MASPS §2.5.1.1 OK. The installed system fulfils this requirement. Aircraft can also be identified by the call-sign directly re-ported by transponders.

OK.

Tech_Surv_17 Target position reports shall use a common refer-ence system, WGS-84 datum. Target positions may be in LAT/LON or Cartesian coordinates re-ferred to a common reference point on the aero-drome surface.

MASPS §2.5.1.1 OK. The installed system fulfils this requirement.

OK, target position is in Carte-sian co-ordinates referred to a common reference point.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 145 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Surv_18 Target position reports should use a common reference point on aircraft and vehicles. This point has been defined as the geometrical mid-point on the longitudinal axis of the target.

MASPS §3.2.1.2 The reference point is used as the datum for position accuracy measurements. See perform-ance test section.

The installed system does not fulfil this requirement. For non-cooperative targets the reported position is re-ferred to the scatter plot barycentre. For cooperative ADS-B target the reported position is the one as provided by the on-board equipment. For cooperative targets de-tected by MLAT the reported position is the one as de-tected by system from the transmitting antenna.

A common reference point shall be used for accuracy tests. Concerning target positions considered in the data fusion process and presented to con-troller, different positions are measured by different sensors: - SMR gives barycentre

position - MLAT gives 1090 antenna

position on the air-craft/vehicle

- ADS-B gives the GPS antenna position on the aircraft/vehicle

Tech_Surv_19 The SDF shall perform correlation of target report data from the sensors and track target movements in order to determine the best estimate of the target position at each update.

MASPS §3.2.1.2 OK. NOTE: Probability of Continuous Track (PCT) will be measured by the MOGADOR tool in SP6

The installed system fulfils this requirement.

OK.

Tech_Surv_20 Provide a seam-less transition

The SDF should provide a seamless transition between the airborne track for an aircraft and its ground track.

Op_Serv-13 [Fn-30 Fn-54 ICAO §2.5.1.6]

OK. 10 consecutive departures and 10 consecutive arrivals ob-served.

Hypothesis Ver.sur.3 partially verified. Indicator: VE-5.

OK.

Other Information about Traffic Tech_Surv_21 Other Information about Traffic

The SDF should obtain other information about the traffic through appropriate interfaces to other sys-tems.

Fn-12 [MASPS §2.5.1.1]

OK. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 146 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Surv_22

For each aircraft the information required will in-clude:

• ATC Callsign • Mode A code • Mode S code • Departure Airport • Destination Airport • Estimated Time of

Arrival / Departure • Stand identifier

Additional information that may be required in-cludes:

• Aircraft type • Wake Vortex Category • Slot time (if applicable) • SID/STAR • Stand status (occupied/free) • Assigned runway • Estimated and Actual Off Block Time

Op_Serv-05 OK. At Prague the following informa-tion is required:

• ATC Callsign • Mode A Code • Mode S Address • Aircraft Type • Wake Vortex Category • Departure/Destination

Airport • Estimated Time of Ar-

rival/Departure • Slot Time (when appli-

cable) • Stand Identifier • SID/STAR (as appli-

cable)

The installed system fulfils this requirement.

OK.

Tech_Surv_23 For each vehicle the information required may include:

• ATC Callsign • Transponder code

(Mode S or other type) • Vehicle type • Vehicle fleet identifier

Op_Serv-05 OK. The installed system fulfils this requirement.

The requirement is satisfied except for transponder code and vehicle type. The transponder code is only a means to corre-late a flight plan with a track. The vehicle surveillance data enables the ATCO to identify the vehicle fleet by a letter contained in the callsign (e.g. ‘M’ for management vehicles).

Tech_Surv_24 For each tracked target, the SDF should extend the target report data to include the other relevant information available.

OK. The installed system fulfils this requirement.

OK.

Basic Function (Control) Tech_Cont_01 The Control function should continuously process

the target reports from the SDF to compare the traffic situation in real time with a set of predefined alert situations.

OK. The installed system fulfils this requirement.

OK (ref. SCA function).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 147 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Cont_02 The Control function should output an alert report to clients whenever a predefined alert situation occurs.

Op_Serv-22 [ICAO §2.5.4] Op_Serv-15

OK. Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The installed system fulfils this requirement.

OK (ref. SCA function).

Alert Display Tech_HMI_13 Display Alerts

On receipt of an alert report, the HMI should display alerts at the appropriate controller working posi-tions.

Fn-20 [Op_If-1 Op_Serv-22 MASPS §2.5.3]

OK. The alert is indicated by the label(s) of involved target(s) turning amber or red, depending on the type and severity of the alert.

The installed system does not fulfil this requirement. Alert reports are displayed to all controller positions. The appropriate controller has the possibility to acknowledge his pertained alert. However this function is not configured in the current EMMA system.

Partially compliant. Warning/alarm alerts are shown on all CWPs (that have different rules). Displaying alerts at the appropriate positions only (APP CWPs, GND CWPs, etc.) is possible but not provided in EMMA1 test bed.

Tech_HMI_14 Alert Continuity

The alert should be displayed continuously as long as the alert situation persists.

Fn_Perf-19 [Op_Perf-19 ICAO §3.4.5.14]

OK. The installed system fulfils this requirement.

OK.

Tech_HMI_15 Alert Hierarchy

In the event of multiple simultaneous alerts, it may be appropriate to prioritise them in some way, e.g. by listing or colour coding. In any case, a Stage 1 alert will always have lower priority than a Stage 2 alert.

Fn-37 [Op_Serv-27]

Stage 1 alert is colour-coded amber; Stage 2 is colour-coded red. A window can be configured to pop up showing a list of active alerts, sorted by type.

The EMMA system alert display is configured: • To report stage 2 alerts

(red colour-coded) on a dedicated window and on the traffic situation display;

• To report stage 1 alerts (yellow colour-coded) only on the traffic situa-tion display.

SCA has ‘alarms’ and ‘warnings’ displayed in different ways. Alarms have greater priority.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 148 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Alert Situations Tech_Cont_03 Conflict on Run-way

The Control function should detect any predefined conflict situation on the runway and generate a conflict alert report. The conflict situations should be configurable from the configuration database.

Op_Serv-16 [Fn_03] ICAO §2.5.4.3.a ICAO §2.5.4.2.d

OK. OK.

Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The installed system fulfils this requirement.

OK.

Tech_Cont_04 Runway Incursion

The Control function should detect whenever a target enters any predefined runway strip area and generate an incursion alert report. The runway strip boundaries should be configurable from the configuration database.

Fn_03 [ICAO §2.5.4.3.d] ICAO §2.5.1.7

OK. An alert is given at the CWP whenever a target enters one of these areas.

Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The installed system fulfils this requirement.

OK.

Tech_Cont_05 Protected Area Incursion

The Control function should detect whenever a target enters any predefined protected area and generate a protected area alert report. The protected areas should be configurable from the configuration database.

ICAO §2.5.4.3.b OK. An alert is given at the CWP whenever a target enters the area.

Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The installed system fulfils this requirement.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 149 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Cont_06 Restricted Area Incursion

The Control function should detect whenever an aircraft target enters any predefined restricted area and generate a restricted area alert report. Targets other than aircraft targets should not trigger the alert. The restricted areas should be configurable from the configuration database.

Op_Serv-17 [Fn-05]

OK. An alert is given at the CWP whenever an aircraft target en-ters the area, but not when vehi-cles or unclassified targets enter the area.

Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The rules for generating alerts for unauthorised access into restricted areas are fully configurable. Current configuration as-sumes that an unclassified target shall be treated as an aircraft (conservative ap-proach). The installed system fulfils this requirement.

OK.

Tech_Cont_07 Route Deviation

Once a route has been assigned to a mobile, and the mobile has started on that route, the Control function should detect when the target begins to deviate from that route by more than a predefined distance and generate a deviation alert report. The deviation limit should be configurable from the configuration database.

ICAO §2.5.4.3.c ICAO §2.5.4.3.e

Deviation alerting is an EMMA2 function – not included in this SAT.

Not within the scope of A-SMGCS levels 1 & 2.

Conformance monitoring module is not provided in EMMA1.

Stages of Alert Tech_Cont_08 Conflict alerts should be configurable in two stages

(1 and 2) according to the severity of the situation. Stage 2 is more severe than Stage 1. Incursion alerts, restricted area alerts and deviation alerts should be configurable as either Stage 1 or Stage 2 according to local requirements at the airport.

Op_Serv-27 OK. Both Stage 1 (amber) and Stage 2 (red) alerts are given at the CWP, depending on the severity defined for each situation.

Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The installed system fulfils this requirement.

OK (ref. to req. ‘Tech_HMI_15 Alert Hierarchy’).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 150 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Alert Reports Tech_Cont_09 As a minimum, each alert report transmitted from

the Control function to clients should include the following information:

• Data Source Identifier • Alert Report Identifier • Type of Alert • Alert Stage • Time of Alert • Identity of target(s) in alert situation

Fn-36 [MASPS §2.5.1.2]

OK. The target report contains the required data.

Hypothesis Ver.hmi.2 verified. The installed system fulfils this requirement.

OK.

Tech_Cont_10 An alert report should be transmitted for each target position update for as long as the alert situation persists.

Fn-36 [MASPS §2.5.1.2]

OK. Hypothesis Ver.con.1 verified as part of D4.2.1 site accep-tance procedure for the sur-face conflict alert (SCA) sys-tem at Toulouse airport. The installed system fulfils this requirement.

OK.

Table 8-1: Functional Tests for Level II A-SMGCS

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 151 File Name: D671_Analysis_V1.0.doc Version: 1.0

Appendix A.2 Performance Tests This section describes the tests performed to verify compliance of the EMMA Level II A-SMGCS with the performance requirements listed in [4]. In the following tables, the comment ‘FAT’ in the Test Procedure column means that the requirement is only tested during factory acceptance, not at SAT; the comment V&V means that the requirement is tested during operational verification and validation (SP6), not at SAT.

Result and Comments EMMA REQ ID Requirement Description Ref PRG TLS MXP

Capacity Tech_Gen_42 The equipment shall have sufficient capacity to

process data for 300 targets simultaneously. Op_Range-2

OK. The system detects and tracks all targets.

Not tested. The operational system has this capability. Performance analysis is not the focus of EMMA1 that foresees experimental systems (test bed).

Tech_Gen_43 When first installed, the processing equipment shall have a margin of spare capacity of at least 30%.

MASPS §3.1.2 Not tested at SAT. Note tested See above.

Co-ordinate System Tech_Gen_44 All geographical information shall be referenced to

a common reference point on the aerodrome. This point shall be referenced in WGS-84.

ICAO §2.6.6.1 OK. The display shows the aero-drome map with overlaid targets and a cursor. The LAT/LON position is continuously dis-played together with the bearing and range information next to the cursor. ARP: N 50º 06.05’

E 14º 15.60’ RWY 06: N 50º 06.011’

E 14º 13.578’ RWY 24: N 50º 06.957’

E 14º 16.402’ RWY 13: N 50º 06.481’

E 14º 14.722’ RWY 31: N 50º 05.428’

E 14º 16.900’ SMR: N 50° 06.372’

E 14° 06.022’

The installed system fulfils this requirement. The reference point is the Toulouse ARP.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 152 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Ref PRG TLS MXP

Tech_Gen_45 The reference point for target position data shall be the mid-point of the target’s longitudinal axis.

ICAO §2.6.6.2 This is a definition, not a testable requirement. Accuracy meas-urements are referred to this reference point.

The installed system does not fulfil this requirement. See Tech_Surv_18.

Ref. to req. ‘Tech_Surv_18’.

Coverage (Surveillance) Tech_Surv_25 The A-SMGCS equipment should provide surveil-

lance coverage throughout the movement area up to a height of at least 200 feet above the aero-drome surface, and on the approaches to each runway out to a distance of at least 10 NM.

ICAO §3.4.1.5 [§3.4.2.2 §3.4.3.6] ICAO §4.1.1.4

OK. Good visibility and no precipita-tion at time of test. Test results are presented in D6.3.1 [12].

Hypothesis Ver.sur.2 verified. Indicator: VE-1.

OK.

Accuracy and Resolution (Surveillance) Tech_Surv_26 Reported Position Accuracy

The reported position accuracy of the surveillance data transmitted from the SDF to clients should be 7.5m or better at a confidence level of 95%.

Op_Perf-05 [Op_Perf-15 Fn_Perf-01 MASPS §3.2.3 ICAO §4.2.3]

OK. Good visibility and no precipita-tion at time of test. Test results are presented in D6.3.1 [12].

Hypothesis Ver.sur.3 partially verified. Indicator: VE-5.

OK.

Tech_Surv_27 Reported Position Resolution

The resolution of the position data in a target report should be better than 1 m.

Op_Perf-06 [Fn_Perf-02 MASPS §3.2.3]

OK. The smallest observable change was 1 pixel, corresponding to 0.1m on the 50m-range scale.

Hypothesis Ver.sur.3 partially verified. Indicator: VE-5.

OK.

Tech_Surv_28 Reported Speed Accuracy

The accuracy of the target speed data transmitted from the SDF to clients should be better than 5m/s at a confidence level of 95%.

Op_Perf-16 [Fn_Perf-04 MASPS §3.2.3 ICAO §4.1.1.8 §4.1.1.10]

OK. Good visibility and no precipita-tion at time of test. Test results are presented in D6.3.1 [12].

Not tested. Not tested.

Tech_Surv_29 Reported Direc-tion Accuracy

The accuracy of the direction of movement data transmitted from the SDF to clients should be better than 10° at a confidence level of 95%.

Op_Perf-16 [Fn_Perf-04 MASPS §3.2.3 ICAO §4.1.1.8 §4.1.1.10]

OK. Good visibility and no precipita-tion at time of test. Test results are presented in D6.3.1 [12].

Not tested Not tested.

Tech_Surv_30 Reported Speed Resolution

The resolution of the speed data in a target report should be better than 1 m/s.

Op_Perf-17 [Fn_Perf-05 MASPS §3.2.3]

OK. Not tested OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 153 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Ref PRG TLS MXP

Tech_Surv_31 Reported Direc-tion Resolution

The resolution of the direction of movement data in a target report should be better than 1.5°.

Op_Perf-17 [Fn_Perf-05 MASPS §3.2.3]

OK. Not tested. OK.

Tech_Surv_32 Aircraft Level Accuracy

For an airborne aircraft, the accuracy of the target measured-height transmitted from the SDF to clients should be 10 m or better at a confidence level of 95%.

Op_Perf-07 [Fn_Perf-06 ICAO §4.2.3]

Requires specially equipped test aircraft. NOTE: Could not be tested in SP6.

Not tested. Not tested.

Velocity Range Tech_Surv_33 The surveillance equipment should be capable of

detecting and tracking targets within the following velocity ranges:

• 0 to 250 kt for aircraft on final approach, missed approach and runways

• 0 to 80 kt for aircraft on runway exits • 0 to 80 kt for vehicles on the movement

area • 0 to 50 kt for aircraft on straight taxiways • 0 to 20 kt for aircraft on taxiway curves • 0 to 10 kt for aircraft and vehicles on

stands and stand taxi lanes • Any direction of movement

Op_Range-5 [ICAO §2.6.4] Op_Range-6 [ICAO §4.1.1.8]

OK. All targets are detected and tracked.

Verification hypothesis Ver.gen.3.

OK.

Update Rate Tech_Surv_34 Update Rate

An updated target report shall be transmitted from the SDF to the clients at least once per second for each target.

Fn_Perf-08 [MASPS §3.2.3 ICAO §4.2.4] Op_Serv-03

OK. Test results are presented in D6.3.1 [12].

Hypothesis Ver.sur.5 verified. Indicator: VE-11.

OK.

Data Integrity (Surveillance) Tech_Surv_35 Probability of Detection

The probability that an actual aircraft, vehicle or object is detected and reported at the output of the SDF should be 99.9% at minimum.

Fn_Perf-09 [Op_Perf-01 MASPS §3.2.3

OK. Good visibility and no precipita-tion at time of test. NOTE: This parameter is also measured long-term using the MOGADOR tool in SP6. Test results are presented in D6.3.1 [12].

Hypothesis Ver.sur.7 partially verified. Indicator: VE-2.

Refer to test results.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 154 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Ref PRG TLS MXP

Tech_Surv_36 Probability of False Detection

The probability that anything other than an actual aircraft, vehicle or object is detected and reported at the output of the SDF should not exceed 10E-3 per reported target.

Fn_Perf-10 [Op_Perf-02 MASPS §3.2.3

OK. Good visibility and no precipita-tion at time of test. NOTE: This parameter is also measured long-term using the MOGADOR tool in SP6. Test results are presented in D6.3.1 [12].

Verification hypothesis: Ver.sur.8. Indicators: VE-3.

OK.

Tech_Surv_37 Probability of Identification

The probability that the correct identity of a coop-erative aircraft, vehicle or object is reported at the output of the SDF should be 99.9% at minimum.

Fn_Perf-11 [Op_Perf-03 MASPS §3.2.3

OK. Good visibility and no precipita-tion at time of test. NOTE: This parameter is also measured long-term using the MOGADOR tool in SP6. Test results are presented in D6.3.1 [12].

Hypothesis Ver.sur.7 partially verified. Indicator: VE-9.

OK.

Tech_Surv_38 Probability of False Identifica-tion

The probability that the identity reported at the output of the SDF is not the correct identity of the actual aircraft, vehicle or object, should not exceed 10E-3 per reported target.

Fn_Perf-12 [Op_Perf-04 MASPS §3.2.3

OK. Good visibility and no precipita-tion at time of test. NOTE: This parameter is also measured long-term using the MOGADOR tool in SP6. Test results are presented in D6.3.1 [12].

Verification hypothesis: Ver.sur.8. Indicators: VE-10.

OK.

Data Integrity (Alert) Tech_Cont_11 Probability of Detection of Alert Situation

The probability of detection of an alert situation should be greater than 99.9%

Fn_Perf-25 [ICAO §4.5.1]

OK. Test results are presented in D6.3.1 [12].

Verification hypothesis Ver.con.3. Indicator: VE-12.

OK.

Tech_Cont_12 Probability of False Alert

The probability of false alert should be less than 10E-3.

Fn_Perf-26 [MASPS §3.3.3 ICAO §4.5.1] Op_Perf-20 Op_Perf-21

Insufficient data Verification hypothesis Ver.con.4. Indicator: VE-13.

OK.

Timeliness (Alert)

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 155 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Ref PRG TLS MXP

Tech_Cont_13 Alert Response Time

Having received the target report from the surveil-lance element, the time taken for the Control func-tion to detect and report any alert situation should be not more than 0.5 s.

Fn_Perf-27 [MASPS §3.3.2.4 ICAO §4.5.2] Op_Perf-18 [ICAO §2.5.4.4]

OK. Test results are presented in D6.3.1 [12].

Hypothesis Ver.con.5 verified as part of D4.2.1 to range between 0 ms and 2 ms (depending on the number of conflicts). Indicator: VE-14.

OK.

Accuracy and Resolution (HMI) Tech_HMI_16 Map Accuracy

The accuracy of all map information to be pre-sented on the HMI display(s) should be 1m or better.

Fn_Perf-14 [Op_Perf-05]

OK. ARP: N 50º 06.050’

E 14º 15.600’ RWY 06: N 50º 06.011’

E 14º 13.578’ RWY 24: N 50º 06.957’

E 14º 16.402’ RWY 13: N 50º 06.481’

E 14º 14.722’ RWY 31: N 50º 05.428’

E 14º 16.900’ SMR: N 50° 06.372’

E 14° 06.022’

Not tested. OK.

Tech_HMI_17 Display Resolu-tion

The resolution of the HMI displays should be suffi-ciently high that quantisation errors are negligible. As a minimum, the display resolution should be 1024 lines of 1280 pixels.

Fn_Perf-15 [Op_Perf-05 Op_Perf-06 MASPS §3.6.1.1]

OK. The display resolution is 1024 by 1280.

The display resolution is 1024 by 1280.

OK.

Tech_HMI_18 Position Registra-tion Accuracy

The position registration accuracy of all information presented on the HMI display(s) should be one pixel.

Fn_Perf-16 [Op_Perf-05 Op_Perf-09 MASPS §3.6.1.2] Fn-19 [Op_Serv-11]

Not tested at SAT. Not tested. Not tested.

Timeliness Tech_HMI_19 Target Display Latency

The Target Display Latency should not exceed 250 ms.

Fn_Perf-17 [Op_Perf-05 Op_Perf-09 MASPS §3.6.1.3]

OK. Not tested. OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 156 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Ref PRG TLS MXP

Tech_HMI_20 Information Dis-play Latency

The Information Display Latency should not exceed 250 ms for safety-critical information. For informa-tion that is not safety-critical, this value can be relaxed.

Fn_Perf-18 [Op_Perf-09 MASPS §3.6.1.4]

Not tested at SAT. Not tested. Not tested.

Tech_HMI_21 Response Time to Operator Input

The Response Time to Operator Input shall not exceed 250 ms.

Fn_Perf-20 [Op_If-1 MASPS §3.6.1.5]

OK. The window opens (almost) instantaneously. OK. The response is (almost) instan-taneous.

Not tested. Not tested.

Table 8-2: Performance Tests for Level II A-SMGCS

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 157 File Name: D671_Analysis_V1.0.doc Version: 1.0

Appendix A.3 Interface Tests This section describes the tests performed to verify compliance of the EMMA Level II A-SMGCS with the interface requirements listed in [4]. In the following tables, the comment ‘FAT’ means that the requirement is only tested during factory acceptance, not at SAT.

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Interface Principles Tech_Gen_46 Wherever possible and practicable, the A-SMGCS

should utilise standard data communications inter-face protocols and data formats.

Op_Ds-3 [MASPS §1.8.4]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_47 The system software applications should use ex-tensive client-server architecture and inter-process communication.

Op_Ds-4 [ICAO §2.6.16.2]

OK. The installed system fulfils this requirement.

OK.

Tech_Gen_48 The inter-process communication level should support process distribution via LAN, using TCP/IP.

Op_Ds-4 [ICAO §2.6.16.2]

OK. The installed system fulfils this requirement.

OK.

Time Synchronisation Interface Tech_Gen_49

The A-SMGCS should be synchronised with an airport central time source so that all date and time indications used within the system agree with the reference time. The synchronisation standard used should be the Network Time Protocol (NTP).

OK. The installed system fulfils this requirement.

Partially compliant. EMMA1 test bed uses UTC reference clock connected via serial line to one node of test bed. Operational system is ready to accept UTC reference clock information via LAN with NTP standard protocol.

Data Sources Tech_Surv_39 Cooperating mobiles shall be adequately equipped

to communicate their position, identity, and other relevant data to the A-SMGCS.

Op_If-3 ICAO §2.6.3.2

OK. At the time of testing 40 vehicles were equipped with Mode S squitter beacons.

The installed system fulfils this requirement for aircraft and 10 vehicles (SAT).

5 vehicles equipped with Mode S squitter beacons (detected by MAL subsystem) and ADS-B over WLAN (detected by proper legacy system).

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 158 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Surv_40 The equipment should interface to a flight data processing system of the ATM at the airport. In some cases, it may also be necessary to interface to a separate code-callsign database.

Op_If-4 [ICAO §2.6.16.1]

OK. NOTE: Only flight data for air-craft due to arrive or depart within the next 40 minutes are displayed.

The installed system does not fulfil this requirement: the issue will be solved in EMMA2. The current EMMA system is connected to a test network which does not provide flight plans continuously.

OK.

Tech_Surv_41 To receive data on airborne aircraft in the vicinity, the A-SMGCS surveillance equipment should be interfaced to the approach surveillance system at the airport.

Op_If-4 [ICAO §2.6.16.1]

OK. The installed system fulfils this requirement.

OK.

Tech_Surv_42 The equipment should interface to a processing system of the airport to obtain stand information regarding aircraft about to land or depart.

Op_If-4 [ICAO §2.6.16.1]

OK. Not within the scope of A-SMGCS levels 1 & 2.

Not included in EMMA1 (inter-operability with Airport Operator database is proposed in EMMA2).

Tech_Surv_43 If necessary to achieve the required performance, the equipment should interface to a MET system to obtain meteorological data.

Op_If-4 [ICAO §2.6.16.1]

A MET data interface is not necessary for this phase of EMMA.

Not within the scope of A-SMGCS levels 1 & 2. Manual switching of SCA depending on visibility condi-tions.

Not included in EMMA1 test bed.

Tech_Surv_44 To obtain information about the status of stop bar lights, the equipment should interface to the aero-drome ground lighting system.

Op_If-4 [ICAO §2.6.16.1]

OK.

Not within the scope of A-SMGCS levels 1 & 2.

Not included in EMMA1 test bed.

Tech_Surv_45 The equipment should be capable of interfacing to any other system specified by the local authority.

Op_If-4 [ICAO §2.6.16.1]

OK. The system is not installed on the operational network.

OK, but for safety reason and to avoid interference with operative system, the integration of EMMA test bed with the opera-tive context is limited.

Data Communication Protocols Tech_Surv_46 The Surveillance function should comply with the

general interface requirements given in section 3.3. OK. See general requirements. OK.

Tech_Cont_14

The Control function should comply with the gen-eral interface requirements given in section 3.3.

OK. See general interface re-quirements.

OK.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 159 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Data Formats Tech_Surv_47 Target position reports output from surveillance

sensor systems should be in the ASTERIX data format.

MASPS §2.5.1.1 OK. The installed system fulfils this requirement.

OK, if requested.

Tech_Surv_48 Data format

The SDF shall be able to receive input target report data from the SMR Extractor and from the GFS in the ASTERIX CAT010 data format.

Fn_If-1 [MASPS §2.5.1.1 ICAO §2.6.16.2]

OK. specified in Annex E and Annex F of [5]

The installed system fulfils this requirement.

OK, if requested.

Tech_Surv_49 Data format

The SDF shall be able to receive input target report data from MLAT in the ASTERIX CAT010 data format.

Fn_If-2 [MASPS §2.5.1.1 ICAO §2.6.16.2]

OK. specified in Annex C of [5]

The installed system fulfils this requirement.

OK, if requested.

Tech_Surv_50 Data format

The SDF shall be able to receive input target report data from approach radar systems in the ASTERIX CAT001, CAT 034, and CAT048 data formats, in accordance with local requirements.

Fn_If-3 [MASPS §2.5.1.1 ICAO §2.6.16.2]

OK. specified in Annex B of [5]

The installed system fulfils this requirement. It is interfaced with the DACOTA system.

OK, if requested.

Tech_Surv_51 Data format

The SDF shall be able to output target report data in the ASTERIX CAT011 data format.

Op_Ds-4 [Fn_If-4 ICAO §2.6.16.2]

OK. specified in Annex J of [5]

The installed system does not fulfil this requirement: the output of SDF is in ASTERIX CAT62 data format. CAT011 has been super-seded by CAT062

OK, if requested.

Tech_Surv_52 It should be possible to route the output data of the surveillance data fusion to pre-defined clients.

Op_Serv-02 OK. The installed system fulfils this requirement.

OK.

Tech_Surv_53 Client processes using the surveillance data shall be able to receive and decode data in the ASTERIX CAT011 or CAT062 data format.

Op_Serv-02 OK. The installed system does not fulfil this requirement, except that the SDF data format is ASTERIX CAT62. CAT011 has been super-seded by CAT062.

OK, if requested.

Tech_Cont_15

The Control function shall be able to receive input target report data from the SDF in the ASTERIX CAT011 or CAT062 data format.

OK. The installed system does not fulfil this requirement, except that the SDF data format is ASTERIX CAT62.

OK, if requested.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 160 File Name: D671_Analysis_V1.0.doc Version: 1.0

Result and Comments EMMA REQ ID Requirement Description Reference PRG TLS MXP

Tech_Cont_16

Alert reports output from the Control function sys-tems should be in the ASTERIX CAT011 or CAT062 data format.

OK. The installed system does not fulfil this requirement: the output of the control function system is in ASTERIX CAT10s data format.

OK, if requested.

Table 8-3: Interface Tests for Level II A-SMGCS

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 161 File Name: D671_Analysis_V1.0.doc Version: 1.0

APPENDIX B DETAILED RESULTS FOR OPERATIONAL REQUIREMENTS Operational requirements are the parent requirements of most of the technical requirements, thus most of the operational requirements have been verified by technical test or even by a plausible check, e.g. ‘yes, the installed system is a modular one’. However, some operational requirements need to be proven also from an operational point of view, e.g. ‘Op_Perf-05: For the surveillance service, the allowable error in reported position shall be consistent with the requirements set by the control task of the controller: 12m.’ Such an operational requirement had to be verified technically but also its operational feasibility had to be verified by asking the system operators (ATCos) for their acceptance of the experienced performance. If the ATCos accept the performance it can be stated that the opera-tional requirement has been fully verified, technically and operationally. Note: When requirements are verified (technically and operationally) it cannot be concluded that they are validated because a lower performance for instance has not been proven for their operational feasibility by a real experiment. However, even when requirements cannot be validated they can be questioned with following result pattern:

• Requirements that could not be verified technically but operationally (the lower performance was accepted by the ATCos) or • Requirements that could be verified technically but not operationally

can be rejected or at least can be improved to meet the users’ needs or to soften the technical requirement . Appendix B.1 General requirements

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_05 The installed system fulfils the Tech_Gen_05 requirement. The modularity of the A-SMGCS is well accepted by the controllers. They consider that, after a short adaptation period, the HMI can easily be customised to the user’s prefer-ences. Hypothesis Opf.Pre.2 validated.

OK. Op_Ds-1 An A-SMGCS shall be composed of different modules required for particular user needs or technological choices.

Tech_Gen_05

Technically verified Technically verified Operationally verified

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 162 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_HMI_10 The installed system fulfils the Tech_HMI_10 requirement. No validation hypothesis associated to this requirement.

OK. Op_Ds-2 The A-SMGCS design concept must be built upon the integration of the fundamental and principal system elements and facilitate the upgrading of those elements whilst maintain-ing, where possible, the same HMI and references. This is important when consider-ing harmonisation, familiarisation and training requirements, and will allow the evolution of the system design through to a full A-SMGCS with the minimum negative impact on the users’ ability to interface with the system.

Tech_HMI_10

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Gen_46 The installed system fulfils the Tech_Gen_46 requirement. Test of this operational requirement not in the scope of the EMMA pro-ject.

OK. Op_Ds-3 Standards like Standards and Recommended Practices (SARPS) shall be written and used to permit interoperability between the A-SMGCS elements developed by different manufacturers.

Tech_Gen_46

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Gen_47, Tech_Gen_48

The installed system fulfils the Tech_Gen_47 and Tech_Gen_48 requirements. No validation hypothesis associated to this requirement.

OK. Op_Ds-4 The data interchange between systems should be performed in a standardized format in order to ensure an adequate exchange of information. ASTERIX will be the standard to be used for surveillance data.

Tech_Gen_47 Tech_Gen_48

Technically verified Technically verified Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 163 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_32 VA-83 not confirmed: ‘The display enables to recognize a degrading accuracy of surveillance.’

The installed system fulfils the Tech_Gen_32 requirement. The controllers were not able to evaluate the statement: ‘The EMMA display enables to detect a loss of accuracy of the surveillance’.

OK. Op_Ds-5 A self-checking system with failure alerts shall be in the system design.

Tech_Gen_32

Technically verified Operationally not fully verified

Technically verified Operationally not verified

Verified by Tech_Gen_36 Subject to Functional Hazard Analy-sis (FHA), System Safety Assess-ment (SSA), and appropriate opera-tional procedures. The term ‘fail-soft’ means that the system is so designed that, even if equipment fails to the extent that loss of some data occurs, sufficient data remain on the display to enable the controller to continue operation without assistance of the computer. VA-79 confirmed significantly: ‘The information displayed in the A-SMGCS is helpful for avoiding con-flicts.’

Technical requirement Tech_Gen_36 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Ds-6 Equipment which shows control data shall both be fail-safe and fail-soft.

Tech_Gen_36

Technically verified Operationally verified

Not assessed technically Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 164 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_36 Subject to Functional Hazard Analy-sis (FHA), System Safety Assess-ment (SSA), and appropriate opera-tional procedures. The term ‘fail-soft’ means that the system is so designed that, even if equipment fails to the extent that loss of some data occurs, sufficient data remain on the display to enable the controller to continue operation without assistance of the computer. VA-21 not confirmed: ‘I think procedures in case of A-SMGCS failure are defined clear enough.’

Technical requirement Tech_Gen_36 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Ds-7 In case of a failure of an element of an A-SMGCS, the failure effect shall be such that the element status is always in the ‘safe’ condition.

Tech_Gen_36

Technically verified Operationally not verified

Not assessed technically Not assessed operationally

Technically verified

Verified by Tech_Gen_39 The SDS does a controlled shut down, then automatically reboots and resumes operation. Tech_Gen_40 Recovery time for SDS: 36 s Recovery time for RANC: 28 s

Technical requirements Tech_Gen_39 and Tech_Gen_40 will be checked in EMMA2 SSA. No validation hypothesis associated to this requirement.

OK. Op_Ds-8 An A-SMGCS shall be self-restartable. Tech_Gen_39 Tech_Gen_40

Technically verified Not assessed technically Not assessed operationally

Not assessed

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 165 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_41 Technical requirement Tech_Gen_41 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Ds-9 The restart of an A-SMGCS shall include the restoration of pertinent information on actual traffic and system performance.

Tech_Gen_41

Technically verified Not assessed technically Not assessed operationally

Not assessed

Verified by Tech_Gen_24 The gap-filler sensors are mounted close to the taxiways to be surveyed, so that shadowing by objects enter-ing the field of view is avoided. Since the gap-filler employs optical sensors signal reflections are not an issue. Tech_Gen_26 Equipment is CE marked. Test certificate available.

The installed system partly fulfils the Tech_Gen_24 requirement. The installed system fulfils the Tech_Gen_26 requirement. No validation hypothesis associated to this requirement.

OK. Op_Env-4 The system shall have adequate immunity to adverse effects such as : a) radio interference, including that pro-

duced by standard navigation, telecom-munications and radar facilities (including airborne equipment);

b) signal reflections and shadowing caused by aircraft in flight, vehicles or aircraft on the ground, buildings, snow banks or other raised obstacles (fixed or tempo-rary) in or near the aerodrome environ-ment; and

c) meteorological conditions or any state of the aerodrome resulting from adverse weather in which operations would oth-erwise be possible.

Tech_Gen_24 Tech_Gen_26

Technically verified Technically partly verified Not assessed operationally

Not completely assessed in EMMA1.

Verified by Tech_Gen_26 Equipment is CE marked. Test certificate available.

The installed system fulfils the Tech_Gen_26 requirement. No validation hypothesis associated to this requirement.

OK. Op_Env-5 Those elements of A-SMGCS which require the use of radio spectrum should operate in properly allocated frequency bands in accor-dance with appropriate national and interna-tional radio regulations.

Tech_Gen_26

Technically verified Technically verified Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 166 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_26 Equipment is CE marked. Test certificate available.

The installed system fulfils the Tech_Gen_26 requirement. No validation hypothesis associated to this requirement.

OK. Op_Env-6 A-SMGCS equipment shall not cause inter-ference to standard radio navigation, surveil-lance and communication systems.

Tech_Gen_26

Technically verified Technically verified Not assessed operationally

Technically partially verified

Verified by Tech_HMI_03 The HMI is configurable to Prague requirements.

The installed system fulfils the Tech_HMI_03 requirement. No validation hypothesis associated to this requirement.

OK. Op_Evo-1 An A-SMGCS shall be capable of accommo-dating any operational change of the aero-drome after being installed, for instance a physical change in layout (runways, taxiways and aprons), or a change in the aerodrome procedures, rules...

Tech_HMI_03

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Gen_13 The system support equipment includes an appropriate database with editing facilities.

Technical requirement Tech_Gen_13 not assessed (related verification hypothesis: Ver.gen.4). This operational requirement is too vast to be properly validated. No validation hypothesis associated to this requirement.

OK. Op_Evo-2 An A-SMGCS shall be capable of accommo-dating any technological change after being installed.

Tech_Gen_13

Technically verified Not assessed technically Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 167 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_HMI_10 The installed system fulfils the Tech_HMI_10 requirement. The limited A-SMGCS implementa-tion does not allow testing this opera-tional requirement. No validation hypothesis associated to this requirement.

OK. Op_Evo-3 A-SMGCS evolution shall have a minimum negative impact on the users’ ability to inter-face with the system. This is important when considering harmonisation, familiarisation and training requirements.

Tech_HMI_10

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Gen_19 The equipment is CE marked and approved by the relevant authorities.

The installed system fulfils the Tech_Gen_19 requirement. No validation hypothesis associated to this requirement.

OK. Op_Evo-4 The design principle of an A-SMGCS shall permit modular enhancements such as im-plementation of further A-SMGCS levels.

Tech_Gen_19

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Gen_10

The installed system fulfils the Tech_Gen_10 requirement. No validation hypothesis associated to this requirement.

OK. Op_Evo-5 The design principle of an A-SMGCS shall permit system enhancements at minimal cost.

Tech_Gen_10

Technically verified Technically verified Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 168 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_HMI_01 There is a bright acceptance of the HMI in relation to the layout Confirmed significantly: VA-88, VA-90, VA-91, VA-92, VA-93, VA-94, VA-96, VA-97, VA-98, VA-99, VA-100, VA-101, VA-102, VA-103, VA-104, VA-105, VA-106, VA-108, VA-130, VA-132, VA-134 Not Confirmed: VA-107 VA-133

The installed system fulfils the Tech_HMI_01 requirement. Hypothesis Ver.hmi.4 not tested. But hypotheses Ver.hmi.1, Ver.hmi.2, Ver.hmi.3 validated. Hypothesis Opf.hmi.1 validated.

OK. Op_If-1 A-SMGCS shall enable users to interface efficiently.

Tech_HMI_01

Technically verified Operationally mostly verified Validity of the Req. in terms of its

generality is questioned

Technically partly verified Operationally verified

Technically verified Operationally partially verified

Verified by Tech_HMI_04 The HMI employs keyboard, mouse, on-screen menus and icons for data entry. VA-84 confirmed significantly: ‘The display layout is easy to cus-tomize to my own preferences.’

The installed system fulfils the Tech_HMI_04 requirement. Hypothesis Ver.hmi.3 validated. Hypothesis Opf.hmi.1 validated.

OK. Op_If-2 A-SMGCS shall enable operators to update traffic context or to configure the system to interface efficiently.

Tech_HMI_04

Technically verified Operationally verified

Technically verified Operationally verified

Technically verified Operationally verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 169 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_39 At the time of testing 40 vehicles were equipped with Mode S squitter beacons.

The installed system fulfils the Tech_Surv_39 requirement for air-craft and 10 MOSQUITO equipped vehicles. No validation hypothesis associated to this requirement.

OK. Op_If-3 A-SMGCS shall be capable of interfacing with all cooperative mobiles in order to collect the required traffic data. In particular, it shall interface with existing and future embedded systems.

Tech_Surv_39

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Surv_40 Only flight data for aircraft due to arrive or depart within the next 40 minutes are displayed.

The installed system fulfils the Tech_Supp_19 requirement. The installed system does not fulfil the Tech_Surv_40 requirement: the issue will be solved in EMMA2. The installed system does not fulfil this operational requirement, due to the type of connections allowed for an R&D implementation. No validation hypothesis associated to this requirement.

OK. But many things are out of the scope of EMMA1 test bed. Some things will be provided in EMMA2 (e.g. interface with Airport Operator data base for CDM applications and a more strict integration with FDP subsystem placed in ACC).

Op_If-4 In order to fully benefit from an A-SMGCS by all parties concerned, the system should be capable of interfacing with the following ground systems:

• Air traffic management (ATM), in-cluding Integrated Initial Flight Plan Processing System (IFPS), depar-ture management, etc.

• Approach surveillance system to take into account airborne aircraft;

• Stand management systems; • Existing and future ATS systems; • MET systems; • Visual aids; • Any other system as part of the

Collaborative Decision Making Process (CDM).

Tech_Supp_19 Tech_Surv_40

Technically verified Technically partly verified Not assessed operationally

Technically partly verified Partially assessed operationally

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 170 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_HMI_08 The HMI design has been harmo-nised with existing ATM HMI and agreed with the Prague ANS Pro-vider (ANS CR).

The installed system fulfils the Tech_HMI_08 requirement. This operational requirement cannot be tested in shadow-mode trials No validation hypothesis associated to this requirement.

OK. Op_If-5 The operation of A-SMGCS interfaces should not interfere with other ATC responsibilities, such as the observation of aerodrome activity and the requirements to provide alerting service.

Tech_HMI_08

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Gen_15 Weather conditions were good throughout the SAT period. Further long-term testing will be carried out using the MOGADOR tool as part of the SP6 V&V exercise. Results are published in the D6.3.1 document [12]. ATCos worked with the A-SMGCS for 7 months fully operational under all prevailing visibility conditions.

The installed system partly fulfils the Tech_Gen_15 requirement (Ver.sur.7 partly verified and Ver.sur.8 not tested). The limited time period allocated to the shadow-mode trials did not allow this requirement to be tested opera-tionally. However, an important number of false plots or track duplication were observed during a validation session in heavy rain situation. Hypothesis Opf.pre.4 could not be tested during the shadow-mode trials (A-SMGCS visibility transition proce-dures had not been defined when the validation sessions took place).

OK. Op_Range-1 A-SMGCS shall be capable of operating in all visibility conditions.

Tech_Gen_15

Technically verified Operationally verified

Technically partly verified Not assessed operationally

Technically partly verified Not assessed operationally

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 171 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_42 The system detects and tracks all targets.

Technical requirement Tech_Gen_42 not assessed (related verification hypothesis: Ver.gen.1). Hypothesis Opf.pre.6 could not be tested during the shadow-mode trials (movement rates had not been modified when the validation ses-sions took place).

OK. Op_Range-2 A-SMGCS shall be able to handle all traffic movements on their area of interest at any instant time.

Tech_Gen_42

Technically verified Not assessed technically Not assessed operationally

Technically verified

Verified by Tech_Gen_14 The surveillance system supports all known aircraft and vehicle types that are likely to use the aerodrome during the life expectancy of the equipment.

Technical requirement Tech_Gen_14 not assessed (related verification hypothesis: Ver.gen.2). No validation hypothesis associated to this requirement.

OK. Op_Range-3 An A-SMGCS shall support operations in-volving all aircraft types and all vehicles types.

Tech_Gen_14

Technically verified Not assessed technically Not assessed operationally

Technically verified

Verified by Tech_Gen_14 The surveillance system supports all known aircraft and vehicle types that are likely to use the aerodrome during the life expectancy of the equipment.

Technical requirement Tech_Gen_14 not assessed (related verification hypothesis: Ver.gen.2). This requirement cannot be tested in shadow-mode trials No validation hypothesis associated to this requirement.

OK. Op_Range-4 An A-SMGCS shall be adaptable to cater for future aircraft types and vehicles types.

Tech_Gen_14

Technically verified Not assessed technically Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 172 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_33 All targets are detected and tracked.

Technical requirement Tech_Surv_33 not assessed (related verification hypothesis: Ver.gen.3). No validation hypothesis associated to this requirement.

OK. Op_Range-5 The system shall be capable of supporting operations of mobiles within the following parameters:

• Minimum and maximum speeds for aircraft on final approach, and run-ways;

• Minimum and maximum speeds for aircraft on taxiways;

• Minimum and maximum speeds for vehicles; and any heading.

Tech_Surv_33

Technically verified Not assessed technically Not assessed operationally

Technically verified

Verified by Tech_Surv_33 All targets are detected and tracked.

Technical requirement Tech_Surv_33 not assessed (related verification hypothesis: Ver.gen.3). No validation hypothesis associated to this requirement.

OK. Op_Range-6 The A-SMGCS should cover the following speeds:

• 0 to 50 knots for aircraft on straight taxiways;

• 0 to 20 knots for aircraft on taxiway curves;

• 0 to 80 knots for aircraft on runway exits;

• 0 to 250 knots for aircraft on final approach, missed approach and runways;

• 0 to 80 knots for vehicles on the movement area; and

• 0 to 10 knots for aircraft and vehi-cles on stands and stand taxi lanes.

Tech_Surv_33

Technically verified Not assessed technically Not assessed operationally

Not completely assessed techni-cally

Hypothesis Opf.pre.3 validated. OK. Op_Resp-1 Although the responsibilities and functions may vary, they shall be clearly defined for all users of A-SMGCS.

Operationally verified

Operationally verified Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 173 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

In EMMA, the test of this requirement will mostly focus on the share of responsibility between equipment and human operators (ATCo). Hypothesis Opf.pre.3 validated.

OK. Op_Resp-2 An A-SMGCS shall be designed so that the responsibilities and functions may be as-signed to the following: a) The automated system; b) Controllers; c) Pilots; d) Vehicle drivers; g) Airport authorities; h) System operators.

Operationally verified

Operationally verified Technically verified

Hypothesis Opf.pre.3 validated. Op_Resp-3 Airport authority shall be responsible for notifying the A-SMGCS category operating in its aerodrome and the procedures that may be applied.

Operationally verified

Operationally verified Technically verified

Table 8-4: Results General Operational Requirements

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 174 File Name: D671_Analysis_V1.0.doc Version: 1.0

Appendix B.2 Surveillance Service Requirements

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_32.

The installed system fulfils the Tech_Gen_32 requirement. The operational status of the system is monitored. The definition of re-quired modules and level of perform-ances depending on operations is the scope of the operational feasibil-ity phase. Hypothesis Opf.pre.2 validated. Hypothesis Opf.pre.5 could not be tested during the shadow-mode trials (A-SMGCS fallback procedures had not been defined when the validation sessions took place).

OK. Op_Mon-1 The operational status of all A-SMGCS equipment shall be monitored by the system, and alerts shall be provided when the system must not be used for the intended operation.

Tech_Gen_32

Technically verified Technically verified Operationally partly verified

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 175 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_33 The scrollable Status Log field dis-plays system status, warning and failure messages together with the time as they occur. Detailed information such as date, time, status, and identification are available. A description of the message is displayed by double clicking on the message line. Tech_Supp_11 The TECAMS HMI shows the status of the major components of the EMMA system. It is possible to start and stop subsystems from the TECAMS HMI.

The installed system fulfils the Tech_Gen_33 requirement. The installed system partly fulfils the Tech_Supp_11 requirement. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Mon-2 Monitoring of the performance of an A-SMGCS should be provided such that opera-tionally significant failures are detected and appropriate remedial action is initiated to restore the service or provide a reduced level of service.

Tech_Gen_33 Tech_Supp_11

Technically verified Technically partly verified Not assessed operationally

Partially Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 176 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_34 The scrollable Status Log field dis-plays system status, warning and failure messages together with the time as they occur. Detailed information such as date, time, status, and identification are available. A description of the message is displayed by double clicking on the message line.

The installed system partly fulfils Tech_Gen_34 requirement: the operational status of the system is monitored. The definition of required modules and level of performances depending on operations is the scope the opera-tional feasibility phase. Hypothesis Opf.pre.2 validated. Hypothesis Opf.pre.5 could not be tested during the shadow-mode trials (A-SMGCS fallback procedures had not been defined when the validation sessions took place).

OK. Op_Mon-3 The A-SMGCS shall perform a continuous validation of data provided to the user and timely alert the user when the system must not be used for the intended operation.

Tech_Gen_34

Technically verified Technically partly verified Operationally partly verified

Technically verified Operationally partly verified

Verified by Tech_Gen_37 The source node turns white indicat-ing that the node is disconnected and the SDS node turns yellow indicating warning status.

Technical requirement Tech_Gen_37 will be checked in EMMA2 SSA. Hypothesis Opf.pre.5 could not be tested during the shadow-mode trials (A-SMGCS fallback procedures had not been defined when the validation sessions took place).

OK. Op_Mon-4 The system shall allow for a reversion to adequate back-up procedures if failures in excess of the operationally significant period occur.

Tech_Gen_37

Technically verified Not assessed technically Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 177 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_37 The source node turns white indicat-ing that the node is disconnected and the SDS node turns yellow indicating warning status. VA-83 not confirmed: ‘The display enables to recognize a degrading accuracy of surveillance.’

Technical requirement Tech_Gen_37 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA. However, the controllers were not able to evaluate the statement: ‘The EMMA display enables to detect a loss of accuracy of the surveillance’.

OK. Op_Mon-5 Operationally significant failures in the sys-tem shall be clearly indicated to the control authority and any affected user.

Tech_Gen_37

Technically verified Operationally not verified

Not assessed technically Operationally not verified

Not assessed technically Operationally not verified

Verified by Tech_Gen_38 The system alarm window pops up with the name of the data source that is disconnected. An audible alarm sounds.

Technical requirement Tech_Gen_38 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Mon-6 All critical elements of the system should be provided with audio and visual indication of failure given in a timely manner.

Tech_Gen_38

Technically verified Not assessed technically Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 178 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-2, Tech_Surv_35 Measured Values for PD Short Term: 99.65% Long Term: 97.1% - 99.4% 99,9% of identification could not be met easily, but controller accepted the lower PID performance to meet their operational needs VA-33, VA-36, VA-48, VA-54, VA-55 confirmed significantly: ‘When visual reference is not possi-ble I think the A-SMGCS surveillance display can be used to give clear-ances in a safe and efficient way’.

The installed system partly fulfils Tech_Surv_35 requirement (hy-pothesis Ver.sur.7 partially verified). Hypothesis Opf.sur.9 not validated (the controllers consider that there are too many missing reports to control the traffic in a safe and effi-cient way). Hypothesis Opf.sur.6 could not be tested (the position discrimination verification test had not been per-formed when the validation sessions took place).

Please, ref. to test results. Op_Perf-01 The probability that an actual aircraft, vehicle or obstacle is detected and reported at the output of the surveillance element of the A-SMGCS shall be 99.9% at minimum.

VE-2 Tech_Surv_35

Technically not verified Operationally verified Op_Perf-01 not completely proven

valid

Technically partly verified Operationally not verified

Technically partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 179 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-3, Tech_Surv_36 Measured Values for PFD Short Term: 0.07% Long Term: 0.04% - 0.16% 0.1% of identification could not be met easily, but controller accepted the higher PFD performance to meet their operational needs VA-33, VA-36, VA-48, VA-54, VA-55 confirmed significantly: ‘When visual reference is not possi-ble I think the A-SMGCS surveillance display can be used to give clear-ances in a safe and efficient way’.

Technical requirement Tech_Surv_36 not assessed (related verification hypothesis: Ver.sur.8). Hypothesis Opf.sur.11 not validated (the controllers consider that the probability of false detection is not acceptable).

Please, ref. to test results. Op_Perf-02 The probability that anything other than an actual aircraft, vehicle or obstacle is detected and reported by the surveillance element of the A-SMGCS shall not exceed 10E-3 per reported target (0.1%).

VE-3 Tech_Surv_36

Technically verified (Short Term only) Operationally verified

Not assessed technically Operationally not verified

Technically partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 180 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-9, Tech_Surv_37 Measured Values for PID Short term: 99.72% Long Term: 78.8% - 94.1% 99,9% of identification could not be met easily, but controller accepted the lower PID performance to meet their operational needs VA-33, VA-36, VA-48, VA-54, VA-55 confirmed significantly: ‘When visual reference is not possi-ble I think the A-SMGCS surveillance display can be used to give clear-ances in a safe and efficient way’.

The installed system partly fulfils Tech_Surv_37 requirement (hy-pothesis Ver.sur.7 partially verified). Hypothesis Opf.sur.10 not validated (the controllers did not reach any consensus on the acceptance of the probability of identification).

Please, ref. to test results. Op_Perf-03 The probability that the correct identity of an aircraft, vehicle or obstacle is reported at the output of the surveillance element shall be 99.9% at minimum.

VE-9 Tech_Surv_37

Technically not verified Operationally verified Op_Perf-03 not completely proven

valid

Technically partly verified Operationally not verified

Technically partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 181 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-10, Tech_Surv_38 Measured Values for PFID Short Term: 0% Long Term: 3.2% - 19.7% 0.1% of false identification could not be met easily, but controller ac-cepted the higher PFID performance to meet their operational needs. VA-33, VA-36, VA-48, VA-54, VA-55 confirmed significantly: ‘When visual reference is not possi-ble I think the A-SMGCS surveillance display can be used to give clear-ances in a safe and efficient way’.

Technical requirement Tech_Surv_38 not assessed (related verification hypothesis: Ver.sur.8). Hypothesis Opf.sur.12 not validated (the controllers consider that the probability of false identification is not acceptable).

Please, ref. to test results. Op_Perf-04 The probability that the identity reported at the output of the surveillance element is not the correct identity of the actual aircraft, vehicle or obstacle shall not exceed 10E-3 per reported target (0.1%).

VE-10 Tech_Surv_38

Technically verified (Short Term only) Operationally verified Op_Perf-04 not completely proven

valid

Not assessed technically Operationally not verified

Technically partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 182 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_26 & VE-5 Measured Values for RPA Short Term: 3.2m (static) Long Term: N/A

VA-1, VA-2, VA-3 confirmed signifi-cantly: ‘When visual reference is not possi-ble, the displayed position of the - aircraft in the runway sensitive area - vehicle in the runway sensitive area - aircraft on the taxiways is accurate enough to exercise con-trol in a safe and efficient way.’ VA-27 confirmed significantly: ‘When visual reference is not possi-ble, A-SMGCS facilitates to give traffic information to pilots so that they can avoid other traffic.’ VA-35 not confirmed: ‘I think that the A-SMGCS surveil-lance display could be used to de-termine that an aircraft is on stand or has left the stand.’ VA-33, VA-36, VA-48, VA-54, VA-55 confirmed significantly: ‘When visual reference is not possi-ble I think the A-SMGCS surveillance display can be used to give clear-ances in a safe and efficient way’.

The installed system partly fulfils Tech_Surv_26 requirement (hy-pothesis Ver.sur.3 partially verified). The installed system partly fulfils the Tech_HMI_17 requirement. Technical requirements Tech_HMI_16, Tech_HMI_18 and Tech_HMI_19 not tested. Hypothesis Opf.sur.4 partly validated (the controllers consider that the target position accuracy is accept-able. Nevertheless, many controllers require an adaptation of the proce-dures to take into account accuracy margins. Due to the lack of technical data, this adaptation could not be performed prior to the validation activities).

Please, ref. to test results. Op_Perf-05 For the surveillance service, the allowable error in reported position shall be consistent with the requirements set by the control task of the controller: 12m.

VE-5 Tech_Surv_26 (Tech_HMI_16) (Tech_HMI_17) (Tech_HMI_18) (Tech_HMI_19)

Technically verified (Short Term only) Operationally verified Op_Perf-05 not completely proven

valid (12m has not been proven)

Technically partly verified Operationally partly verified

Technically partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 183 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-11 Measured Values for RPR Short term: 0.1m Long Term: N/A Tech_Surv_27 The smallest observable change was 1 pixel, corresponding to 0.1m on the 50m-range scale.

The installed system partly fulfils Tech_Surv_27 requirement (hy-pothesis Ver.sur.3 partially verified). The installed system partly fulfils the Tech_HMI_17 requirement. Hypothesis Opf.sur.4 partly validated (the controllers consider that the target position accuracy is accept-able. Nevertheless, many controllers require an adaptation of the proce-dures to take into account accuracy margins. Due to the lack of technical data, this adaptation could not be performed prior to the validation activities).

Please, ref. to test results. Op_Perf-06 The mobile position resolution shall be at least 1 m.

VE-6 Tech_Surv_27 (Tech_HMI_17)

Technically verified (Short Term only) Technically partly verified Operationally partly verified

Technically verified

Requires specially equipped test aircraft. NOTE: Could not be tested in SP6.

Technical requirement Tech_Surv_32 not tested. Hypothesis Opf.sur.4 partly validated (the controllers consider that the target position accuracy is accept-able. Nevertheless, many controllers require an adaptation of the proce-dures to take into account accuracy margins. Due to the lack of technical data, this adaptation could not be performed prior to the validation activities).

Please, ref. to test results. Op_Perf-07 Where airborne traffic participates in the A-SMGCS, the level of an aircraft when air-borne shall be determined within ±10m.

Tech_Surv_32

Technically not tested Not assessed technically Operationally partly verified

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 184 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-11 Measured Values for TRUR Short term: 0.47s Long Term: N/A

Hypothesis Ver.sur.5 verified. Hypothesis Opf.sur.8 validated (the controllers consider that the consis-tency between displayed and actual traffic is good).

OK. Op_Perf-08 Where appropriate, the update rate of an A-SMGCS shall be consistent with the require-ments set by the control task of the controller: 1s.

VE-11

Technically verified (Short Term only) Technically verified Operationally verified

Technically verified

Verified by Tech_Gen_30 Technical requirement Tech_Gen_30 will be checked in EMMA2 SSA. Technical requirements Tech_HMI_18, Tech_HMI_19 and Tech_HMI_20 not tested. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Perf-09 A-SMGCS shall preclude failures that result in erroneous data provided to the users.

Tech_Gen_30 (Tech_HMI_18) (Tech_HMI_19) (Tech_HMI_20)

Technically verified Not assessed technically Not assessed operationally

Not assessed technically Not assessed operationally

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 185 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Redundancy is not required for the EMMA trials.

Technical requirement Tech_Gen_31 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Perf-10 The availability of an A-SMGCS shall be sufficient to support the safe, orderly and expeditious flow of traffic on the movement area of an aerodrome.

Tech_Gen_31

Technically not verified Not assessed technically Not assessed operationally

Technically not verified Not assessed operationally

Verified by Tech_Gen_35 & 36

Technical requirements Tech_Gen_35 and Tech_Gen_36 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Perf-11 A failure of equipment shall not cause: • A reduction in safety (fail soft); and • The loss of basic functions.

Tech_Gen_35 Tech_Gen_36

Technically verified Not assessed technically Not assessed operationally

Technically verified

Verified by Tech_Gen_28

The installed system fulfils Tech_Gen_28 requirement. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Perf-12 An A-SMGCS shall provide a continuous service.

Tech_Gen_28

Technically verified Technically verified Not assessed operationally

Technically verified

Redundancy is not required for the EMMA trials.

Technical requirement Tech_Gen_31 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Perf-13 Any unscheduled break in continuity shall be sufficiently short or rare as not to affect the safety of mobiles.

Tech_Gen_31

Technically not verified Not assessed technically Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 186 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_39 The SDS does a controlled shut down, then automatically reboots and resumes operation.

Technical requirement Tech_Gen_39 will be checked in EMMA2 SSA. This operational requirement will be checked in EMMA2 SSA.

OK. Op_Perf-14 When restarting, the recovery times of A-SMGCS shall be of a few seconds.

Tech_Gen_39

Technically verified Not assessed technically Not assessed operationally

Technically verified

Hypothesis Ver.sur.7 partially veri-fied. Hypothesis Opf.sur.9 not validated (the controllers consider that there are too many missing reports to control the traffic in a safe and effi-cient way).

Please, ref. to test results. Op_Perf-22 The A-SMGCS shall detect obstacles, whether moving or stationary, located any-where on the movement area of the aero-drome and having an equivalent radar cross section of 1 square meter or more.

Technically partly verified Operationally not verified

Verified by Tech_Gen_01 VA-89 rejected significantly: ‘I think there is too much inconsis-tency between A-SMGCS and real traffic.’

The installed system fulfils Tech_Gen_01 requirement. No validation hypothesis associated to this requirement. However, the controllers do not consider that there are too many inconsistencies between the A-SMGCS HMI and actual traffic.

OK. Op_Serv-01 A-SMGCS shall provide the surveillance service to the users.

Tech_Gen_01

Technically verified Operationally verified

Technically verified Operationally verified

Technically verified Operationally verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 187 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_52

The installed system fulfils Tech_Surv_52 requirement. In EMMA, the use of A-SMGCS will be limited to a ground position No validation hypothesis associated to this requirement.

OK. Op_Serv-02 The users of the surveillance service shall be all control authorities concerned in the ma-noeuvring area of the aerodrome.

Tech_Surv_52

Technically verified Technically verified Not assessed operationally

Technically verified Not assessed operationally

Verified by Tech_Surv_34 VA-89 rejected significantly: ‘I think there is too much inconsis-tency between A-SMGCS and real traffic.’

The installed system fulfils Tech_Surv_52 requirement (Ver.sur.5 hypothesis verified). The installed system fulfils this re-quirement. The continuity of the service will be tested in EMMA2 SSA. No validation hypothesis associated to this requirement. However, the controllers do not consider that there are too many inconsistencies between the A-SMGCS HMI and actual traffic.

OK. Op_Serv-03 The surveillance service shall continuously provide the following airport traffic situation:

• Traffic Information; • Traffic context.

Tech_Surv_34

Technically verified Operationally verified

Technically verified Operationally partly verified

Technically partly verified Operationally not verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 188 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_03: Targets are displayed by: a) A target symbol showing the

target position b) A leader line connecting the

symbol to a label containing the target identity and other infor-mation

c) A raw video (SMR) image that permits the user to classify the target by its size.

d) A selectable number of track history dots showing the past positions of the target.

VA indicators supports the need of right information: VA-4, VA-5 not confirmed: ‘When visual reference is not possi-ble, a -missing label -missing position report is not a problem to exercise control in a safe and efficient way.’

VA-6 is rejected significantly: ‘When visual reference is not possi-ble, a wrong label is not a problem to exercise control in a safe and effi-cient way.’

VA-8 is confirmed: ‘When visual reference is not possi-ble, track swapping prevents me to exercise control in a safe and effi-cient way.’

The installed system fulfils Tech_Surv_03 requirement (Ver.hmi.1 hypothesis verified). Hypothesis Ver.sur.2 verified. Hypothesis Ver.sur.7 partially veri-fied. Hypothesis Opf.sur.2 validated (the detection coverage is considered as acceptable. Therefore, no modifica-tion of A-SMGCS procedures is required. The apron is only covered by the primary radar and procedures have not been adapted). Hypothesis Opf.sur.3 not validated (the controllers did not reach any consensus on the acceptance of the identification coverage. The apron is only covered by the primary radar and procedures have not been adapted). Hypothesis Opf.sur.9 not validated (the controllers consider that there are too many missing reports to control the traffic in a safe and effi-cient way). Hypothesis Opf.sur.10 not validated (the controllers did not reach any consensus on the acceptance of the probability of identification).

OK. Op_Serv-04 The surveillance service shall continuously provide the following traffic information:

• Position of all vehicles on the area of interest for vehicles, including in-truders;

• Identity of all cooperative vehicles on the area of interest for vehicles;

• Position of all relevant aircraft on the area of interest for aircraft, in-cluding intruders;

• Identity of all relevant aircraft on the area of interest for aircraft;

• History of the mobiles position (e.g. the 3 last positions displayed).

Tech_Surv_03

Technically verified Technically partly verified Operationally partly verified

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 189 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_22 The following information are avail-able:

• ATC Callsign • Mode A Code • Mode S Address • Aircraft Type • Wake Vortex Category • Departure/Destination Air-

port • Estimated Time of Arri-

val/Departure • Slot Time (when applica-

ble) • Stand Identifier • SID/STAR (as applicable)

The installed system fulfils Tech_Surv_22 requirement. The installed system is able to sup-port this modification, if provided with this information. Aircraft type (from flight plan) and SID (for departing flights) indicated in tag + flight plan information available in a dedicated window No validation hypothesis associated to this requirement.

OK. Op_Serv-05 The traffic information may optionally include other information about traffic, such as:

• Vehicle type; • Aircraft type; • Aircraft gate; • ...

Tech_Surv_22

Technically verified Technically verified Not assessed operationally

Technically verified

Verified by Tech_Surv_01 The installed system partly fulfils Tech_Surv_01 requirement (Ver.sur.2 hypothesis verified, Ver.sur.3, Ver.sur.7 hypotheses partly verified and Ver.sur.8 hy-pothesis not assessed). Hypothesis Opf.pre.7 validated (the movement area has been fully de-fined prior to the experimentations).

OK, but also apron area should be considered ‘of interest’ because so strictly close to manoeuvring area.

Op_Serv-06 The area of interest for vehicles shall be the manoeuvring area.

Tech_Surv_01

Technically verified Technically partly verified Operationally verified

Technically verified Operationally verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 190 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_02 Surveillance of the approach is provided from the Eurocat 2000 system. There is no visible difference between the positions displayed on the Eurocat displays and those on the EMMA displays. NOTE: the accuracy required is not stated in the ICAO manual.

The installed system fulfils Tech_Surv_02 requirement (Ver.sur.2 hypothesis verified). Hypothesis Opf.pre.7 validated (the movement area has been fully de-fined prior to the experimentations).

OK. Op_Serv-07 The area of interest for aircraft shall be the movement area, plus a volume around the runways for aircraft on approach to each landing runway direction, at such a distance that inbound aircraft can be integrated into an A-SMGCS operation and that aerodrome movements, including aircraft departures, relevant missed approaches or aircraft cross-ing the relevant active runways, can be managed.

Tech_Surv_02

Technically verified Technically verified Operationally verified

Technically verified Operationally verified

Op_Serv-08 Cancelled Tech_Supp_03

Verified by Tech_Supp_03 The display shows: • The aerodrome map with

RWYs, TWYs, parking stands, etc.

• Holding positions, stop bars, RWY thresholds, centerlines and other information as re-quired by ANS CR.

• Fixed obstacles.

The installed system fulfils the Tech_Surv_03 requirement. No validation hypothesis associated to this requirement.

OK. Op_Serv-09 The traffic context shall at least include: • Airport layout: geographical repre-

sentation of various airport areas (TWY, RWY…);

• Reference points: holding posi-tions, stop bars (and other airfield lighting), RWY thresholds…

Tech_Supp_03

Technically verified Technically verified Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 191 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Supp_01 Runway status and LVP/non-LVP information is provided by manual input at the CWPs.

The installed system fulfils this re-quirement. No validation hypothesis associated to this requirement.

OK. Op_Serv-10 The traffic context may optionally include (local issue):

• Status of runways and taxiways (open / closed);

• The reason a runway or taxiway is closed;

• Status of ATS systems: landing systems aids, ATIS…

• Other data: meteorological condi-tions…

Tech_Supp_01

Technically verified Technically verified Not assessed operationally

Technically verified Not assessed operationally

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 192 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Not tested in SAT VA-1,VA-2,VA-3 confirmed signifi-cantly: ‘When visual reference is not possi-ble, the displayed position of the - aircraft in the runway sensitive area - vehicles in the runway sensitive area - aircraft on the taxiways is accurate enough to exercise con-trol in a safe and efficient way.’ VA-16,VA-17 confirmed significantly: ‘I think that the A-SMGCS surveil-lance display could be used to de-termine that an aircraft has -vacated the runway -crossed a holding position.’ VA-35 not confirmed: ‘I think that the A-SMGCS surveil-lance display could be used to de-termine that an aircraft is on stand or has left the stand.’

Technical requirement Tech_HMI_18 not tested. Hypothesis Ver.sur.3 partially veri-fied. Hypothesis Opf.pre.7 validated (the movement area has been fully de-fined prior to the experimentations). Moreover, the controllers consider that the target position accuracy is acceptable (Opf.sur.4). Neverthe-less, many controllers require an adaptation of the procedures to take into account accuracy margins. Due to the lack of technical data, this adaptation could not be performed prior to the validation activities. The controllers consider that the number of pilot position reports could be reduced when using the A-SMGCS. Nevertheless, they do not agree on suppressing the pilot posi-tion report to confirm that the aircraft has vacated the runway because it is safety critical. The controllers slightly agree with the statement that the A-SMGCS is sufficient to determine that an aircraft is on the stand or has left the stand.

OK. Op_Serv-11 Each mobile shall be seen in the correct position with respect to the aerodrome layout and other traffic.

Tech_HMI_18

Operationally partially verified Technically partly verified Operationally verified

Technically partly verified Operationally partially verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 193 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_HMI_07 The target is labelled and the line is removed from the list.

The installed system partly fulfils the Tech_HMI_07 requirement. Verification hypothesis: Ver.hmi.4 not assessed. Hypothesis Opf.hmi.1 validated. However, no consensus was reached on whether the manual modification of the content of mobile reports on the HMI can be performed quickly and efficiently. Many control-lers considered that because of their lack of experience with the A-SMGCS they could not take a stand on this issue. Nevertheless, some of the controllers noted that the manual modification of the label content was not easy to use because of the not intuitive label selection mode (i.e. click on the mobile instead of the label) and the difficulty to ‘catch’ a quickly moving mobile.

Partially compliant. Vehicle label is assigned by configu-ration position and normally not changed by operative personnel by surveillance positions (CWPs).

Op_Serv-12 The surveillance service shall provide to the user the ability to manually put the right callsign in the label associated to a vehicle equipped with a mobile cooperative equip-ment used for different vehicles.

Tech_HMI_07

Technically verified Technically partly verified Operationally partly verified

Technically verified Operationally partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 194 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to techn. tests (VE-Indicators) operational feasibility tests (VA-Indicators))

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_20 10 consecutive departures and 10 consecutive arrivals observed. VA-111 confirmed: ‘The A-SMGCS display gives me sufficient information about airborne traffic in the vicinity of the airport.’ Comment: Tower executive Controllers have to use the E 2000 as primary display. Seamless A-SMGCS transition has been established but E2000 is still used.

The installed system partly fulfils Tech_Surv_20 requirement (Ver.sur.3 hypothesis partially veri-fied). Hypothesis Opf.sur.4 partly validated (the controllers consider that the target position accuracy is accept-able. Nevertheless, many controllers require an adaptation of the proce-dures to take into account accuracy margins. Due to the lack of technical data, this adaptation could not be performed prior to the validation activities). Moreover, the controllers are rather satisfied with the amount of informa-tion about airborne traffic in the vicinity of the airport. However, several controllers noted that this kind of information being available in other tools, it is not useful in a ground tool like A-SMGCS.

OK. Controllers appreciate continuity of track presentation from approach to the ground and vice versa.

Op_Serv-13 A seamless transition should be provided between the surveillance for an A-SMGCS and the surveillance of traffic in the vicinity of an aerodrome.

Tech_Surv_20

Technically verified Operationally verified

Technically partly verified Operationally partly verified

Technically verified Operationally verified

Table 8-5: Results Surveillance Service Requirements

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 195 File Name: D671_Analysis_V1.0.doc Version: 1.0

Appendix B.3 Control Service Requirements

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by VE-5, Tech_Surv_26 Measured Values for RPA: Short term: 1.2m/s 7.9° Long Term: not assessed

The installed system partly fulfils Tech_Surv_26 requirement (hy-pothesis Ver.sur.3 partially verified). Hypothesis Opf.sur.4 partly validated (the controllers consider that the target position accuracy is accept-able. Nevertheless, many controllers require an adaptation of the proce-dures to take into account accuracy margins. Due to the lack of technical data, this adaptation could not be performed prior to the validation activities).

Please refer to test results. Op_Perf-15 The allowable error in reported position shall be consistent with the requirements set by the Control service: 7.5m when the position is used by the conflict / infringement detection process.

VE-5 Tech_Surv_26

Technically verified (Short Term only) Technically partly verified Operationally partly verified

Not assessed technically Not assessed operationally

Verified by VE-8, Tech_Surv_30 Measured Values for RVA: Short term: 1.2m/s 7.9° Long Term: not assessed

It has been chosen not to display the heading and velocity on Toulouse A-SMGCS HMI. Therefore, this opera-tional requirement could not be verified.

Please refer to test results. Op_Perf-16 The velocity shall be determined to the fol-lowing accuracy:

• speed: <5m/s • Direction of movement: <10°.

VE-8 Tech_Surv_28

Technically verified (Short Term only) Not assessed technically Not assessed operationally

Not assessed technically Not assessed operationally

Verified by Tech_Surv_30

It has been chosen not to display the heading and velocity on Toulouse A-SMGCS HMI. Therefore, this opera-tional requirement could not be verified.

Op_Perf-17 The target report velocity resolution shall be : • speed: 1m/s • direction of movement: 1.5°

Tech_Surv_30

Technically verified Not assessed technically Not assessed operationally

Please refer to test results.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 196 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Cont_13

The installed system fulfils Tech_Cont_13 requirement (hy-pothesis Ver.con.5 verified as part of D4.2.1 to range between 0 ms and 2 ms – depending on the number of conflicts). Hypothesis Opf.con.2 not validated (the controllers did not reach any consensus on the acceptance of the alert response time).

OK. Op_Perf-18 The alert resulting of conflict / infringement detection shall be provided to the user well in advance within a specified time frame, to enable the appropriate remedial action with respect to:

a) Conflict / infringement prediction; b) Conflict / infringement detection;

and c) Conflict / infringement resolution.

Tech_Cont_13

Technically verified Technically verified Operationally not verified

Technically verified

Verified by Tech_HMI_14

The installed system fulfils the Tech_HMI_14 requirement. Hypothesis Ver.hmi.2 verified. Hypothesis Opf.hmi.2 validated.

OK. Op_Perf-19 The Conflict/Infringement Alert should be displayed continuously while the conflict is detected.

Tech_HMI_14

Technically verified Technically verified Operationally verified

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 197 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Insufficient data to be verified VA-65 not rejected signif.: ‘Too many unnecessary information alerts were popping up.’ VA-66 not confirmed: ‘I think that all Runway Incursion Alerts are triggered at the right mo-ment’. VA-68 not rejected signif.: ‘I experienced too many false alerts to work in a safe and efficient way’.

Technical requirement Tech_Cont_12 not assessed (related verification hypothesis: Ver.con.4). Hypothesis Opf.con.3 not validated (the controllers did not reach any consensus on the acceptance of the number of false and nuisance alerts).

OK, but a more accurate configura-tion and tuning should increase controllers’ satisfaction.

Op_Perf-20 The number of false alert shall be as low as possible to not disturb the user.

VE-13 Tech_Cont_12

Technically not verified Operationally not verified

Not assessed technically Operationally not verified

Insufficient data to asses the Prob-ability of False Alerts VA-68 not rejected signif.: ‘I experienced too many false alerts to work in a safe and efficient way.’ Comment: Only used in test - not in real traffic

Technical requirement Tech_Cont_12 not assessed (related verification hypothesis: Ver.con.4). Hypothesis Opf.con.3 not validated (the controllers did not reach any consensus on the acceptance of the number of false and nuisance alerts).

OK. Op_Perf-21 The false alerts shall not impact on the air-port safety.

Tech_Cont_12

Technically not verified Operationally not fully verified

Not assessed technically Operationally not verified

Technically not verified Operationally not verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 198 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Gen_02 VA-77 confirmed: ‘Issuing clearances to aircraft is supported well by the A-SMGCS.’ VA-123 confirmed: ‘The A-SMGCS enables me to pro-vide the pilots a better level of ser-vice.’ VA-25 confirmed: ‘A-SMGCS helps to issue traffic information.’ VA-27 confirmed: ‘When visual reference is not possi-ble, A-SMGCS facilitates to give traffic information to pilots so that they can avoid other traffic.’

The installed system fulfils Tech_Gen_02 requirement. No validation hypothesis associated to this requirement.

OK. Op_Serv-14 A-SMGCS level II shall provide the control service to the users.

Tech_Gen_02

Technically verified Operationally verified

Technically verified Not assessed operationally

Technically verified Operationally verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 199 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Cont_02 The installed system fulfils Tech_Cont_02 requirement (hy-pothesis Ver.con.1 verified as part of D4.2.1 site acceptance procedure for the Surface Conflict Alert (SCA) system at Toulouse airport). In EMMA, the use of A-SMGCS will be limited to a ground position. No validation hypothesis associated to this requirement.

OK. Op_Serv-15 The users of the A-SMGCS control service shall be all control authorities concerned in the manoeuvring area of the aerodrome.

Tech_Cont_02

Technically verified Technically verified Not assessed operationally

Technically verified Operationally verified

Verified by Tech_Cont_03 VA-67 not confirmed: ‘I think that a Runway Incursion monitoring alert function helps me to react in an expeditious and safe manner.’

The installed system fulfils Tech_Cont_03 requirement (hy-pothesis Ver.con.1 verified as part of D4.2.1 site acceptance procedure for the Surface Conflict Alert (SCA) system at Toulouse airport). Technical hypothesis Ver.con.3 not tested. Hypothesis Opf.con.1 partly vali-dated (the hypothesis has been validated for the incursion rules but not for the conflict rules. The control-lers consider that the A-SMGCS helps them to detect and prevent incursions on the runway and into restricted areas and stop bar cross-ings).

OK. Op_Serv-16 The control service shall detect the con-flicts/infringements on runway provided in annex.

Tech_Cont_03

Technically verified Operationally not fully verified

Technically partly verified Not assessed operationally

Technically verified Operationally not fully verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 200 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Cont_06 An alert is given at the CWP when-ever an aircraft target enters the area, but not when vehicles or un-classified targets enter the area. VA-17 confirmed signif.: ‘I think that the A-SMGCS surveil-lance display could be used to de-termine that an aircraft has crossed a holding position.’ VA-63 confirmed signif.: ‘I think A-SMGCS can help me to detect lit stop bar crossings.’

The installed system fulfils Tech_Cont_06 requirement (hy-pothesis Ver.con.1 verified as part of D4.2.1 site acceptance procedure for the Surface Conflict Alert (SCA) system at Toulouse airport). Technical hypothesis Ver.con.3 not tested. Hypothesis Opf.con.1 partly vali-dated (the hypothesis has been validated for the incursion rules but not for the conflict rules. The control-lers consider that the A-SMGCS helps them to detect and prevent incursions on the runway and into restricted areas and stop bar cross-ings). Moreover, the controllers consider that the A-SMGCS can already help them to detect and prevent incur-sions on the runway and into re-stricted areas and stop bar cross-ings.

OK. Op_Serv-17 The control service shall detect the restricted area incursions caused by an aircraft and/or vehicles into an area such as closed TWY, ILS or MLS critical area, to be defined locally for each aerodrome.

Tech_Cont_06

Technically verified Operationally verified

Technically partly verified Operationally partly verified

Technically verified Operationally verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 201 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Supp_08 There are various boundary areas for configuration of the alerting function.

The installed system fulfils the Tech_Supp_08 requirement. Hypothesis Ver.con.2 verified as part of D4.2.1. No validation hypothesis associated to this requirement.

OK. Op_Serv-18 The runway protection area shall be com-posed of two boundaries: A ground boundary to detect the mobiles on the surface, an air boundary to detect airborne aircraft.

Tech_Supp_08

Technically verified Technically verified Not assessed operationally

Verified by Tech_Supp_09 There are areas for the runway strips with different widths depending on the selection of normal or LVC.

The installed system fulfils the Tech_Supp_09 requirement. Hypothesis Ver.con.2 verified as part of D4.2.1. Hypothesis Opf.con.2 not validated (the controllers did not reach any consensus on the acceptance of the alert response time).

Op_Serv-19 The length of the ground boundary must at least include the runway strip. The width could be defined, and different, according to the meteorological conditions, e.g. Non-LVP, LVP. As an example based on today ILS holding positions:

• In Non-LVP: ground boundary de-fined by Cat I holding position

• In LVP: ground boundary defined by Cat II / III holding position

• This ground boundary will be used for both prediction and alert stages.

Tech_Supp_09

Technically verified Technically verified Operationally not verified

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 202 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Supp_10 Associated with each area are vari-ous attributes including time to threshold. The values for the EMMA tests at Prague are set to: • Non-LVP: T1 = 30s; T2 = 15s • LVP: T1 = 45s; T2 = 30s

The installed system fulfils the Tech_Supp_10 requirement. Hypothesis Ver.con.2 verified as part of D4.2.1. Hypothesis Opf.con.2 not validated (the controllers did not reach any consensus on the acceptance of the alert response time).

Please refer to SCA setting parame-ters.

Op_Serv-20 The air boundary shall be defined as a flight time to threshold and would take into account the two stages of alert, prediction and alert, as well as the meteorological conditions:

• Non-LVP: Prediction around T1 = 30’’, Alert around T2 = 15’’

• LVP: Prediction around T1 = 45’’, Alert around T2 = 30’’

Tech_Supp_10

Technically verified Technically verified Operationally not verified

Technically verified

Verified by Tech_Supp_05 Runway status and LVP/non-LVP information is provided by manual input at the CWPs. VA-79 confirmed signif.: ‘The information displayed in the A-SMGCS is helpful for avoiding con-flicts.’

The installed system fulfils the Tech_Supp_05 requirement. Hypothesis Ver.con.1 verified as part of D4.2.1 site acceptance procedure for the Surface Conflict Alert (SCA) system at Toulouse airport. Hypothesis Opf.con.1 partly vali-dated (the hypothesis has been validated for the incursion rules but not for the conflict rules. The control-lers consider that the A-SMGCS helps them to detect and prevent incursions on the runway and into restricted areas and stop bar cross-ings). Moreover, the controllers consider that the information provided by the A-SMGCS is helpful to avoid conflict.

OK. Op_Serv-21 For the Conflict/Infringement detection, addi-tional updated and correct traffic context information shall be provided to the system such as:

• Airport Configuration: runways in use, runways status, restricted ar-eas …

• Applied procedures and working methods: LVP, multiple line-ups.

Tech_Supp_05

Technically verified Operationally verified

Technically verified Operationally partly verified

Technically verified Operationally partly verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 203 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_HMI_13 The alert is indicated by the label(s) of involved target(s) turning amber or red, depending on the type and severity of the alert. VA-17 confirmed signif.: ‘I think that the A-SMGCS surveil-lance display could be used to de-termine that an aircraft has crossed a holding position.’

The installed system fulfils Tech_Cont_02 requirement (hy-pothesis Ver.con.1 verified as part of D4.2.1 site acceptance procedure for the Surface Conflict Alert (SCA) system at Toulouse airport). The installed system does not fulfil the Tech_HMI_13 requirement. Hypothesis Opf.con.1 partly vali-dated (the hypothesis has been validated for the incursion rules but not for the conflict rules. The control-lers consider that the A-SMGCS helps them to detect and prevent incursions on the runway and into restricted areas and stop bar cross-ings).

OK. Op_Serv-22 The control service shall alert the users in case of conflict/infringement detection.

Tech_Cont_02 Tech_HMI_13

Technically verified Operationally verified

Technically partly verified Operationally partly verified

Technically verified Operationally verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 204 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Cont_07 Both Stage 1 (amber) and Stage 2 (red) alerts are given at the CWP, depending on the severity defined for each situation. Tech_HMI_15 Stage 1 alert is colour-coded amber; Stage 2 is colour-coded red. A win-dow can be configured to pop up showing a list of active alerts, sorted by type.

The installed system fulfils Tech_Cont_08 requirement (hy-pothesis Ver.con.1 verified as part of D4.2.1 site acceptance procedure for the Surface Conflict Alert (SCA) system at Toulouse airport). The installed system fulfils the Tech_HMI_15 requirement. No validation hypothesis associated to this requirement.

OK. Op_Serv-27 The control service shall provide 2 stages of alert.

Tech_Cont_08 Tech_HMI_15

Technically verified Operationally not assessed (but was a basic request by the con-trollers)

Technically verified Not assessed operationally

Technically verified Operationally verified

Neither technically nor operation-ally assessed

The installed system fulfils this re-quirement

OK. Op_Serv-28 Priorities should be established so as to ensure system logic performs efficiently. Conflict alerting priorities should be as fol-lows:

a) Runway incursions. b) Restricted area incursions.

Not assessed technically Not assessed operationally

Not assessed technically Not assessed operationally

Verified by Tech_Supp_07

The installed system fulfils the Tech_Supp_07 requirement. No validation hypothesis associated to this requirement.

OK. Op_Serv-29 In order to efficiently assist ATCo, the auto-mated A-SMGCS control service shall be configurable to adapt to local ATC proce-dures and working methods.

Tech_Supp_07

Technically verified Technically verified Not assessed operationally

Technically verified

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 205 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments and Results (includes references to operational feasibility tests)

EMMA Req. No.

Requirement Description EMMA TRD Ref. & VE-ID

PRG TLS MXP

Verified by Tech_Surv_05 VA-75 confirmed signif.: ‘A-SMGCS provides the right infor-mation at the right time.’ VA-79 confirmed signif.: ‘Displayed information help to avoid conflicts’

The installed system does not fulfil the Tech_Surv_05 requirement. No validation hypothesis associated to this requirement.

OK. Op_Serv-30 For the Conflict/Infringement detection, addi-tional updated and correct traffic information shall be provided to the system such as mobiles velocity.

Tech_Surv_05

Technically verified Operationally verified

Technically not verified Not assessed operationally

Technically verified Operationally verified

Table 8-6: Results Control Service Requirements

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 206 File Name: D671_Analysis_V1.0.doc Version: 1.0

APPENDIX C ADDITIONAL RESULTS FOR OPERATIONAL REQUIREMENTS Most of the operational requirements have been verified by technical tests or checks (see Appendix B). With some of them their operational feasibility has been tested by asking the controllers who actively worked with the system. These additional results have been used to fully verify them, to reject the system perform-ance with this respect, or to question the requirement in terms of its validity. Two examples, ‘EMMA Op_Perf-15’ and ‘EMMA Op_Perf-01’, of operational requirements that have been questioned with respect to its validity will be dis-cussed in the tables below. The first two lines of the tables Table 8-7 and Table 8-8 represent the source of the requirement with respect to ICAO and EUROCAE. These two lines are followed by the derived EMMA requirements, operational and technical respectively. Further on, the results of the technical tests are re-ported, followed by the results of the respective debriefing questions, which prove the operational feasibility of the requirement. With the first example ‘Reported Position Accuracy’ (Table 8-7), the requirement ‘Op_Perf-15: The allowable error in reported position shall be consistent with the requirements set by the Control service: 7.5m when the position is used by the conflict / infringement detection process.’, has been verified technically (at all three test sites the position accuracy was lower than 7.5 metres) and also operationally (operational feasibility has been verified by the controllers’ acceptance of the experienced position accuracy performance). With these results it can be concluded that the operational requirement has been fully verified, technically and operationally, but nothing can be said about its validity unambiguously. Eventually, even a lower performance (RPA > 7.5m) would be accepted by the ATCos, but this could not be tested, and thus, a statement to the validity of a lower performance cannot be given. A lower performance, which would also meet the user’s needs, would cause lower implementation costs eventually. How-ever, the technical test at three different airport sites showed that a position accuracy of 7.5m can be met today with standard equipment. Therefore, it is recom-mended to keep the 7.5m as the minimum performance requirement for the ‘Reported Position Accuracy’ for implementation level higher than level 1. Neverthe-less, it must be stated that even lower performances might be valid if it meets the users’ needs.

Source ‘Reported Position Accuracy’ Requirement

ICAO doc9830 §4.2.3 The actual position of an aircraft, vehicle and obstacle on the surface should be determined within a radius of 7.5m.

EUROCAE-MASPS §3.2.3 Recommendation for a minimum ‘Reported Position Accuracy’ performance of 12m at system level 1 and 7.5m at the system level2 and all higher levels.

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 207 File Name: D671_Analysis_V1.0.doc Version: 1.0

EMMA Op_Perf-05 For the surveillance service, the allowable error in reported position shall be consistent with the requirements set by the control task of the controller: 12m.

EMMA Op_Perf-15 The allowable error in reported position shall be consistent with the requirements set by the Control service: 7.5m when the position is used by the conflict / infringement detection process.

EMMA TRD Tech_Surv_26 The reported position accuracy of the surveillance data transmitted from the SDF to clients should be 7.5m or better at a confidence level of 95%.

Technical Tests (Verification) * indicates a significant, positive result

EMMA In-dex

Name of the Indicator Results at PRG / MXP / TLS

VE-5 Reported Position Accuracy (RPA) RPA = 3.2m – 7.5m*

Operational Feasibility (Validation)

EMMA In-dex

Statement of the Questionnaire Mean answers of the Controllers (1 - 6)

VA-1 When visual reference is not possible, the displayed position of the aircraft in the runway sensitive area is accurate enough to exercise control in a safe and efficient way.

M = 5.1*

VA-2 When visual reference is not possible, the displayed position of vehicles in the runway sensitive area is accurate enough to exercise control in a safe and efficient way.

M = 4.7*

VA-3 When visual reference is not possible, the displayed position of the aircraft on the taxi ways is accu-rate enough to exercise control in a safe and efficient way.

M = 5.4*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 208 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments: • Today, a static RPA of 7.5m can easily be met technically (3 test sites proved it) • Controllers accepted this performance to meet their operational needs • However, 12m or even 20m could also be sufficient for some operational needs (e.g. only surveillance)

Recommendations: • Keep the ICAO requirements of 7.5m (ICAO 4.3.3 = validated) • But allow lower accuracy, if the user accept it to meet their operational needs

Table 8-7: Discussion of Operational Performance Requirements - Example 1: RPA

With the second example ‘Probability of Detection (PD)’ (Table 8-8), the technical requirement could only partly be verified. Beside the technical short-term tests (as described by EUROCAE ED-87A §4.6.2 [17]), long-term tests have been conducted. Both methods revealed performance lacks at all three test sites. Despite of these performances being lower than 99.9%, the ATCos accepted the experienced performance to meet their operational needs. This might be caused by a PD performance, which is tremendously dependent on different environmental circumstances, which are finally merged together in one single performance value. Particularly the long-term assessment tool was used to identify those different performances, which are caused by different weather conditions, the area of the aerodrome, the kind of movement, or the equipment status of the movement. The long-term assessment tool also helped to identify the number and duration of detection-gaps that are of serious operational interest.

These long-term results showed that the PD on the runway and the PD of co-operative movements is always nearly 100%. Especially the runway and aircraft movements (which are almost always co-operative) are of most interest to the ATCos. This implies that a single value to express the overall ‘Probability of De-tection’ performance of a surveillance function does not adequately represent the high variety of operational requirements to a ‘probability of detection perform-ance’. Therefore, it is recommended that the PD parameter should at least distinguish between:

• Aerodrome area (RWY, TWY, Apron)

• Movements (aircraft, vehicle, unknown)

• Number and durations of gaps

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 209 File Name: D671_Analysis_V1.0.doc Version: 1.0

Source ‘Probability of Detection’ Requirement

ICAO doc9830 No performance requirement, but recommendations given to prove it.

EUROCAE-MASPS §3.2.3 Recommendation for a minimum probability of detection performance of 99.90% at each of the system levels.

EMMA Op_Perf-01 The probability that an actual aircraft, vehicle, or obstacle is detected and reported at the output of the surveil-lance element of the A-SMGCS shall be 99.9% at minimum.

EMMA TRD Tech_Surv_35 The probability that an actual aircraft, vehicle, or object is detected and reported at the output of the SDF should be 99.9% at minimum.

Technical Tests (Verification)

EMMA In-dex

Name of the indicator Results at PRG / MXP / TLS

VE-2 Probability of Detection (PD) PD (short) = 99.7 – 99.9%

PD (long) = 86.5 to 99.9%

Operational Feasibility (Validation)

EMMA In-dex

Questionnaire Item Mean answers of the Controllers (1 - 6)

VA-55 When visual reference is not possible I think the A-SMGCS surveillance display can be used to determine if the runway is cleared to issue a landing clearance.

M = 5.3*

VA-36 I can rely on A-SMGCS when giving taxi clearances even when visual reference is not possible.

M = 4.9*

VA-48 When an intersection is not visible line up from this intersection could be applied in a safe way when using A-SMGCS.

M = 5.1*

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 210 File Name: D671_Analysis_V1.0.doc Version: 1.0

Comments: • 99.9% of detection could not be met easily • But, controller accepted a lower PD performance to meet their operational needs • The expression of the technical performance requirement ‘probability of detection’ was proven as not specific enough to comprehend the manifold

operational needs Recommendation:

• OP_Perf-01 should be replaced by a comprehensive set of PD requirement, w.r.t. o Aerodrome area (RWY, TWY, Apron) o Objects (aircraft, vehicle, unknown) o Number and durations of gaps (Matrix of Detection, [EMMA VE-17, [12]]

Table 8-8: Discussion of Operational Performance Requirements - Example 2: PD

The following Table 8-9 shows an example for a ‘Matrix of Detection’. This matrix is proved to be useful to get further insight into the PD performance. With each valid track the number of gaps and their duration are counted and expressed in a percentage value related to the total amount of value tracks. Those addi-tional quantitative results are important to get informed about the operational PD performance. For instance, on the runway, the ATCos rather accept 10 times a one second gap of detection instead of one gap that lasts 10 sec, although both examples would result in the same PD performance. Table 8-9 gives an example for such a ‘Matrix of Detection’.

Duration of Detection Gaps Number of

Gaps per track 1 sec 2 sec 3 sec 4 sec 5 sec >5 sec Total

0 N/A %

1 % % % % % % %

2 % % % % % % %

EMMA

Verification and Validation Analysis Report

Save date: 2007-06-29 Public Page 211 File Name: D671_Analysis_V1.0.doc Version: 1.0

3 % % % % % % %

4 % % % % % % %

5 % % % % % % %

>5 % % % % % % %

Total % % % % % % 100.00 %

Table 8-9: Example for the ‘Matrix of Detection’