Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : i
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
EUROCONTROL EXPERIMENTAL CENTRE Brétigny-sur-Orge, FRANCE
eDEP Technical Document
eDEP Acceptance Plan
Document Ref: eDEP-TP-v01 Issue date:5 December 2003
The information contained in this document is the property of the EUROCONTROL Agency and no part should be reproduced in any form without the Agency’s permission.
The views expressed herein do not necessarily reflect the official views or policy of the Agency.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : ii
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Document Change Log Release Author Date of the
release Description of the
release Modifications (sections affected
and relevant information) 0.1 D.Smith Initial Draft 0.2 D.Smith Major Update Coordination 0.3 D.Smith Following M1 CWP updates
Integration testing updates 0.4 Graffica 5th Dec. 2003 Updated following
datalink functionality. Added datalink test cases.
Acceptance and Reviewing Procedures Name (s) Date of acceptance/ review Date of approval
Darren Smith Mike Vere
Document distribution to/cc Name Role
to Mike Vere Graffica Technical Lead to Sophie Carlier eDEP Team Member to
Table of contents
1. INTRODUCTION........................................................................................................................................ 1 1.1 PURPOSE................................................................................................................................................. 1 1.2 RESPONSIBILITIES .................................................................................................................................... 1 1.3 DOCUMENT STRUCTURE ........................................................................................................................... 1 1.4 ASSOCIATED DOCUMENTATION.................................................................................................................. 1
1.4.1 Internal project documentation....................................................................................................... 1 1.4.2 General documentation.................................................................................................................. 1
1.5 OUTSTANDING ISSUES....................................................................................................................... 1 1.6 GLOSSARY............................................................................................................................................... 1 2. PROJECT OVERVIEW.............................................................................................................................. 1 2.1 BACKGROUND .......................................................................................................................................... 1 2.2 DEVELOPMENT PROCESS ......................................................................................................................... 1
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : iii
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
2.3 SOFTWARE & SOFTWARE TESTING............................................................................................................ 1 2.4 PROJECT MILESTONES ............................................................................................................................. 1 2.5 ACCEPTANCE OVERVIEW .......................................................................................................................... 1 2.6 QUALITY RELATED MILESTONES (REVIEW, TEST)....................................................................................... 1 2.7 ACCEPTANCE PROTOCOL.......................................................................................................................... 1 3. DELIVERY REVIEWS................................................................................................................................ 1 3.1 PRELIMINARY DESIGN REVIEWS ................................................................................................................ 1
3.1.1 Design Completeness .................................................................................................................... 1 3.1.2 Design Confidence......................................................................................................................... 1 3.1.3 AVENUE Issues ............................................................................................................................. 1 3.1.4 Other Issues................................................................................................................................... 1
3.2 CRITICAL DESIGN REVIEWS....................................................................................................................... 1 3.3 CODE REVIEWS........................................................................................................................................ 1 3.4 QUALITY REVIEWS.................................................................................................................................... 1 4. UNIT TESTING .......................................................................................................................................... 1
5. INTEGRATION TESTING.......................................................................................................................... 1 5.1 GENERAL PRINCIPLES .............................................................................................................................. 1 5.2 TEST COVERAGE...................................................................................................................................... 1
5.2.1 Test Coverage Tables.................................................................................................................... 1 5.2.2 Improved CWP tools for test verification........................................................................................ 1
5.3 REQUIRED INTEGRATION TESTING............................................................................................................. 1 6. VALIDATION TEST PLAN ........................................................................................................................ 1 6.1 INTRODUCTION......................................................................................................................................... 1 6.2 TEST ITEMS.............................................................................................................................................. 1
6.2.1 Documentation ............................................................................................................................... 1 6.2.2 Software ......................................................................................................................................... 1
6.3 FEATURES TO BE TESTED.......................................................................................................................... 1 6.4 FEATURES NOT TO BE TESTED................................................................................................................... 1 6.5 APPROACH............................................................................................................................................... 1 6.6 DEPENDENCIES ........................................................................................................................................ 1 6.7 ACCEPTANCE PROTOCOL.......................................................................................................................... 1 6.8 TEST DELIVERABLES ................................................................................................................................ 1 6.9 TESTING TASKS........................................................................................................................................ 1 6.10 ENVIRONMENT NEEDS .............................................................................................................................. 1
6.10.1 Hardware Requirement .................................................................................................................. 1 6.10.2 Software Requirement ................................................................................................................... 1 6.10.3 Tools............................................................................................................................................... 1 6.10.4 Data Requirement .......................................................................................................................... 1
6.11 RESPONSIBILITIES .................................................................................................................................... 1 6.12 STAFFING................................................................................................................................................. 1 6.13 SCHEDULE ............................................................................................................................................... 1 6.14 RISKS ...................................................................................................................................................... 1 7. TEST DESIGN SPECIFICATION .............................................................................................................. 1 7.1 SCENARIO DATA....................................................................................................................................... 1
7.1.1 Static Data Overview...................................................................................................................... 1 7.1.2 Traffic Data Overview..................................................................................................................... 1 7.1.3 Platform Configuration ................................................................................................................... 1
7.2 PHASE 1 .................................................................................................................................................. 1
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : iv
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
7.3 PHASE 2 .................................................................................................................................................. 1 7.4 PHASE 3 .................................................................................................................................................. 1 8. TEST CASE SPECIFICATION .................................................................................................................. 1 8.1 STC TSPV (TECHNICAL SUPERVISION) FAMILY......................................................................................... 1
8.1.1 STC TSPV 010 – Installation / Launch .......................................................................................... 1 8.1.2 STC TSPV 020 – Time Control ...................................................................................................... 1
8.2 STC_GRD_TP FAMILY............................................................................................................................ 1 8.2.1 STC_GRD_TP 010 – Trajectory Calculation ................................................................................. 1
8.3 STC_GRD_FM FAMILY ........................................................................................................................... 1 8.3.1 Introduction..................................................................................................................................... 1 8.3.2 STC_GRD_FM_RTE – Route Expansion ...................................................................................... 1 8.3.3 STC_GRD_FM_CON – Constraint Application (From IFPLs) ....................................................... 1 8.3.4 STC_GRD_FM_SEC – Sector List Computation........................................................................... 1 8.3.5 STC_GRD_FM_UNT – Unit List Computation............................................................................... 1 8.3.6 STC_GRD_FM_RECALC – ........................................................................................................... 1 8.3.7 STC_GRD_FM_HDG – Heading Orders (Closed) / Re-directs ..................................................... 1 8.3.8 STC_GRD_FM_CFL – Cleared Flight Level Orders...................................................................... 1 8.3.9 STC_GRD_FM_DIR – Direct To Orders........................................................................................ 1 8.3.10 STC_GRD_FM_OPEN – Open-ended HDG Orders ..................................................................... 1 8.3.11 STC_GRD_FM_SPD – Speed Control Orders .............................................................................. 1
8.4 STC_GRD_COORD............................................................................................................................... 1 8.4.1 Overview ........................................................................................................................................ 1 8.4.2 OLDI Terminology .......................................................................................................................... 1 8.4.3 Test Case Scenarios...................................................................................................................... 1 8.4.4 STC_GRD_COORD_ACT – Initial Co-ordination Activation from Assumed Sector...................... 1 8.4.5 STC_GRD_COORD_ACT(ADV) – Initial Co-ordination from an ‘Advanced’ TS........................... 1 8.4.6 STC_GRD_COORD_XFL_ASS – XFL Changes within an Assumed Transferring Sector ........... 1 8.4.7 STC_GRD_COORD_XFL_ADV - XFL Changes within an Advanced Transferring Sector........... 1 8.4.8 STC_GRD_COORD_EFL Entry Flight Level Co-ordination .......................................................... 1 8.4.9 STC_GRD_COORD_DIR – DirectTo Co-ordination...................................................................... 1 8.4.10 STC_GRD_COORD_RED – Heading Co-ordination (re-direct) .................................................... 1 8.4.11 STC_GRD_COORD_HDG – Open ended heading....................................................................... 1 8.4.12 STC_GRD_COORD_SPD – Speed Control .................................................................................. 1 8.4.13 STC_GRD_COORD_ABR – Abrogation Examples....................................................................... 1 8.4.14 STC_GRD_COORD_CTL – Transfer Of Control........................................................................... 1 8.4.15 STC_GRD_COORD_UNM – Unmanned Positions....................................................................... 1 8.4.16 STC_GRD_COORD_TIM – Time out related test cases............................................................... 1 8.4.17 STC_GRD_COORD_PHONE – Manual co-ordination via telephone ........................................... 1
8.5 STC_GRD_FPM .................................................................................................................................... 1 8.6 STC_GRD_STCA .................................................................................................................................. 1 8.7 STC_GRD_MTCD.................................................................................................................................. 1 8.8 STC_GRD_IFPL / ASP........................................................................................................................... 1 8.9 STC_CWP.............................................................................................................................................. 1
8.9.1 Overview ........................................................................................................................................ 1 8.9.2 STC_CWP_DES – Design related issues...................................................................................... 1 8.9.3 STC_CWP_PVD – General PVD tests .......................................................................................... 1 8.9.4 STC_CWP_PERF – Performance Tests ....................................................................................... 1 8.9.5 STC_CWP_TOOL – ToolBox......................................................................................................... 1 8.9.6 STC_CWP_SIL .............................................................................................................................. 1 8.9.7 STC_CWP_MSG – Message Windows ......................................................................................... 1 8.9.8 STC_CWP_LAB – Radar Track (Flight Label, Symbol, Spd Vector etc) Appearance .................. 1 8.9.9 STC_CWP_INT – Radar Track (Label, Symbol) Interaction.......................................................... 1
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : v
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
8.10 STC_AIR ................................................................................................................................................ 1 8.10.1 STC_AIR_NAV – Navigation Start ................................................................................................. 1 8.10.2 STC_AIR_CFL – CFL Orders ........................................................................................................ 1 8.10.3 STC_AIR_SPD – Speed Orders .................................................................................................... 1 8.10.4 STC_AIR_HDG – Heading Orders ................................................................................................ 1 8.10.5 STC_AIR_TRN – Turn Orders ....................................................................................................... 1 8.10.6 STC_AIR_DIR – Direct To Orders ................................................................................................. 1 8.10.7 STC_AIR_RES – Resume Normal Navigation .............................................................................. 1 8.10.8 STC_AIR_CTL – Flight Control...................................................................................................... 1 8.10.9 STC_AIR_HMI – Specific HMI issues............................................................................................ 1
8.11 HYBRID CWP? ...................................................................................................................................... 1 8.12 PERFORMANCE TESTS.............................................................................................................................. 1 8.13 DATALINK TESTS ...................................................................................................................................... 1
8.13.1 STC_DLINK_ACM ......................................................................................................................... 1 8.13.2 STC_DLINK_ACLUP ..................................................................................................................... 1 8.13.3 STC_DLINK_ACLDOWN............................................................................................................... 1 8.13.4 STC_DLINK_CAP .......................................................................................................................... 1 8.13.5 STC_DLINK_PPD .......................................................................................................................... 1
9. TEST PROCEDURES................................................................................................................................ 1 9.1 SCENARIO VERIFICATION .......................................................................................................................... 1 9.2 LABEL STATES ......................................................................................................................................... 1 9.3 CO-ORDINATION ISSUES ........................................................................................................................... 1 9.4 SIL WINDOW............................................................................................................................................ 1 9.5 EXTREME CASES...................................................................................................................................... 1
Index of figures
Index of tables
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 1
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
1. INTRODUCTION
1.1 Purpose This document describes the mechanisms and procedures through which the eDEP project manager shall accept the Graffica deliverables.
The document contains elements of both an IEEE Software Quality Assurance Plan (SQAP) and a Software Test Plan (STP). These standard templates have been tailored to the needs of eDEP, a small development project.
The document (in its current release) focuses on the Graffica / Eurocontrol interface, and less on the internal Graffica quality issues.
1.2 Responsibilities Eurocontrol is responsible for this document, with important effort supplied by Graffica Ltd.
Graffica is expected to specifically provide information (either within this document, or within a separate SQAP) concerning code reviews, partial unit testing, and integration testing.
1.3 Document structure The document is divided in *** main chapters, the first being the present introduction :
1.4 Associated documentation
1.4.1 Internal project documentation
Title Reference Date [Ref 1] TRS 221 / 2001 TRS 221 / 2001 27/11/01
[Ref 2] TRS Requirements Document TRS_eDEP_2002_ANNEX_v1.0 26/11/01 [Ref 3] eDEP Architecture Document [Ref 4] eDEP Design Document [Ref 5] eDEP User Manual [Ref 6] eDEP PMP [Ref 7] Qinetiq Technical Proposal for
TRS 221/2001 Qinetiq/KIS/AMSD/MKTG/3/1/181 Dec 2001
1.4.2 General documentation
Title Authors Date [Ref 8] MASS v9.0 HMI Specification v1.0 July 2000 [Ref 9] SEU Software Testing Guidelines Boulle & Legris 15/04/99 (v0.2)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 2
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Title Authors Date [Ref 10] IEEE Standard for Software Test
Documentation IEEE 12207 829 - 1998
[Ref 11] IEEE Standard for Software Quality Assurance Plans
IEEE 12207 730-1998
[Ref 12] OLDI Specification P. Bailey v2.3 http://www.eurocontrol.int/projects/eatchip/odt/documents/standards/oldi_e23.zip
[Ref 13] EATMP Generic HMI Specification STERIA / EUROCONTROL 10 March 2000 [Ref 14] EATMP HMI Web Site
https://www.eurocontrol.be/hmi
[Ref 15] ESCAPE SRD for Flight Manager J.Vigner & JC Perrin 30/08/01 [Ref 16] DSI HMI Specification Chp 10 – Sysco Poul Stevns V1.3 July 2000
1.5 OUTSTANDING ISSUES EATMP Dynamic specification – similar to DSI ????
The exact behaviour of Direct-to is a little shaky
1.6 Glossary AS Assuming Sector (i.e. downstream sector)
ASP Airspace Server
CDR Critical Design Review
CWP Controller Working Position
FPM Flight Plan Monitor
IFPL Initial Flight Plan
LAB Label
LoA Letters of Agreement
MIL Message In List (i.e. Co-ordination In List)
MOL Message Out List (i.e. Co-ordination Out List)
MTCD Medium Term Conflict Detection
PDR Partial Design Review
PWP Pilot Working Position
SIL Sector Inbound List
SQAP Software Quality Assurance Plan
STCA Short Term Conflict Alert
STP Software Test Plan
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 3
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
TP Trajectory Predictor
TS Transferring Sector (i.e. upstream sector)
VAW Vertical Assistance Window
XFL eXit Flight Level
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 4
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
2. PROJECT OVERVIEW
2.1 Background The eDEP project shall produce an open reusable ATM java toolkit that facilitates the development of specific ATM demonstrators. Hence, the EEC shall place great emphasis on the software architecture quality.
The eDEP platform has been under development since April 2001 by Graffica Uk Ltd. The 2001 platform consists of approximately 500 java classes, and concentrates on the GRD / CWP functionality. This software shall be improved and extended during this 2002 TRS.
2.2 Development Process The 2001 development process was bottom-up, spiral based. The development iteratively focused on ‘grey’ areas (of relative high risk), setting out the foundations of a reusable ATM framework. This work equally identified the major workflows for 2002.
The 2002 contract shall be based on the iterative waterfall approach. That is, we have the classical waterfall stages of requirements, overall architecture design, followed by an iteration of detailed design, implementation and test.
Hence, the acceptance (review, test) procedures shall also be iterative, being applied at key milestone delivery points.
Note : much of the User Requirements / Software Requirements and to a certain degree the Architecture were imposed by the TRS (EATMP HMI, ESCAPE like architecture).
2.3 Software & Software Testing The eDEP platform is 100% java, built upon the JDK 1.3 SDK.
From a functional point of view eDEP contains the following components,
• AIR subsystem
• PM (Pilot Manager)
• PWP (Pilot Working Position)
• GRD subsystem
• ASP (Airspace Component)
• TME (Time Component)
• ATG (Air Traffic Generator) / IAS (Integrated Air Surveillance)
• IFPL (Initial Flight Plan Component)
• TP (Trajectory Predictor Component)
• FM (Flight Manager Component)
• COORD (intra-centre co-ordination Component)
• MTCD (Medium Term Conflict Detection Component)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 5
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
• STCA (Short Term Conflict Alert Component) [new component]
• FPM (Flight Path Monitoring)
• CWP subsystem
• HIPS Conflict Zone Engine
• PVD (Plan View Display)
• VAW (Vertical Aid Window) …
The eDEP project is concerned with building a reusable, generic toolkit. From an acceptance / testing point of view, such software is troublesome to evaluate, since by definition toolkits/frameworks are not complete applications. Hence, answering the following questions is not easy
• is it really reusable?
• is it really generic ?
• is it ‘bug-free’?
In order to answer such questions the toolkit shall be used to construct an ‘example’ application. This shall consist of an EATMP-like CWP, OLDI-like Coordination, ESCAPE-like AR40 Flight Manager. This application shall then be evaluated for the above criteria.
2.4 Project Milestones The e-DEP platform is proposed in a number of incremental versions,
• eDEP Standalone Edition (Q1 2002)
• light-weight demonstrator facility
• ideal for portable PC, PC or web-based demonstrations
• consists of single CWP and FDPS components
• already partly developed (80%) – brought as input to the eDEP project
• eDEP Experimentation Edition (2002)
• build upon the Standalone edition
• provide distribution support for multiple CWPs ( maximum 15)
• provide an simple preparation tool (EEC Development)
• provide an simple AIR subsystem with piloting HMIs
• eDEP Integrated Edition (2003) [beyond current TRS]
More specifically, the delivery schedule is as follows (T0 = 7th January 2002)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 6
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
ID Due Date (T0 +) Work Package Deliverable Description
M1
1.5 months Standalone Edition (1st Delivery)
Architecture Document, Draft Design Document (focusing on FM, COORD, CWP design issues)Draft Test Plan Document (developed in collaboration with the EEC)
Initial prototype delivery
M2
3 months Standalone Edition (2nd Delivery)
Updated Documentation (Architecture, Design, Test, and User manual)
Tested Software (GRD and CWP functionality)1
• full ATC object model
• ATC core services upgrade (COORD, FM)
• EATMP CWP
M3
4.5 months Experimentation Edition (1st Delivery)
Architecture, Design and Test Plan Document draft updates for
• AIR focus (HMI + Pilot Manager)
• Distribution Support
• ATC Tool Services Upgrade (STCA, FPM)
M4
7 months Experimentation Edition (2nd Delivery)
Document Updates
Software (GRD focus)
• Distribution Support
• ATC tool Services Upgrade (STCA, FPM)
Software (AIR focus)
• Pilot HMI (PWP) & Initial Pilot Manager logic
• CWP upgrade (e.g. Feed / non-feed issues)
M5 10 months Experimentation Edition
(General Availability) Document Updates
Full Software Delivery including full Pilot Manager Logic (inc. basic 4D TP) & Performance Issues
This test plan describes the testing procedures to be applied for each of these deliverables.
2.5 Acceptance Overview The following diagram illustrates the development process (without the iterations being shown) with the intended acceptance checkpoints.
1 AIR behaviour is simulated through FM (i.e. aircraft immediately flies controller entered orders)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 7
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
UserRequirements
SoftwareRequirements
ArchitectureDesign
DetailedDesign
Code
UnitTests
IntegrationTests
SystemTests
AcceptanceTestsURD
SRD / TRS Annex
ADD
DDD
PDR
CDR
CodeReview
SystemTesting
QualityReview
The eDEP project is very much influenced by existing systems (ESCAPE, AVENUE) and existing standards (EATMP). Hence, the classical User Requirements and Software Requirements phases were outside the scope of this outsourcing project. In fact, the Software requirements, under the form of a TRS Annex, were provided as input to the project.
The EEC intends to perform the following reviews
• Partial Design Reviews for each major release of the Architecture Design Document (mainly milestones M1 & M3)
• Critical Design Reviews for each release of the Detailed Design Document (mainly in milestones M1, M3)
• limited code reviews (random selection of code)
• Document Quality reviews (consistency, coherency, etc) for stable document releases (mainly at milestones M2, M4, M5)
The EEC intends to perform the following tests
• System Tests – using a pre-prepared ATC scenario, with the example ATM application built upon the toolkit.
Graffica is expected to internally perform the following (as outlined in the Tender Offer)
• internal design reviews (PDR, CDR)
• internal code reviews
• internal unit testing of certain key areas / algorithms2 (TP, AWS)
• internal integration testing3
2 Full unit testing is not feasible given the project budget 3 Assuming a focus on the FM/COORD/CWP triple. Note that integration testing is essential if certain functionality is neither testable or observable through the CWP HMI
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 8
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
What proof is expected of this??? (e.g. some test programs (Junit, profile viewer), some test logs, summary reports??
2.6 Quality Related Milestones (Review, Test) The following tasks / milestones are added to the project.
ID Quality Task Start Date / Duration Notes
MQ1 PDR of Architecture Design (M1) 28/01/02 / 2 weeks Major PDR – see section 3.1
MQ2 CDR of Detailed Design (M1) 11/02/02 / 1 week Major CDR – see 3.2
Special focus on FM/COORD issues
Special focus on CWP re-use issues
MQ3 Initial Code Review (M1) 18/02/02 / 1 week Major litmus test on code qualityFocus on AWS, FM, ASP.Early code reviews permit corrective actions to be done.
MQ4 Validation Testing (M2) 02/04/04 / 2 weeks
MQ5 Document Quality Check (M2) 02/04/04 1 week
MQ6 PDR of Architecture Design update (M3)
20/05/02 / 1 week Minor PDR- focus on new components
MQ7 CDR of Detailed Design Update (M3)
20/05/02 / 1 week Major CDR – focus on AIR (PM, PWP)
MQ8 Validation Testing (M4) 01/08/02 / 2 weeks focus on AIR, and any updated GRD components
MQ9 Document Quality Check (M4) 01/08/02 / 1 week
MQ10 Code Review (M4) 15/08/02 / 1 week
MQ11 Full Validation for acceptance (M5)
04/11/02 / 2 weeks
MQ12 Document Quality Check (M5) 04/11/02 1 week
MQ13 Code Review (M5) 04/11/02 1 week
These tasks represent EUROCONTROL led acceptance tasks, relative to official TRS milestone payments. Internal Graffica reviews/tests are not shown.
2.7 Acceptance Protocol The eDEP project manager and his EUROCONTROL team shall be responsible for the above Quality related tasks. The eDEP project manager is responsible for analysing the test/review results, thus leading to either milestone payment or corrective actions (for Graffica).
The Graffica Technical Lead shall be on-site during validation testing (not necessarily for the full 2 weeks).
Graffica shall provide evidence of internal unit / integration testing.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 9
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
3. DELIVERY REVIEWS
3.1 Preliminary Design Reviews With the TRS Annex (SRS) as input, the first main activity is Architecture Design, producing the ADD. The overall architecture is heavily influenced by the existing systems – ESCAPE, and AVENUE. Hence the Partial Design Review (PDR) should be relatively straight forward, mainly focusing on the inter-component interfaces to ensure
• the interfaces and their data flows are sufficient to achieve the software requirements
• the interfaces follow the AVENUE model.
3.1.1 Design Completeness The TRS Annex (and in particular the EATMP Generic HMI Requirements and AIR subsystem requirements) shall be used to verify that each requirement can be allocated to a given architectural element, and in particular the presented external component interfaces.
3.1.2 Design Confidence A number of important test use cases shall be manually ‘executed’ within the design to observe the expected system behaviour. This should present a certain degree of system-wide design confidence.
The ESCAPE FM SRS shall be used to design a set of cases.
3.1.3 AVENUE Issues With respect to AVENUE the following is expected,
• eDEP functional components correspond to AVENUE functional components
• eDEP APIs and Data Dictionary are ‘similar’ to AVENUE APIs/DDs4
Intended areas of focus include the FM/COORD/CWP triple. The eDEP components shall be examined with respect to the following data types,
• IFPL Data AvFlplt::InitialFP + AvFlplt::BasicRouteFP (expanded points)
• Constraints Data : AvCstrt::ConstraintsOfFP
• TRAJ Data : AvTraj::TrajectoryFP
• Sector List : AvSectt::SectorsFP
• Co-ordination data : AvCdnmTypes::CoordInfo
• Correlation Data : (Not really necessary)
• SSR Chg Data : (Not really necessary)
4 Full AVENUE compliance is not realistic for a prototyping platform. However, it should be possible to plug eDEP CWPs into an AVENUE compliant GRD subsystem without huge amounts of effort.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 10
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
3.1.4 Other Issues The proposed design shall be analysed from a performance point of view. That is, for a given use case (e.g. Controller order) what are the total number of generated messages, and what factors contribute to their size and number (e.g. number of route points, number of CWP positions etc).
3.2 Critical Design Reviews Following the delivery of the DDD, the Critical Design Review (CDR) shall focus on
• ability of proposed design to meet software requirements (ATM functionality)
• general design characteristics (see TRS Annex section 3.3)
• specific subsystem design characteristics
• AIR subsystem (TRS annex section 7.2)
• GRD subsystem (FM/COORD focus – section 9.4, 9.5)
• CWP subsystem (section 10.3)
• logical package structuring (gsdk, atc, and example application)
• threading / dynamic considerations
• performance
3.3 Code Reviews During the coding process, the EEC shall conduct limited code reviews, to ensure the software coding standards and comment standards are applied (see TRS Annex for more information).
The coherence and correctness of software comments with respect to the code shall be verified.
The EEC shall select a representative set of classes (e.g. from gsdk.aws, gsdk.entity, atc, atcapp).
Equally, the EEC shall re-examine the overall code to ensure the CDR results still apply (i.e. the implementation is conformant to the design).
It is expected, however, that Graffica shall perform internal code reviews (as outlined in the TRS response sections 6.9-6.18).
3.4 Quality Reviews Once a coherent and complete set of documents have been delivered (ADD, DDD, User Guide, javadoc) they shall be analysed from a quality point of view,
• coherence
• completeness
• correctness
• ease of understanding
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 11
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
4. UNIT TESTING
The EEC shall not perform eDEP unit testing. This is the responsibility of Graffica Ltd.
The EEC recognises that complete coverage testing (black or white) is not feasible given the project time-scales and budget. Hence, a certain amount of unit testing shall be moved into integration testing (in order to avoid the cost of building expensive test harnesses).
However, some unit testing is expected in high-risk areas,
• complex algorithms (e.g. TP)
• graphics library (aws)
Proof of testing is required – e.g. test code.
Note : Graffica has begun to build a top-level test package. This activity is encouraged.
5. INTEGRATION TESTING
5.1 General Principles Integration testing is the responsibility of Graffica Ltd. Again, given the project timescales and budget, complete and exhaustive integration testing is not feasible. Hence, a large amount of integration testing can be placed within the validation (system) testing.
However, key subsystem assemblies, which represent high complexity or high risk, must be tested at an integration level.
Equally, if we push a certain amount of integration testing into validation testing, we must identify that the CWP is capable of testing all major components completely.
A good example, is FM what-if functionality. This is not accessible via the CWP in its current form.
5.2 Test Coverage As previously mentioned, the EEC shall accept that the majority of integration testing is pushed into validation testing for budget reasons.
However, we need to ensure that components’ external interfaces are fully testable from the CWP. That is,
• Each external operation of a given component is called directly or indirectly from the CWP
• The results of this invoked external operation are visible from the CWP (i.e. the test results can be verified).
5.2.1 Test Coverage Tables Graffica should provide the EEC with component test coverage tables.
That is, for each component’s external interface we list each operation and indicate if it is called from within the CWP (or some other component) and indicate if the operation results are verifiable.
This should allow us to determine a certain level of confidence and possibly identify hotspots that require testing.
For example,
FMController Operation Invoked verifiable
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 12
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
CreateWorkingPlan No No
SetPlannedLevel Yes – CWP Yes
RegsterUnitCrossing Yes – CS Indirectly
RegisterForSystemPlan Yes - CWP
5.2.2 Improved CWP tools for test verification A number of minor improvements to the CWP / PWP components may greatly assist the integration testing (without the need for expensive test harnesses)
• ability to display time estimates on Flight Legs
• ability to view and distinguish various points on the Flight Leg (sector crossing, unit crossing, significant trajectory points (TOD, BOC..)
• ability to completely view the Flight Leg in the vertical domain (with rough sector boundary overlays)
Note : the EEC has developed a traffic analyser tool which implements many of the above improvements. (Admittedly, in its current form it does not extend FlightLeg directly). This will be provided to Graffica for integration testing.
5.3 Required Integration Testing The following areas are critical,
• Core FDPS integration (FM, COORD, TP) + CWP
• Air Subsystem integration (PWP, PM)
The following should be done:
• test all external interfaces (all operations)
• produce a test log summary (note : the EEC would accept screen shots from the traffic analyser tool)
Graffica is free to use subsets of the validation test cases to achieve the above integration testing.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 13
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
VALIDATION TEST PLAN
6. VALIDATION TEST PLAN
6.1 Introduction This validation test plan will define the criteria under which EUROCONTROL shall accept the eDEP software developments performed by Graffica Ltd.
The eDEP software platform, described in section 2.3, consists of an ATC toolkit and an example ATM application.
The following documents are supplied as input to validation testing,
• eDEP PMP (see [Ref 6])
• Graffica Quality Plan (see [Ref 7])
• Coding / Comment Standards (see [Ref 7])
6.2 Test Items
6.2.1 Documentation The following test item documentation is supplied,
• Software Requirements (TRS Annex and referenced documents) – see [Ref 2]
• Architecture document (see [Ref 3])
• Design Document (see [Ref 4])
• User Document (see [Ref 5])
• Java Doc Documentation (derived from software)
6.2.2 Software The software is delivered in a phased iterative approach (see section 2.4). The phases and their contents are listed below,
• phase 1- Standalone Edition (milestone 2)
• phase 2 – Initial Experimentation Edition (milestone 4)
• phase 3 – Final Experimentation Edition (milestone 5)
Description GRD subsystem AIR subsystem CWP subsystem 1 Core FPDS & CWP.
No distribution support ASP, TME, IFPL FM, TP(3d), COORD MTCD, FPM5
No AIR subsystem Hence GRD FM / ATG ensure plots are generated for GRD
EATMP Labels Co-ordination support
5 Partial implementation
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 14
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
ATG trajectories.
2 Distribution Support, Initial AIR subsystem Improved CWP / GRD
as phase 1, plus
STCA, FPM, IAS6
PWP, PM as phase 1, plusAnti-overlap Hybrid (Feed) function
3 Full Pilot Manager functionality TP with 4D capabilities Performance Issues
as phase2, plus TP 4D
as phase 2 plus, Improved PM
as phase 2
6.3 Features to be tested The validation testing shall be incremental, corresponding to the phased delivery of the test items,
• phase 1 (MQ4) – validation testing of milestone (2). Focus on GRD/CWP functionality
• phase 2 (MQ8) – validation testing of milestone (4). Focus on AIR, GRD/CWP and distribution functionality
• phase 3 (MQ11) – validation testing of milestone (5) Complete validation testing + performance measurements
6.4 Features not to be tested The following features shall not be tested,
• HIPS Conflict Zone Engine
• plug-in mechanisms
6.5 Approach Initial validation testing (milestone 2) shall be conducted mainly by the eDEP project manager. However, for later iterations, support staff (PILOTS, OPS) shall be equally used.
The eDEP project manager shall use the TRS Annex (and its referenced documents) and other EEC documents to construct a suitable set of test cases. Input documents include,
• ACE Operational Validation Plan contains an extensive list of test cases for the ACE platform
• ESCAPE FM SRS contains many traffic scenario examples that define the FM/COORD behaviour
• EATMP Generic HMI Specification
• DSI HMI Specification Contains useful use cases, sequence diagrams for the co-ordination process (at the HMI level)
The data preparation task is high importance to the validation process. The prepared static and traffic data should be sufficiently rich to test all major components. This data shall be derived from IPAS. (see section ???).
6 ATG converted into IAS (i.e. fed by AIR subsystem)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 15
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
The vast majority of validation tests shall be conducted via the HMIs – PWP and CWP. Hence, the eDEP platform shall be tested at a black-box level.
6.6 Dependencies EUROCONTROL shall develop on-site a number of tools / functions that have a direct use within the eDEP validation process. These include,
• IPAS Data Converter
• Traffic Viewer tool (including profile validation)
• Data Recording facility
• Replay facility
These tools shall be provided to the validation process.
6.7 Acceptance Protocol
6.8 Test Deliverables The following documents shall be delivered
• System Test Plan, Test Designs, Test Cases and Test Procedures
• System Test Reports and Summary
The following data shall be delivered
• input data
• static data (Airspace Definition)
• traffic data (Initial Flight Plans)
• configuration data
• output data
• data recording (for analysis or replay)
• screen shots
6.9 Testing Tasks The following tasks are identified (build a table with dates and interdependencies)
• Prepare test plan
• Prepare test designs & test cases
• Prepare test procedures (are we going to do this??)
• Prepare input data
• perform phase 1 validation tests
• perform phase 2 validation tests
• perform phase 3 validation tests
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 16
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
For each of the phased validation tests we need to define the procedures and the content (AIR tests etc)
6.10 Environment Needs
6.10.1 Hardware Requirement The eDEP platform, being developed in pure java, can be deployed on any machine supporting the Java 1.3 runtime environment. However, the target platform for eDEP is focused upon a PC / Windows environment.
Hence, hardware requirement is as follows
• phase 1
• 1 PC with a typical configuration of 800MHz, 256 MB RAM with Dual Head Graphics card (optional)
• phase 2
• ‘x’ PC
• phase 3
• ‘x’ PC
• 2k flat screens
[Should make a note concerning the constraints concerning the graphics card Mike? i.e. you expect true colour for transparencies to correctly work]
6.10.2 Software Requirement The following COTS software is required
• Java SDK 1.3
• Windows NT 4.0 Release Pack 5
• (possibly) Windows 2000 for 2k flat screen machines (driver problems for NT?)
6.10.3 Tools The EEC eDEP team may supply the following software (TDB)
• IPAS data converter
• data recording / replay
• traffic viewer tool
• improved flight leg allowing times, sector crossing points, COP points, and significant trajectory points (TOD, BOC) to be visualised.
• vertical profile viewer (with superimposed sectorisation)
• tabular data viewer
6.10.4 Data Requirement A suitable static file is required. This file should be sufficiently rich to test the eDEP ASP capabilities. This includes,
• Beacons, Airways, COPS
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 17
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
• Sectors – including
• multi-volume sector definitions
• small sectors (enabling ADVANCED / ADVANCED coordination)
• Airports (and associated SID / STAR) definitions
• Unit Definitions (CWP / Sector mappings)
• Letters of Agreement – these need to be studied with care, in order to test the system
At least two traffic samples are required – standard and heavy (for performance tests). The standard sample should have the following characteristics
• flights with associated SID / STARs
• flights without SID/ STARs
• flight with CTFL points (i.e. pre-prepared profile)
• flights without CTFL points (i.e. FM deduces profile through LoA application)
• various aircraft types
{TBD} Undecided if shall use existing traffic sample or hand-crafted traffic sample
May actually use 1 hand-crafted and 1 existing traffic sample.
Concerning the actual exercises, the following criteria are expected,
• unmanned units (no pilot, no CWP) automated system response (COORDINATION)
• feed units (CWP, no pilot) typically feed units are mapped to several sectors
• measured units (CWP and Pilot) typically 2 CWPs per unit, mapped to a single sector
6.11 Responsibilities The eDEP project manager shall be responsible for the overall planning and management of the validation process.
The Graffica Development team shall be responsible for correcting reported incidents as soon as possible.
6.12 Staffing Darren Smith : responsible for majority of validation testing
Sophie Carlier : support
Mike Vere : to be defined. on-site (EEC) for acceptance, available (UK) during validation for corrections…
What about some ops / pilots later on??
6.13 Schedule See section 2.6
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 18
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
6.14 Risks The IPAS data conversion software is essential to the project. This shall be developed at an early stage.
Functionality that is not used in example appli?
Functionality that is not observable in HMI?
Some work may be necessary here, to ensure that the delivered CWP is sufficiently rich to observe most system behaviour. For example, the CWP should be upgraded so that STCA and FPM (deviations) are observable.
7. TEST DESIGN SPECIFICATION
7.1 Scenario Data
7.1.1 Static Data Overview [Note] : Suggested that COM145B be used.
7.1.2 Traffic Data Overview
7.1.3 Platform Configuration
7.2 Phase 1
7.3 Phase 2
7.4 Phase 3
8. TEST CASE SPECIFICATION
8.1 STC TSPV (Technical Supervision) Family
8.1.1 STC TSPV 010 – Installation / Launch Test Case ID STC TSPV 010
Objective Verify the eDEP platform can be easily installed & launched
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 19
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications The installation / administration instructions in the User Manual
Output Specifications A launched platform
Test Criterion v1 : Visual check for running CWP and Control Panel v2,v3: Visual check for running CWPs, PWPs, Simulation Engine and Control Panel
Environment Static: Traffic: Config:
Test Dependencies
8.1.2 STC TSPV 020 – Time Control Test Case ID STC TSPV 020
Objective Verify that simulation time can be controlled
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Use the Control panel to test the time controls. Namely.
1) start the timer
2) accelerate / decelerate the time rate
3) freeze / unfreeze the timer
4) stop the timer
Output Specifications The various clock displays (on control panel, PWP, CWP)
Test Criterion v1 : Visual check for running CWP and Control Panel v2,v3: Visual check for running CWPs, PWPs, & Control Panel
Environment Static: Traffic: Config:
Test Dependencies STC TSPV 010
8.2 STC_GRD_TP Family
8.2.1 STC_GRD_TP 010 – Trajectory Calculation Note : in some cases these are very much unit tests. However, it would be worthwhile having them redone at the validation stage
Test Case ID STC GRD FM TRAJ 010
Objective Verify that trajectories are correctly computed
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened (assumes extra trajectory info is displayed on the Flight Leg) Via the traffic viewer, various flights are examined v3 : Various Aircraft Types shall be used
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 20
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Output Specifications Graphical Profile which allows the following to be checked,
the 2d trajectory output (beacon overflight) the vertical profile (Climb rates) the time estimates
Test Criterion FM SRS 3.50 : Climb as early as possible, descend as late as possible
Note : may compare eDEP results to ESCAPE results to get an overall impression of accuracy. May invite other EEC staff.
Environment Static: Traffic: Config:
Test Dependencies STC TSPV Family
8.3 STC_GRD_FM Family
8.3.1 Introduction This family of tests is primarily concerned with validating the correctness of the trajectory-centric data provided by the FM component. This data can be divided into a number of perspectives, (which corresponds to the internal FM processing model)
• Route Expansion - conversion of the abbreviated IFPL format into a sequence of Constraint points
• Constraint Application – Application of ATC constraints (LoA) to this sequence of expanded points
• Sector Crossing List / Unit Crossing List summary data, derived from the TP trajectory data
The remaining tests are then concerned with tactical order processing (CFL, HDG, DIRECT-TO etc) which then cause a revalidation of the above trajectory-centric data.
Note : There is an obvious overlap between the FM tactical order test cases, and the COORDination test cases (After all, in many cases, COORDINation is the prelude to FM processing)
8.3.2 STC_GRD_FM_RTE – Route Expansion This family of tests is concerned with testing the FM behaviour when receiving initial flight plans (IFPL).
8.3.2.1 STC_GRD_FM_RTE_010 – Simple Route Segment Expansion Test Case ID STC_GRD_FM_RTE 010
Objective Verify that IFPL data is correctly expanded (2d)
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened {TBD} : Via the traffic viewer, various flights are examined Flight are chosen for which the preparation data contains route segments(i.e entry-point route-to-follow exit-point) Various flights shall be chosen that combine 1+ route segments, and contain 0+ explicit point lists
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 21
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Output Specifications CWP PVD Flight Legs
Test Criterion ESCAPE FM SRS Req 1.100
Environment Static: Traffic: Config:
Test Dependencies STC TSPV Family
8.3.2.2 STC_GRD_FM_RTE_020 – SID / STAR Expansion Test Case ID STC GRD FM RTE 010
Objective Verify that IFPL data is correctly expanded (2d) for SID/STARs
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened {TBD} : Via the traffic viewer, various flights are examined A number of flights are chosen with SID, STAR, and SID/STAR data
Output Specifications CWP PVD Flight Legs
Manual check to ensure all SID, STAR, route segments are correctly expanded
Test Criterion ESCAPE FM SRS Req 1.100
Environment Static: Traffic: Config:
Test Dependencies STC TSPV Family
8.3.3 STC_GRD_FM_CON – Constraint Application (From IFPLs) This family of tests covers the FM processing of IFPLs concerning the application of LoA constraints – namely the allocation of flight levels.
8.3.3.1 STC_GRD_FM_CON_010 – Climbing/Descending Aircraft at Simulation Start Test Case ID STC GRD FM CON 010
Objective Verify that initial GOTO FL profiles are correctly handled (i.e. a/c that begin the simulation within an immediate climb)
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened {TBD} : Via the traffic viewer, various flights are examined
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles Check the vertical profile
Test Criterion
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_RTE
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 22
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
8.3.3.2 STC_GRD_FM_CON_020 – LoA Constraint With Static FL Test Case ID STC GRD FM CON 020
Objective Ensure that simple LoA rules are correctly applied
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened {TBD} : Via the traffic viewer, various flights are examined
A number of scenario flights are chosen
with/without SID, STAR definitions
with/without Route Segment definitions (requiring LoA profiles)
These flights shall pass over navigation points which have ‘simple’ LoA
if then CTFL=value
Note : what can be found in the condition???
e.g. if ADEST=ddd then RFL=220
if RFL>310 then RFL=310
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles
Check the LoA have been correctly applied
Test Criterion FM SRS : Selection of LoA Constraints
subset of section 3.2.2 (2.50, 2.70-2.90, 2.110),
subset of section 3.2.3 (2.120),
subset of section 3.2.4, 3.2.5
FM SRS : Profile computation section 3.3.1
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_RTE
8.3.3.3 STC_GRD_FM_CON_030 –LoA Constraint With Dynamic FL Test Case ID STC GRD FM CON 030
Objective Ensure that LoA rules are correctly applied
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened {TBD} : Via the traffic viewer, various flights are examined
A number of scenario flights are chosen
with/without SID, STAR definitions
with/without Route Segment definitions (requiring LoA profiles)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 23
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
These flights shall pass over navigation points which have dynamic LoA
if then CTFL=RFL
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles
Check the LoA have been correctly applied
Test Criterion as STC_GRD_FM_CON_020
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_RTE
8.3.3.4 STC_GRD_FM_CON_040 –Explicit Profile Specification Test Case ID STC GRD FM CON 040
Objective Verify that IFPLs may contain explicit FL values that override default LoA
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened {TBD} : Via the traffic viewer, various flights are examined
A number of scenario flights are chosen where planned flight FLs (or Control FL) are specified within the IFPL
Flights shall be chosen with FL values which
• are in accordance with LoA
• contradict LoA
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles
Test Criterion Check the explicit FLs override the default LoA
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_RTE
8.3.4 STC_GRD_FM_SEC – Sector List Computation This family of tests is concerned with issues of Sector traversal computation.
8.3.4.1 STC_GRD_FM_SEC_10 – Horizontal Sector Traversal Test Case ID STC GRD FM SEC 010
Objective Verify that the FM correctly computes the list of intersected sectors for a given trajectory
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened Note : this assumes that the flight leg can show sector crossing points{TBD} : Via the traffic viewer, various flights are examined
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 24
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
A number of scenario flights are chosen which traverse sectors horizontally.
Different types of sectors shall be chosen
• mono-volume sectors
• multi-volume sectors (causing several volumes to be traversed in series)
Output Specifications CWP PVD Flight Legs
Test Criterion Check the sector crossing points
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_CON, STC_GRD_TP
8.3.4.2 STC_GRD_FM_SEC_20 – Vertical Sector Traversal Test Case ID STC GRD FM SEC 020
Objective Verify that the FM correctly computes the list of intersected sectors for a given trajectory
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened Note : this assumes that the flight leg can show sector crossing points{TBD} : Via the traffic viewer, various flights are examined
A number of scenario flights are chosen which traverse sectors vertically.
Different types of sectors shall be chosen
• mono-volume sectors
• multi-volume sectors (causing several volumes to be traversed in series)
Output Specifications CWP PVD Flight Legs, VAW
Test Criterion Check the sector crossing points
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_CON, STC_GRD_TP
8.3.4.3 STC_GRD_FM_SEC_30 – Sector Re-entry ??? Are we going to cater for this or not ????
Test Case ID STC GRD FM SEC 030
Objective Verify that the FM correctly computes the list of intersected sectors for a trajectory that re-enters a sector
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened Note : this assumes that the flight leg can show sector crossing points{TBD} : Via the traffic viewer, various flights are examined
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 25
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
A number of scenario flights are chosen which traverse sectors horizontally / vertically.
The flights in question shall re-enter a given sector by one of two means,
• circular flight plan (causing re-entry)
• doughnut shaped multi-volume sectors
Output Specifications CWP PVD Flight Legs
Test Criterion Check the sector crossing points
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_CON, STC_GRD_TP
8.3.5 STC_GRD_FM_UNT – Unit List Computation
8.3.5.1 STC_GRD_FM_UNT_010 – Horizontal Unit Traversal Test Case ID STC GRD FM UNT 010
Objective Verify that the FM correctly computes the list of traversed units, and the appropriate coordination data (e.g. COPs)
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened Note : this assumes that the flight leg can show COP points{TBD} : Via the traffic viewer, various flights are examined
A number of scenario flights are chosen
• which traverse units horizontally
• which pass directly over COP points
• which traverse mono-sector units
• which traverse multi-sector units
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles
Test Criterion Check the sector crossing points
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.5.2 STC_GRD_FM_UNT_020 – Vertical Unit Traversal Test Case ID STC GRD FM UNT 020
Objective Verify that the FM correctly computes the list of traversed units, and the appropriate coordination data (e.g. COPs)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 26
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened Note : this assumes that the flight leg can show COP points{TBD} : Via the traffic viewer, various flights are examined
• A number of scenario flights are chosen
• which traverse units vertically
• which pass directly through COP points
• which traverse mono-sector units
• which traverse multi-sector units
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles
Test Criterion Check the sector crossing points
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.5.3 STC_GRD_FM_UNT_030 – FM Derived COP Calculation Test Case ID STC GRD FM UNT 030
Objective Verify that the FM correctly computes the COP points
Test Items User Manual, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP flight label, various flight legs are opened Note : this assumes that the flight leg can show COP points{TBD} : Via the traffic viewer, various flights are examined
A number of scenario flights are chosen
• which traverse units horizontally / vertically
• which do not pass through COP points
Hence the FM is forced to select the ‘most likely’ COP point
(e.g. this occurs for Direct To Orders)
Output Specifications CWP PVD Flight Legs, and VAW Vertical Profiles
Test Criterion Check the sector crossing points
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.6 STC_GRD_FM_RECALC – This is a placeholder. Once deviation monitoring is implemented, I guess there will be an option for the FM to perform trajectory recalculations when a/c deviates (a good example being time corrections (shifts) on trajectories when a/c overfly beacons).
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 27
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
FM SRS 1.210 / 1.220
8.3.7 STC_GRD_FM_HDG – Heading Orders (Closed) / Re-directs The EATMP heading orders are closed (i.e. in eDEP terminology – re-direct orders).
Many of the presented test cases tend to overlap with COORDination test cases (or are the logical conclusion of a successive coordination).
In reality, these FM test cases should focus on the validity of the recalculated constraint list and resulting trajectory.
8.3.7.1 STC_GRD_FM_HDG_010 – Simple Heading Order Test Case ID STC_GRD_FM_HDG_010
Objective Verify the FM correctly processes a CLOSED Heading Order (i.e. a re-direct order) for a flight in cruise
Test Items eDEP platform (v1, v2, v3)
Input Specifications Through the CWP an ASSUMED aircraft in horizontal flight is placed onto a heading.
(1) Click on the ahdg field
(2) Using the elastic vector place the a/c on a heading
Output Specifications CWP Flight Label, PVD Flight Legs, and VAW Vertical Profiles
v1-3 :
Change in Label (line 4, field ahdg)
Flight Leg / Vertical profile updated (PVD, VAW)
v1:Monitor the a/c radar plots over time (ATG)
Test Criterion FM SRS section 3.3.3.3
EATMP HMI section 4.5
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
Note : this test case may follow from certain COORDination test-cases
8.3.7.2 STC_GRD_FM_HDG_020 –Heading Order For Climbing/Descending Flight Test Case ID STC_GRD_FM_HDG_020
Objective Verify the FM correctly processes a Heading Order for a climbing/descending flight
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP an evolving a/c is placed onto a heading
i.e. click on ‘ahdg’ field, select heading via elastic vector.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 28
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Output Specifications CWP Flight Label
v1-3 : Change in Label (ahdg field). Profile and Flight Leg updated
v1:Monitor the a/c radar plots over time. (a/c should continue climbing, and turns)
Test Criterion FM SRS section 3.3.3.3
EATMP HMI section 4.5
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.7.3 STC_GRD_FM_HDG_30 – Heading Order causing projection of ATC & XFL constraints Test Case ID STC_GRD_FM_HDG_030
Objective Verify the FM correctly projects existing ATC and XFL (boundary points) constraints onto the new trajectory segment (curr-posn, 2d hdg posn, rejoin point)
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP a suitable a/c is placed onto a heading
i.e. click on ‘ahdg’ field, select heading via elastic vector.
Output Specifications CWP Flight Label
v1-3 : Change in Label (ahdg field).
v1:Monitor the a/c radar plots over time. (a/c should continue climbing, and turns)
v1-v3 : The PVD & VAW Flight Legs should conform to ESCAPE SRS expectations
Test Criterion FM SRS section 3.3.3.3
EATMP HMI section 4.5
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.8 STC_GRD_FM_CFL – Cleared Flight Level Orders
8.3.8.1 STC_GRD_FM_CFL_010 – New Cleared Flight Level Test Case ID STC_GRD_FM_CFL_010
Objective Verify the FM correctly processes a CFL order
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP an assumed a/c is given a new CFL.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 29
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Output Specifications CWP Flight Label : CFL visible (see EATMP HMI section 3.2.4.2)
No Change in Vertical Flight Leg (VAW, PVD)
v1:Monitor the a/c radar plots over time. (a/c should climb / descend immediately)
Test Criterion FM SRS section 3.3.3.4 – FL Order
EATMP HMI section 5.3
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.8.2 STC_GRD_FM_CFL_020 – New Cleared Flight Level (On Heading) Test Case ID STC_GRD_FM_CFL_020
Objective Verify the FM correctly processes a CFL order for a flight off-plan
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP an assumed a/c is given a HDG order and then a new CFL.
Output Specifications CWP Flight Label : CFL visible (see EATMP HMI section 3.2.4.2
, PVD Flight Legs, and VAW Vertical Profiles
v1-3 : Change in Label. No Change in Vertical Flight Leg (VAW, PVD)
v1:Monitor the a/c radar plots over time. (a/c should climb / descend immediately)
Test Criterion EATMP HMI section 5.3
FM SRS section 3.3.3.4 – FL Order
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_HDG_10
8.3.8.3 STC_GRD_FM_CFL_030 – CFL in accordance with XFL
8.3.8.4 STC_GRD_FM_CFL_040 – CFL different to XFL
8.3.9 STC_GRD_FM_DIR – Direct To Orders A lot of this is covered in STC_GRD_COORD. We just cover the ‘after co-ordination agreed’ issues. That is, the re-computation of the trajectory (especially the projection of the current constraints onto the new direct-to route).
Many of the test cases here are identical to the HDG (RE-DIRECT) test cases.
8.3.9.1 STC_GRD_FM_DIR_010 – Simple Direct To Order Test Case ID STC_GRD_FM_DIR_010
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 30
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Objective Verify the FM correctly processes a direct-to for a flight in cruise (constant FL)
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP an ASSUMED aircraft in horizontal flight is placed onto a heading.
(1) Click on the XPT field
(2) select a downstream point
or
(1) Click on hdg field
(2) use elastic vector to select direct heading to downstream point
Output Specifications v1-3:
Change in Label (line 4, field ahdg contains direct-to beacon)
Flight Leg / Vertical profile updated (PVD, VAW)
v1:Monitor the a/c radar plots over time (ATG)
Test Criterion FM SRS : section 3.3.3.1
EATMP HMI section 4.5, 5.4
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
Note : this test case may follow from certain COORDination test-cases
8.3.9.2 STC_GRD_FM_HDG_020 –Direct To Order For Climbing/Descending Flight Test Case ID STC_GRD_FM_HDG_020
Objective Verify the FM correctly processes a DirectTo Order for a climbing/descending flight
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP an evolving a/c is placed onto a direct-to
click on xpt field, select direct-to point
or
click on ‘ahdg’ field, select point via elastic vector.
Output Specifications CWP Flight Label
v1-3 : Change in Label (ahdg field).
Flight Leg / Vertical profile updated (PVD, VAW)
v1:Monitor the a/c radar plots over time. (a/c should continue climbing, and turns)
Test Criterion FM SRS : section 3.3.3.1
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 31
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
EATMP HMI section 4.5, 5.4
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.9.3 STC_GRD_FM_HDG_30 – Direct To Order causing projection of ATC & XFL constraints Test Case ID STC_GRD_FM_HDG_030
Objective Verify the FM correctly projects existing ATC and XFL (boundary points) constraints onto the new trajectory segment
Test Items EATMP HMI Spec, eDEP platform (v1, v2, v3)
Input Specifications Through the CWP a suitable a/c is placed onto a direct-to
click on xpt field, select downstream point
click on ‘ahdg’ field, select downstream point via elastic vector.
Output Specifications CWP Flight Label
v1-3 : Change in Label (ahdg field).
v1:Monitor the a/c radar plots over time. (a/c should continue climbing, and turns)
v1-v3 : The PVD & VAW Flight Legs should conform to ESCAPE SRS expectations (i.e. ATC constraints projected onto new segment, XFL constraints projected onto boundary)
Test Criterion FM SRS : section 3.3.3.1
EATMP HMI section 4.5, 5.4
Environment Static: Traffic: Config:
Test Dependencies STC_GRD_FM_SEC
8.3.10 STC_GRD_FM_OPEN – Open-ended HDG Orders This functionality is not currently accessible via the CWP HMI (i.e. EATMP HDG orders are re-directs). However, this functionality shall be incorporated in the AIR component, and hence shall be tested there.
8.3.11 STC_GRD_FM_SPD – Speed Control Orders MACH / CAS.
Achievable / Unachievable values
OLDI does support downstream co-ordination on speed – part of the transfer protocols.
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 32
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
8.4 STC_GRD_COORD
8.4.1 Overview The following sections outline the various test cases required for Co-ordination. The number of test permutations are high due to a number of factors,
• order types (XFL, EFL, DirectTo, Redirect)
• sector starting the (re)co-ordination
• transferring sector (initial activation, or revised co-ordination)
• assuming sector (revised co-ordination)
• status of sector initiating (re)co-ordination
• ASSUMED sector – the sector currently controlling the a/c initiates XFL coord
• ADVANCED sector – the ‘next’ sector (CS+1) initiates early XFL co-ordination (WOW!)
• ACT transmission status
• already sent – i.e. we are revising a co-ordination status
• not yet sent – i.e. no need for co-ordination (too early)
• LoA verification (ACT or RAP, REV or RRV)
• ensuing conversation (accept, reject, counter)
• HMI complexity (various input mechanisms, rich ouptut)
• Time related issues (e.g. close to boundary certain co-ordinations are no longer possible, or become automatically ‘referred’)
• Manned / unmanned sectors
For the test permutations the following diagram shall be used
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 33
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Sector A Sector B Sector C
D
A
B C
E
F
AFR123 : ASSUMEDAFR123 : NO_STATE
AFR123 : ADVANCED
AFR123 : NO_STATE
AFR123 : ADVANCED
AFR123 is currently under the control of Sector (Unit) A. Depending upon the sector dimensions, and the configurable ACT transmission time threshold (e.g. 10 minutes) the a/c may be in various states in sectors B and C
• NO_STATE (i.e. no ACT message has been received )
• ADVANCED (i.e. an ACT message has been received)
Note : need to check on the ASSUMED, ADVANCED, ADVANCED case.. gets complex.
8.4.2 OLDI Terminology The following sections make use of standard OLDI message terminology. This is convenient since the standard is referenced in many document sources and the message types are well defined. For example, the DSI SYSCO HMI Specification (see [Ref 16]) has been used extensively for the construction of the below test cases.
We have the following message types
Message Type Description
Initial Co-ordination
ACT The ACTivation message facilitates the exchange of co-ordination data - estimates and transfer conditions for a flight, and defines the initial co-ordination contract been AS and TS.
ACT messages are sent for flights which are in accordance with standard conditions (LoA)
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 34
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
This message is sent automatically ‘x’ minutes before sector exit.
RAP The RAP message is equivalent to the ACT message, except that it indicates the flight is not in accordance with standard conditions (LoA).Hence the co-ordination is referred to the assuming sector controller.
This message is sent automatically ‘x’ minutes before sector exit
System Acknowledgement
LAM Logical Acknowledgement Message. Sent automatically by the receiving system when the incoming message has been processed & no controller input is required
SBY Standby Message acknowledges that a message proposing transfer conditions (RAP, RRV, CDN) has been received and that the proposal is referred to the controller for a decision
Co-ordination Revisions
REV Revision Message Sent by TS to AS to indicate revisions to the previously agreed transfer conditions. The new transfer conditions are in accordance with standard conditions (LoA)
RRV As REV except the new transfer conditions are not in accordance with standard conditions (LoA)
ACP Indicates manual acceptance of the transfer conditions
RJC Indicates manual rejection of the proposed transfer conditions (previously agreed conditions are still valid)
CDN Initiates a modification proposal from the AS
Equally, it can be used to forward a counter proposal from the AS to the TS as reply to a RAP or RRV message
The reader is directed to the official OLDI standard for more information (see [Ref 12]).
8.4.3 Test Case Scenarios The test cases are broken down into a number of logical groups, consisting of progressively richer co-ordination ‘’conversations’
• Initial co-ordinations
• ACT-LAM
• RAP-ACP, or RAP-RJC
• RAP-CDN-ACP, or RAP-CDN-ACP
• Transferring sector led co-ordinations revisions (XFL, DIR, HDG)
• RRV ACP or RRV-RJC
• RRV-CDN-ACP or RRV-CDN-RJC
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 35
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
• Assuming sector led co-ordination revisions (EFL)
• CDN-ACP or CDN-RJC
Note : for the moment we do not consider the following,
• ACT / SBY – i.e. disagreement between AS and TS LoA databases
• PHONE functionality (i.e. all system co-ordination is abandoned . Verbally agreed conditions shall be entered into system at later point)
8.4.4 STC_GRD_COORD_ACT – Initial Co-ordination Activation from Assumed Sector The following set of test cases concern initial ACTivation, with its numerous variations (LoA violation, controller or timer induced).
Note : in the interests of simplicity the test cases deal with a transferring sector which has the subject a/c ASSUMED.
The test cases need also to be applied to smaller sectors where the a/c is only ADVANCED in the transferring sector.
8.4.4.1 STC_GRD_COORD_ACT_010 - Timed Activation (ACT LAM) Test Case ID STC_GRD_COORD_ACT_010
Objective Verify a normal timed ACTivation (default : 12 minutes before Sector exit).All Letters of Agreement are met.
Test Items eDEP platform (v1, v2, v3)
Input Specifications CWPTS – Assumed Flight ‘f’ @ 8 minutes to sector exit
CWPAS – next sector for flight ‘f’
Output Specifications preconditions:
CWPTS – Label in ASSUMED format.
CWPAS – Label in NON CONCERNED format
post conditions
CWPTS – Label in ASSUMED format (NS field present)
CWPAS – SIL contains entry for ‘f’. Label in CONCERNED format.
Test Criterion EATMP HMI section 3.2 (Label), 2.3.5.11-12 (SIL,MIL, MOL)
DSI HMI section 10.2.2.5
Environment Static: Traffic: Config:
Test Dependencies
8.4.4.2 STC_GRD_COORD_ACT_020 – Timed Activation RAP SBY Test Case ID STC_GRD_COORD_ACT_020
Ref. : TEC/….
eDEP_STP.doc
Date : DD/MM/YYYY
Page : 36
EUROCONTROL
STATUS TITLE Company Ref. Template : tec01.dot
Objective Timed Activation with LoA violation
Test Items eDEP platform (v1, v2, v3)
Input Specifications CWPTS – Assumed Flight ‘f’ @ 8 minutes to sector exit.
Note : Flight F has XFL set to a value that violates LoA
CWPAS – next sector for flight ‘f’
Output Specifications preconditions: CWPTS – Label in ASSUMED format.
CWPAS – Label in NON CONCERNED format
Intermediate conditions
(if we get around to implementing LAM/SBY)
post conditions
CWPTS –
Label in ASSUMED format (NS field present)
Line 0 (SYSCO field) has ‘EST’ in co-ordination colour
MOL reads “RAP XFL XXXXX”
CWPAS –
SIL contains entry for ‘f’.
Label in CONCERNED format. Refered data in co-ordination colour
Sysco field