21
Project funded by the S2R JU S2R-OC-IP2-02-2015 IT virtualisation of testing environment Grant Agreement: 730815 - VITE VITE: Virtualisation of the Test Environment Demonstration plan Issue 1.0 Date 10/08/2018 Number of pages 21 Classification PU Document Reference Project Work package Partner Nature Number VITE WP4 INECO Deliverable 4.1 Partner reference (optional) Responsible Name/Company Signature Date Author INECO/ TBD WP Leader F. Nazzareno / RFI TBD Project coordinator B Sierra / INECO TBD S2RJU Project Officer Lea Paties TBD Ref. Ares(2018)4556873 - 05/09/2018

VITE: Virtualisation of the Test Environment

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Project funded by the S2R JU

S2R-OC-IP2-02-2015 – IT virtualisation of testing environment

Grant Agreement: 730815 - VITE

VITE: Virtualisation of the Test Environment

Demonstration plan

Issue 1.0 Date 10/08/2018

Number of pages 21 Classification PU

Document Reference

Project Work package Partner Nature Number

VITE WP4 INECO Deliverable 4.1

Partner reference (optional)

Responsible Name/Company Signature Date

Author INECO/ TBD

WP Leader F. Nazzareno / RFI TBD

Project coordinator B Sierra / INECO TBD

S2RJU Project Officer Lea Paties TBD

Ref. Ares(2018)4556873 - 05/09/2018

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 2 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

DOCUMENT CHANGE LOG

Issue Date Affected Sections Comments

0.1 28/05/2018 All First draft

0.2 31/05/2018 4.6 NoBo contribution

0.3 30/06/2018 All Scenarios description

0.4 20/07/2018 All Final draft for review

1.0 10/08/2018 All Final draft for delivery after WP4 partners comments

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 3 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

TABLE OF CONTENTS

1 INTRODUCTION ....................................................................................................... 5

1.1 Purpose ...................................................................................................................... 5

1.2 Intended audience / Classification ............................................................................... 5

1.3 Associated documentation .......................................................................................... 5

1.4 Abbreviations and Acronyms ...................................................................................... 6

2 DEMONSTRATION PLAN OVERVIEW ..................................................................... 8

3 PRE TESTING CAMPAIGN ....................................................................................... 9

3.1 Specific objectives ...................................................................................................... 9

3.2 Methodology ............................................................................................................... 9

3.3 Test specification ........................................................................................................ 9

3.4 Execution, analysis and results ................................................................................. 10

3.5 Evaluation ................................................................................................................. 10

4 OPERATIONAL TEST CAMPAIGN ......................................................................... 11

4.1 Specific objectives .................................................................................................... 11

4.2 Methodology ............................................................................................................. 11

4.3 Test specification ...................................................................................................... 11

4.3.1 Test specification for Demo CEDEX – MULTITEL ............................................. 12

4.3.2 Test Specification Demo RINA – RFI ................................................................. 14

4.4 Execution, analysis and results ................................................................................. 16

4.5 Evaluation ................................................................................................................. 16

4.6 Assessment from Notified Bodies ............................................................................. 16

4.6.1 Test platform documents ................................................................................... 16

4.6.2 Traceability matrix .............................................................................................. 17

4.6.3 Test description/protocol .................................................................................... 18

4.6.4 Test report ......................................................................................................... 18

4.6.5 Release Note ..................................................................................................... 19

5 REPEATABILITY TEST CAMPAIGN ....................................................................... 20

5.1 Specific objectives .................................................................................................... 20

5.2 Methodology ............................................................................................................. 20

5.3 Test specification ...................................................................................................... 20

5.3.1 Test specification for Demo CEDEX – MULTITEL ............................................. 20

5.3.2 Test Specification demo RINA – RFI .................................................................. 20

5.4 Execution, analysis and results ................................................................................. 20

5.5 Evaluation ................................................................................................................. 21

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 4 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

LIST OF TABLES

Table 1. Test cases proposed ....................................................................................................... 12

Table 2. Test cases for CEDEX-MULTITEL campaign (ADIF)....................................................... 14

Table 3. Test cases for RFI-RINA campaign (RFI) ........................................................................ 15

Table 4. Test cases for Repeatability test campaign ..................................................................... 20

LIST OF FIGURES

Figure 1: WP4 Logic ....................................................................................................................... 5

Figure 2: Demo overview ................................................................................................................ 8

Figure 3. Platform Release Note content ...................................................................................... 17

Figure 4. Release Note content ..................................................................................................... 19

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 5 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

1 INTRODUCTION

1.1 Purpose

The purpose of this deliverable is to provide the description and the specification of the test campaign demonstration to be performed in WP4. Within this task, a basic set of relevant common test will be defined and grouped in different scenarios to achieve the proper demonstration by virtual tests. WP4 has an important interaction between WPs as the scheme shown below:

Figure 1: WP4 Logic

This document includes all the information needed for the execution of the Test campaign (connection and communication between labs, list and reasons of the selection of test cases and limitations of the labs) for both basic demos.

1.2 Intended audience / Classification

This document is public.

1.3 Associated documentation

[1] VITE-WP2-INE-DEL-2 2-v1.0_Test process framework

[2] VITE-WP2-CED-DEL-2.3-v0.2-Description of the Test Dictionary

[3] VITE-WP3-CED-DEL-3.2-v1.3_Lab architecture Specification

[4] VITE-WP3-CED-DEL-3.3-v1.2-Description of the project data format

[5] VITE-WP3-CED-DEL-3.4-v1.0-FFFIS for the critical interfaces in the distributed lab architecture for remote testing

[6] D3.7 SW tool to create scenarios

[7] D3.8 SW tool to manage project data

[8] D3.10 SW tool for FFFIS compliance evaluation

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 6 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

[9] D 3.11 Adaptations to new architecture and results of the tests performed with the tool for FFFIS compliance evaluation

[10] General requirements for the competence of testing and calibration laboratories (ISO/IEC 17025:2017)

[11] Subset-110 UNISIG Interoperability Test – Guidelines v3.6.0

[12] VITE-WP4-INE-TEC-001-v1.0_Scenarios for CED-MUL demo.

1.4 Abbreviations and Acronyms

B2 Baseline 2

B3 MR1 Baseline 3 Maintenance release 1

CES Conditional Emergency Stop

EVC European Vital Computer

EoA End of Authority

FIS Functional Interface Specification

FFFIS Form Fit Functional Interface Specification

FS Full Supervision mode

IXL Interlocking

LAB Laboratory

LEU Lineside equipment Unit

L1 / L2 / LSTM Level 1 / Level 2 / Level STM

MA Movement Authority

NOK Not OK

OB Onboard

OBU Onboard unit

OS On sight mode

OTC Operational Test case

RBC Radio Block Center

SB Stand By mode

SoM Start of Mission

SH Shunting mode

SPAD Signal passed at danger

SR Staff responsible Mode

SSP Static Speed Profile

STM Specific Transmission Module

SW Software

S2R Shift 2 rail

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 7 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

TAF Track Ahead Free

TD Technical Demonstrator

TIU Train Interface Unit

TSR Temporary Speed Restriction

TR Trackside / Trip mode

VITE Virtualisation of the test Environment

V_LoA Speed at the Limit of Authority

WP Work Package

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 8 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

2 DEMONSTRATION PLAN OVERVIEW

The overall objective of WP4 within the framework of S2R-OC-IP2-02-2015 IT virtualization of testing environment project is to prove lab testing capabilities of the architecture and methods proposed in WP2 and WP3.

defining a demonstration plan and a set of common test specifications

carrying out 2 basic demos

collecting and analysing test results

drafting a test report and conclusions

To fully achieve the objective WP5 defined also some requirements and guidelines that are included in section 4.6.

Three sets of tests are defined, according to the objective of the test are performed

Pre-testing campaign: Interface, communication and synchronisation tests

Operational test campaign: execution of selected operational scenarios

Repeatability test campaign: repetition of a dedicated scenario during one day

The test campaign involves on board units developed by two different suppliers and two lines in Spain and Italy, equipped by different suppliers. Furthermore, involves 4 type of facilities independent from any suppliers with a view to ensuring transparency in regard to testing process and results.

The different actors and resources involved in VITE demonstration as well as the functions assigned to each one are:

OB laboratory: RFI and RINA

Trackside laboratories: CEDEX and MULTITEL

Revision and Evaluation companies: rest of WP4 companies

The following connections are foreseen:

Figure 2: Demo overview

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 9 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

3 PRE TESTING CAMPAIGN

3.1 Specific objectives

The specific objective of the Pre-testing campaign is to check that the architecture defined is detailed enough to allow for a smooth and timely connection between laboratories

3.2 Methodology

In order to test the established connection between laboratories and to obtain a fluid and stable testing campaign, it has been agreed to carry out a pre-testing phase. This proof of concept tests shall include interface, communication and synchronisation tests in order to assess the VITE architecture proposed, as well as preparation of the operational test campaign.

The tests will be carried out in the four labs, some tests apply only to an Onboard lab or to a trackside lab.

3.3 Test specification

The following tests are proposed:

Integration tests between onboard unit and Laboratory: the following checks will be performed

1. Check TIU Adaptor

2. Check Odometry

3. Check Balise Interface

4. Check Service Brake

5. Check Emergency Brake

6. Check Main Switch Off

7. Check Pantograph

8. Check Air Tightness

9. Check Inhibit regenerative brake

10. Check Inhibit eddy current brake

11. Check Inhibit magnetic shoe brake

12. Check Radio Session management

13. Check Radio Link Supervision

The detailed specification of each of these tests is included in the internal documentation for the

onboard labs (MULTITEL and RINA).

Integration tests between trackside and laboratories -

For the trackside part the issues to be taken into account are project data and the IXL/RBC

connection to the laboratory test bench.

In both trackside labs the RBC to be used for the demo is a real RBC.

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 10 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

The detailed specification of these tests is included in the internal documentation for the trackside

labs (CEDEX and RFI).

Labs integration tests – To test the correct connection between the labs, it has been agreed to perform at least the following checks

Check 1. Check communication between the simulation modules in both labs. 2. Check bridge for the euroradio communication. 3. Check the syntax of the messages exchanged between TCL and OBU ADAPTOR as

specified by Subset-111-2 with the developed tool [8]. 4. Check remote display of the OB LAB simulation modules (Balise Transmission Simulator,

Train Simulator and DMI interactive module) to enable control of the simulation from TR LAB 5. Perform RBC connection and disconnection test 6. Perform one movement across the line test, including SoM and reception of a MA

3.4 Execution, analysis and results

The tests will be performed by the corresponding lab personnel and the results of the tests will be detailed in deliverable D4.2.

Once this pre-test phase has been completed, and verified that there have been no issues with laboratory connections, the operational test campaign should be launched.

3.5 Evaluation

The results from the pre-testing in WP4 shall establish that the architecture defined in WP3 allows for a smooth and timely connection between labs or otherwise provide some recommendations for improvement of the architecture proposed.

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 11 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

4 OPERATIONAL TEST CAMPAIGN

4.1 Specific objectives

The specific objective of the operational tests campaign are: - To check of the effectiveness of the architecture and related interfaces developed in WP3

including the remote testing approach to meet the actual expectations of the railway sector. In particular:

o Configuration management shall be integrated within the lab architecture o The equipment tested in the laboratory shall be the real equipment installed onsite.

The additional equipment necessary to perform the tests shall be as similar as possible to the systems as installed onsite, this can include virtual equipment if demonstrated that the conditions are analogue to the onsite conditions

o Before the execution of any testing campaign, the specific laboratory shall present a document containing the technical limitations associated to the execution of the tests. This document shall be annexed to the Test report

o Test results shall be comparable and therefore need to be identified in the harmonised interfaces

o The laboratory test report should include not only what has been tested but also how it has been tested to increase the confidence in the results. These include the configuration information as mentioned before (e.g. SW release notes), numbers of tries in a test, traceability to the requirements, etc

o minimising the effect of human error for the testers, by automatizing the tests, will contribute in the acceptance of the test results by all stakeholders

- To gather data for the uncertainties analysis

4.2 Methodology

These test campaign is composed of 2 basic demos connecting RINA, RFI, CEDEX and MULTITEL in groups of 2 involving the lines Treviglio – Brescia (RFI – RINA) and Venta de Baños – Burgos (CEDEX – Multitel).

The main equipment involved in the demonstration CEDEX-MULTITEL is:

Trackside equipment (RBC and IXL simulators) available at the Laboratory of Cedex in Madrid (Spain). The Trackside equipment used will be according to the Base Line (BL) 2.3.0d of the ETCS Specification.

On-Board equipment (On-Board Unit) located at the Laboratory of Multitel in Mons (Belgium). The On-Board unit used will be according to the Baseline 3 of the ETCS Specification;

The main equipment involved in the demonstration RFI-RINA is:

Trackside equipment (RBC and IXL simulators) available at the Laboratory of RFI in Rome (Italy). The Trackside equipment used will be according to the Base Line (BL) 2.3.0.d of the ETCS Specification

On-Board equipment (On-Board Unit) located at the Laboratory of RINA in Genoa (Italy). The On-Board unit used will be according to the Base Line 3 MR1 (3.4.0) of the ETCS Specification.

4.3 Test specification

The selection of the OTCs comes from the work that has been performed by INECO and DICEA as a contribution of the work done together for WP2. From this input, 13 categories of test cases have been selected as a priority for their execution during the demo campaign.

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 12 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

Category Functionality

1 OS Protection Mode transition. Verify the EVC switches modes from FS/SR to OS and that the start location and the length of the OS are is defined according to the infrastructure requirements.

2 OS Protection Mode transition. Exits from On sight are to FS/SH at a signal.

Transition to SH mode ordered by the RBC.

3 Override Mode transition from OS/ FS to TR when perform a SPAD at an EoA.

4 Override Override with authorization in FS/PT/OS/TR mode. Verify the "Override" function is available.

5 Degraded Scenarios Unconditional emergency stops.

6 Degraded Scenarios Conditional emergency stops.

7 Degraded Scenarios Emergency stop revocation.

8 Degraded Scenarios Co-operative shortening of MA.

9 Degraded Scenarios Shortened MA.

10 Movement authority description Supervision of the static speed profile with the train running from the beginning to the end of the line at maximum speed.

11 Movement authority description V_LoA

12 Movement authority description Danger point (including release speed) information is defined correctly when the EoA ends at a signal or a buffer stop.

13 Movement authority description MA after a SoM with known position.

Table 1. Test cases proposed

4.3.1 Test specification for Demo CEDEX – MULTITEL

For the specific demo carried out between CEDEX and MULTITEL, the following list of test cases has been selected. Additionally to the test cases mentioned in the table above, the list has been completed with 16 OTCs, that according to experience of ADIF and RENFE, were considered necessary to complete a testing campaign.

With this list of test cases, 10 scenarios have been created in which the tests are grouped together by concatenating the different events.

The scenarios are included in document [12] VITE-WP4-INE-TEC-001-v1.0_Scenarios for CED-MUL demo.

OTC ID Test Case description Scenario Category

2.8.15 Mode transition from OS to FS at a main light signal

Scenario 9 OS protection (1)

2.10.2 Mode transition from FS to OS at a further location. The driver acknowledges the request before reaching the OS area.

Scenarios 5 and 9

OS protection (2)

2.10.3 Mode transition from FS to OS at a further location. The driver does not acknowledge the request of OS mode.

Scenario 6 OS protection (2)

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 13 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

OTC ID Test Case description Scenario Category

2.13.2.1 Perform a SPAD at an EoA. Mode transition from OS to TR at stop light signal.

Scenario 5 Override (3)

2.22.6.4 Perform a SPAD at an EoA when the balise group at the main light signal is missed.

Scenario 6 Override (3)

2.13.1.1 Perform a SPAD at an EoA. Mode transition from FS to TR at stop light signal

Scenario 5 Override (3)

2.14.1.1 Override with authorization. FS mode. The radio communication session is established with the RBC.

Scenario 3 Override (4)

2.14.2.1 Override with authorization. OS mode. The radio communication session is established with the RBC.

Scenario 6 Override (4)

2.19 Unconditional emergency stop and movement authority is revoked.

Scenario 8 Degraded Scenarios (5 and 7)

CN4 Conditional Emergency Stop when the train occupied the track circuit.

Scenario 5 Degraded Scenarios (6)

29.1 Cooperative shortening of MA. The train accepts the new MA

Scenario 8 Degraded Scenarios (8)

2.3.1.1 MA update in FS mode. New EoA at a light signal.

Scenario 3 Degraded Scenarios/Shorten MA (9)

2.1.2 Static speed profile supervision Scenarios 2 and 10

Supervision / MA description (10)

2.15.1.3 Level transition from L2 to L0. Scenario 1 Level transition/ MA description-V_LOA (11)

2.15.1.3 closed signal

Level transition from L2 to L0. The light signal at the transition border is displaying a stop aspect.

Scenario 8 Level transition/ MA description-V_LOA (11)

2.22.6.1 Supervision of the release speed and distance to danger point in FS mode. Fixed value given by trackside. Normal conditions.

Scenario 5 Supervision/ MA description (12)

2.8.7 SoM in SB mode. The train is in front of a light signal and the location information is valid.

Scenario 3 SoM (13)

2.6.1.2 Management of TSR information sent by the RBC in level 2. FS mode.

Scenario 4 TSR

2.6.3.2 Management of TSR information sent by the RBC in level 2. OS mode.

Scenario 5 TSR

2.6.5.2 Management of the overlapping TSR information sent by the RBC in level 2.

Scenario 3 TSR

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 14 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

OTC ID Test Case description Scenario Category

2.8.8 SoM in SB mode. The train is in front of a light signal and without train location information.

Scenario 1 SoM

2.8.14.1 Mode transition from SR to FS at a main light signal.

Scenarios 3 and 7

Staff Responsible

2.9.1 Mode transition from FS to SB Scenario 3 EoM

2.12.1.2 Mode transition from FS to SH selected by the driver.

Scenario 7 Shunting

2.12.2.6.1 Mode transition from OS to SH at a further location ordered by trackside. The driver acknowledges the request before reaching the SH area.

Scenario 4 Shunting

2.13.3.1 Movement protection in SR ("Stop if in SR")

Scenario 3 Staff Responsible

2.15.1.4 Level transition from L0 to L2. Scenario 2 Level transition

2.15.1.4 closed signal

Level transition from L0 to L2. The light signal at the transition border is displaying a stop aspect.

Scenario 7 Level transition

2.23.1 Start and end conditions of the text messages given by trackside. Approaching powerless section. FS mode.

Scenario 1 Text message

2.26.21 Supervision of the national values transmitted from the trackside.

Scenario 10 Rules for balises (linking)

2.32.10.1 Expiration of T_NVCONTACT with successful attempt to set-up safe connection.

Scenario 6 Degraded/National values

18.3.2 Management of the default balise information. LEU disconnection.

Scenario 9 Degraded Scenarios

22.3 Track conditions: verification of track conditions in the complete line

Scenarios 2 and 10

Track conditions

Table 2. Test cases for CEDEX-MULTITEL campaign (ADIF)

4.3.2 Test Specification Demo RINA – RFI

For the specific demo to be carried out between RINA and RFI, the following list of test cases has been selected to match the 13 categories in 4.3. In addition Test cases 15, 9 and 36 have been added as a recommendation from RFI.

Category Functionality OTC ID (RFI)

1 OS Protection Mode transition. Verify the EVC switches modes from FS/SR to OS and that the start location and the length of the OS are is

ST_MEC_03_L290_F2

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 15 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

Category Functionality OTC ID (RFI)

defined according to the infrastructure requirements.

2 OS Protection Mode transition. Exits from On sight are to FS/SH at a signal.

Transition to SH mode ordered by the RBC.

TBD

3 Override Mode transition from FS to TR when perform a SPAD at an EoA.

TBD

4 Override Override with authorization in FS/PT/OS/TR mode. Verify the "Override" function is available.

TBD

5 Degraded Scenarios Unconditional emergency stops.

ST_MEI_01_L290_F2

6 Degraded Scenarios Conditional emergency stops. ST_MEC_01_L290_F2

ST_MEC_01_L290_F1

7 Degraded Scenarios Emergency stop revocation. ST_MEI_02_L290_F1

ST_MEI_02_L290_F2

8 Degraded Scenarios Co-operative shortening of MA. TBD

9 Degraded Scenarios Shortened MA. TBD

10 Movement authority description Supervision of the static speed profile with the train running from the beginning to the end of the line at maximum speed.

ST_MA-FS_01_L290_F1

ST_MA-FS_01_L290_F2

ST_MA-FS_02_L290_F1

ST_MA-FS_02_L290_F2

ST_SOM_03_L290_F2 (valid)

ST_SOM_03_L290_F1 (valid)

ST_SOM_04_L290_F2 (valid)

11 Movement authority description V_LoA

12 Movement authority description Danger point (including release speed) information is defined correctly when the EoA ends at a signal or a buffer stop.

13 Movement authority description MA after a SoM with known position.

Additional proposal

TSR ST_TSR_01_L290_F2

Additional proposal

Level transition ST_LT/L2_01_L290_F2

ST_LT/L2_01_L290_F1

ST_L2/LT_02_L290_F2

ST_L2/LT_02_L290_F1

Additional proposal

Degraded scenarios TBD

Table 3. Test cases for RFI-RINA campaign (RFI)

The test cases description is proprietary of Ansaldo/RFI.

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 16 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

4.4 Execution, analysis and results

The corresponding lab personnel will perform each demo test campaign in cooperation with other members of WP4. Once the test campaign has been executed, WP4 partners will proceed to analyse the results The results obtained in the test campaign will be detailed in deliverable D4.3.

These results will contribute also to the uncertainties methodology, for both approaches:

- A priori approach: by gathering and analysing the collected data in the test feedback form for each scenario.

- A posteri approach: with the results from the operational test cases

After the execution and analysis of the results, participant members of WP5 could complete the assessment on the demonstration results obtained within WP4, see section 4.6.

4.5 Evaluation

The evaluation of the results will be done, not with a focus on the product in which the importance of the test campaign is the result of OK or Not OK from an operational and functional point of view, but with a focus towards the requirements established in the WP2.

4.6 Assessment from Notified Bodies

Even if the test campaigns defined are out of scope of the NoBo assessment as are not related to the compliance of the TSI requirements, NoBos involved in the Project and participating in WP5 intend to perform an analysis about the acceptance of these tests based in the strategy and methodology shown in the demonstrations.

Even if the laboratories involved are accredited over the ISO/IEC 17025 standard [10], the availability of this accreditation does not demonstrate the equivalence of the laboratory test conditions and configuration with those taking place on-site and, therefore, the ISO/IEC 17025 accreditation cannot be considered as a sufficient criterion to accept the shift of tests from on-site to laboratory.

Taking that into account, the demonstration of the complete control of the laboratory test conditions and configuration and the demonstration of those conditions and configuration in laboratory are fully controlled and are the same as on-site can increase its reliability.

This demonstration is proposed to be carried out through the documentation to be generated by the laboratories at all the stages of the test campaigns. Its harmonization in terms of the identification of the compulsory documents to be produced as well as on that of their minimum content can help all the stakeholders to easily review, identify and know the test environment, automatically increasing the laboratory tests reliability. Thus, this documentation will reinforce, collect and demonstrate the maturity, test monitoring and quality which can be reached by performing tests campaigns in laboratories.

In the following sections it is proposed the set of documentation which is expected to be presented by laboratories during and after the test campaigns in WP4. This documentation shall be understood as the minimum set to be generated as well as their minimum content.

4.6.1 Test platform documents

Platform shall be correctly configured and available prior to start any test campaign. Due to that, the platform associated set of documents shall collect all needed information and steps to set up in a proper way those test benches to be used during test campaigns. This includes a comprehensive description of the test benches, a test description detailed including the protocol to be followed in order to check the correctness of the platform functioning and the associated test report.

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 17 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

Therefore, the validation of the platform shall be demonstrated by, at least, issuing the following documentation:

Platform description, which shall contain the full architecture in terms of hardware and software, including the tools used for the data conversion. All the components and tools involved shall have an associated passport document for the demonstration of its control and validation. This, particularly, shall include the detailed description all the platform-DUT adaptors required for the test implementations.

The version of the data used for the configuration of the platform shall be also part of this document as, for instance, version of the track plan, list of balises, SSP or gradient.

In case it is not possible to simulate the real environment, limitations shall be also identified as thoroughly as possible in this documentation. A comparison of these limitations shall be done versus campaign needs, in order to conclude about their “fitness for purpose” and null impact on the results to be reached.

Platform test description which is the protocol defining the tests to be carried out in order to verify the correct configuration and integration of the platform, necessary before starting tests. This document shall include the version of the document “Platform description” to which is linked.

NOTE: This document is the same as the one presented in [11] as “Integration/Tuning test scenarios” (§ 5.3.4.3).

Platform test reports with the results of the test performed compared with the expected results and the reference to the test log (if any). This document shall reference the version of the document “Platform test description” used for the execution of the tests.

NOTE: This document is the same as the one presented in [11] ¡Error! No se encuentra el origen de la referencia. as “Integration/Tuning test results” (§ 5.3.5).

Platform Release Note which shall basically collect and link the versions of previous documents: platform description, platform test description(s) and platform test report(s).

Figure 3. Platform Release Note content

4.6.2 Traceability matrix

In order to verify that the tests carried out cover the target defined for a specific campaign, it shall be established a traceability between the requirements which are intended to be checked and the tests to be performed. The easiest way to demonstrate this traceability is by means of a matrix which collects all the requirements of the campaign and correlates them with the test activities. A coverage of the 100% will ensure that the execution of the tests demonstrates the compliance with the campaign requirements.

The requirement validation table shall contain, at least:

Platform description

Platform test description(s)

Platform test report(s)

Platform RELEASE NOTE

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 18 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

Information to identify requirements, as for example:

o Requirement identifier (e.g., paragraph identifier in the normative reference);

o The reference and version of the document from which the requirement has been captured.

The validation/proofing reference, which can be the identifier of a test case where the requirement is probed, the protocol over which the requirement is tested, etc.

This table shall be reviewed in case protocols are updated, etc.

4.6.3 Test description/protocol

The test description/protocol shall contain the description of the steps needed to perform the tests and the expected results.

The granularity of this description shall be chosen by the laboratory but at least shall allow to identify unambiguously all the steps required to execute and reproduce correctly the test.

Test description shall also define the initial conditions and the expected result together with the input documents (reference and version) used for its elaboration.

4.6.4 Test report

The test report shall summarize the results obtained after completing the steps defined in the associated test description/protocol.

Test reports shall contain:

A reference to the test description (see § 4.6.3);

A reference to the applicable platform configuration (i.e., platform release note, § 4.6.1);

The result of configuration verification, necessary before starting tests (if any);

The results of tests performed, with the comparison of the expected results, and the reference to the test log (if available).

The results shall be summarized in a systematic and synthetic way as, for instance, a table indicating the positive (OK) or negative (NOK) result. The NOKs could be linked to an open point or change request which shall have an identifier. This identifier can be also added to this document.

NOTE: This document is the same as the one presented in [11] as “Execution test report” (§ 5.4.5).

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 19 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

4.6.5 Release Note

A Release Note shall be generated at the end of the test activities. Its target is to link the version and reference of the documents involved in a certain test campaign, as a final photo of the validation activities carried out. These documents are, at least, the Platform Release Note, the traceability matrix, the test description(s) and the test report(s).

Figure 4. Release Note content

Platform RELEASE NOTE

Test description(s)

Test report(s)

RELEASE NOTE

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 20 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

5 REPEATABILITY TEST CAMPAIGN

5.1 Specific objectives

The specific objectives of this test campaign are: To gather data for the uncertainty methodology and see if some conclusions can be extracted regarding the repeatability of the results obtained in different independent laboratories

5.2 Methodology

In order to define which tests can be considered as candidates for the laboratory rather than on-site, based on the quantification of test uncertainty indicators, we will proceed to repeat some functionalities during the test campaign.

5.3 Test specification

From the 13 functionalities that have a priority to be tested in the VITE project, we will focus for this exercise on two of them. These functionalities will be tested through the OTCs shown below during 1 day in the lab:

Category Functionality

On-Sight Protection Mode transition. Verify the EVC switches modes from FS/SR to OS and that the start location and the length of the OS are is defined according to the infrastructure requirements.

On-Sight Protection Mode transition. Exits from On sight are to FS/SH at a signal. Transition to SH mode ordered by the RBC.

Override Mode transition from SH/ FS to TR when perform a SPAD at an EoA.

Override Override with authorization in FS/PT/OS/TR mode. Verify the "Override" function is available.

Table 4. Test cases for Repeatability test campaign

5.3.1 Test specification for Demo CEDEX – MULTITEL

CEDEX/MULTITEL to choose a test case on the categories above.

5.3.2 Test Specification demo RINA – RFI

RINA/RFI to choose a test case on the categories above.

5.4 Execution, analysis and results

The tests will be performed by the corresponding lab personnel and the results of the tests will be detailed in deliverable D4.3.

These results will contribute to the uncertainties methodology, for both approaches:

- A priori approach: by gathering and analysing the collected data in the test feedback form for each scenario. Will be limited information as only 1 or 2 scenarios are foreseen for this campaign.

- A posteri approach: with the results from the executed operational test cases

Demonstration plan

Ref: VITE-WP4-RFI-DEL-4.1

Issue: 1.0 Date:

10/08/2018

Class: PU Page 21 / 21

VITE: Virtualisation of the Test Enviornment Grant Agreement No: 730815

5.5 Evaluation

The evaluation of the results will be done, against the test campaign objectives and assess if

- quality data is collected that contributes to the uncertainty methodology - results contribute to the quantification of assessment uncertainty, ie confidence in the

repeatability of the results

END OF DOCUMENT