13
1 TG MTS Model Test Suite v2.0 Wim Devos, Pavel Milenov, Slavko Lemajic, Paolo Isoardi Control and management of agricultural land in IACS workshop Baveno, 23-25 May 2016 1 Outline 1.Origins: from ATS over Model Based Testing to MTSv2.0 2.Parts: the scope and content 3.Operation: how do the tests work? 4.Delivery 2

S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

1

TG MTSModel Test Suite v2.0

Wim Devos, Pavel Milenov, Slavko

Lemajic, Paolo Isoardi

Control and management of agricultural land in IACS workshop

Baveno, 23-25 May 2016

1

Outline

1.Origins: from ATS over Model Based Testing to MTSv2.0

2.Parts: the scope and content

3.Operation: how do the tests work?

4.Delivery

2

Page 2: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

2

ORIGINSPart1

3

LPIS Workshop : Applications and Quality, 6-8 th October, 2009, Tallinn 4

Objective

Model Conformance TestConformance Statement (ICS)

Abstract Test Suite (ATS)

Data Conformance TestExecutable Test Suite

(ETS)Additional Information for Testing

Conformance Test Report

Analysis of results

Application Schema or

Feature Catalogue of the implementation

under test

ISO 19105 Conformance and testing

conformancefulfilment of specified requirements

conforming implementationimplementation which satisfies the requirement

Page 3: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

3

LPIS Workshop : Applications and Quality, 6-8 th October, 2009, Tallinn 5

ATS Structure

Abstract test suite structureModules and tests: e.g.

Module A.1.1 can be assigned ‘Conformant’ value if one of the tests A.1.1.1.1 OR A.1.1.1.2 OR A.1.1.1.3 OR A.1.1.2.1 is

‘Conformant’

Test purpose: verify definition of reference parcel: reference parcel demarcated by the farmer, who

cultivates it (manage/execute his tenure rights: ownership, rent etc.) on multi-annual basis

Test method: Inspect the documentation of the application schema or feature catalogue, verify

consistency with ICS

ATS NOTE: Conformant with Farmer’s block definition

Test reference: LCM specification

Test type: Basic test

A.1.1Basic test

A.1.1.1 A.1.1.2

A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1

Baveno 2015

ABSTRACT & EXECUTABLE TEST CASES act Prepare MTS executable test cases within ETS su ite

MS Testing labaratory

Integrated

administration and

control system :

Perform MTS

Prepare schema definition tests

Prepare extended code list tests

Prepare: encoding tests

Prepare feature type completeness tests

Prepare: attribute / association

completeness tests

Prepare: multiplicity tests

Prepare: value type tests

Prepare: constraint tests

Startpreparingexecutabletest cases

Finishpreparingexecutabletest cases

«FeatureType»Base types::ExecutableTestCase

«FeatureType»Base types::AbstractTestCase

Retrieve all abstract test cases

Retrieve: eligibility profile

documentation

«FeatureType»Quality assessment framework::

MtsItemUnderInspection

Prepare schema mapping test

«trace»

«trace»

«trace»

«trace»

«trace» «trace»

«trace»

«trace»

«trace»

«trace»

BASIS GML...

6

Page 4: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

4

Baveno 2015

MODEL BASED TESTING MESSAGES

MTS outputs:LOG of conformance assessment

FAIL: The source element (x) do not have corresponding target element.

FAIL: The type of target element (x) do not match the type of source element (x).

FAIL: The target element (x) do not have documentation.

FAIL: The multiplicity of target element (x) do not match the multiplicity of source element (x).

....

TESTING REPORT of conformance testing.... Overall testing report:

Application schema: (FAIL/PASS)Leaf: (FAIL/PASS)Feature type: (FAIL/PASS)Type: (FAIL/PASS)Data type: (FAIL/PASS)CodeList: (FAIL/PASS)Union: (FAIL/PASS)Enumeration: (FAIL/PASS)Property (classes): (FAIL/PASS)Relationships: (FAIL/PASS)Generalization: (FAIL/PASS)Assosciation: (FAIL/PASS)Aggregation: (FAIL/PASS)Composition: (FAIL/PASS)Property (Relationships): (FAIL/PASS)

7

Baveno 2015 WS and later

1. during the WS:� lot’s of feasibility concerns expressed by MS

� three volunteers: DE, HU and ES

2. meeting in ISPRA (April 2015):� Result of tests by the volunteers

� Main issues:

• Eligibility profile: catalogue of LC types > set of all system choices,

• MTS scope too broad: MMU, EFA

• LCM bases tests might require database redesign or SW recoding

• Implementation: software, user interface, cost-efficiency,

multilingual support

3. formal request to downsize UML on TG ETS (June 2015)4. ….and then everybody became busy (Autumn 2015+).

8

Page 5: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

5

PARTSPart2

9

Information on implementation choices: TG IXITimplementation extra information for testing

Why TG IXIT?

The ETS inspection procedure was conceptually quite simple:1. Detect critical defects

2. Map agricultural land of the RP that is physically visible

3. Compare observations with SUT records

4. Perform pass/fail test for 5 quality expectations

But over time, upon request of the MS, a number of design dependencies were introduced in the procedure for:

• CD application• waiver use• contamination

The module A111 was no longer sufficient� structured approach needed

10

Page 6: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

6

Data model test

Why data model tests?Support correct mapping of SUT to the TG ETS documentation1. the scoping query of the RP population

� “reference parcel”

� “non-zero MEA”

� “declared last year for direct payment”

2. The comparison of ETS observations� “recorded referenceArea”

3. The correct reporting and data exchange� lpisPointZerostate.gml

� lpisPolygonZerostate.gml

� updateEvidence.xml

If not correctly done: � comparing apples to pears

11

MetadataWhy metadata?

� Not new, first initiative in 2009 Sofia WS

(start LPIS QA portal)

� Annual questionnaire of DGAgri

� ECoA questionnaire

What is asked?� Not the INSPIRE metadata record for

cataloguing, but selected elements thereof.

� What is needed for understanding SUT

during screening

� “LPIS” + supporting sources (ortho/DEM)

This is the more dynamic part of TG MTS � standalone report: metadata.xsd

12

Page 7: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

7

OPERATIONPart3

13

Performing the IXIT

What are the key Regulation choices (features) of an LPIS design?� A Who authors the primary RP boundary, the perimeter of the

identified unit of land? � defines the reference parcel type

� B What process delineates the physical borders of the agricultural

land (generate the MEA/referenceArea)?

� C How are eligible landscape feature (under GAEC) adjudicated?

� D How are RPs assembled (optional further processing choices)?

� E Is there pro rata reduction on PG with scattered ineligible features?

� F Is there (Y/N) validation evidence of the required positional

accuracy requirements for each individual reference parcel perimeter?

by 6 poly(cho)tomy questionnaires A to F

� Each SUT is described by 6 adjectives

14

Page 8: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

8

Data model tests 1

Based on model testing methodology with 3 abstract test cases

1.featureTypeCompletness : Is this spatial object represented?

2.attributeTypeCompleteness: Does it have this information element?

3.codeListEnumerationCompleteness: Does it respect conventional naming?

Is the list complete?

“this spatial object”, ”this information element”, “naming” are abstract parameters. To obtain executable tests they need to appropriately rephrased in terms of both

� ETS data element (e.g. referenceParcel, referenceArea, CP)

� SUT (is “it” a feature type, an attribute, a method (combination, query)?)

The description of the 41 ETS data elements is stored in � UML (online)

� An xls log file15

Data model tests 2

To generate the SUT contribution to generate your 41 executable

tests, revise your 2010 ATS archive package, in terms of either� The SUT feature catalogue

� The SUT application schema

DG JRC already proposes 41 “starter” executable tests.

But then comes the tricky part:� Each starter’s may work “as is” for your SUT

� But “it” may not work for your SUT, in which case you will need to

rephrase that executable test to obtain the correct result

If not:

� you could test pears disguised as apples > papple!

� your ETS results become unreliable, i.e. non-conforming!

16

Page 9: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

9

Data model tests: an exampleLCM refers to1. an area value2. in hectares3. as maximum for DP-payment≡ input for IACS process

LPIS 2012 answer describes 1. an area value √2. in hectares √3. polygon of eligible land Х≡ output of a mapping process

This is a partial match� Potentially incorrect as it

misses the reference to DP application/payment

Example (LCM)

referencedAbstractTestCaseId 14

executableTestCaseId 1412

executableTestName checkReferenceArea

referencedAbstractTestCaseName attributeAndAssociationCompleteness

executableTestResultDescription

Feature class "referenceParcel" within

LCM has attribute field "ReferenceArea".

Value in this field corresponds to the

maximum area, for which an application

can be made and it is reported in

hectares

Example (LPIS 2012)

referencedAbstractTestCaseId 14

executableTestCaseId 1412

executableTestName checkReferenceArea

referencedAbstractTestCaseName attributeAndAssociationCompleteness

executableTestResultDescription

Feature class "ELG_AREAS" within SUT

has attribute field "GR_AREA". Value in

this field corresponds to the total

geometric area of all polygons

representing eligible lands (GAC

compliant) for a given reference parcel

and it is reported in hectares

consequence of incomplete matchfree adaptation from various real experiences

1. How to scope (non-zero MEA)?� No problem if there are no other mechanisms between this eligible area

and pre-printed form/crosscheck/payment

� But what if “eligible area” includes to 2nd pillar conditions?

� JRC: “wrong scoping”

� MS: “no problem, because not available for payment”

� JRC: sample size not respected (650 + 150 ≠ 800 )

2. How to assess observed area conformity?� No problem if “eligible area” relates to all 1st pillar conditions

� But in this case:

• MS: “Pre-printed form populated with last year application area”

• MS: “crosscheck done with intersection of AP and RP”

� MS: “no problem because no impact on payment”

� JRC: LPIS is not performing its basic roles at all! 18

Page 10: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

10

Metadata collection setup

This is a data model test, but reference descriptions located in:• Not LCM• Inspire Implementing Rules on metadata• Inspire Common Data Specification for orthoimagery (CP, DEM)• Ad hoc: ixitQualifier, sensorPlatform, demPlatform

Why different reference models and templates?The LPIS (perimeter1 and/or border2) vector data

supported by orho-imagery3

supported by digital elevation model4 (DEM/DSM)

Expectation: 2-4 or more metadata records in 2 data formats:� Vector

� Raster/grid

19

Metadata content

For each named dataset (from IR MD):temporalExtent/phenomenonTime: to which time period it refers to?

resourceLocator: where does it reside?

identifier: what’s its content/role?

lineage: what is the production line?

spatialResolution: what scale or GSD?

responsibleParty: who’s behind?

metadataPointOfContact: who to contact?

The regulations imply at least two metadata records1.LPIS: the vector dataset of polygons that hold MEA > 0

2.orthoimagery: the background(s) used during the assessment year

But applicable third party geospatial data (cadastre / land cover / dem) also need to be reported.

20

Page 11: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

11

MTS dynamics

IXIT, data model and LPIS vector metadata relate to a single SUT (≡lot) that is subject the TG ETS

� TO DO for 2016� Until upgrade: only temporalExtent will be dynamic through update

Aerial imagery is often subject to a 3 year update cycle � one record by separate specification/lineage/contractor� annual deliveries (and geographic extent) are not requested

Supporting and Source data are independent� Monitor the update of these external sources

21

2016 DELIVERYPart4

22

Page 12: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

12

Typical MTS data volumes

TG IXIT: 6 values (outcomes)

Data model test 41 values (local references/descriptions)41 x 4 codes (internal date/pass)

Metadata 10 values/codes x 3 (vector/ortho/source)

GRAND TOTAL 6 + 41 + 30 ≈ 80 values

23

MTS package and update deliveryDeadlines:

MTS package is synchronous with ETS reporting package.

Data exchange:

For 2016 (@31 Dec 2017) 1st complete delivery: 2 options• Either 3 xsd’s package upload (benefit: immediate validation by portal)

• Or 1 xls followed by manual entry in LPIS QA portal (disadvantage: 2 edits)

After upgrade of SUT (@31 Dec 2018,2019,2020): package re-upload

After update of dynamic data values (@31 Dec 2018,2019,2020) for� temporalExtent of SUT� change of image specification/lineage/contractor� change of third party metadataUpdate of affected values via LPIS QA portal application

24

Page 13: S5 Devos MTS2 - European CommissionA.1.1 Basic test A.1.1.1 A.1.1.2 A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1 Baveno2015 ABSTRACT &EXECUTABLE TEST CASES act Prepare MTS executable test

13

Questions?

25