72
Edition: A Date: 29 May 2015 Status: Draft Main Report LEVEL 1 & 2 PERFORMANCE EVALUATION NGSP S-Band PSR: DPR & (M)SSR: DTI529 - CBR

Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

  • Upload
    dotu

  • View
    224

  • Download
    8

Embed Size (px)

Citation preview

Page 1: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Edition: A Date: 29 May 2015 Status: Draft

Main Report

LEVEL 1 & 2 PERFORMANCE EVALUATION

NGSP S-Band PSR: DPR & (M)SSR: DTI529 - CBR

Page 2: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 2/14

IMS-600-10

Revision 1.0: 23/05/2013

Blank Page

Page 3: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 3/14

IMS-600-10

Revision 1.0: 23/05/2013

EXECUTIVE SUMMARY

As stated in EUROCONTROL (SUR.ET1.ST01.1000-STD-01-01 & SUR.ET1.ST03.1000-STD-01-01) and ICAO (Doc 8071 Vol. III), comprehensive and continuous Radar coverage of high quality and reliability is essential for the uninterrupted provision of Radar services and the application of specific Radar separation standards. Surveillance systems require sophisticated and permanent performance monitoring in real time as well as ongoing subsystem-level Performance Evaluations.

This System (Level 1) and Subsystem (Level 2) Performance Evaluation provides the results of the NGSP (CBR) ATC Radar as measured between 18th and 22nd May 2015. The CBR Radar is a co-located IE (M)SSR (DTI529) and PSR (DPR) System. The (M)SSR operates to 256NM and the PSR to 100NM.

The Level 1 & 2 analysis and results can be found at Annex A and Annex B, respectively.

The reference for performance analysis is bound to the IEAP Quality of Service (QoS). Similarly, the Commissioning Volume (CV) is also bound to the QoS. Therefore, analysis is conducted to 250NM for (M)SSR and 60NM for PSR. Further information pertaining to antenna back angles and flight levels assessed can be found at Annex A.

As this type of analysis and report has been conducted annually, the information provided can be used to determine performance, improvements and degradations of the Radar. Further, this report acts as a reference/guide to areas that Maintenance can focus on for necessary improvement, if any. It also provides useful information and focus to OEM service personnel.

As a whole, the (M)SSR radar falls within specification and performs well, with the exception of the level 1 RMS Range Statistics, albeit minor. Level 2 analysis revealed that the performance of the encoder system requires attention, as errors in azimuth exceed 500 meters to the East and North-West. This mechanical measurement has been unobtainable during several former opportunities and it is understood that the unit itself is not responsible for the corrective maintenance activity required; however, the unit and/or maintenance authority should begin pushing for corrective action. Further, ‘jitter’ is also significant and will be adding unwanted strain to the interrogators ability to track aircraft.

PSR performance is a safety critical system for management of Violation of Controlled Airspace (VCA) as well as aircraft separation within the coverage volume. Results detailed the overall PSR performance to be approx. 4-6% lower than that achieved during the 2013 optimisation. Admittedly not a momentous degradation, it does highlight the requirement for continuous monitoring and improvement. For lower level aircraft (below 1,500ft) the performance was 6-8% lower. Annex A details the performance in two states of configuration, of which both were satisfactory.

As with any Radar System new or legacy, the key to providing Radar data of a high quality is ongoing maintenance and monitoring. Radar performance changes on a continual basis in real-time. This can be a result of a failure, degradation or the environment (i.e. weather, haze, new obstructions, other RF sources, telecommunications – e.g. 4G or PCs, etc). A regime of preventative maintenance (Optimisation/Training) in conjunction with classical Radar performance improvement through Optimisation and performance assessment is essential.

Page 4: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 4/14

IMS-600-10

Revision 1.0: 23/05/2013

DOCUMENT CHARACTERISTICS

General

Level 1 & 2 Performance Evaluation

CBR

Edition: A

Edition Date: 29 May 2015

Status: Draft

Keywords: IEAP, RASS

Abstract: This is the report on the results of the RASS Performance Evaluation conducted

on the NGSP Radar between 18th and 22nd May 2015.

Contact Information

Author: Person1

Reviewed: Person2

QA/Release: Person3

Contact Person: Person1

Tel: +61 2 62419237

E-mail Address: [email protected]

Document Control Information

Document Name: IEAP_SAMPLE_L1-L2_Main_Draft A

Path:

Host System: Windows 7 Professional SP1

Software: Microsoft Word 2010

Page 5: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 5/14

IMS-600-10

Revision 1.0: 23/05/2013

DOCUMENT CHANGE RECORD

Revision Date Reasons for change Pages Affected Approved by

A 29/05/2015 Draft release for comment All J. De Vries

Page 6: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 6/14

IMS-600-10

Revision 1.0: 23/05/2013

ACRONYM LIST

ACP Azimuth Change/Count Pulses

ADS-B Automatic Dependent Surveillance - Broadcast

ARP Azimuth Reference Pulse

ASTERIX All-purpose Structured EUROCONTROL Radar Information Exchange

BLI Bench Level Instruction

CPI Coherent Processing Interval

DFE Doppler Filter Threshold Editing

DHM Data Handling Module

EDR Extended Data Recorder

HPD Horizontal Polar Diagram

IEAP IE Asia Pacific Pty Ltd

(M)SSR Monopulse Secondary Surveillance Radar

Pd Probability of Detection

PDD Product Description Document

PE Performance Evaluation

PRI Pulse Repetition Interval

PSR Primary Surveillance Radar

RASS Radar Analysis Support System

RASS-R Radar Analysis Support Systems – Real-time measurements

RASS-S Radar Analysis Support Systems – Site measurements

RCD Radar Comparator Dual

RCM Radar Comparator Mono

REX Receiver Exciter

RF Radio Frequency

RSR Relocatable Site Radar

SOC Standard Operating Conditions

SSR Secondary Surveillance Radar

STC Sensitivity Time Control

STDEV Standard Deviation

TCV Technical Commissioning Volume

ToD Time of Detection

ToR Time of Recording

U/S Unserviceable

UTC Universal Time Clock

VPD Vertical Polar Diagram

Page 7: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 7/14

IMS-600-10

Revision 1.0: 23/05/2013

Blank Page

Page 8: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 8/14

IMS-600-10

Revision 1.0: 23/05/2013

1. BACKGROUND

The customer performs annual Radar Sensor Performance Evaluations (PE) on the customer owned Air Traffic Control (ATC) Radar Sensors, using the RASS suite of measurement tools. The purpose of these Performance Evaluations is to determine the quality of surveillance data a Radar System produces and to obtain a snapshot of the operating state of the Radar's subsystems.

IE Asia Pacific Pty Ltd (IEAP) was contracted by customer to conduct the annual Performance Evaluation (PE) and report on the performance outcome of the CBR NGSP Radar Sensor between 18th and 22nd May 2015. Person1 and Person2 of IEAP performed this work, the analysis of the collected data and production of this report for the customer.

1.1. Current Situation

Prior to the commencement of the Performance Evaluation, an onsite briefing was held with the key stakeholders to determine the performance and any outstanding issues of the Radar System. During the meeting the following topics were covered;

Antenna Group

o Customer had replaced both Azimuth Drive Motors and Clutch assemblies, during the last annual maintenance visit.

o Local Maintainers expressed concern for the PSR performance, specifically the impact of lighting protection on the recently constructed stairs.

Primary Surveillance Radar (PSR)

o A recommendation from the previous report was to link the maps to the correct files. Local Maintainers advised this had been done and that an instruction (BLI) was put in place to ensure it was checked regularly, including after processor reboot.

o Local Maintainers advised that Permanent Echo #1 specifics were missing from the system. One possible replacement had been identified though not implemented. Local Maintainers were unsure that such a change was within their derestriction. IEAP advised that the SOC document detailed what level of authority the unit holds with regard to Permanent Echo adjustments.

o Local Maintainers advised that no optimisation had occurred since the previous PE.

(Monopulse) Secondary Surveillance Radar ((M)SSR)

o (M)SSR instability highlighted in the previous PE report, has been rectified through the replacement of both Interrogators to the later version.

Site Communication and Data Interface (SCDI)

o Local Maintainers advised that since (M)SSR replacement, the processors are not combining all PSR targets to (M)SSR targets. The same anomaly exists for the CBR NGSP Radar. Raytheon are aware of the issue and have a solution awaiting rollout.

o Local Maintainers advised that instability issues have been rectified by replacing the Aurora I/O card.

Plant equipment

o No issues reported.

Page 9: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 9/14

IMS-600-10

Revision 1.0: 23/05/2013

1.2. Test Methodology

The tests conducted in a Performance Evaluation are detailed in the IEAP Radar Performance Evaluation Product Description Document. These tests have been identified through consultation with IE staff to provide a measure of the quality of surveillance data produced by the Radar System and a top down analysis of the (M)SSR, PSR and mechanical subsystems.

Each test is performed in accordance with the procedures described in the current NGSP RASS BLI.

In order to maximize the reliability of the real-time analysis conducted, certain criteria for data collections need to be met.

The following table highlights the minimum criteria for data collection as set out by ICAO (Doc8071 Vol3) and EUROCONTROL (SUR.ET1.ST01.1000-STD-01-01 and SUR.ET1.ST03.1000-STD-01-01).

Minimum Duration High density traffic area (en-route or major TMA) 1 hour* Medium density traffic areas – 2 hours* Low density traffic areas – 4 hours* * Must include at least 50,000 data samples

Minimum Quantity of Data Probability of Detection – 200 Chains >5 minutes per chain Accuracy Analysis – 150 Chains Systematic Error Estimation – 200 Chains >5 minutes duration in cover of >2 Radars

System Configuration Normal operational configuration for prevailing traffic and environment

Environment, Weather No Anoprop conditions or heavy Angel activity or abnormal conditions (e.g. jamming, interference) should be used to verify PSR/SSR overall performance

Traffic Recordings shall be made in Peak Traffic times if possible

Recorded Quality >99% of recorded data shall be correctly recorded and available for chaining

In analysing the data, a number of key factors are taken into account and eliminated to the extent possible, before finalising the results. These include, but are not limited to:

1. Coverage Volume 2. Filtering of Reference samples (e.g. ADS-B with FOM>5) 3. Environment (e.g. Barometric data) 4. Constraints of the Radar design.

It is important to note, that there are a number of different root causes that can lead to an out-of-tolerance specification in the Level 1 results. It is for this reason that the Level 2 analysis is performed – in order to better isolate the root cause or causes of the anomaly.

Page 10: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 10/14

IMS-600-10

Revision 1.0: 23/05/2013

2. REFERENCES

The following documents apply:

EUROCONTROL SUR.ETI.ST01.1000-STD-01-01 Radar Surveillance in En-Route and Major

Terminal Areas dated March 1997

EUROCONTROL SUR.ET1.ST03.1000-STD-01-01 Radar Sensor Performance Analysis June 1997

ICAO (Doc 8071 Vol. III)

3. ANNEXES

Annex A – Level 1 Performance Measurement against OEM, QoS and EUROCONTROL Specifications

Annex B – Level 2 Sensor Baseline Evaluation

Annex C – Test Equipment and Software List

Page 11: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 11/14

IMS-600-10

Revision 1.0: 23/05/2013

4. CONCLUSIONS

The conclusions drawn from analysis of the data collected during this Performance Evaluation are detailed as follows:

Level 1

PSR overall Pd was recorded at approx. 92.6% and meets the overall Pd specifications (OEM/QoS /EUROCONTORL). This is the reported baseline state IAW the Site Operating Conditions document (i.e. all filtering off). Slight degradation of the Pd, at all flight levels, was identified when compared to the Pd% results achieved during the 2013 optimisation, more so below 1500ft. Further, the use of circular polarisation does not adversely affect the performance.

Both (M)SSR Interrogators meet the overall Pd specification (QoS/EUROCONTROL) at approx. 99.9% for both (M)SSR-A and (M)SSR-B.

Both (M)SSR Interrogators meet the Systematic Error specification (QoS/EUROCONTROL) for Azimuth and Range Bias.

Both Interrogators meet the Random Error specifications (QoS/EUROCONTROL) for Azimuth and Range (STDEV).

Both Interrogators meet the RMS specification (OEM) for Azimuth, though not for Range.

Both (M)SSR Interrogators meet the Jumps, Code Validation and False Target specifications (QoS/EUROCONTROL) across the board.

Level 2

The PSR Target and Weather receiver channels show sound characteristics for Gain, Sensitivity and Bandwidth.

The operational Beam Switching and STC maps that were created in 2013, through optimisation, remain suitable. This is evident through the Video, Multi-Level and Pd% analysis. Small areas could be improved by adjustment and ongoing analysis, specifically around the 5NM mark and over the CBR area.

Results indicate the Transmitter characteristics were satisfactory.

PSR antenna horizontal beam pattern measurement shows that the 3dB, 10dB beamwidth and side-lode measurements were within tolerance.

Both interrogators display satisfactory characteristics for Gain, Sensitivity and Bandwidth.

Both (M)SSR Interrogators show satisfactory transmitter characteristics.

The Pulse Repetition Frequency (PRF) for both interrogators was approx. 238Hz. Both Interrogators were operating a double interlace pattern of Mode 3/A – C (MIP 9).

Windowed Video recordings show good power levels for all opportunity aircraft and Site Monitors (SM). There were no signs of punch-through, splits or reflections. The SUM/DELTA relationship shows signs of asymmetry. Analysis of the SM data indicates both transponders are performing as expected. It is worthwhile however, monitoring the output power of each.

The (M)SSR antenna demonstrates satisfactory characteristics for Uplink measurements. The Downlink measurement highlighted the SUM/DELTA asymmetry, though more specifically for peak amplitudes as the crossovers and notch depth were satisfactory. Overall performance has not changed since the previous Performance Evaluation.

Page 12: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 12/14

IMS-600-10

Revision 1.0: 23/05/2013

Multi-level analysis of the PSR revealed the PSR/(M)SSR correlation issue discussed at the pre-brief. The PSR only tracks are from a number of sources several can be tuned out without degrading performance. This will allow room for raising the PSR sensitivity. Several aircraft without transponders were recorded, emphasising the requirement for optimal PSR performance.

Multi-level analysis of both Interrogators reveals that whilst level 1 specifications are met, the overall performance is not optimum, primarily due to the encoder and asymmetry of the DELTA antenna. Most importantly, the reported position of targets is not accurate in the two areas that align with the significant encoder errors.

Mechanical analysis revealed both encoders had similar performance with a peak error of -0.10° at

approx. 340° azimuth. In between 60° and 90° azimuth, the encoders had an error of approx. -0.08°. This is significant for traffic in those areas, and especially as range increases.

There is ‘Jitter’ present for the entire revolution, though most prominent where the antenna boom passes the northerly azimuth motor.

The inclination of the turntable has remained the same for several years, with a peak error of approx. -0.1° at 230° azimuth and approx. 0.1° in the opposite direction. This result is the specification boundary.

Page 13: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 13/14

IMS-600-10

Revision 1.0: 23/05/2013

5. RECOMMENDATIONS

The following recommendations are made to improve the performance of the CBR NGSP Radar System:

It is recommended that the unit begin seeking improvement opportunities in PSR performance. Tweaking of optimised maps and establishment of some Track Initiate Inhibit zones are two common areas that will ensure the PSR is returned to optimum performance

Continue monitoring the (M)SSR Antenna (receive) performance. Any further degradation may add to unwanted Biases. Ensure OBA tables remain up to date.

Ensure that the large encoder error is rectified. Due to the commonality of the recorded result across both encoders, the likely cause is the Rotary Joint. As ‘Jitter’ is also present, several other mechanical components may also be contributors.

In terms of minimum separation requirements, the azimuth error in some areas peaks at -0.10°, particularly to the North-West. Using trigonometry to determine the error in nautical miles, we use the formula;

𝑶𝒑𝒑

𝑨𝒅𝒋= 𝑻𝒂𝒏𝜽. Rearranged we have; 𝑻𝒂𝒏𝜽 × 𝑨𝒅𝒋 = 𝑶𝒑𝒑. i.e. 𝟎. 𝟎𝟎𝟐𝟒𝟒 × 𝟒𝟎𝑵𝑴 = 𝟎. 𝟎𝟏𝑵𝑴

40NM = 0.01NM (18.5 m)

200NM = 0.49NM (907.5 m)

Upon rectification of the reported encoder error, it is advised that the following be completed at a minimum;

o Turning gear measurement and analysis (Gyro and Inclinometer).

o Level 1 recording and analysis (ADS-B as the reference) for positional accuracy.

o Eccentricity file update for format converters.

Investigate and rectify the inclination of the antenna pedestal.

A general recommendation to all owners, operators and maintainers of Radar Systems is as follows:

Optimisation should be performed on an ongoing basis in conjunction with daily system checks. Checking of on-line maps, PSR detections and Receiver noise (CFAR) are examples of things that can be checked regularly. Optimisation is preventative maintenance and ongoing efforts will assist in;

o Ensuring up-to-date and accurate maps and system parameter values are applied to the system for maximum performance.

o Ensuring that any PSR detection issues are known and can be addressed. This is essential for maintaining safe minimum separation standards for TMA operations.

Note: that the TMD system can be utilised to support the majority of these tasks.

Page 14: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Main 14/14

IMS-600-10

Revision 1.0: 23/05/2013

Blank Page

Page 15: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Edition: A Date: 29 May 2015 Status: Draft

ANNEX A

LEVEL 1 PERFORMANCE MEASUREMENT AGAINST OEM, QOS AND EUROCONTROL SPECIFICATIONS

NGSP S-Band PSR: DPR & MSSR: DTI529 - CBR

Page 16: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 2/14

IMS-600-10.01

Revision 1.0: 23/05/2013

Blank Page

Page 17: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 3/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1. LEVEL 1 PERFORMANCE MEASUREMENT AGAINST OEM, TPSA AND

EUROCONTROL SPECIFICATIONS

The following analysis results have been calculated from data that has been simultaneously recorded at various levels throughout the Radar using the RASS-R “EDR” recorder tool in the DHM. This enables direct comparison between different sources of data, as there is a common recording time stamp for the data. The types of data recorded were as follows:

1. ADS-B data was decoded from raw OMEGA channel video from each MSSR Interrogator

2. Combined data (MSSR and PSR) was recorded at the output of the Tracker

3. MSSR only data from the output of the Interrogator cabinet.

All analysis is performed on a coverage measurement volume and has been calculated, based on the Time of Detection (ToD) of each target report.

The analysis also takes into account the actual Line of Site coverage limitations for the Radar location. Figure 1 shows the coverage diagram from the NGSP Radar.

In addition, the Barometric sounding files for the NGSP region (e.g. 94610 Canberra Airport Observations at 12h00 18 May 2015) are downloaded into the analysis tools to ensure the most up-to-date corrections are available for the analysis.

Figure 1: Coverage Diagram

Page 18: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 4/14

IMS-600-10.01

Revision 1.0: 23/05/2013

Figure 2 below shows the Commissioning Volume (CV) applied in order to calculate the Probability of Detection (Pd) for PSR and MSSR as defined by the Quality of Service (QoS), with exception to the lower elevation angle for PSR. The CV for PSR only requires a +0.5 degree lower angle. For the purpose of comparing the current PSR performance (at different flight levels) with the previous optimisation results, a lower elevation angle is used i.e.-5.0 degrees (dependent on lowest aircraft detected).

Table 1 details, the quantity of data gathered to confirm minimum collection criteria set out by EUROCONTROL has been met.

File Reference/s: RCD – Numerous recordings from approx. 0700 to 2359 (UTC) between 18th and 22nd May 2015.

File Reference/s: RCM – Numerous recordings from approx. 0700 to 2359 (UTC) between 18th and 22nd May 2015.

Table 1: Data Collection Values

Reference Value EUROCONTROL Specification

Data Samples RCM >125,000 on all occasions Minimum 50,000

RCD >157,000 on all occasions

Number of Chains >5 minutes

> 334 on all occasions Pd/Systematic Errors >200 Accuracy Analysis >150

System Configuration

Normal Normal operational configuration for prevailing traffic and environment

Environment, Weather

No Anoprop conditions

No Anoprop conditions or heavy Angel activity or abnormal conditions (e.g. jamming, interference) should be used to verify PSR/SSR overall performance

Peak Traffic Included (Y/N)

Y Recordings shall be made in Peak Traffic times if possible

Recorded Quality 100%

>99% of recorded data shall be correctly recorded and available for chaining

Figure 2: MSSR and PSR Technical Commissioning Volume

Page 19: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 5/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1.1. PSR Performance The quality parameters of the primary sensor are calculated based on continuous chains of MSSR target reports greater than 5 minutes in run length within the PSR coverage volume. These results are contained in the following sections.

1.1.1. Probability of Detection

The PSR Probability of Detection (Pd) is calculated for the coverage volume of the PSR Radar by using the RASS-R RCM module. Figure 3 shows a tabular display of overall PSR Pd based on the Time of Detection (ToD), from an overnight data collection. The legend on the image indicates % Pd and in terms of the TPSA specification any black cells (below 80% Pd) represent a failure to meet specification, while Grey cells indicate there is insufficient data in those areas to conduct an analysis.

As per Table 2 the PSR was recorded at approx. 92.6% probability of detection.

Table 2: Probability of Detection – PSR Overall – 18 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Probability of Detection (%)

92.6% >80 >80 >90

Figure 3: PSR Pd Overall

Page 20: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 6/14

IMS-600-10.01

Revision 1.0: 23/05/2013

For information only, Figure 4 represents the performance out to 40NM as this is generally as far as local TMA operations extend to. The result is shown at Table 3.

Table 3: Probability of Detection – 40NM, 18 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Probability of Detection (%)

93.4% >80 >80 >90

Figure 4: PSR Pd – 40NM

Page 21: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 7/14

IMS-600-10.01

Revision 1.0: 23/05/2013

Analysis was also performed at various Flight Levels. The Pd% at various Flight Levels is shown at Table 4 and includes previously reported Pd% results and alternate configurations.

NOTE – The number of samples used for each flight level block is only a relative proportion of the total sample recorded and can only be used as an indicative performance % against the respective specifications. Analysis was conducted using the QoS Coverage Volume. Air traffic density would be slightly different for each night of recording.

Table 4: Pd% at various flight levels

Flight Level 2013 IEAP Opt.

DFE Off, Linear Enabled

May 2015

DFE Off, Circular Enabled

May 2015

DFE Off, Linear Enabled

500 -1,500 ft 82.8% 75.9% 76.7%

1,500 -5,000 ft 90.8% 88.1% 87.1%

5,000 -10,000 ft 95.9% 91.9% 92.1%

10,000 -15,000 ft 99.1% 92.7% 90.5%

15,000 -25,000 ft 98.9% 93.3% 93.2%

25,000 - 50,000 ft Not Measured 91.1% 90.8%

500 - 25,000 ft (TPSA) >96.7% 90.1% 92.6%

Legend:

Less than 80% = RED

Greater than 90% = Green

Greater than 80% and Less than 90% = Black

Table 5 details the average percentage of traffic for varying Flight Levels. The information highlights the importance of optimising the PSR for all heights in the coverage volume.

Table 5: Average Percentage of Traffic

Flight Level Air Traffic %

<500ft 2.06%

500 – 1,500ft 5.43%

1,500 – 5,000ft 12.08%

5,000 – 10,000ft 11.21%

10,000 – 15,000ft 6.98%

15,000 – 25,000ft 22.26%

> 25,000ft 39.98%

Page 22: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 8/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1.2. MSSR Performance The following section contains the quality parameters for the MSSR.

1.2.1. Probability of Detection

Figure 5 shows a tabular display of MSSR ‘A’ Pd based on the Time of Detection (ToD), from an overnight data collection. Black cells indicate MSSR Pd below 95% while Grey cells indicate there is insufficient data in those areas to conduct an analysis. The legend on the image indicates % Pd and in terms of the TPSA specification any Red, Orange or Black cells represent a failure to meet the specification.

As per Table 6, the Pd for MSSR-A is approx. 99.9%.

Table 6: Probability of Detection – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Probability of Detection (%)

99.9% N/A >97 >97

Figure 5: MSSR Pd – MSSR-A

Page 23: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 9/14

IMS-600-10.01

Revision 1.0: 23/05/2013

Figure 6 shows a tabular display of MSSR ‘B’ Pd based on the Time of Detection (ToD), from an overnight data collection.

As per Table 7, the Pd for MSSR-B is approx. 99.9%.

Table 7: Probability of Detection – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Probability of Detection (%)

99.9% N/A >97 >97

Figure 6: MSSR Pd – MSSR-B

Page 24: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 10/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1.2.2. Positional Accuracy Analysis

1.2.2.1. Systematic Errors

The position of MSSR targets are checked for systematic errors (bias) with the RASS-R RCD. ADS-B target reports are used as a reference and are filtered for faulty transponders and to pass only reports with a figure of merit greater than five. The MSSR target reports are then compared against the reference trajectories to determine the range and azimuth biases. Tables 8 & 9 show the results of the analysis. An XY representation of the Azimuth and Range Error dispersion is shown in Figures 7 & 8.

Table 8: Systematic Errors – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Azimuth Bias (degrees)

0.02 N/A <0.1 <0.1

Range Bias (metres)

-44.4 N/A <100 <100

Figure 7: Azimuth and Range Error – MSSR-A

Page 25: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 11/14

IMS-600-10.01

Revision 1.0: 23/05/2013

Table 9: Systematic Errors – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Azimuth Bias (degrees)

0.04 N/A <0.1 <0.1

Range Bias (metres)

-47.9 N/A <100 <100

Figure 8: Azimuth and Range Error – MSSR-B

Page 26: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 12/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1.2.2.2. Random Error Analysis

The residual random errors in the measured position of the MSSR target reports are calculated by the RCD as a Standard Deviation (STDEV) result. The results are displayed in Tables 10 & 11 below.

Table 10: Random Errors – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Azimuth Errors (degrees STDEV)

0.04 N/A <0.08 <0.08

Range Errors (metres STDEV)

11.14 N/A <70 <70

Table 11: Random Errors – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Azimuth Errors (degrees STDEV)

0.04 N/A <0.08 <0.08

Range Errors (metres STDEV)

9.1 N/A <70 <70

1.2.2.3. RMS Errors

The NGSP OEM Specification does not specify performance in the bias and random error (STDEV) format of the parameters listed in paras 1.2.2.1 and 1.2.2.2. Instead, it expresses the azimuth and range error performance as RMS values, which effectively combines the bias and random error results as a single result. The results as calculated by the RCD are displayed in the Tables 12 & 13 below.

Table 12: RMS Errors – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Azimuth Errors (degrees RMS)

0.04 <=0.07 N/A N/A

Range Errors (metres RMS)

44.0 <=37 N/A N/A

Table 13: RMS Errors – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Azimuth Errors (degrees RMS)

0.05 <=0.07 N/A N/A

Range Errors (metres RMS)

45.4 <=37 N/A N/A

Page 27: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 13/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1.2.2.4. Jumps

Errors attributed as jumps are calculated in the RCM and are provided as a ratio value in Tables 14 & 15.

Table 14: Jumps – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Jumps % 0.00 N/A <0.05 <0.05

Table 15: Jumps – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Jumps % 0.00 N/A <0.05 <0.05

1.2.3. Data Validity Analysis

The data from the MSSR is processed by the RCM to determine the quality of correct code detection. Correct code means the mode A/C code value corresponds to the current “correct” value for the associated trajectory. Tables 16 & 17 detail the results of the valid Mode A and valid Mode C analysis.

Table 16: Valid Code Detection – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Valid and Correct A Code (%)

99.99 N/A >98 >98

Valid and Correct C Code (%)

99.91 N/A >96 >96

Valid and Incorrect A Code

(%) 0.01 N/A <0.1 <0.1

Valid and Incorrect C Code

(%) 0.03 N/A <0.1 <0.1

Table 17: Valid Code Detection – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Valid and Correct A Code (%)

99.99 N/A >98 >98

Valid and Correct C Code (%)

99.87 N/A >96 >96

Valid and Incorrect A Code

(%) 0.00 N/A <0.1 <0.1

Valid and Incorrect C Code

(%) 0.04 N/A <0.1 <0.1

Page 28: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex A - Level 1 Performance Measurement Against OEM, QoS and EUROCONTROL Specifications Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex A 14/14

IMS-600-10.01

Revision 1.0: 23/05/2013

1.2.4. False Target Analysis

The RCM compares individual target reports against the reconstructed trajectory. If the report falls outside of a predetermined window the report is classified as a false target. False targets are then classified by type and are listed in Tables 18 & 19 below.

Table 18: False Target Results – MSSR-A – 18 May 2015

Parameter Measured Result

(Time of Detection)

OEM Specification QoS

Specification

EUROCONTROL

Specification

Multiple Targets % 0.02 N/A <0.3% <0.3%

- reflections % 0.00 N/A <0.2% <0.2%

- sidelobes % 0.02 N/A <0.1% <0.1%

- splits % 0.00 N/A <0.1 % <0.1%

Table 19: False Target Results – MSSR-B – 19 May 2015

Parameter Measured Result

(Time of Detection) OEM Specification

QoS

Specification

EUROCONTROL

Specification

Multiple Targets % 0.01 N/A <0.3% <0.3%

- reflections % 0.00 N/A <0.2% <0.2%

- sidelobes % 0.01 N/A <0.1% <0.1%

- splits % 0.00 N/A <0.1 % <0.1%

Page 29: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Edition: A Date: 29 May 2015 Status: Draft

Annex B

LEVEL 2 SENSOR BASELINE EVALUATION

NGSP S-Band PSR: DPR & (M)SSR: DTI529 - CBR

Page 30: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 2/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Blank Page

Page 31: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 3/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1. LEVEL 2 – SENSOR BASELINE EVALUATION

1.1 PSR Performance

RASS measurements were conducted on the Primary Surveillance Radar (PSR) system.

1.1.1 Receiver Sensitivity

The purpose of this measurement is to determine the Receiver's performance across the entire dynamic range from saturation to noise floor. The test is equivalent to the Minimal Discernible Signal (MDS) checks, whereby a known signal is injected into the Low Noise Amplifier (LNA) or High Power Coupler. The injected signal is then sampled at the latest stage possible in the Analogue Receiver and converted to LOG video for measurement.

The result is what is termed the Receiver Sensitivity or Gain curve. This curve provides important information pertaining to the receivers characteristics such as:

- Gain

- Dynamic Range

- Saturation Point

- Noise Floor (MDS)

As this curve provides a correlation between input and output of the Analogue Receiver, the curve is used for further testing to assign a power level to the voltage read, and in particular, receiver video recordings.

Table 1 details the results of the receiver sensitivity measurements. Figure 1 thru Figure 3 display the gain curves.

Table 1: Receiver Sensitivity Curve - Results

*The specification listed for saturation is derived on from the Dynamic range and Noise Floor values.

Noise Floor Saturation Dynamic Range

High -109dBm -49dBm 60dB

Low -109dBm -49dBm 60dB

Weather -109dBm -49dBm 60dB

Specification -109dBm -49dBm* 60dB

Page 32: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 4/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Figure 1: PSR - Receiver Linearity – High Beam F1

Figure 2: PSR - Receiver Linearity – Low Beam F1

Page 33: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 5/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

Results are satisfactory, no anomalies identified.

Figure 3: PSR - Receiver Linearity – Wx Beam F1

Page 34: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 6/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.1.2 Receiver Bandwidth

Each Receiver was swept across its bandwidth to determine frequency response. A bandwidth measurement is taken to ascertain whether or not the Receiver is tuned in frequency to avoid inadvertently accepting an echo from the corresponding Transmitter. The injection and measurement points are identical to the Receiver calibration measurements. This test provides the serviceability of the band pass filter sections at the front end of the Receiver.

The RFA Transmitter is set to transmit a pulse at the specified power value, and the frequency is swept between two chosen limits. The RFA continuously samples the output of the Radar Receiver video signal and uses these values to build a dBm versus frequency table. The resulting graph can be analysed to determine the 3dB bandwidth. Figures 4 through to Figure 6 show the Receiver bandwidth for PSR-B. Table 2 details the results of bandwidth measured for each PSR receiver.

Table 2: Receiver Bandwidth

3 dB Bandwidth

High 1.33 MHz

Low 1.30 MHz

Weather 1.29 MHz

Specification Not Specified

Page 35: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 7/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Figure 4: PSR - Receiver Bandwidth - High Beam F1

Figure 5: PSR - Receiver Bandwidth - Low Beam F1

Page 36: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 8/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

Results are satisfactory, no anomalies identified.

Figure 6: PSR - Receiver Bandwidth - Wx Beam F1

Page 37: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 9/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.1.3 Transmitter Characteristics

The Transmitter characteristics are measured from the Power Monitor. Pulse shape and forward power are recorded and analysed. The results for the Power Measurement are detailed at Table 3. The output pulse recorded for each Transmitter is shown in Figure 7 and Figure 8.

Table 3: PSR Power Measurements

Pulse Width Forward Power

PSR-B (short) 1.5 μS 16.6 KW

PSR-B (long) 79.8 μS 16.6 KW

Specification 1.6μS ± (short) 82μS ±5.5 (long) 16kW

Comments:

The forward power of the PSR Transmitter is as expected.

Figure 7: PSR-B Forward Power – Short Pulse

Figure 8: PSR-B Forward Power – Long Pulse

Page 38: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 10/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.1.4 Sectorial STC Measurements

A Sectorial STC Recording can provide a great deal of useful information on how the STC and Beam Combining (BC) Maps have been

configured for each Receiver. It gives a 360 graphical representation of the programmed STC range/azimuth bins. STC and Beam Combining are the Analogue Receivers first line of defence against unwanted PSR returns such as hills and buildings, i.e. clutter.

The Sectorial STC measurement result consists of an RF input power, versus timetable presented, versus azimuth. To be able to measure the gain versus time delay, the RFA needs to be synchronised to the interrogation signal of the Radar under test. The interrogation trigger signal is used both as a start of range and azimuth indicator. The RFA will inject pulses of fixed (selectable) power level into the Receiver. The PSR’s analogue video output signal is then sampled by the RFA. This amplitude is passed through the calibration curve in order to build the STC curve: A gain versus time delay curve.

Figures 9 and 10 shows the STC map and Beam Switching map respectively.

Comments:

These are the Optimised maps from 2012.

Both maps are performing well and evidence of suitability can be seen at the Clutter and Multi-Level analysis.

Figure 9: PSR Sectorial STC

Figure 10: PSR Sectorial Beam Switching

Page 39: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 11/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.1.5 Windowed Video Recordings

After the Receiver’s calibration curve is known, a windowed video (azimuth by range) recording can provide useful information on the performance of the Radar and the RF environment. The windowed video recordings are reported to demonstrate problematic areas such as Receiver saturation (red). In this view, small areas of concern can easily be identified. Additionally, windowed video can be used to demonstrate to technicians where optimisation is required.

Figures 11 and 12 show a window video recording comparing the clutter amplitude between normal operation and with maps off.

Figure 11: PSR Windowed Video – Maps Off

Page 40: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 12/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

There are several small areas in coverage that have the potential for performance improvement through further analysis and ongoing optimisation. A noteworthy area of ground clutter exists to the south and has the receiver bordering saturation. This area is highlighted by the red circle and was specifically switched to low beam (2013 Opt) to better detect targets over the Gundaroo airfield. The majority of effects caused by the high levels of clutter will be taken care of by the Doppler filters and adaptive clutter map. Further, no issues were identified at data level, though ongoing monitoring is recommended.

Figure 12: PSR Windowed Video – Normal Operation

Page 41: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 13/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.1.6 Clutter Map Recordings

Recording of clutter is designed to determine where Receiver saturation occurs and to assist in isolating the cause of missed PSR detections, essentially testing the effectiveness of the clutter maps. Measurement of clutter is taken at the output of the Receiver’s IF Amplifier. The Receiver Sensitivity Curve is required during analysis to determine the intensity of the clutter returns.

Clutter recordings are almost identical to Windowed Video recordings, except they reveal an overall view. The RASS clutter map processor cancels moving targets by integrating video over eight successive scans to identify the environmental clutter.

Both tests are carried out with maps off and maps on (Normal operation). The results detail the level of attenuation used to cancel clutter and indicate whether or not the maps are optimised correctly.

Results of the clutter analysis are shown in Figures 13 and 14.

Comments:

The effective clutter suppression can be seen here. The Beam and STC maps are operating as expected.

Minor improvements are possible however; the effect on overall performance will be negligible.

Figure 13: PSR - Clutter Recording – Maps off

Figure 14: PSR - Clutter Recording – Normal Operation

Page 42: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 14/40

IMS-600-10.02

Revision 1.0: 23/05/2013

A horizontal polar representation of the previous PPI clutter view is shown in Figures 15 and 16. In this view, the bearing and severity of the clutter is easier to determine. Additionally, the effect of applying too much attenuation to supress clutter can be visualised.

Comments:

The white circle highlights the area over Gundaroo region purposely switched to Low Beam so that targets can be detected around the Gundaroo Airfield. The red circles highlight minor areas where improvements are possible through further fine tuning of the Beam and STC maps.

Figure 15: PSR - Clutter Recording – Maps off

Figure 16: PSR - Clutter Recording – Normal Operation

Page 43: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 15/40

IMS-600-10.02

Revision 1.0: 23/05/2013

PSR Antenna Measurements Many characteristics of the PSR system are determined by recording the emitted RF pulses from the Radar under test and the strength of signals at the RFA Receiver.

The Uplink measurement provides Horizontal Polar Diagrams (HPD) of the PSR antenna in its operational environment by recording the pulses of the Radar. For this purpose the Radar Field Analyser is set-up in the field with no connection to the Radar. The RFA's antenna picks up the Radar signal and the Uplink software calculates the HPD antenna diagram from this data.

The pulses are recorded in a condensed format (pulse mode) or in detail (scope mode). After the recording the HPD is extracted from the data through fingerprinting (stagger).

This antenna measurement is completed to check whether the correct horizontal antenna pattern is being radiated as per the OEM specification. Without the correct antenna pattern for the specific Radar type, detection and positioning of aircraft will prove difficult for the Receiver.

1.1.6.1 PSR Uplink Measurements

The PSR uplinks were recorded at the same locations as the SSR uplinks. Before each Horizontal Polar Diagram (HPD) recording was made, a scope recording was conducted to test each site for presence of significant multipath and other interference.

Table 4 details the results for Beamwidth measured at each site.

Table 4: Beamwidth measured

*Side-lobes 3dB Beamwidth 10dB Beamwidth PRF

Ainslee Hill >24dB 1.41° 2.63° 669.8 Hz

Phillip Ave >25dB 1.41° 2.49° 669.8 Hz

Gunning Rd N/A N/A N/A N/A

Specification >25dB 1.4 ±0.05 (S-Band)

1.2 ±0.15 (L-Band) Not specified Site Specific

*The specification >xx dB means better than the specified value down from peak when referenced to the level of the main lobe.

Page 44: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 16/40

IMS-600-10.02

Revision 1.0: 23/05/2013

The resulting horizontal antenna pattern is shown in Figure 17 below.

The stagger pattern recording from one of the sites can be seen below in Figure 18. Comments:

Results are as expected, no anomalies identified. It was not necessary to measure the transmit pattern of the PSR Antenna from Gunning Rd, as the other recordings were sufficient in demonstrating the sound antenna performance.

Figure 17: PSR - Uplink HPD – Ainslee Hill

Figure 18: PSR - Stagger Pattern

Page 45: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 17/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2 SSR Performance

1.2.1 Receiver Sensitivity

The Receiver Sensitivity is measured by recording Receiver video levels against 1090 MHz applied at the Receiver's input. The purpose of the measurement is to verify the Receiver sensitivity, gain and linearity tracking between the three channels in each Receiver and to build a calibration table required for further RASS video recordings.

The SSR Receiver Calibration includes the three extremely important baseline examinations of Sensor Functionality:

1. Amplitude (Gain) Matching

2. Tangential Sensitivity

3. Dynamic Range.

The test procedure involves injecting a known signal into the high power coupler and measuring the performance at the output of the Analogue Receiver.

The method of mono-pulse processing used in this type of Radar system requires accurately matched gain responses from both the SUM and DELTA channels. A mismatch of the SUM and DELTA response for a given RF input level could result in poor azimuth resolution by affecting the Interrogators de-garbling performance. During the Performance Evaluation, the gain (mV/dB) of each Receiver is recorded against the position the Receiver is located and the results are detailed in Table 5.

The serial number of each Receiver is annotated at Table 6.

Table 5: (M)SSR Receiver Sensitivity Curve - Results

Interrogator SIGMA DELTA OMEGA Dynamic Range Tangential Sensitivity

(M)SSR-A 25.0 mV/dB 25.0 mV/dB 25.0 mV/dB 82.7 dB -99.4 dBm

(M)SSR-B 25.0 mV/dB 25.3 mV/dB 25.2 mV/dB 82.9 dB -99.6 dBm

Specification Unknown Unknown Unknown 80 dB <-90 dBm

Table 6: Receiver Amplifier Serial Numbers

Interrogator SUM DELTA OMEGA

(M)SSR-A 02-10096 02-10097 01-10098

(M)SSR-B 02-10094 03-10095 02-10131

Page 46: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 18/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Figure 19 and Figure 20 show the Sensitivity Curve of (M)SSR-A and (M)SSR-B respectively.

Comments:

Results are as expected, no anomalies identified.

Figure 19: (M)SSR-A Receiver Sensitivity Curve

Figure 20: (M)SSR-B Receiver Sensitivity Curve

Page 47: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 19/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2.2 Receiver Bandwidth

The Receiver is swept across its operating bandwidths to record their frequency response. The measurement point is identical to that of the Receiver Sensitivity measurements. The purpose of this test is to ensure that the Receivers are centred on 1090MHz with sufficient bandwidth at the 3dB point to receive aircraft responses from 1087-1093MHz as per ICAO requirements.

Table 7 details the results whilst Figure 21 and Figure 22 show the frequency response of (M)SSR-A and (M)SSR-B.

Table 7: Receiver Bandwidth

3 dB Bandwidth 40 dB Bandwidth 60 dB Bandwidth

(M)SSR-A 1085.5 – 1094.5 MHz 1078.9 – 1100.7 MHz 1075.7 – 1104.2 MHz

(M)SSR-B 1085.5 – 1094.2 MHz 1079.0 – 1100.5 MHz 1075.9 – 1103.8 MHz

Specification 1086±0.5 & 1094±0.5 Mean 1090MHz±0.5

>1078 & <1102 MHz >1065 & <1115 MHz

Figure 21: (M)SSR-A Receiver Bandwidth

Page 48: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 20/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

Results are as expected, no anomalies identified.

Figure 22: (M)SSR-B Receiver Bandwidth

Page 49: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 21/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2.3 Transmitter Characteristics

The purpose of this test is to check whether the Transmitted pulses are within specification. Aircraft rely on receiving correctly sized and spaced pulses for it to reply. An out of tolerance spacing for example, may cause the aircraft to reply incorrectly or disregard the interrogation. Pulse Width and Spacing are recorded and analysed and an example of the (M)SSR pulse train is shown at Figure 23. Figures 24 and 25 show the recorded pulse train for SUM channel for both (M)SSR-A and (M)SSR-B, respectively.

The results obtained are shown below in Table 8 and Table 9.

Table 8: (M)SSR Power and Pulse Width

Power (dBW) Power (kW) Pulse Width (μS)

(M)SSR -A

P1 33.26 2.11 0.86

P2 33.10 2.04 0.86

P3 – 3/A 33.26 2.11 0.86

P3 - C 33.26 2.11 0.86

(M)SSR -B

P1 33.10 2.04 0.88

P2 33.20 2.08 0.88

P3 – 3/A 33.10 2.04 0.88

P3 - C 33.10 2.04 0.88

Specification >32dBW >1.58KW 0.8μS ± 0.1μS

Table 9: (M)SSR Pulse Spacing

P2 (μS) P3 - 3/A (μS) P3 – C (μS)

(M)SSR -A 2.00 8.00 21.00

(M)SSR -B 2.00 8.00 21.00

Specification 2μS ± 0.1μS 8μS ± 0.1μS 21μS ± 0.1μS

Figure 23: Example Pulse train for P1, P3 and P2

Page 50: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 22/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

Results recorded are as expected, no anomalies identified. Result accuracy for power the measurement is approx. ±0.4dB.

Figure 24: (M)SSR-A - SUM Channel Tx characteristics

Figure 25: (M)SSR-B - SUM Channel Tx characteristics

Page 51: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 23/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2.4 Windowed Video Recording

Windowed video (azimuth by range) recordings of the log video from the online Receiver system provide useful information on the performance of the (M)SSR, the test Transponder and the RF environment. Recordings were made on several stationary targets, to provide an indication of the Radar’s Mode Interlace Pattern (MIP) and the Transponder strength to each interrogation.

The Pulse Repetition Frequency and Site Monitor power for each Interrogator can be found at Table 10.

Figures 26 thru 28 show the Site Monitor and targets of opportunity.

Table 10: Pulse Repetition Frequency

PRF SM Power at Rx

(M)SSR-A_7757 262 Hz -68.8 dBm

(M)SSR-B_7737 261 Hz -68.7 dBm

Specification 262 Hz (CBR) Site Specific

Figure 26: (M)SSR-A - Windowed Video – Site Monitor (A7757) MIP9

Page 52: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 24/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

The power of both site monitors is considered slightly low and whilst anomalies have not surfaced to date, monitoring for further degradation is advised.

Figure 28: (M)SSR-B - Windowed Video - Opportunity Target (A3754) MIP9

Figure 27: (M)SSR-B - Windowed Video – Site Monitor (A7745) MIP9

Page 53: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 25/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2.5 Antenna Measurements

Poor antenna performance means poor (M)SSR performance. The effects of reduced gain, high side lobes and reflected signals received off the ground or via nearby objects are difficult to overcome by subsequent processing in the Radar’s Receiver.

One of the least monitored components in an (M)SSR system is the antenna.

1.2.5.1 (M)SSR Uplink Measurements

To generate an accurate antenna pattern of the Radar’s antenna system, correct transmitter operation must first be confirmed. Transmitter operation is confirmed by examining Radar pulses recorded from the RF environment. Reliable antenna patterns also require the absence of multipath, if at all possible. This too is determined by examination of recorded RF pulses.

Prior to making a HPD measurement at each site, a scope recording was made to ensure the pattern is not affected by multipath interference.

The results from the measurement analysis are detailed at Table 11. Figures 29 thru 31 show the (M)SSR HPD recorded at separate sites.

Table 11: Uplink Results

*Side-lobes 3dB Beamwidth 10dB Beamwidth SUM/OMEGA Crossover

Ainslee Hill >25dB 2.50° 4.52° -18.47dB/-2.52°; -17.01dB/3.05°

Phillip Ave >25dB 2.61° 4.68° -17.52dB/-2.94°; -17.51dB/3.13°

Gunning Rd N/A N/A N/A N/A

Specification >26dB 2.35 ±0.25° <4.5° Not Specified

*The specification >xx dB means better than the specified value down from peak when referenced to the level of the main lobe.

Figure 29: (M)SSR - Uplink HPD – Ainslee Hill

Page 54: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 26/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Comments:

Results are as expected. There is a small side-lobe (left of bore-sight) though not causing any issues at present. It was not necessary to measure the transmit pattern of the (M)SSR Antenna from Gunning Rd, as the other recordings were sufficient in demonstrating the antenna transmit performance, of which is satisfactory.

Figure 30: (M)SSR - Uplink HPD – Phillip Ave

Figure 31: (M)SSR - Uplink HPD – Gunning Rd

Page 55: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 27/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2.5.2 Downlink Measurements

The downlink HPD is useful for measuring the receive-only antenna pattern of the DELTA channel. Symmetrical SUM/DELTA crossovers are important for correct mono-pulse operation of the Radar.

Results detailed at Table 12, Figure 32 shows the SUM (red) and DELTA (blue) pattern for the antenna.

Table 12: Downlink Results

Notch Depth Peak Amp/Location SUM/DELTA Crossover

Ainslee Hill -25dB -1.1dB/-1.7°; -2.1dB/1.6° -2.78dB/-1.06°; -2.96dB/1.08°

Specification Not Specified Not Specified Not Specified

Comments:

The DELTA (Blue) channel shows signs of asymmetry, particularly with regard to the peak amplitudes. An asymmetrical curve will add to the inaccuracy of target resolution. The crossovers points are as expected and satisfactory.

Figure 32: (M)SSR –Downlink HPD

Page 56: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 28/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.2.5.3 Uplink/Downlink Locations

Each location is chosen carefully and should include both low and high elevation. Another important factor is the direction from the Radar in which the measurement is taken. Selecting each location as far apart (in relative angle from the Radar) as practically possible, helps determine whether any anomalies are caused by the Antenna or the Environment. It is normal for some relatively low angles to show small signs of punch-through. These sites are used predominantly to isolate other antenna or environmental characteristics.

During the recordings, the approximate average percentage of air traffic movement through relative antenna elevation is detailed at Table 13, whilst Table 14 and Figure 33 illustrate the three locations used for this site.

Table 13: Average Percentage of Traffic

Elevation Air Traffic

< 0.0º 27.23%

0º - 1.5º 33.15%

1.5º - 3.0º 20.98%

3.0º - 5.0º 11.50%

5.0º - 10.0º 6.00%

> 10.0º 1.64%

Table 14: Uplink/Downlink Locations

Ainslee Hill Phillip Ave Gunning Rd

Latitude

Longitude

Elevation 171m 193m 81m

Relative Elevation angle -1.45º -0.54º -0.92º

Figure 33: Google Earth Locations

Page 57: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 29/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.3 Multilevel Data Analysis

Multi-Level analysis put simply is displaying approximately 18 hours of aircraft detections (PSR & (M)SSR), for the purpose of finding areas of concern. For example, if all aircraft in one particular area are being missed, then the issue must be environmental or Radar related. If in the same instance only one aircraft was missed, it is possible that the one aircraft was physically undetectable and not the result of, for example poor Radar performance.

All images shown in the Multi-Level analysis section are a product of loading data into the RASS-S Inventory tool, to visualise any anomaly/problem and conduct further investigation into the reason or cause of the issue.

1.3.1 PSR Data Analysis

Shown below is a graphical representation of actual PSR replies (Blue) overlaying (M)SSR replies (Green). Within the PSR 80NM coverage, it is expected there should be a PSR detection to each (M)SSR reply. Please note that in a vertical view the azimuth is not shown, though it still contains all detections from 0 – 360 degrees azimuth on the z-axis.

The data displayed in Figure 34 is from an overnight recording commencing on the 18 May 2015.

Comments:

In a baseline configuration, with Linear Polarisation selected, the PSR system performs very well. Several minor holes in coverage still exist, though primarily due to limitations of the Radar design and location.

Figure 34: PSR – Linear Polarisation, Filtering disabled

Page 58: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 30/40

IMS-600-10.02

Revision 1.0: 23/05/2013

The data displayed in Figure 35 is from an overnight recording commencing on 18 May 2015. The data has been filtered to display all (M)SSR only tracks within the PSR coverage volume and is simply an inverted version of Figure 34 (PPI view).

Comments:

The majority of blue tracks shown is due to the known PSR/(M)SSR correlation problem. The issue is typically more severe past the 30NM mark and is possibly compounded by encoder performance, associated jitter and the SUM/DELTA relationship curve i.e. the further out in range targets are, the more difficult it is for the (M)SSR to resolve accurately.

Figure 35: (M)SSR Only – Inside PSR Coverage

Page 59: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 31/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Shown below at Figures 36 and 37 are graphical representations of actual PSR only replies (Blue) overlaying (M)SSR replies (Green). This is primarily shown to demonstrate the importance of PSR especially with respect to Violations of Controlled Airspace (VCA). In some instances the (M)SSR Interrogator can fail to detect or track aircraft that have serviceable Transponders. In other cases, aircraft may not have a Transponder or it has been switched off.

Another purpose for displaying recorded data this way is to alert local staff to the issues the PSR is suffering from for example; concentrated areas of false tracks that require attention/optimisation.

Comments:

The majority of PSR only tracks are caused by the known PSR/(M)SSR correlation. Once this is rectified, a fresh recording should be analysed for confirmation. A higher concentration of PSR only tracks resides over the CBD of Gundaroo and is normal, as both the PSR and (M)SSR will generally struggle in this area.

‘Planer View’ is used here to further demonstrate that the known correlation issue is global. Once rectified, the PSR only tracks will be drastically reduced.

Whilst not shown, there were several occasions identified where targets were not cooperative with the (M)SSR interrogators, placing emphasis on the importance of PSR performance.

Figure 36: PSR Only Tracks

Figure 37: Planar view

Page 60: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 32/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Figure 38 is a Google Earth image overlayed with the PSR only tracks.

Comments:

General aviation traffic (without transponders) can be seen on coast line routes and also across to Rottnest Island. Approach and departure routes can be seen where the PSR track has split from the (M)SSR track due to known correlation issue. Other areas show false tracks cause from road traffic, sea clutter and wind farms. Ongoing analysis of unwanted false tracks can determine their permanency and technicians can use tools available within radar software to remove (carefully). By reducing false track count, allows the sensitivity of the PSR to be raised further.

Figure 38: Nearby False tracks

Page 61: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 33/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.3.1.1 PSR False Track Analysis

The PSR false track statistics can be found at Table 15 and Figure 39 below. The recorded data was with DFE off as is normally the case at CBR. The analysis excludes VCA and Permanent Echoes.

Table 15: False Track Statistics

Comments:

Maintenance technicians rarely utilise filtering, however Circular Polarisation is used when poor weather prevails. The purpose of analysing this data is to demonstrate to technicians the effects of the alternate configuration. The results show that no adverse effects come from switching to circular polarisation. Other areas of PSR performance are also not effected.

Avg. False Tracks Max Min

PSR Baseline 3.3/scan 18 0

PSR Cir. Polarisation 2.4/scan 18 0

Specification N/A 20/scan N/A

Figure 39: False Tracks – Baseline Configuration

Page 62: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 34/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.3.1.2 Permanent Echo Analysis

Figures 40 and 41 show the results of monitoring the PSR Permanent Echoes (PE) #1 thru #4. Amongst other reasons, these echoes are used for alignment of the Radar when a site monitor is not available. Table 16 details the statistics for all Permanent Echoes.

Table 16: STDEV Results

Azimuth Range STDEV Azimuth STDEV Range

PE #1 304.30° 6.47 NM 0.26° 0.18 NM

PE #2 236.38° 17.57 NM 0.13° 0.01 NM

PE #3 174.86° 6.95 NM 0.14° 0.01 NM

PE #4 164.93° 17.09NM 0.93° 0.01 NM

Comments:

Permanent Echo #1 is not an ideal echo and should be replaced. The PSR struggles to hold a solid track for this Permanent Echo due to its physical nature.

Figure 40: Multi-level Data – Permanent Echo #1 and #2

Figure 41: Multi-Level Data – Permanent Echo #3 and #4

Page 63: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 35/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.3.2 (M)SSR Data Analysis

1.3.2.1 Site Monitor Analysis

An image of the (M)SSR Site Monitor (SM) analysis can be found at Figure 42. Table 17 details the SM statistics.

Table 17: STDEV Results

Bearing Range STDEV Azimuth STDEV Range

(M)SSR-A_7757 173.648 137.166 0.022 0.004

(M)SSR-B_7737 173.642 142.284 0.019 0.004

Specification 173.651° 7742 137.242

≤ 0.07 Unknown 7745 142.242

Comments:

Results are as expected, no anomalies identified.

Figure 42: Multi-level Data – Site Monitor 7757

Page 64: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 36/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.3.2.2 Zero Code Analysis

A zero (not present) code is where the aircraft does not reply to the interrogation, whether it is a Mode 3/A or a Mode C. The other reason for a zero code is the Interrogator cannot read the reply correctly due to Phasing, Gain Matching or Encoder problems, to name a few. Table 18 details the results.

Table 18: Zero codes

Avg. Zero C Max Zero C Avg. Zero A Max Zero A

(M)SSR-A 1.07/scan 4 0.01/scan 1

(M)SSR-B 1.06/scan 3 0.01/scan 1

Specification Unknown Unknown Unknown Unknown

Comments:

Results are as expected, no anomalies identified.

1.3.2.3 Positional Accuracy Analysis

The accuracy a reported target is an integral component of the Radars performance. At Figure 43, (M)SSR targets (green) overlay ADS-B targets (Magenta). As stated at Annex A, ADS-B reports are used as the reference due to their significant accuracy derived from transmitted GPS coordinates. Bias and Accuracy results are detailed in the Level 1 RCD analysis.

Comments:

Various areas within the (M)SSR coverage volume show that the interrogator has not positioned the target accurately. With the current state of the encoder system (detailed further on) and the asymmetry of the SUM/DELTA curve, the accuracy of the interrogators will be degraded in areas.

Figure 43: Azimuth Bias error

Page 65: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 37/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.4 Mechanical Analysis

RASS provides a capability to measure azimuth Encoder performance and inclination of the antenna turntable assembly. Poor Encoder performance can impact overall Radar performance and make merging of sensor data in a Multi Radar Tracking (MRT) or data fusion system difficult.

Encoder performance is measured by placing a three-axis Gyroscope on the antenna under test. The Gyroscope measures the actual rotation of the antenna system while RASS video recording equipment measures the reported position of the antenna via the Radar’s Antenna Change/Clock Pulses (ACP) and Antenna Reference Pulses (ARP), created by the Encoder. Turntable level is measured by analysing data from an inbuilt Inclinometer packaged within the Gyroscope hardware.

1.4.1 Encoder Performance

The Encoder was tested for correct operation. In the following gyro plots, the green line represents actual antenna position with respect to constant rotation, the blue line represents the position reported by the Radar (derived from the Encoder’s ARP/ACP) and the red line represents the difference or error between actual antenna position and the Encoders reported antenna position.

Table 19 details the performance result of both Encoders. Figures 44 & 45 show a snapshot of the recording.

Table 19: Encoder Results

Peak Error Azimuth

Encoder X -0.14° ≈ 330°

Encoder Y -0.14° ≈ 340°

Specification *<0.1° N/A

*The overall azimuth error of the Radar data output should be =< 0.1 degrees.

Figure 44: Mechanical – Encoder Analysis ENC-X

Page 66: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 38/40

IMS-600-10.02

Revision 1.0: 23/05/2013

NOTE:

Measured Error; is the error calculated by the RASS software when the North Reference Pulse (NRP) is assumed to have zero error. Hence at 0° azimuth, there is zero error. In reality, zero error is actually at the azimuth the SM is located, which is the reference for a ‘north alignment’.

Calculated Error; can be difficult to comprehend but essentially, it is the error calculated after taking into account the reference used to ‘north-align’ the radar system, typically the SM. The net ACP (blue line) is shifted vertically up/down until the ACP has zero error (Difference between Blue and Green lines) at the SM. Further, the calculated error is proved when overlaying (M)SSR targets with a reference (ADS-B).

Comments:

In this case the calculated error is significant and has surfaced in the data as an azimuth bias, specifically at bearings 74° and 340°, as shown in the Multi-level analysis.

The performance of each Encoder is similar to one another, hinting that the cause of the large error is the Rotary Joint as it is common to both encoders. Rectifying the encoder performance issue will lower the positional accuracy results found in the level 1 RCD analysis.

At bearings 74° and 330° the error is significant, as shown in the Multi-level analysis.

Jitter is severe throughout rotation, but most significant when the boom passes the northerly azimuth motor. Recent maintenance indicates that the azimuth motors and clutch assemblies are not at fault, though this statement is not certain due to the possibility of suspect spares.

Figure 45: Mechanical – Encoder Analysis ENC-Y

Page 67: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 39/40

IMS-600-10.02

Revision 1.0: 23/05/2013

1.4.2 Turntable Inclination

Table 20 details the inclination of the pedestal turntable. Figure 46 shows a snapshot of the recording.

Table 20: Turntable Results

Peak Error Azimuth

Inclination -0.10° ≈230°

Specification <0.1° N/A

Comments:

The levelling of the pedestal has remained untouched for a number of years. The result shows that the antenna dips down by 0.1° at approx. 230°, and points up by 0.1° at approx. 45°.

It was noted that one of the tower legs appears to have descended into the ground. Of course this was only an observation. For reference, it is the leg closest to the side entry door to the Radar cabin.

Figure 46: Mechanical – Turntable Inclination

Page 68: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex B - Level 2 Sensor Baseline Evaluation Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex B 40/40

IMS-600-10.02

Revision 1.0: 23/05/2013

Blank Page

Page 69: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Edition: A Date: 29 May 2015 Status: Draft

ANNEX C

TEST EQUIPMENT AND SOFTWARE LIST

NGSP S-Band PSR: DPR & MSSR: DTI529 - CBR

Page 70: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex C - Test Equipment and Software List Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex C 2/4

IMS-600-10.03

Revision 1.0: 23/05/2013

Blank Page

Page 71: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex C - Test Equipment and Software List Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex C 3/4

IMS-600-10.03

Revision 1.0: 23/05/2013

RASS HARDWARE

The following RASS equipment was used during the conduct of the Performance Evaluation.

Part Number Description Serial Number Calibration Due

RFA 641

LPA 114

Radar Field Analyser

Log Periodic Antenna

67/03/118

163/02/47

12/12/15

RGI 596 Radar Gyroscope Indicator 57/02/120 23/01/16

RVR 183 Radar Video Recorder 09/01/07 03/02/16

RVI 299 Radar Video Interface 22/01/10 03/02/16

IFL 520 Log IF Module 46/01/61 05/02/16

RIM 782 Radar Interface Module 97/04/83 02/01/16

GPS 450 Global Positioning System 36/02/146 15/12/15

UDR 600 USB Data Recorder 60/01/258 12/12/15

UPM772 USB Power Meter 95/01/78 09/01/16

RASS SOFTWARE

RASS-S software used during this Performance Evaluation was v7.0.5p33

RASS-R software used during this Performance Evaluation was v3.7.1

RCM software module used in this Performance Evaluation was v4.1.0

RCD software module used in this Performance Evaluation was v2.1.1

DHM software module used in this Performance Evaluation was v2.13.0

CMC software module used in this Performance Evaluation was v1.0.4

Page 72: Main Report L 1 2 PERFORMANCE EVALUATION - …intersoft-electronics.com/Downloads/Sample Report... ·  · 2016-07-11Level 1 & 2 Performance Evaluation Edition Date: 29 May 2015 IEAP_SAMPLE_L1-L2_Main

Annex C - Test Equipment and Software List Edition Date: 29 May 2015

IEAP_SAMPLE_L1-L2_Annex C 4/4

IMS-600-10.03

Revision 1.0: 23/05/2013

Blank Page