30
1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

Embed Size (px)

Citation preview

Page 1: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

1

AMDAR Quality Assurance

Bradley BallishNOAA/NWS/NCEP/NCO/PMB

SSMC2/Silver Spring23 March, 2009

Page 2: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

2

Outline• Monthly reports

• Examples of data quality control (QC) problems

• Comparison of some aircraft temperatures, wind and moisture data in North American area

• Proposed aircraft temperature bias corrections and related issues

• Summary

Page 3: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

3

Regular Monthly AMDAR Reports

• Based on a WMO meeting at the ECMWF in June 2002, NCEP prepares monthly aircraft monitoring reports at website:

• http://www.ncep.noaa.gov/pmb/qap/amdar/• These standard monthly reports are not frequent

enough in time, do not have track-checking or stuck data summaries and do not have accent and descent statistics in most parts

• The WMO Integrated Global Observing System (WIGOS) Pilot Project for AMDAR suggests regional centers QC AMDAR data before transmission on the GTS

• This will require much more frequent updates than monthly reports

Page 4: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

4

Japanese Data in Monthly Reports

• In the NCEP AMDAR report for February 2009, the Japanese data looked good

• Of 274 Japanese aircraft reporting data, only 7 had suspect temperatures:

• Units JP9Z4U44, JP9Z4Y4X, JP9Z4Y79, JP9Z4YVV, JP9Z5859, JP9Z585Z and JP9Z5Y79 had warm biases

• No units had suspect winds!• There were about 100 minor track-check

errors, see example on next page

Page 5: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

5

Track-Check Error Example

Aircraft Data for Unit JP9Z58XZ

For 00Z 16 March 2009

Time-Days Lat Lon Press

16.07153 27.40 125.00 196.8

16.07639 34.82 140.37 461.7

16.07708 29.53 127.97 196.8

16.11944 34.75 140.28 435.2

16.12083 32.05 132.23 196.8

16.14306 35.27 140.70 558.1

16.14444 35.48 140.78 609.7

16.22917 35.40 139.90 290.1

16.26806 34.32 133.58 300.9

16.28750 33.52 130.43 300.9

Locations and pressures arechanging too fast with timebut all data are close to modelbackground

All raw data received at NCEP haveonly header KAWN – US Air Forcenot RJTD as expected Additional examples can be provided

Page 6: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

6

Aircraft Monitoring Example

• On 9 August 2006, aircraft EU3102 started to show a large temperature bias from 300 hPa up compared to the background

• The spurious bias was so large that few spurious temperatures passed QC

• The bias was so large that the aircraft was probably wasting fuel

• If the airlines could check a website with this information, such problems could be found and fixed much sooner

Page 7: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

7

Temperature Bias for unit EU3102 300 hPa up August 2006

-2

0

2

4

6

8

10

1201

00

0106

0112

0900

1100

1118

1206

1212

1300

1306

1318

1400

1506

1512

1518

1612

1618

1706

1712

1718

1800

1806

Run Day and Time

Tem

per

atur

e B

ias

Page 8: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

8

Aircraft Track-check Example• On 11 August 2006, aircraft AFZA01 was flying from the

southeast to northwest with roughly several minutes between reports

• Three groups of reports are shown, with groups 1 and 3 with correct locations and group 2 with all reports about 12 degrees too far north

• The blue numbers are vector wind differences to the guess, with group 3 having large differences that all passed QC

• Flying from the end of group 1 to the start of group 2 is an impossible distance in several minutes

• This is a tough example for current QC codes to correctly process as group2 can track-check with itself

• This problem with South African aircraft has lasted over a year

• Examples of solo track-check errors are common

Page 9: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

9

1

3

2Blue numbers are vector winddifferences of observed windsminus model background

Page 10: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

10

Aircraft Temperature Observation Count

Comparison for NA area

• An impact test adding TAMDAR and Canadian AMDAR data at NCEP did not have positive impact, so here we examine this data

• The next slide compares the average number of different types of temperature counts to the nearest mandatory pressure level per GDAS model run in June 2008 for North America (NA)

• Counts for Radiosondes, ACARS, TAMDAR and two types of Canadian AMDAR are compared

• Wind observation counts (not shown) were found to be nearly identical to temperature counts

• Clearly the aircraft counts out number those from sondes

• The two main types of Canadian aircraft are labeled CRJ and DHC-8

Page 11: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

11

Average Temperature Counts per Run NA Area June 2008

0 1000 2000 3000 4000 5000

1000

925

850

700

500

400

300

250

200

Pre

ssur

e in

hP

a

Counts

CRJ

DHC-8

TAMDAR

ACARS

Sondes

Sondes have low counts relativeto large ACARS counts

Page 12: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

12

Temperature Bias Comparison

• The next slide compares the average temperature bias of different types of observations to the nearest mandatory pressure level per GDAS model run in June 2008

• Biases for sonds, ACARS, TAMDAR and two types of Canadian AMDAR are shown

• Clearly the aircraft temperatures are generally warmer than those from sonds (as found for ACARS and AMDAR, Ballish and Kumar (BAMS, Nov 2008))

• The DHC-8 aircraft have the warmest bias

Page 13: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

13

Temperature Biases Versus Guess NA Area June 2008

-1 -0.5 0 0.5 1 1.5

1000

925

850

700

500

400

300

250

200

Pre

ssur

e in

hP

a

Temperature Bias in Degrees

CRJ

DHC-8

TAMDAR

ACARS

Sondes

Sonds are coldcompared toaircraft

Page 14: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

14

Temperature Bias vs POF for Canadian AMDAR Data

• In the following slide, the temperature biases for Canadian AMDAR type DHC-8 are shown vs the phase of flight (POF)

• This aircraft type has generally warm biases that vary with the POF

• Here the biases vary considerably with the POF

Page 15: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

15

Temperature Biases Versus Guess DHC-8 Aircraft NA Area June 2008

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3

1000

925

850

700

500

400

Pre

ssur

e in

hP

a

Temperature Bias in Degrees C

TOTAL

LEVEL

DESCENT

ASCENT

Ascent vsdescent is large

Page 16: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

16

Speed Bias vs POF for Canadian AMDAR Data

• In the following slide, the wind speed biases for Canadian AMDAR type CRJ are shown vs the POF

• This aircraft type has speed biases that vary considerably with the POF

• In the second following slide, the same is shown for Canadian aircraft type DHC-8

• Here the speed biases vary even more with the POF

• At the WIGOS February 2009 meeting, it was noted that the CANADIAN AMDAR data are less accurate in high latitudes due to using magnetic, rather than GPS navigation

Page 17: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

17

Speed Biases Versus Guess CRJ Aircraft NA Area June 2008

-1.5 -1 -0.5 0 0.5 1 1.5 2

1000

925

850

700

500

400

300

250

Pre

ssur

e in

hP

a

Speed Bias in m/sec

TOTAL

LEVEL

DESCENT

ASCENT

Ascent vs descent is large

Page 18: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

18

Wind Speed Biases for Aircraft Type DHC-8 June 2008 NA Area

-1.5 -1 -0.5 0 0.5 1 1.5 2

1000

925

850

700

500

400

Pre

ssur

e in

hP

a

Speed Bias in m/sec

TOTAL

LEVEL

DESCENT

ASCENT

Ascent vs descent is very large

Page 19: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

19

Relative Humidity Bias Comparison

• The next two slides show counts of moisture observations and relative humidity biases differences versus the guess for the North American area in June 2008 for sonds, ACARS and TAMDAR data

• The TAMDAR data (at this time) are mainly in the mid west, yet have higher counts and very good stats versus the guess

Page 20: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

20

Average Moisture Observation Counts per Run NA Area June 2008

0 200 400 600 800 1000

1000

925

850

700

500

400

300

Pre

ssur

e in

hP

a

Counts

TAMDAR

ACARS

Sondes

TAMDAR has large counts, butare just in mid-west only

Page 21: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

21

Relative Humidity Biases Versus Guess NA Area June 2008

-10 -8 -6 -4 -2 0 2 4 6 8 10

1000

925

850

700

500

400

300

Pre

ssur

e in

hP

a

RH Bias in %

TAMDAR

ACARS

SondesTAMDAR biases maybe better than sonds

Page 22: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

22

Proposed Aircraft Temperature Bias Corrections

• Ballish and Kumar BAMS(Nov 2008) studied aircraft temperature biases and proposed bias corrections shown in the next slide for January 2007 for the 15 aircraft types with the largest counts

• In the following slide, the same is shown for non US AMDAR types

• This study did not include TAMDAR or Canadian AMDAR types

Page 23: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

23

Proposed ACARS Temperature Bias Corrections January 2007

-1.4

-1.2

-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.675

7-22

2

757-

223

757-

24A

PF

737-

3H4

757-

232

767-

34A

F

A30

0F4-

60

737-

522

MD

-11F

737-

832

MD

-88

757-

251

A31

0-20

3

767-

332

A31

0-32

4

Aircraft Types

Bia

s C

orre

ctio

ns in

Deg

ree

s C

SFC-700 700-500 500-300 300-150

Most correctionsare negative

Page 24: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

24

Proposed NUS-AMDAR Temperature Bias Corrections January 2007

-1.5

-1

-0.5

0

0.5

1JP

A32

0-20

0

737-

300

-737 AU

A31

9-10

0 B-

AF

-747

A32

1-10

0

NZ

A34

0-30

0

MD

-11F

A34

0-

Aircraft Types

Bia

s C

orre

ctio

ns

in D

egre

es C

SFC-700 700-500 500-300 300-150

Page 25: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

25

Aircraft vs Sond GSI Draws to Temps between 200-300 hPa

# Aircraft >> # Sondes, thus warm aircraft data overwhelms the GSI/GFS system

Aircraft Tdiff (obs-ges)

Aircraft Tdiff (obs-anl)SOND Tdiff (obs-anl)

SOND Tdiff (obs-ges)

Page 26: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

26

AMDAR Versus Sond Counts 300-200 hPa

Aircraft

Sonds

Aircraft

Sonds

Page 27: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

27

Suru Saha’s website displays model fits to RAOBS in North America showing the GFS analysis and guess maintain a warm bias throughout most of the troposphere that may be related to large numbers of aircraft with warm biases

Page 28: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

28

Model Climate Impact from Aircraft Warm Temperatures

• The next slide courtesy of Dick Dee of ECMWF shows the increase in the number of aircraft reports versus time in the ECMWF reanalysis

• The temperature bias of the ECMWF analysis and background seem to be affected by the large increase in the number of aircraft temperatures along with other factors

• The NCEP GSI may have more bias impact as it does not thin aircraft data and its satellite radiance bias corrections are anchored to the analysis as truth as opposed to radiosondes as truth

Page 29: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

29

Model Climate Bias Impact From Warm Aircraft Temperatures

Global-mean departures of analysis (blue) and background (red) from radiosonde temperatures (K) at 200hPa, and number of obs/day (x10-4, green)

Global-mean departures of analysis (blue) and background (red) from aircraft temperatures (K) at 200hPa, and number of obs/day (x10-4, green)

Page 30: 1 AMDAR Quality Assurance Bradley Ballish NOAA/NWS/NCEP/NCO/PMB SSMC2/Silver Spring 23 March, 2009

30

Summary • The standard monthly AMDAR reports are useful but do not

contain enough information on aircraft data quality• In part due to the WIGOS project, more frequent and complete

quality information will be needed• Improvements are needed in the aircraft track-checking• The TAMDAR data appear to be of useful quality, especially the

moisture• The Canadian AMDAR data show considerable bias differences

with the aircraft phase of flight and will need more effort to assimilate them well

• There is evidence that large numbers of relatively warm aircraft temperatures are impacting model analysis bias

• Improvements are needed in the bias correcting and or use of aircraft temperatures, winds and moisture

• NCEP and the ECMWF are both planning to perform aircraft temperature bias corrections

• It is likely that 4DVAR assimilation is needed to get maximum impact of aircraft data due to their reporting at off times