8
Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Embed Size (px)

Citation preview

Page 1: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Preliminary Assessment Structure(some suggestions for the discussion)

Jörg Schulz, EUMETSAT

Page 2: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Content

• GEWEX needs• Breakout Groups• Common Discussion Points• More questions and some elements of schedule

Page 3: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

GEWEX Needs

GEWEX needs to consider candidates for homogeneous, long time series (1980 – present) products including TPW, Profiles and UTH

GEWEX needs to consider shorter term (2000 – present) best possible product for process studies using co-variance among W-E products

GEWEX does not generate the actual data sets. It merely endorses and works with data sets that meet its criteria for long/short term stability, documentation, and willingness to be part of broader science question. GEWEX needs willing PIs.

Page 4: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

GEWEX Needs

GEWEX needs to develop an assessment strategy that evaluates what we know, what we don’t know, and how we move forward.

GEWEX products are not fixed, but rather get reprocessed in a cycle of approximately 5 years. Assessments need to work on those time frames also.

GEXEX needs to start on the logistics of what makes the most sense from a scientist perspective. That is what we’re here to do

Page 5: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Breakout groups

Total column water vapour− Lead Marc Schröder, Rapporteur: Janice Bytheway

Temperature and water vapour profiles + UTH / FTH

− Lead: Antonia Gambacorta, Rapporteur: Martin Stengel

Page 6: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Common Discussion Points

• What data sets are existing (make an inventory, list strength and weaknesses, e.g., in terms of space-time coverage)?

• What validation data are existing (make an inventory, list availability and status of QA)

• What should be compared, level-3 gridded data sets, level-2 retrievals, level-1 radiance data?

• How should a comparison be designed?• Discuss a specific list of error parameters, e.g., different types of bias; • Who should lead the assessment (need 2 co-leaders), do we need

sub-leaders following the breakout groups?• Do not forget to touch on technical issues as data bases and formats .

Page 7: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

Error Parameters: Types of Bias Errors

• Radiometric Bias: Biases due to uncharacterized or ill applied sensor calibration.

• Retrieval Bias: Biases related to shortcomings in a retrieval itself.• Sampling/Contextual Bias: Biases related to where a retrieval is/is not

performed or contextually related uncertainty in a scene. This leads to a skewed data population relative to what is thought to have been collected.

• Aggregation/Data Reduction Bias: Loss of required information during conversion to a gridded product or during analysis.

• Cognitive Bias: We, the investigators, misinterpret, withhold, or frame data/results contrary to full nature of the data.

• Other Considerations: a) Correlated error-“Independent” products that share similar biases; b) Tautology –Circular reasoning or treating non-independent data as independent during tuning.

This is an example from the aerosol assessment.

Page 8: Preliminary Assessment Structure (some suggestions for the discussion) Jörg Schulz, EUMETSAT

More questions and some elements of schedule

• How long will it take and what resources are available?• Assessment funding should be a multi-agency task, agencies are

invited to support this activity;• Who can take on the function as technical assessment centre?• Schedule: Assessment preparation should be done in 2011, the

assessment itself in 2012;• Write summary report on this workshop within the next 8-10 weeks?• Publish summary in GEWEX news as early as possible!• Assessment leader(s) needs to present results at GEWEX Radiation

Panel meeting each year (next in Tokyo 30 Aug-1 Sep 2011).• EUMETSAT can offer one Assessment workshop in Darmstadt or we

try to organise it somewhere nice.