79
NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification Processor Thomas W. Bettge Richard A. Anthes ACID DEPOSITION MODELING PROJECT AND ATMOSPHERIC ANALYSIS AND PREDICTION DIVISION NATIONAL CENTER FOR ATMOSPHERIC RESEARCH BOULDER. COLORADO I I II ·· i . 1

VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NCAR/TN-281+IANCAR TECHNICAL NOTE

July 1987

VERIFY - The MM4Verification Processor

Thomas W. Bettge

Richard A. Anthes

ACID DEPOSITION MODELING PROJECTAND

ATMOSPHERIC ANALYSIS AND PREDICTION DIVISION

NATIONAL CENTER FOR ATMOSPHERIC RESEARCHBOULDER. COLORADO

II II · ·

i

.

1

Page 2: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification
Page 3: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Table of Contents

List of Figures . .. .. ..... .. .

List of Tables .. .. .. ... . ... .

Preface . . . . . . . . . . . . . . . . .

Acknowledgements ............

1.0. Introduction .. .. .. .. .. . . . .

2.0. Description of VERIFY .........

2.1. Brief Overview . . . ... . . . .

2.2. Verification Against Station Data

2.3. Verification Against Analyses ....

2.4. Precipitation Verification ......

3.0. VERIFY Code Structure .........

3.1. Code Strategy . . . . . . . . ..

3.2. Flow Chart ............

3.3. COMMON Storage ........

3.4. Parameterization .........

4.0. VERIFY Users' Guide ..........

4.1. CRAY JCL ...........

4.2. UPDATE Modifications ......

4.3. NAMELIST Groups/Variables . . .

4.4. Sample Decks ...........

5.0. Limitations ...............

6.0. New Data Sets and Grids .........

7.0. Examples of Verification .........

References . . . . . . . . . . . . . . .

Appendix A: Preparation of MM4 Output

Appendix B: Description of Structure Fur

Appendix C: SFCAL Users' Guide

Appendix D: Smoothing Operator

Appendix E: Verification of Meteorologica

. .. ... . . . . . . . v

. . . . . . . . . . . . . vii

... . . . . . . . . . . . . ix

. . . . . . . . . . . . . 1xi

. .. . . . . . . . .. . .. Xi

. . . .· . . . . . . . . . 57

. . . . . . . . . . . . . 11

. . . . . . . . . . . . . 11

. . . . . . . . . . . . . 11

. . . . . . . . . . . . . 13

. . . . . . . . . . . . . 15

. . . . . . . . . . . . . 15

. . . . . . . . . . . . . 15

. . . . . . . . . . . . . 15

... . . . . . . . . . . 15

... . . . . . . . . . . 21

. . . . . . . . . . . . . 25

... . . . . . . . . . . 29

... .. . . . . . . . . 30

... . . . . . . . . . . 31

action

il Model

iiieee

Il1

Page 4: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification
Page 5: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

List of Figures

Figure 1.1. The MM4 Modeling System ......

Figure 1.2. Use of VERIFY ..................

Figure 3.1. VERIFY Flow Chart . .. . ........ . 12

v

2

3

Page 6: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification
Page 7: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

List of Tables

COMMON Storage Data Order

Station Verification Skill Scores

Analysis Verification Skill Scores . .

Precipitation Verification Skill Scores

VERIFY Parameters . . .

Namelist PRIME ......

Namelist GRID1 ......

Namelist GRID2 ......

Namelist GRID3.

Namelist SCALES .....

Namelist PRECIP . . . . .

List of Current Grid Types

List of Station Comparisons

List of Analysis Comparisons

List of Precipitation Comparisons

vii

6

8

10

Table 2.1.

Table 2.2.

Table 2.3.

Table 2.4.

Table 3.1.

Table 4.1.

Table 4.2.

Table 4.3.

Table 4.4.

Table 4.5.

Table 4.6.

Table 4.7.

Table 5.1.

Table 5.2.

Table 5.3.

14

. . . . . . . . . . . . 16

. . . . . . . . . . . . 18

. . . . . . . . . . . . 18

. . . . . . . . . . . . 18

. . . . . . . . . . . . 19

. . . . . . . . . .. . 19

* . . . . . . . . . . .. 20

* . . . . . . . . . . .. 26

* . .. . . . . . . . .. .. 27

28

Page 8: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification
Page 9: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Preface

The purpose of this technical note is to describe and demonstrate the

capabilities of a processor (VERIFY) developed to verify objectively fore-casts generated by the PSU/NCAR Mesoscale Model (MM4). VERIFY has

the ability to examine model performance through verification by means of

three distinctly different functions: comparison against rawinsonde stationdata, comparison against gridded analyses, and comparison against grid-

ded observed precipitation. Designed to execute within the NCAR Scientific

Computing Division environment on the CRAY supercomputers, VERIFY

uses data sets that reside on the mass storage system.

In this note the capabilities of this processor are outlined, the philos-

ophy and strategy behind the code structure is discussed, a users' guide is

given, and the limitations of the processor are presented. This note does notprovide a comprehensive description of the processor code, nor of the data

sets upon which it operates.

VERIFY was developed primarily to analyze the MM4 performance.

Due to the flexibility of its design, however, the results from other models-

both regional and global-can easily be incorporated.

ix

Page 10: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification
Page 11: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Acknowledgements

The National Center for Atmospheric Research is sponsored by theNational Science Foundation. A portion of this work has been supported

by the U.S. Environmental Protection Agency under Interagency AgreementDW930144-01-1, but has not been subject to EPA review procedures. HopeHamilton provided expert assistance in the preparation of this report.

xi

Page 12: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

1.0. Introduction

The VERIFY program is designed for use as a verification package to measure theskill of forecasts generated by the PSU/NCAR Meso-a Scale (Regional) Limited AreaModel (MM4) which is documented by Anthes et al. (1987). It can also be used to verifyother regional and global models. Within the modeling system illustrated in Fig. 1.1,VERIFY represents the process labeled as 'VERIFY'.

VERIFY performs three major functions:

(1) verification against rawinsonde station data;(2) verification against gridded analyses; and(3) verification of precipitation.

A number of supplementary functions, which depend upon the major function beingemployed, can be performed. They include:

(a) computation of S1 skill score (defined in Appendix E);(b) computation of the root mean square (RMS), the standard deviation (SD),

and the mean errors (Appendix E);(c) computation of correlation coefficient of observed changes versus forecast

changes (Appendix E);(d) computation of precipitation threat and bias scores (Appendix E);(e) computation of precipitation spatial correlation coefficient matrix (Appendix

E);(f) computation of statistical structure of fields using a normalized structure

function (Appendix B);(g) scale decomposition of gridded fields;(h) graphical displays in time series of all scores; and(i) horizontal maps of forecast, observed, and difference fields.

While VERIFY was developed primarily to analyze the MM4 performance, its designis flexible enough that results from other regional models (or global models interpolated toregional domains) can easily be incorporated. Due to the basic structure of the code, theimplementation of a number of different data sets on different grids can be compared toone another on a similar grid. Figure 1.2 illustrates the types of data sets that VERIFYpresently accommodates.

Note in Fig. 1.2 that the output from the MM4 is only one of several data sets whichcan be processed by VERIFY. The MM4 raw data set, in fact, must be converted fromsigma to pressure surfaces prior to acceptance as an input data set to VERIFY. AppendixA contains a sample deck that can be used to perform this conversion.

1

Page 13: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NMC globaldata

Transmission data

Figure 1.1. The MM4 Modeling System.

2

I mnda-i eB

3

Page 14: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

FORECASTS

OBSERVATIONS

Figure 1.2. Use of VERIFY. CFM is the NCAR Community Forecast Model, a version ofthe Community Climate Model. PERIDOT is a limited-area model developedby the Centre de Recherches en Meteorologie (Paris). LFM is the Limited-AreaFine Mesh Model developed by the National Meteorological Center (NMC).

3

Page 15: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

VERIFY itself produces an output data set during station verification for input tothe auxiliary structure function analysis program SFCAL. The function and use of SFCALare described in Appendices B and C, respectively.

The purpose of this document is to describe the capabilities of VERIFY, to discuss thephilosophy and strategy behind the VERIFY code structure, to present the code structureso that it may be understood and easily modified by the user, and to illustrate use of theprogram through a users' guide. The types of verification performed within each of thethree major functions are described and illustrated, and a discussion of how to introducea new data set is given.

Finally, the ability to intercompare completely the input data sets shown in Fig 1.2is limited. The limitations are discussed in Section 5.0.

4

Page 16: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

2.0. Description of VERIFY

VERIFY performs three major functions: comparison of gridded data sets againstrawinsonde station data, comparison of one gridded data set against another, and com-parison of gridded precipitation fields. The comparisons consist of computing objectiveskill scores, filtering data to extract information about the scale structure of a field, andplotting and analyzing fields and field differences on a horizontal map background.

The objective comparisons in VERIFY are performed on the fields listed in Table2.1. Each of the three major verification components works on slightly different fields andperforms slightly different functions. For example, the station data verification performsa structure function analysis, while the gridded data set verification accomplishes scaledecomposition by two-dimensional Fourier filtering.

The purpose of this section is to describe the capabilities of each of the three majorfunctions. Section 7.0 of this document contains the formulations, the interpretation, andexamples of most of the skill scores mentioned in this section. Further discussion of theverification procedures used in VERIFY are givein Anthes (1983).

2.1. Brief Overview

During each submission of the VERIFY code, only one of tthe three possible functionscan be requested. Once the function is selected, all the time periods desired will beprocessed and a predetermined set of skill scores will be computed for all of the fieldsavailable from the data sets specified. The fields eligible for processing depend upon thefunction being performed and are listed in Tables 2.2-2.4.

The user has no choice as to which fields are used in the skill score computations fora particular function. If a field is eligible to be used (as indicated in Tables 2.2-2.4), andit exists on both input data sets, skill scores for that field will be computed. If a field iseligible, but does not exist within one or both of the input data sets, it will be ignored.

The results of the skill score computations for each field will be presented in table formon the user's printed output, and will be plotted in time series graph form on microfilm.

Options over which the user has control are either CPU intensive or unactivatedwithout a request from the user. These include the format of the scale decomposition offields, the contouring of fields and field differences on map backgrounds, and the selectionof specific thresholds for precipitation verification.

2.2. Verification Against Station Data

The verification of a gridded data set against a set of station observations is per-formed in the following manner. A set of station observations for a particular time period

5

Page 17: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Field Number Variable Level (mb)Sea Level PressureConvective Precip

Total Precip

u

v

HqT

u

v

HqT

u

v

HqT

u

v

HqT

u

v

HqT

u

v

HqT

34 Surface Pressure

Note: Sea level is designated as 1013 mb.The surface is designated as 1001 mb.

Table 2.1. Common Storage Data Order.

6

123

45678

101310011001

10011001100110011001

910111213

1000

1000

1000

1000

1000

1415161718

850850850850850

1920212223

2425262728

700700700700700

500500500500500

300300300300300

1001

2930313233

Level (mb)Field Number Variable

Page 18: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

is gathered and sorted according to the fields needed as specified in Table 2.2. The cor-responding fields within the gridded data set are gathered, and values of the fields at thestation locations are obtained by linear interpolation from the nearest four points. (Aninterpolation scheme involving the sixteen grid points nearest the station location wastested, but the results were virtually indistinguishable from those using the four-pointlinear scheme.)

The skill scores that are computed with the pairs of values for each field at the stationlocations are indicated in Table 2.2. They include:

(1) root mean square (RMS) of the differences;(2) mean of the differences;(3) standard deviation (SD) of the differences; and(4) correlation coefficient (CC) of the observed change

versus the forecast change in the field since theinitial time period requested.

In addition, the pairs of values for selected fields at each time period are saved andcan be disposed as an output data set upon completion of VERIFY. This output data setcan then be used as input to the auxiliary program SFCAL. SFCAL performs a structurefunction analysis by computing the autocorrelation and cross-correlation functions for allpairs of station points for both the forecast and observed field values. Appendix B containsa more complete description of the use of the structure function for verification. AppendixC contains a users' guide to the SFCAL program.

It is not possible to produce horizontal plots on a map background during the stationverification procedure.

2.3. Verification Against Analyses

The comparison of one set of gridded data against another set of gridded data pro-ceeds in the same manner as the station verification. The fields from both data sets aregathered, interpolated via linear interpolation to the verification grid (if necessary), usedto compute the objective skill scores, analyzed on a map background (if desired), anddecomposed into spectral components (if desired).

The verification scores computed using the gridded fields (or their differences) arelisted in Table 2.3. They include:

(1) root mean square (RMS) of the differences;(2) mean of the differences;(3) standard deviation (SD) of the differences; and(4) Si skill score.

7

Page 19: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Scale decomposition of gridded fields is accomplished with the method outlined byErrico (1985). After fields from both data sets are decomposed, the difference betweenthe decompositions is used to compute the usual skill scores. If scale decomposition isrequested, it is performed on all the fields indicated in Table 2.3. The spectra (fieldvariance as a function of scale) for each field also will be plotted by default.

The user may elect to map the gridded fields, as well as their differences, onto theverification grid background. The contour interval for each field can be specified. Alldecompositions of fields that the user specifies to be plotted will be plotted, but thecontour intervals for the decomposed fields will be selected internally.

Field mb RMS Mean SD

CCForecast Change

vs. StructureObserved Change Function

xSurfacePressure

Height 850700500300

Tempera- 850ture 700

500300

SpecificHumidity

Wind

850700500300

850700500300

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

X

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Notes:(1) The RMS error for the wind field is the RMS vector wind

difference.(2) For structure function analysis, fields are output for later

input to program SFCAL.

Table 2.2. Station Verification Skill Scores.

8

I _ i� _

Page 20: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

2.4. Precipitation Verification

Gridded precipitation fields are compared in a manner similar to other gridded fields.The total precipitation fields (rainfall at the surface) are obtained from the two input datasets, and the precipitation skill scores are computed. The user may request to plot theprecipitation fields, or their differences, on the verification grid map background. Duringthe verification of precipitation, VERIFY processes only the total precipitation field.

Si ScaleField mb RMS Mean SD Skill Score Decomposition

Sea LevelPressure

Height 1000

850700500300

Tempera- 1000ture 850

700500300

SpecificHumidity

Wind

1000

850700500300

1000850700500300

x x x x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

xX

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Notes:(1) The RMS error for the wind field is the RMS vector wind difference.(2) Scale decomposition accomplished by the method of Errico (1985).(3) S1 skill score uses a grid length which is specified by the user.

Table 2.3. Analysis Verification Skill Scores.

9

Page 21: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

The skill scores computed during the precipitation verification are summarized in

Table 2.4 and Appendix E. They are:

(1) categorical forecast accuracy (the percentageof correct forecasts);

(2) threat score;(3) bias score; and(4) correlation matrix.

The scores are computed for the threshold amounts requested by the user. A thresh-

old amount is simply a limit that defines the occurrence of an event. If the forecast (or

observed) rainfall at a point is greater than the threshold (say, 0.25 mm), an 'event' is

defined to have occurred in the forecast. If the rainfall is less than the threshold at a

point, then a 'no event' is defined to have occurred.

The correlation matrix verification measures the spatial correlation between the fore-

cast and observed patterns for various spatial lags. For each time period included in the

verification, the user specifies the center point of each box within which the lag correlations

are to be computed. The forecast grid box (whose size is specified through the parameter

MSIZE; see Section 3.4) is moved point-by-point over the observed grid box. At each

overlay a correlation coefficient is computed. The final result is a matrix of correlation co-

efficients from which one may determine the direction in which the forecast box (pattern)

must be moved to achieve the highest correlation with the observed box (pattern). An

example of the correlation matrix verification is presented in Appendix E.

Categorical SpatialField Forecast Threat Bias Correlation

Accuracy Score Score Matrix

Total Precipitation x x x x(surface)

Note:Scores are computed for all threshold amounts selected byuser.

Table 2.4. Precipitation Verification Skill Scores.

10

Page 22: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

3.0. VERIFY Code Structure

3.1. Code. Strategy

The basic strategy within the code structure of VERIFY is based upon the fact thattwo data sets, possibly in different formats and on different grids, can be compared. Thecomparison is performed on a third grid which may not necessarily be the same as eitherof the first two. Practically speaking, one can refer to the first grid as the 'forecast' grid,the second as the 'observed' grid, and the third as the 'verification' grid. Within the codethese are referred to as grids 1, 2, and 3, respectively, and one can easily compare forecastversus forecast using the generic definitions of the first two grids.

Each grid is treated as its own independent entity. The first two grids are matchedagainst the user-specified third grid. In order for verification to proceed, the data onthe first two grids are interpolated, if necesary, to the third grid. Verification is thenperformed on the third grid. For example, the user may specify that the first set of datais on a Lambert conformal grid, the second is on a stereographic grid, and the verificationshould take place on a latitude-longitude grid.

After placing the input data sets onto the verification grid, the fields are differenced,statistics are computed, and maps are plotted-all at the discretion of the user.

3.2. Flow Chart

Figure 3.1 shows the general configuration of VERIFY in terms of its major subrou-tines. The main routine VERIFY simply calls, in order, the subroutines INPUT, SETUP,TIMEL, PRINT, GRAF, and UNSET.

INPUT reads the NAMELIST input groups used to prescribe processing.

SETUP assigns lengths of the data storage arrays based upon the verification gridprescription, computes the weighting factors used in area-averaging statistics on the veri-fication grid, and establishes the internal table of fields eligble for verification.

TIMEL is the main working routine for the verification. It loops over the time periodsbeing processed, gathers the data from the two input grids, calls one of the three majorverification component routines, and stores the verification statistics for later output. Thedata are gathered, interpolated, and stored through GETDAT. The computation andstorage of statistics are performed in VERSTN, VERANL, or VERPCP, all of which calllower-level routines to compute statistics and plot maps.

PRINT provides for printed output of the verification statistics on paper.

GRAF generates the plots of verification statistics for disposal to film.

11

Page 23: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

UNSET releases local data sets used in processing and disposes any data which mayhave been created during a run.

Input Data

Read and interpolate datato the verification grid.

Verify against station data.

Verify against griddedanalysis.

Verify precipitation.

Figure 3.1. VERIFY Flow Chart.

12

.1

Page 24: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

3.3. COMMON Storage

After interpolation to the verification grid, the data from the first two grids are heldin COMMON blocks. The order of the data is predetermined. Table 2.1 shows the fieldsand their order in the data arrays within the COMMON blocks.

The data from grids 1 and 2 are stored in the COMMON blocks named DATA1 andDATA2. Each COMMON block is defined as:

COMMON/DATA1/ LEN1 , IEXST1(MFLDS) , ID1(20),D (MAXLEN) , Wl(LENW)

DATA2 is identical to DATA1, except the variables are named LEN2, IEXST2, etc. Thearray lengths in the above COMMON statement are defined in subsection 3.4.

LEN1 is the length of each field stored in Dl (the fields are stored sequentially asdefined in Table 2.1). IEXST1 is an array which signals the existence of a field withinD1. If an element of IEXST1 is 0, then the field does not exist in the first data set, andD1 is filled with 0.0 in the locations where the field would normally be stored. ID1 is ahollerith description of the first data set. Dl contains the data on the verification grid. W1contains the appropriate weighting factors for each grid point to be used in area-weightedcalculations.

3.4. Parameterization

VERIFY is parameterized through the UPDATE common deck *COMDECK PARAMwith 16 parameters. Two of the parameters, IDIM and JDIM, must be set to the size ofthe verification grid. Six parameters, MM4I, MM4J, XM4DX, XM4PHO, XM4LMO, andXM4PS1, refer to the MM4 model configuration, and need be set only if MM4 model outputis being processed. The remaining eight parameters are set according to the maximumvalues expected for running VERIFY, and the default values will usually be sufficient.

Brief descriptions of each parameter are contained in Table 3.1. The default valuesare also given.

13

Page 25: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

VARIABLE Default Description

IDIM 61 The east-west dimension of the verification grid. If per-forming station verification, set to the east-west dimensionof the gridded data.

JDIM 46 Same as IDIM, but for the north-south dimension.

MM4I 46 Within the MM4 indexing convention, the data dimensionin the i-direction.

MM4J 61 Same as MM4I, but for j-direction.

XM4DX 80.0 The grid length (km) of the Lambert conformal MM4model.

XM4PHO 40.0 Center latitude of the Lambert conformal projection.

XM4LMO -90.0 For Lambert conformal projection, first co-latitude atwhich the image scale is 1.

MFLDS 34 The maximum number of fields possible to process. Pre-determined as described in Table 1.

MAXLEN 120000 The maximum length of the common data storage arrays.Must be greater than IDIM*JDIM*MFLDS.

MTIMES 25 The maximum number of time periods to process.

IMAX 150 The maximum number of stations to be used in stationverification.

MTHRES 6 The maximum number of threshold amounts to be used inthe precipitation verification.

MBOXES 2 The maximum number of boxes within which to computethe lag correlation coefficient matrix in precipitation veri-fication.

MSIZE 11 The maximum size of each box (number of grid points)over which to compute the lag correlation coefficient ma-trix in precipitation verification. The box extends (MSIZE-1)/2 points in all directions from its center. MSIZE is al-ways odd.

MBRKS 5 The maximum number of wave-group breakdowns to beused during scale decomposition in gridded verification.

Table 3.1. VERIFY Parameters.

14

I

Page 26: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

4.0. VERIFY Users' Guide

The use of VERIFY is described in this section by discussing the three major parts ofthe deck: the CRAY job control language (JCL), the UPDATE modifications to the VER-IFY code, and the NAMELIST input groups which specify the processing to be performed.Sample deck setups for each type of verification are shown at the end of this section.

4.1. CRAY JCL

The job control statements in the sample decks are self-explanatory. The only variablefor which the user should concern himself is the name of the file that contains the currentUPDATE version of the VERIFY program.

4.2. UPDATE Modifications

There are times when the VERIFY program may need to be modified for particularuses. Typically, the user will need only to modify the two PARAMETER statements whichspecify the size of the verification grid. As seen in the sample decks, they are:

PARAMETER (IDIM=ii)PARAMETER (JDIM=jj)

IDIM prescribes the east-west dimension of the verification grid, and JDIM the north-south dimension. The default values are IDIM=61 and JDIM=46. For verification againstradiosonde station data, these parameters should be set to the dimensions of the griddeddata being used in the comparison (usually the data on grid 1).

4.3. NAMELIST Groups/Variables

Six NAMELIST groups contain the input variables used to prescribe the processingto be performed by VERIFY.

The first NAMELIST group, PRIME, contains several primary control variables. Thevariable names and descriptions, along with the default settings, are given in Table 4.1.Note that only one of the three major verification functions can be performed during asingle run.

The second, third, and fourth NAMELIST groups describe the input data set grids(GRID1 and GRID2) and the verification grid (GRID3). The variables in these groups aredescribed in Tables 4.2-4.4. Table 4.7 contains a list of the input and verification gridscurrently accepted by VERIFY.

15

Page 27: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

VARIABLE

VANAL

VSOUND

VPRECIP

ISDATE

IFDATE

INC

MAPI

MAP2

MAPD

MAPTOT(n)

MAPSCA(n)

CINC(n)

CHIGH(n)

CLOW(n)

CDIF(n)

Default

.FALSE.

.FALSE.

.FALSE.

0

0

0

0

0

0

0

(preset)

(preset)

(preset)

(preset)

Description

Performs verification of one set of gridded data againstanother set of gridded data.

Performs verification of gridded data against station data.

Performs verification of precipitation; both data sets grid-ded.

Starting date of the verification in the form YYMMD-DHH, where YY is the year, MM is the month, DD is theday, and HH is the hour.

Final date of verification in the form YYMMDDHH.

Increment in hours between verification times.

Plots flag (film) for fields on the first data set. Plots onverification grid background. 0 = no plots; 1 = plots.

Same as MAP1, but for second data set.

Same as MAP1, but for difference between first and sec-ond data sets.

Indicates which fields are to be plotted, if either MAPI,MAP2, or MAPD is not equal to 0. For example, MAP-TOT(14)=1 will allow maps of field 14 to be plotted. Ta-ble 2.1 contains a list of fields which correspond to n.

Same as MAPTOT, but for scale breakdowns of the field.

Contour interval for maps of field n. Preset to appropriatevalues for fields in Table 2.1.

Highest contour level on maps of field n. Preset to appro-priate values for fields in Table 2.1.

Lowest contour level on maps of field n. Preset to appro-priate values for fields in Table 2.1.

Contour interval for difference maps of field n. Preset tosuitable value for fields in Table 2.1.

Table 4.1. Namelist $PRIME.

16

Page 28: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

VARIABLE Default Description

IWEST 5 Western-most point (in grid space) to be used in verifica-tion statistic computations.

IEAST IDIM-4 Eastern-most point to be used in verification statisticcomputations. (See Table 3.1 for definition of parameterIDIM.)

JSOUTH 5 As in IWEST, but for southern-most point.

JNORTH JDIM-4 As in IEAST, but for northern-most point.

INFO 'blank' Information label of up to 40 characters which will appearon all film frames created.

NSMTH1 0 Number of smoothing passes to perform on the first dataset.

NSMTH2 0 Same as NSMTH1, but for second data set.

SMTH 0.5 The smoothing operator value for smoothing function.(See Appendix D.)

IDELTS1 4 Grid increment spacing used in computation of SI skillscore.

NMCGES .FALSE. Flags to indicate which height field to obtain if one ofthe input data sets is type 3. (See Table 4.) A value of.FALSE. will give 'enhanced' height fields (computed from'enhanced' temperatures). A value of .TRUE. will givethe NMC first-guess heights.

ISTATA .FALSE. Flags to indicate disposition of the initial time statistics.If .TRUE., all initial error statistics will be set to O.O. If.FALSE., initial errors will be computed.

Table 4.1. Namelist $PRIME (Continued).

17

VARIABLE Default Description

Page 29: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

VARIABLE

ITYPEl

KTYPE1

Default

0

0

Description

First data set grid type. (See Table 4.7 for acceptablegrids.)

Grid type to which the first data set should be interpo-lated prior to verification. Valid only for station verifica-tion.

Table 4.2. Namelist $GRID1.

VARIABLE

ITYPE2

Default

0

Description

Second data set grid type. (See Table 4.7 for acceptablegrids.)

Table 4.3. Namelist $GRID2.

VARIABLE

ITYPE3

Default

0

Description

Verification grid type.

Table 4.4. Namelist $GRID3.

18

-

L1·

Page 30: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

VARIABLE

NSCALES

SBREAK(n)

Default

0

0.0

Description

Number of spatial wave-group breakdowns to be verified.

Array containing the break points of the spatial wave-groups being decomposed. Must contain NSCALES+1values, and n=1,NSCALES+1.

Table 4.5. Namelist $SCALES.

VARIABLE

NTHRES

THRES(n)

NBOXES

ICEN(box number, time)

JCEN(box number, time)

Default Description

0 Number of threshold amounts to verify.

0.0 Array of NTHRES threshold amounts.

0 Number of boxes within which to compute spatiallag correlations.

0 Two-dimensional array specifying the east-west cen-ter point of a box at a certain time. The point isgiven with respect to the verification grid.

0 Same as ICEN, but for the north-south center point.

Table 4.6. Namelist $PRECIP.

19

-

Page 31: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Grid Type Number

MM4 forecasts on Lambert conformal grid. 1

Rawinsonde station data from RAWINS output. 2

Griddfed analysis data from RAWINS on Lambert conformal

grid. (Enhanced analysis- 15 levels.) 3

Gridded analysis data from DATAMAP on Lambert

conformal grid. (NMC analysis first-guess fields.) 4

Gridded precipitation data from Perry Samson,

University of Michigan, on (61 x46) 80-km Lambert conformal

grid centered at (90W, 40N). 5

Latitude/longitude grid with one-degree resolution and

(113W, 29N) at lower left. Grid is (45x26). 6

LFM forecast data on a (41 x 38) stereographic grid. 7

LFM observed data on a (41x38) stereographic grid. 8

Latitude/longitude grid with 0.5-degree resolution and

(11OW, 30N) at lower left. Grid is (81x41). This is the

grid used in NCAR's Model Intercomparison Project (MIP). 9

Gridded analysis data from RAWINS on Lambert conformal

grid. (Enhanced analysis - 10 levels.) 10

Table 4.7. List of Current Grid Types.

The fifth NAMELIST group, SCALES, must be present if verification against anal-

yses is being performed. This group is used to specify the characteristics of the scale

decomposition of gridded data according to Errico (1985). Table 4.5 contains the informa-

tion concerning the variables in SCALES.

The sixth NAMELIST group, PRECIP, must be present if precipitation verification

is being performed. Its variables are given in Table 4.6. This group contains information

relevant to the threshold precipitation amounts and the location of boxes used for lagged

spatial correlation coefficient calculations.

20

Page 32: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

4.4. Sample Decks

Listed below are three sample decks that demonstrate the use of VERIFY. It isrecommended that the user obtain the current VERIFY program library file name fromthe author prior to using the processor.

JOB, JN=JOBNAME, T=60,OLM=400, *MS, *D1.ACCOUNT,AC=UUUUPPPPPPPP.*

* VERIFY SAMPLE DECK - ANALYSIS VERIFICATION

DISPOSE, DN=$OUT, DC=ST, DF=CB, DEFER, TEXT= 'FLNM=VERIFY,FLTY=PRINT '.

**** ACQUIRE DATASETS FROM MSS* GRID1 DATASET ONTO UNIT 1* GRID2 DATASET ONTO UNIT 2

ACQUIRE, DN=GRID1, MF=MS .................ACQUIRE, DN=GRID2, MF=MS,.................ASSIGN,DN=GRID1, A=FT01.ASSIGN, DN=GRID2, A=FT02.

**** ACQUIRE VERIFY UTILITY PACKAGE (RELOCATABLE BINARY)

ACQUIRE,DN=VUTLIB,............

**** ACQUIRE UPDATE SOURCE OF VERIFY- UPDATE - COMPILE

ACQUIRE, DN=VPAK.................UPDATE, P=VPAK, F.CFT,I=$CPL,L=O.*

**** LOAD AND EXECUTE

LDR,LIB=VUTLIB.

**** DISPOSE FILM OUTPUT

DISPOSE,DN=$PLT,DC=PT, MF=D1,DF=BI,TEXT='CAMERA=FICHE, TITLE='"' MM4(OSCAR IV CONTROL) VS. RAWINS '.

EXIT.

\EOF

*////////////////////////////////////////////////////////////

*ID MODI*/ MOD FOR VERIFICATION GRID SIZE*D PARAM.10,11

PARAMETER (IDIM=61)PARAMETER (JDIM=46)

*ID MOD2*/ MOD FOR SEA LEVEL PRESSURE ON CROSS POINTS FROM RAWINS*D ANALYS.31

O, -1, -8, O, -13, O, O, -18, O, -23,

\EOF

21

Page 33: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

SPRIMEVANAL= .TRUE.,ISDATE=81042200, :START DATE (YIFDATE=81042400, : FINISH DATE <INC=12, :INCREMENT DA]MAP1=1,MAPD=1,IWEST=5, :WESTERN VERIIIEAST=56, : EASTERN VERIlJSOUTH=5, :SOUTHERN VER]JNORTH=41, : NORTHERN VER]IDELTS1=2, :S1 SCORE GRIINSMTH2-1,SMTH=0. 5,INFO= ' OSCAR IV CONTROL VS. RAWINSMAP1=1,MAPD=1,MAPTOT( 1)=1, :MAP SEA LEVE1MAPTOT(26)=1, :MAP 500 MB HlCINC(1)=2.0, :CHANGE PSLV 4CDIF(1)=1.0, :CHANGE PSLV 1

SEND

SGRID1ITYPE1=1,

SEND

SGRID2ITYPE2=3,

SEND

SGRID3ITYPE3=1,

SEND

$SCALESNSCALES=O,

SEND

fYMMDDHH)(YYMMDDHH)rES (HH - HOURS)

FICATION BOUNDARYFICATION BOUNDARYIFICATION BOUNDARYIFICATION BOUNDARY) INCREMENT

L PRESSUREEIGHTCONTOUR INTERVALDIFF MAP CONTOUR INTERVAL

:GRID1 TYPE

:GRID2 TYPE

:VERIFICATION GRID TYPE

:SCALE DECOMPOSITION

JOB, JN=JOBNAME, T=60, OLM=400, *MS, *D1.

VERIFY SAMPLE DECK - STATION VERIFICATION(JCL SAME AS ANALYSIS VERIFICATION SAMPLE)

**** DISPOSE STRUCTURE FUNCTION DATASET (INPUT TO PROGRAM SFCAL)

DISPOSE,DN=SFDATA,MF=MS,W=password,RT=365,'TEXT='FLNM=subdirl/subdir2/name, ''COMMENT=''any comment you wish to make'''.

\EOF

22

\EOF

Page 34: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

$PRIMEVSOUND=.TRUE.,ISDATE=81042200, :START DATE (YYMMDDHH)IFDATE=81042412, :FINISH DATE (YYMMDDHH)INC=12, :INCREMENT DATES (HH - HOURS)IWEST=16,IEAST=48,JSOUTH=11,JNORTH=32,INFO='OSCARIV 15L VS. RAWINSONDE DATA

SEND

SGRID1ITYPE1=1,

$END

$GRID2ITYPE2=2,

SEND

SGRID3ITYPE3=2,

SEND

\EOF

:GRID1 TYPE

:GRID2 TYPE

:VERIFICATION GRID TYPE

JOB,JN=JOBNAME,T=60,OLM=400,*MS,*D1.

VERIFY SAMPLE DECK - PRECIP VERIFICATION(JCL SAME AS ANALYSIS VERIFICATION SAMPLE)

\EOF

$PRIMEVPRECIP=.TRUE.,ISDATE=81042200, :START DATE (YYMMDDHH)IFDATE=81042300, :FINISH DATE (YYMMDDHH)INC=12, :INCREMENT DATES (HH - HOURS)MAP1=1,MAP2=1,SMTH=.5,NSMTH2=1,INFO='OSCARIV 15L VS. SAMSON PRECIP 'MAPTOT(3)=1, :MAP PRECIPITATION

SEND

SGRID1ITYPE1=1,

SEND

SGRID2ITYPE2=5,

SEND

$GRID3ITYPE3=1,

SEND

:GRID1 TYPE

:GRID2 TYPE

:VERIFICATION GRID TYPE

23

Page 35: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

$PRECIPNTHR0ES=5, 4

T:ES(1:)-00.0254,TaRES(2)00;. 2:540;'THRES'(3)=00:. 6350,THRES(4) =01. 2700,THRES(S):=02.5400,

NBOXES=1,

ICEN(1 , )=3-1,JCEN('. ,i)=23,ICEN(1:,2)=31,JCEN(1i,2)=23,ICEN(1 ,3)=31,JCEN( 1,3)=23,

\EOF

:NUMBER OF PRECIP THRESHOLDS:

THRESHOLDS (CM)

:NUMBER OF CORRELATION BOXES

:BOX CENTER AT FIRST TIME

:BOX CENTER AT SECOND TIME

:BOX CENTER AT THIRD-TIME

24

Page 36: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

5.0. Limitations

Although VERIFY's design is such that any type of grid can be compared to anyother type of grid on yet a third type of grid, only a limited number of intercomparisonsbetween the current grids listed in Table 4.7 is possible. An intercomparison of each newgrid to all old grids would result in a massive effort. Therefore, it is practical to introduceonly those interpolations that are necessary.

The purpose of this section is to outline the grid types that may be intercomparedat the present time.

The intercomparions fall naturally into one of the three major categories. Tables 5.1-5.3 contain the intercomparison combinations for station verification, analysis verification,and precipitation verification, respectively.

Tables 5.2 and 5.3 can be used in the following manner. First, decide which dataset is to be referred to as 'grid 1'. Under the column for GRID1/ITYPEl (which is theactual NAMELIST group/variable), locate that grid type. Look under the column GRID2/ITYPE2 for all the grid types that can be used as 'grid 2' to compare to the first grid.Finally, look under the column GRID3/ITYPE3 for the verification grids that can be usedfor the grids to be intercompared. GRID3/ITYPE3 will always be either one of the twogrids being compared, or a common grid to which both grids can be interpolated.

For station verification (Table 5.1), both GRID2/ITYPE2 and GRID3/ITYPE3 willalways be equal to 2. In addition, during station verification the data set on 'grid 1' can beinterpolated to another grid prior to processing. This is prescribed by GRID1/KTYPE1.Using this option, two data sets on different grids can be compared against station datawithin the same region. For example, if the user wishes to perform station verification onforecasts from the MM4 and the LFM, Table 5.1 indicates that both can be interpolatedfrom their default grids to common grids 6 or 9. Note that station verification is notallowed for data directly on the LFM stereographic grid.

The possibilites for comparison of gridded datasets are numerous and are summarizedin Table 5.2. The user should note that when two MM4 grids are being compared, thegrid characteristics specified through the PARAMETER statements (see Table 3.1) areassumed to be the same for both data sets. That is, if an MM4 forecast (grid type1) is being compared to the RAWINS enhanced analysis (grid type 3), both grids mustconform to the characteristics specified by parameters MM4I, MM4J, XM4DX, XM4PHO,XM4LMO, and XM4PS1.

As seen in Table 5.3, precipitation comparisons are limited due to the fact that onlya few gridded precipitation data sets are available. Further limits are made because gridtype 5 is exclusively a (61x46) 80-km Lambert conformal MM4 grid type.

25

Page 37: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NAMELIST Group/Variables

-G.RIDl/ITYPEl

11

33

44

7

8

GRID1/KTYPE1

6,9

6,9

6,9

:6,9

6,9

9

10

NOTES:

(a) KTYPE 1 indicates the grid to which the data set on:GRItiD/:iTYPEl is interpolated prior to the processing.

(b) <EGRID2/ITYPE2 is always equal to 2.(c) GRID3/ITYPE3 is always equal to2.

TFable 5.1. List of Possible Station Verification Comparisons.

26

Page 38: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NAMELIST Group/Variable

GRID1/ITYPEI

1111

333

444

777

88

9

10

GRID2/KTYPE2

1,3,47,8910

3,4810

3,4810

1,3,47,89

3,48

1,3,4,7,8,9

3,4,10

GRID3/ITYPE3

1,6,96,991

3,6,96,93

4,6,96,94

6,96,7,9

9

6,96,8,9

9

10

NOTE:

When two MM4 grids are being compared, the grid char-acteristics specified through the VARIABLE statements (seeTable 3.1) are assumed to be the same for both data sets.

Table 5.2. List of Possible Analysis Verification Comparisons.

27

Page 39: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NAMELIST Group/Variable

GRIDI/ITYPEI GRID2/KTYPE2 GRID3/ITYPE3

1 1,5 11 1,5,9 9

9 1,5,9 9

NOTE:

Grid type 5 is always a (61 x 46) 80-km Lambert con-formal MM4 grid.

Table 5.3. List of Possible Precipitation Verification Comparisons.

28

Page 40: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

6.0. New Data Sets and Grids

At the present time ten different data sets (see Table 4.7) can be input and analyzedby VERIFY. The addition of a new data set and/or grid is a straightforward procedurerequiring only minimal changes in the main body of the code itself.

To introduce a new data set to VERIFY, it must be assigned a grid type number,similar to those in Table 4.7. The grid type number will be used for internal programcontrol and external communication with the user.

Each data set used by VERIFY is extracted by means of a subroutine written ex-pressly for that data set. As shown in Fig. 3.1, all of these subroutines are called throughthe subroutine GETDAT. A call to the new subroutine which inputs the new data setmust be added to GETDAT. The subroutine is called once per time level being processed,and gathers the fields present which correspond to those needed within VERIFY (listed inTable 2.1). The fields are taken from whatever format they may be written, placed ontothe verification grid (interpolated, if necessary), and put into core memory as described inSection 3.0.

All functions within the new subroutine are at the discretion of the person creatingthe code for the data set desired. That is, if fields must be computed from available data,or if they must be vertically or horizontally interpolated, all methods must be determinedand coded, if necessary, by the creator. VERIFY requires only that, upon exit from thesubroutine, the fields be on the verification grid and in the proper internal storage order.

The best approach to take when introducing a new data set is to emulate an existingsubroutine.

It is possible that a new verification grid will be defined and given a grid type number.In this case the new grid may or may not be associated with a new data set. Nevertheless,if a new data set is to be used as a verification grid, or if a new verification grid is beingdefined, a few additions must be made to the SETUP subroutine. These additions requirethat the user become familiar with the function of SETUP.

First, the size of the verification grid must be set; the variables ILEN3 and JLEN3must be set to the size of the verification grid. Second, the weight contribution for eachpoint on the verification grid must be set for later area-averaging calculations. Third,if the grid is to be used in comparison to rawinsonde station data (grid type 2), thelatitude/longitude coordinates of the grid must be set and placed into the array CORDL.

Again, it is suggested that the user examine and replicate the methods already usedin SETUP to establish the specifications of the new grid as the verification grid.

29

Page 41: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

7.0. Examples of Verification

The verification techniques employed by the VERIFY program are summarized inSection 2.0. Many of the objective skill scores are well documented and used throughoutthe community. The report of the NCAR Acid Deposition Modeling Project (1983) definesand gives examples of many of the scores that VERIFY can compute. A few techniquesthat have been implemented in VERIFY are less well known, however, and may requirefurther explanation.

Several examples of VERIFY's verification methods are given in Appendix E: Ver-ification of Meteorological Model, which contains the equations used in skill score compu-tations, a set of accuracy benchmarks that can be used to aid in the interpretation of thescores, the results from one test case, and an interpretation of the results.

30

Page 42: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

References

Anthes, R. A., 1983: Regional models of the atmosphere in middle latitudes. Mon. Wea.Rev., Ill, 1306-1335.

Anthes, R. A., E.-Y. Hsie, and Y.-H. Kuo, 1987: Description of the Penn State/NCARMesoscale Model Version 4 (MM4). NCAR Technical Note, NCAR/TN-282+STR,66 pp.

Errico, R. M., 1985: Spectra computed from a limited area grid. Mon. Wea. Rev., 113,1554-1562.

The NCAR Acid Deposition Modeling Project, 1983: Regional acid deposition: modelsand physical processes. NCAR Technical Note, NCAR/TN-214+STR, 386 pp.

31

Page 43: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Appendix A. Preparation of MM4 Output

Prior to using a forecast generated by the MM4 model, the fields must be convertedfrom sigma surfaces to the pressure surfaces required within VERIFY. The geopotentialheight must also be computed from the temperature fields. The program MM4PROC hasbeen written to perform this task and can be used for most MM4 cases. It generates agrid type 1 for input to VERIFY. During the vertical interpolation from sigma to pressuresurfaces, the variables U, V, and Q are interpolated 'linear in pressure', whereas variablesT and H are done 'In of pressure'.

The following job deck will perform the conversion. The comments within the deckare self-explanatory. The user should obtain the current MM4PROC program file fromthe author prior to using the processor.

JOB, JN=JOBNAME, T=20, OLM=100, *MS.ACCOUNT, AC=UUUUPPPPPPPP.

* SAMPLE DECK TO BE USED TO PREPARE MM4 FORECAST DATA* FOR THE VERIFICATION PROCESSOR.*DISPDSE, DN=$OUT, DC=ST, DF=CB, DEFER, TEXT=' FLNM=MM4PROC, FLTY=PRINT '.

* GENERALIZED FOR VARIABLE VERTICAL LEVELS (THRU PARAMETER KX)

* MAKE THE FOLLOWING CHANGES:~* (1) PROVIDE THE INPUT DATASET FILE NAME (MM4 FORECAST).

* (2) IF NECESSARY, PROVIDE THE TERRAIN INPUT DATASET FILE* NAME.* (3) PROVIDE THE OUPUT DATASET FILE NAME.* (3) EDIT INPUT - NUMBER OF VERTICAL LEVELS WITHIN THE MM4* FORECAST (KX=15 IS THE DEFAULT).* (4) NAMELIST $BOOKIE FOR FORECAST DATA INPUT.

**** ACQUIRE FORECAST DATASET FROM THE MSS

ACQUIRE, DN=FORVOL,MF=MS, ...........ASSIGN, DN=FORVOL,A=FT01 .

**** ACQUIRE TERRAIN DATASET FROM THE MSS (IF NECESSARY)*

*ACQUIRE, DN=TERRAIN, MF=MS, ...........*ASSIGN, DN=TERRAIN, A=FT08.

**** ASSIGN UNIT 4 TO THE OUTPUT

ASSIGN, DN=DATA, A=FT04.

A-1

Page 44: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

*r*

ACQUIRE THE SOURC CODE - EDIT - COMPILE - EXECUTE

ACQUIRE,DN=MM4PROC, ...........EDITOR.CFT, I=PROC, L=O.LDR, LIB=IMSLIB.*

*

DISPOSE THE OUTPUT DATASET TO THE MSS

DISPOSE, DN=DATA, MF=MS, W=password,RT=365,TEXT= 'FLNM=first level dir/secondlevel/file name,'

'COMMENT=''Any comment about the file you wish to make'''

EXIT.

\EOFEDIT, S=MM4PROC ,D=PROC, 'KX=15'= 'KX=10'\EOF

$BOOKIENUMOUTINOPT

7 ,3 ,

ISKIP = 11 ,IFILES = 1,1,0,1,0,IDATE = 81042200,

81042212,81042300,81042312,81042400,81042412,81042500,

MTNS = 'HIST'SIGMA = 0.050,

0.150,0.250,0.350,0.450,0.550,0.650,0.750,0.850,0.950,

SEND

: NUMBER OF LOGICAL FILES TO BE OUTPUT ---:INPUT FILE PROCESSING OPTION <-:NUMBER OF FILES TO SKIP IF INOPT=3 <---:FILES PROCESSING COMMAND IF INOPT=2 <---:DATES WITH WHICH TO ID OUTPUT FILES <---

: TERRAIN VOLUME TYPE

:SIGMA LEVELS (HALF-LEVELS BY MODEL DEF.)

A-2

Page 45: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

INPUT DOCUMENTATION:

NUMOUT NUMBER OF FILES TO BE OUTPUT

INOPT MM4 HISTORY INPUT OPTIONS FOR PROCESSING= 1 :PROCESS ALL FILES= 2 :PROCESS FILES WHICH HAVE A VALUE OF 1

IN THE CORRESPONDING LOCATION IN THEIFILES ARRAY

= 3 :PROCESS THE FIRST FILE, SKIP ISKIPFILES, PROCESS THE NEXT FILE, SKIPISKIP FILES, ETC.

ISKIP NUMBER OF FILES TO SKIP IF USING INOPT=3

IFILES ARRAY OF VALUES 0 OR 1 DEPENDING UPON IF THATCORRESPONDING FILE IS TO BE PROCESSED (=1)WHEN USING INOPT=2.

IDATE ARRAY OF NUMOUT DATES WHICH WILL BE WRITTEN ASTHE FIRST RECORD ON THE CORRESPONDING OUTPUTFILE. IN THE FORM YYMMDDHH (YR-MON-DAY-HOUR)

MTNS TYPE OF VOLUME FROM WHICH THE TERRAIN WILL BEEXTRACTED. 'DFLOWIN' OR 'RAWINS' OR 'HIST'DEFAULT IS 'HIST'.

SIGMA SIGMA LEVELS

\EOF

A-3

Page 46: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Appendix B. Description of Structure Function

USE OF STRUCTURE FUNCTION IN VERIFYING

MESO-a SCALE NUMERICAL MODEL

This note describes the use of the structure function (Gandin, 1963) to describethe statistical structure of meteorological fields as a function of horizontal distance in aregional-scale numerical model and to compare the structure of the model-simulated fieldswith those of the atmosphere. The use of the structure function will quantitatively describehow much spatial variance in model and observed meteorological variables (temperature,mixing ratio, and horizontal wind conditions) is associated with various horizontal scales.Barnes and Lilly (1976) used the structure function to help plan a mesoscale observationnetwork.

1.0. The Structure Function

The structure function for a pair of observations located at positions r and r2 isdefined as

b(r , ,r 2 ) - m(r,ri) + m(r, r2) - 2m(rI,r2) (1)

where m(r 1, r 1) and m( 12, E2) are the autocorrelation functions (station variances) and

rn r I, 2) is the cross-correlation function (covariance). The autocorrelation and cross-

correlation functions are calculated according to

m(ri,r,) (f(r)) 2 -(f()) 2 , (2)

B-1

Page 47: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

m(rl,r2) = f (rl)f(r2) - (rl)f (r2) , (3)

where the temporal mean (over all samples in the data set) is defined as

N

Af (r 1 )--NE fin (r I ) * (4)n=1

In (4), the superscript n refers to the nth data point at station r .

The structure function is interpreted as follows. As the station separation approaches

0 (that is, rI -- r2), b approaches 0 (for perfect observations). As the separation of

two independent observations of real data approaches 0, b approaches twice the variance

associated with instrumental and data-processing uncertainties. If the covariance vanishes

at very large station spacing, the structure function approaches twice the station variance

as I r 1 - r 21 - oc. Thus, the ratio of the structure function to twice the mean station

variance is a measure of the fraction of variance associated with scales smaller than the

station spacing.

Barnes and Lilly also computed normalized correlation functions

A(r l,r2r2)= m(r,2)m(rl,r )m(r2,r2)] 1/2 , (5)

which ranges between + 1.0 for perfect, positive correlation and -1.0 for perfect, negative

correlation. For error-free data, p varies from 1 at 0 separation to 0 at infinite separation

(zero correlation).

The relationship between the normalized correlation function and the normalized

structure function, under the assumption that m(ri, r i) = m(r2, 2) m is

b(r1 , r2 )= . (6)p(r, r2) + 2m -

Thus, either y or b may be calculated in order to estimate the variance associated with

different scales of motion.

B-2

Page 48: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

In comparison of the structure function for model and observed data, we may eitheruse model and analysis data on the model grid using pairs of model grid points, or wemay use station pairs and model data interpolated to the station locations. The use ofgrid points would be easier to compute, but probably not as desirable since the observedstructure function would then contain the effects of analysis errors. However, the use ofoperational upper-air stations does not allow determination of mesoscale structure.

In the following discussion, we consider the example of the NCAR Acid DepositionModeling Project (ADMP). In the ADMP, 72-h simulations are run for a number of casestudies belonging to several synoptic types. Suppose we compute the structure functionfrom the 12-h data sets that comprise each 72-h case study. For each case study, we havedata sets at 0 h, 12 h, 24 h, 36 h, 48 h, 60 h, and 72 h, for a total of seven data sets percase. For the summer precipitation type, for example, we have studied five cases, whichyields 35 data sets for this classification.

The calculation of the structure function involves selecting N upper-air stations inthe model domain and saving the interpolated model data at these stations at 0, 12, 24,.

72 h of each simulation. These interpolated data are also used in calculation of otherverification parameters such as RMS errors and correlation coefficients of observed andforecast changes.

Because the possible number of unique pairs of observations M = - 1) + (N -2) + (N - 3) + ... 1, where N is the number of stations, becomes large for the total numberof upper-air stations in the domain (Table 1), we generate a large number of data points.

N M N M

5 10 50 122610 45 55 148615 106 60 177120 191 65 208125 301 70 241630 43635 59640 78145 991

Table 1. Possible number M of unique pairs of stations asfunction of total number of stations N.

B-3

Page 49: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

For example, if we use 70 stations, we will have 2416 values of b and / for both modeland observations. Each data point can be plotted as a function of d (see Fig. 2). The meanof both observations and model in various ranges (bins) of d can then be compared, and thestatistical significance of the differences assessed (Fig. 3). The null hypothesis is that themodel and observed populations are not significantly different, i.e., they belong to the samepopulation. Verification of the null hypothesis is a desirable outcome, since it indicatesthat the structure of the model is the same as that of the atmosphere, a desirable propertyfor observing systems simulation experiments and testing various control strategies.

We note that an analysis of the deviations from a constant, such as a climatologicalmean, would give the same structure functions and autocorrelation functions, becauseaddition of a constant to the values of f(r) does not change m(r i, r ) or m(ri, rj). This

is shown below.

m(ri,r) - ft2

Add constant c to fn and compute m'(ri, ri)

m' = (i + )2 - (fi + c) 2

+ f+ 2ci + 2 - (f + )2

-2= f +2cfi + c2-(fi + 2cfi + c2 )

m'= f?-f-i m

Similarly, for any covariance

m(ri,rj) -ifj -f if j

m'= (fi + c)fj -(/, + c) fj

f 3 + cf -( t + c) fj

fifj + cfj - ff - cf

= m(rt,r)

B-4

Page 50: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

2.0. Sample Calculation of Structure Function for Temperature for Five Case Studies

Time

Synoptic Case 1

00122436486072

Synoptic Case 2

0012

72

Synoptic Case 5

0012

72

m(rl,r2) =

m(rl,rl) =

35 35

n=l n=l 3

35 E T1 3 T]

n-=l . n=l -

This calculation will yield b and it for each pair of stations for both forecast and observedvariables. When plotted as a function of d, the station separation, the data will indicate thedegree to which the model atmospheric structure agrees with the actual structure (Fig. 2).

B-5

Station Number

2

Tl22

T2T3T2t~4

'2

1

Tlj

T2T3

Ti4

T1T7

T735 3 3 35

35

ZT 2=n=l

3 . . . N

T3'2

T3

Page 51: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

3.0. Simple Example Using Three Stations and Two Synoptic Cases

To illustrate the calculation of autocorrelation function, we consider a hypothetical

example of three stations. A data set consisting of 14 samples per station, with 7 belonging

to one synoptic case and 7 to another, is listed in Table 2 and plotted in Fig. 4. From

Fig. 4, it is clear that stations 1 and 2 are positively correlated, while stations 1 and 3 are

negatively correlated. As shown in Table 2, A(ri, r2) = 0.755 and 4(ri, r3) is -0.6934.

Station 1 Station 2 Station 3

n fi (n) fi (n)2

1

2

3

4

5

6

7

8.

9

10

11

12

13

14

0

-3

-5

-9

-10

-10

-10

-4

0

6

12

14

14

10

0

9

25

81

100

100

100

16

0

36

144

196

196

100

5. 1103

ff2 f2 (n) f2(n) 2

0

-15

-10

0

30

60

80

-16

0

84

192

224

140

50

5

5

2

0

-3

-6

-8

4

8

14

16

16

10

5

25

25

4

0

9

36

64

16

64

196

256

256

100

25

819 68 1076

fi f fS(n) fs(n)2

0

-6

-25

-63

-90

-100

-100

40

0

-60

-96

-112

-112

-100

0

2

5

7

9

10

10

-10

-10

-10

-8

-8

-8

-10

0

4

25

49

81

100

100

100

100

100

64

64

64

100

-724 -21 951

fi = 0.3571

f = 78.786

m(ri, ,rl) = 78.43

f2 = 4.857

f22 = 76.857

m(r 2, r2) = 72.0

Tfl 2 = 58.5

fi f2 = 1.7344

m(ri, r 2) = 56.766

1: I,-2 ) = 56.766~(r1, r2) = (78.4X76 2.0o)1/2

(£il, r2) = 0.755

= - 1.50

f 2 = 67.93

me(rs, rs) = 69.43

fi is = -51.71

fs = -0.53565

mn(r , rs) = -51.17

-51.17,(r, 1 , r:s) = (78.43x69.4S3)/ 2

s (r l, rs) = -0.6934

Table 2. Sample calculation of normalized correlation function for three

stations and two synoptic cases (7 times in each case).

B-6

Page 52: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

4.0. Calculation Using Real and Forecast Data

A plot of s computed from observations and a 72-h model simulation, initializedat 0000 GMT 22 April 1981, is shown in Fig. 5. The data are temperatures at 850 mb.Both observations and model data indicate a decrease of autocorrelation from a stationseparation of zero to a minimum at around 2,000 km. From 2,000 to about 4,000 km, thereis a tendency for an increase of ,. For comparison, Fig. 6 shows the autocorrelation functionfor 500-mb heights taken from Schlatter (1975). The minimum correlation at, or slightlyless than, 2,000 km is similar in both data sets. However, the negative correlations in the72-h data set reach much lower values (close to -1.0 rather than -0.6 in the climatologicaldata set). Also, there are many more points with large positive values beyond 2,000 kmin the 72-h data set. These differences are likely due to the small sample size (N = 7) inthe single case study.

Comparing the model and observed correlations in Fig. 5, we note a slight tendencyfor less of a scatter in the model data with a higher concentration of model values near-1.0. This might be explained by the fact that the model has fewer degrees of freedomthan the atmosphere and there are no random errors superimposed on the model data.

References

Barnes, S. L., and D. K. Lilly, 1976: Covariance analysis of severe storm environments.Ninth Conference on Severe Local Storms, 21-23 October 1975, Norman, OK, Amer.Meteor. Soc., 301-306.

Gandin, L. S., 1963: Objective analysis of meteorological fields. Translated from RussianIsrael Program for Scientific Translations, 1965, 242 pp. (NTIS TT-65-50007.)

Schlatter, T. W., 1975: Some experiments with a multivariate statistical objective analysisscheme. Mon. Wea. Rev., 103, 246-257.

B-7

Page 53: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

40° N4

30:N

200 N

1i00 N'

160° 150° 140° 130°W

400 N

30 0°N

20°N

10°N

120qW 1 10 100° 90° 80° 70° 60°W

Figure 1. Large domain (Axa = 160 km) of regional acid deposition modeling project.Locations of operational upper air stations numbered (total of 67 in the UnitedStates).

B-8

50° 40° 30° 20 ° W

Page 54: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

1.00.80.60.40.20.0

-0.2-0.4-0.6-0.8-1.0

0 1000 2000

d (km)

Figure 2. Sample plot of normalized correlation(or model data).

1.00.80.60.40.20.0

-0.2-0.4-0.-6-0.8-1.0

0 1000 2000

3000 4000

function for hypothetical observations

3000 4000

d (km)

Figure 3. Sample plot of model (x) and observed (o) correlation function in 500-km bins.

B-9

:0 0

. * : * * . -

** * - * * * *0

* * 0 : * ...

.* * * 0 * --a

* .s * * * * * *. * .* ** * * * 0 * .*

* *.·____ *

*:* 0 * @0* 0. . * . . *_. . *: 0 * *

* . 000.

I I I I I I I -x0 x

0

8o -x -

60

0 X

x

I, I ,I, I \ I ,, I I I

Page 55: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

181614121086420

-2-4-6-8

-10-12-14-16-18_on

I 5 10 15 20 25

n

Figure 4. Hypothetical data set for 3 stations and two synoptic cases with 7 times ineach case. For this data set, Ti(r,,r 2 ) = 0.755 and Au(rT,r3) = -0.6934

B-10

AAif

Page 56: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NORMALIZED CORRELATION FUNCTION

1.0

0.8

0.6

0.4

0.2

D 0

-0.2

-0.4

-0.6

-0.8

-1.0

1.0

0.8

0.6

0.4

0.2

iE 0

-0.2

-0.4

-0.6

-0.8

-I .0

Observed850 mb No. of Stations-55

0 1000 2000 3000 4000 5000 6000

Forecast850 mb No. of Stations=55

0 1000 2000 3000 4000 5000 6000

DISTANCE (km)

Figure 5. Normalized correlation function for T85gsomb for 7 time periods at 12-hour in-tervals beginning at 00 GMT 22 April 1981. The number of stations is 55.Top: observations. Bottom: model.

B-ll

II

J A. . . a A J AI.A. . . a .I . . . I . . .

I I I �

. . . .. . .. . . . . . . . . . . . . . . . . . .. I . . . . . .

~~~~~~~~~~~.. . . . . . . .

;. . II '.' , , I , -, .

I - - - - I

Page 57: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

H-H CORRELATION VS. DISTANCE1.0

0.8

0.6

0.4

0.2

0

-0.2

-0.4

-0.60 0.5 1.0 1.5 2.0 2.5 3.0

DISTANCE

Figure 6. Forecast-error correlations for the height field as a function of separation dis-tance (103 km) between every pair of points for 50 U.S. radiosonde stations(Schlatter, 1975).

B-12

zZO0FI

_JL-0t

(,.)

&_ - I I II I I

I

4

6

0

1 0 go

* *

1.

3.5

I I - - ~ -~- _ ~~L 1- s-1_1 ~ 1 L I~I I - C

- s L --- -�---- II� L I-I I I I I

Page 58: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Appendix C. SFCAL Users Guide

During the execution of VERIFY in station verification mode, pairs of values (forecastand observed) for selected fields at all station locations for each time period are saved andcan be disposed. This output data set can then be used as input to the program SFCAL,which performs a structure function analysis as described in Appendix B of this document.

The fields which are saved by VERIFY and subsequently analyzed by SFCAL aregeopotential height, temperature, and specific humidity at 850 and 500 mb. By defaultthe structure function analysis is performed on all six fields.

The method of analysis, consistent with the sample calculation given in AppendixB, is that all available forecast and observed values at each station location for all theforecast times are processed together. Thus, no distincton is made for individual forecasttime periods, but rather all available data for all pairs of stations which are within 500km, for example, of each other are used to produce a correlation between both the forecastfield and the observed field. This allows for a larger sample from which to compute thenormalized correlation function.

The namelist input group PRIME offers but two options to the user. The number ofcases, NCASES, and the number of time periods per case, NTIMES, are the only variableswhich need be specified. Note that when NCASES is greater than one, all available infor-mation for each station location for all the forecast times and all the cases are processedtogether. Thus, the sample size can be further increased by analyzing together severalforecasts from the same model.

The following sample deck illustrates the use of SFCAL. The user should obtain thecurrent SFCAL program library file from the author prior to using the processor.

JOB, JN=JOBNAME, T=040, OLM=100, *MS, *D1.ACCOUNT, AC=UUUUPPPPPPPP.

* DECK TO PERFORM STRUCTURE FUNCTION ANALYSIS* UPON OUTPUT DATA FROM PROGRAM VERIFY.

DISPOSE, DN=$OUT, DC=ST, DF=CB, DEFER, TEXT=' FLNM=SFCAL, FLTY=PRINT '.

**** ACQUIRE VERIFY OUPUT DATASET(S) FROM THE MSS

ACQUIRE, DN=SFVOL1, MF=MS,...........*ACQUIRE, DN=SFVOL2, MF=MS, ...........ASSIGN,DN=SFVOL1,A=FT10.*ASSIGN, DN=SFVOL2, A=FT11.

*ACIRE ABSOLTE BINARY VERSION OF CODE FRO MSS**** ACQUIRE ABSOLUTE BINARY VERSION OF CODE FROM MSS

C-1

Page 59: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

ACQUIRE,DN=SF, ..........

**** EXECUTE

SF.

**** DISPOSE FILM OUTPUT*

DISPOSE, DN=$PLT, DC=PT, MF=D1, DF=BI, TEXT= ' CAMERA=FICHE, TITLE= '' ANY TITLE YOU WISH '.*

EXIT.

\EOF

SPRIMENCASES=1, :NUMBER OF CASES TO PROCESSNTIMES=7, :NUMBER OF TIME PERIODS TO PROCESS

SEND

\EOF

C-2

Page 60: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Appendix D. Smoothing Operator

VERIFY provides an option that can be activated to filter, or smooth, the fields beingprocessed. The smoothing function, described by Shapiro (1970), is a simple nine-pointoperator with a single smoothing element.

If the nine-point grid is represented by the stencil,

X7 X8 Xg

X4 X5 X6

X1 X2 X3

the new value of the center point is obtained with the operator,

= WIX 5 + W2 (X2 + X4 + X6 + Xs) + W3 (x 1 + X3 + X + X) ,

where

wl = S 2 - 2S + 1

W2= (S-S 2 )/2

W3 = S 2 /4

The smoothing element S can be assigned values between 0.0 and 0.5, with thevalue of 0.5 causing maximum damping. A value of 0.5 will eliminate the 2Ax wavelengthamplitude completely, one-half of the 4Ax wavelength amplitude, one-quarter of the 6Axwavelength amplitude, and so on. All waves longer than 2Ax are damped somewhat,however small.

In order to eliminate increasingly longer wavelengths, the filter can be applied nu-merous times with the same smoothing element.

D-1

Page 61: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Reference

Shapiro, R., 1970: Smoothing, filtering, and boundary effects. Rev. Geophys. SpacePhys., 8, 359:-387.

D-2

Page 62: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Appendix E.

VERIFICATION OF METEOROLOGICAL MODEL

1. IntroductionA complex, three-dimensional dynamic model (MM4) has been developed to pro-

vide high-resolution (in space and time) meteorological data to drive a Regional AcidDeposition Model (RADM). This model has many optional features (such as planetaryboundary--layer physics, treatment of clouds and precipitation, analysis and initializationtechniques, and horizontal and vertical resolution). Anthes et al. (1987) describe themodel. An important question is what combination of physics, analysis, resolution, anddomain size gives the "best" result. A closely related question is how accurate or skillfulis the model in predicting or simulating atmospheric flow and precipitation.

The questions of "best" model and accuracy of the various models can be addressedin a variety of ways. Ideally, a complete, high-resolution meteorological data set (hori-zontal resolution at least 80 km, vertical resolution 1 km, and temporal resolution 1 h)with no errors would be available to verify the model output directly. However, there islittle likelihood of obtaining such a data set in the foreseeable future, and so much morelimited data sets must be used to estimate the accuracy of the model. For most cases, themeteorological data set consists of radiosonde data and surface observations of pressure,temperature, moisture, and precipitation. The upper-air radiosonde network has an aver-age horizontal spacing of 400 km, root-mean-square (RMS) temperature errors of about1°C, and RMS horizontal wind errors of about 10%. These errors in the horizontal windmake it practically impossible to estimate the actual vertical motion.

Although mesoscale meteorological models similar to MM4 have shown considerablepromise in predicting and simulating three-dimensional atmospheric flows, a systematic,quantitative evaluation of a large number of cases of the type under way at NCAR has notyet been attempted. Such a study is needed to establish the overall accuracy of the modeland to avoid false conclusions based on results from a limited number of cases.

We recognize that an ideal validation of the meteorological model from the point ofview of how well it performs in the RADM system would be a calculation of the impact onthe acid deposition predicted by RADM of errors in the meteorological data. However, notonly do perfect meteorological data not exist (so that the true errors in the meteorology areunknown), the chemical data sets for verifying RADM are even more incomplete than themeteorological data. Furthermore, errors in RADM due to imperfect initial chemical data,

E-1

Page 63: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

emissions, and parameterization of the chemistry make it difficult to isolate the impact

of errors in the meteorology when using real data. Therefore, simplified approaches must

be made to verify the meteorological model and to study the sensitivity of RADM to

the meteorological data. NCAR is studying the problem in two major ways: (1) Verify

different versions of the meteorological model against the available meteorological data

using a variety of objective measures of accuracy; (2) Test the sensitivity of RADM to

several different meteorological data sets derived from the MM4 system.

The verification of MM4, when done on a large number of cases, allows estimation of a

"best" version of the model. It also provides quantitative estimates of the accuracy of the

meteorological data provided to RADM. The sensitivity studies provide estimates of the

uncertainty introduced into RADM acid deposition estimates by uncertainties or errors

in the meteorological data. Until more complete meteorological and chemical data sets

are available, these results provide very useful information concerning the accuracy of the

RADM system, as well as provide information concerning the most crucial observations to

be obtained in future field programs.

A related study that complements the verification of MM4, supported by non-EPA

funds, is the model intercomparison project (MIP). In this project, research and operational

groups around the world have been invited to run their models on the same cases to

be studied as part of the ADMP. When verified using the same techniques as applied

to RADM, additional information concerning the sensitivity of meteorological models to

their components is obtained, as well as more information on the inherent uncertainty in

meteorological model simulations.

In the examples to follow a MM4 simulation of the OSCAR IV case (0000 GMT

22 April to 0000 GMT 25 April 1981) is verified using the objective verification program.

The version of MM4 used in this simulation will be designated the "control" version in the

.series of experiments to be carried out during the next years. It consists of ten layers, a

bulk aerodynamic formulation of the planetary boundary layer, and parameterizations of

convective and nonconvective precipitation and surface fluxes of heat and moisture. The

simulation is carried out on a 46 by 61, 80-km grid.

The OSCAR IV case was characterized by rapidly changing weather conditions caused

by a developing low pressure system over the upper midwest of the United States. Figure

1.1 shows the observed analyses of the 1000-mb geopotential height fields, with the frontal

locations marked, for 0, 24, 48, and 72 h. The cold frontal system progressed steadily

across the domain and was associated with considerable rainfall.

The forecast 1000-mb geopotential height fields for 24, 48, and 72 h appear in Figure

1.2. The initial time is excluded since the differences between it and the observed fields

are small, due only to initialization. In general, the forecast is quite good out to 48 h, with

phase, amplitude, and frontal position errors relatively small. As the forecast progresses,

E-2

Page 64: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

the predicted cyclone drifts to the south and west of the observed one, with the predictedvalues consistently deeper and the circulation more intense than observed.

The RMS error in the 1000-mb height field, evaluated every 12 h within a subdomainover the eastern United States, is also presented in Figure 1.2. Compared with the per-sistence error (a common reference for judging a model's skill), the forecast error growsvery slowly in time, again indicating the highly accurate nature of the forecast. Note thatthe initial error is nonzero. This initial error is a measure of the adjustment produced byinitialization and errors produced by vertical coordinate transformations.

2. Objective measures of verification of meteorological modelThere is no one measure of forecast accuracy or skill that completely characterizes

the accuracy of the meteorological data set produced by the model. However, a varietyof objective measures of accuracy has been developed by the meteorological communityto verify models. Each contributes information about the accuracy of different variablesproduced by the model. The verification program described in this technical note and usedto evaluate MM4 contains the most complete set of verification measures (or scores) tobe systematically computed for a large number of cases studied by a model of this type.This section provides a summary and brief description of the verification methods appliedto MM4. It also provides examples from a preliminary run of one case (OSCAR-IV)using the MM4. We emphasize that the scores presented here are only examples and maynot be representative of the overall model performance. Additional interpretation of theverification methods is presented by Anthes (1983).

Figure 2.1 shows the portion of the MM4 domain used for computing the verificationscores for both the MIP and ADMP studies. This region was chosen to avoid the lateralboundary of the model and much of the data-sparse regions over water and in the westernUnited States. The inner region of Figure 2.1 is used for verifying the precipitation fields.

3. Root-mean-square errorsA standard measure of accuracy that can be applied to many data sets is the root-

mean-square (RMS) error. RMS errors give a good overview of the absolute accuracyof a data set, with a few large errors weighted more than many small errors. RMS er-rors are computed for temperature, winds, mixing ratio, and sea-level pressure using twomethods--(1) comparison against point observations and (2) comparison against analysesderived from the point observations. When compared against observations, the RMS scorefor a temperature forecast is computed as

RMST1- NE (TFi - TO )2] , (3.1)

where TFi is the forecast value of T interpolated to the location of the ith observation of

E-3

Page 65: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

T, TOi is the observed value of T at this location, and N is the number of observations.

When verified against the analysis, the RMS error in T is computed as

RMST2E [ ~(TFi - TOi)2 , (3.2)

where i refers to a grid point in the verification domain, TFi is the forecast value of T

at that grid point, TOi is the estimate of the observed temperature at that grid point,

obtained by an analysis of T to the grid, and I is the total number of grid points.

The use of (3.1) has the advantage that model data are compared directly to obser-vations; there are no analysis errors to contend with. However, there are errors associated

with each observation and with the possibility that the point observation is unrepresenta-

tive of the grid average, which is what the model is estimating.

The use of (3.2) has the advantage that an estimate of the grid average value is used

to compare against the model estimate. However, it suffers from the possibility of errors

introduced by the analysis technique.

Table 3.1 lists some subjective estimates of RMS errors for observations and analysis

(denoted EA). This value represents the observed uncertainty and, therefore, a model

cannot be expected to yield errors less than this value. Hence, the values listed may be

considered as the minimum value obtainable by any model, or the score produced by a

"perfect" model.The column listed ER represents an estimate of the RMS error computed from the

difference between two randomly chosen atmospheric states; EP approaches ER as time

becomes very large (typically greater than two weeks).

The third column (Ep) gives RMS errors of a forecast based on persistence for a single

case (OSCAR IV). Model forecast errors are often compared to persistence errors, so that

any value less than persistence would be considered to have at least some skill, while a

value greater than persistence would indicate a worthless or unskillful forecast.

The column EF gives a value computed from MM4 at 48 h for the (OSCAR IV) case.

It is considerably less than Ep, indicating the skill of the model for this case.

Finally, the last column S gives an estimate of the skill of the forecast, where S is

computed from

MaxI(EF - EA) (33)

ER

According to (3.3), if the forecast agrees with the analysis within the analysis uncertainty,

the forecast is judged to have a skill of 100%. The S score decreases to zero as the

forecast error approaches that of two randomly chosen states. The data shown in Table

3.1 indicate that the model shows considerable skill when RMS errors are computed. It

E-4

Page 66: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

must be emphasized, however, that this is a single case and the average numbers overmany cases could be significantly different.

In the production runs, RMS scores as listed in Table 3.1 are computed for the 700-,500 , and 300-mb levels, in addition to the 850-mb level.

4. Correlation coefficient between observed and forecast changeAt times, such as when extreme atmospheric events occur, the RMS error of a forecast

may appear to be large and yet the forecast may still have value. For example, in a caseof explosive cyclogenesis in which the minimum pressure decreases by 50 mb in 24 h, amodel which predicts only a 40-mb deepening would show a relatively large error of 10 mbcompared to average forecasts, but would clearly have value. The correlation coefficientbetween observed and forecast changes of a variable is useful in quantifying how well amodel predicts changes of a variable from the initial conditions. A value of 1.0 indicates aperfect forecast while a value of 0.0 or less indicates a worthless forecast.

Table 4.1 shows the correlation of observed and forecast changes of several variables,computed using forecast values interpolated to observation points. The MM4 for thisone case shows considerable skill in forecasting the changes of temperature, mixing ratio,850-mb height, and sea-level pressure.

5. SI scores

For some meteorological fields such as sea-level pressure or height of the geopotentialfield, the horizontal gradient is more important than the absolute value. The Si score.described in Anthes (1983), is a common measure for quantifying the skill of a forecastof sea-level pressure gradient and the gradient of the geopotential height of upper-levelconstant-pressure surfaces.

Table 5.1 gives examples of 5S scores for 48-h forecasts of sea-level pressure and 500-mb height. The values listed as "perfect" and "worthless" are based on the experienceof many operational forecasters. As shown in Table 5.1, MM4 for this one case showsconsiderable skill compared to persistence.

6. Precipitation verification

Precipitation is an extremely important predicted field in meteorological models. Be-cause observed and model-predicted precipitation is the result of many complex processesand because it often contains very small-scale variations, it is considerably more difficultto forecast than temperature, wind, pressure, and water vapor.

We use a variety of methods to quantify the accuracy of the precipitation forecastsof MM4, including the so-called threat scores, bias score, and categorical forecasts of theoccurrence or nonoccurrence of a specified precipitation threshold amount at a given gridpoint during a specified time period. These scores are discussed by Anthes (1983).

E-5

Page 67: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Figure 6.1 shows the observed precipitation over the 24-h period ending 0000 GMT

23 April 1981 (OSCAR IV case). Figure 6.2 shows the MM4 forecast for the same timeperiod. Table 6.1 lists the categorical forecasts, the threat score, and the bias score for

precipitation thresholds of 0.025 cm (0.01 inch), 0.254 cm (0.10 inch), 0.635 cm (0.25 inch),

1.27 cm (0.5 inch), and 2.54 cm (1.0 inch). Great caution should be exercised in interpreting

the results from this single case, especially for the larger precipitation thresholds which

occur over very few model grid points. However, some conclusions that probably have

some general validity may be made. The categorical forecasts (yes or no of a grid point

receiving a threshold amount of precipitation) indicate that more grid points are correctlyforecast (yes or no) at the larger amounts. This is because the correct forecasts of the larger

amounts are dominated by "no" forecasts, which are relatively easy to make. However, the

threat score, which is a measure of the accuracy of the predicted area receiving a threshold

amount, shows just the opposite. As the area diminishes in size (increasing precipitationthreshold), it becomes more difficult to forecast.

The bias score indicates whether the model tends to overpredict threshold amounts

(bias score greater than 1) or underpredict threshold amounts (bias score less than 1).The bias scores in Table 6.1 are probably meaningless for the single case presented here.

However, this score is very important when averaged over a larger number of cases, because

it indicates the presence of systematic model errors.

For some purposes, such as estimating annual averages or testing various strategies

for control, the details of an individual forecast are not important. It is more important

for the model to simulate realistically the structure of the atmosphere. For example, the

regional model might forecast the correct intensity and shape of a field, but displace the

field by some small distance in space and hence yield large errors at points. A correlation

matrix scoring method (Anthes, 1983) is a measure of skill in predicting the intensity and

patterns of a field. Figure 6.3 shows the correlation matrix for the 24-h forecast shown

in Figure 6.2. When the observed precipitation grid (Figure 6.1) is centered over the

forecast grid (Figure 6.2), a correlation coefficient of 0.65 is obtained as shown by the

center value in Figure 6.3. However, when the observed precipitation field is shifted five

grid points (400 km) to the east, a higher correlation of 0.74 is obtained, indicating that

the model is predicting a precipitation pattern/intensity with an accuracy of 0.74, but

with a displacement error toward the east. A perfect forecast by this method would show

a correlation coefficient of 1.0 at the center of the matrix.

7. Structure (climatology) of model forecasts

As is well known, a general circulation model has no predictive skill (of episodes or

events) when integrated beyond about two weeks. However, such models are extremely

useful in answering questions like how will the climate change if carbon dioxide is doubled.

Similarly, the ADMP may have considerable use for evaluating the effects of various emis-

E-6

Page 68: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

sions reductions in spite of large errors at points in individual simulations, if the modelhas the right "climatology." By climatology, we mean that the model, when run overmany cases, has no biases and has the correct statistical properties (such as frequencies ofextreme events).

A measure of the realism of the model-simulated fields is the normalized correlationfunction, A, which is described in Appendix B. Figure 7.1 shows the forecast and observedcorrelation function of the 500-mb temperature field for the OSCAR IV case. Both fieldsshow a decreased correlation with increasing distance. During the 72-h period of thissingle study, the MM4 (F) is more highly correlated than the observed (0) in the shortestseparation distances, is about the same as the observed in the 1000- to 2000-km separationrange, and is less correlated for the longer separation distances.

Another way of quantitatively examining the structure of a forecast is to compute thetwo-dimensional spectra of a field (Errico, 1985). This two-dimensional Fourier decom-position can also be used to measure the errors at various scales of motion. An exampleof the spectra of the 36-h 500-mb forecast (solid) and observed (long dashed) is shown inFigure 7.2. The horizontal scale varies logarithmically from 4800 km on the left to 350 kmon the right. The MM4 has spectra very similar to the observed although the differencespectra are as large in magnitude at scales of 500 km in length and less. It should be notedthat the observed spectra at scales below about 800 km are poorly estimated in this casebecause of the 400-km spacing of the upper-air observations.

8. Summary

A number of quantitative measures of accuracy of the mesoscale meteorological modelhave been developed. These measures can be used to compare the accuracy of differentversions of the model and to provide quantitative estimates of the error in the meteorolog-ical data supplied to the RADM. Interpretation of the verification scores and methods andexamples from one case have been provided. Computation of these scores for additionalcases is required to increase the reliability of the error estimates.

E-7

Page 69: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

References

Anthes, R. A., 1983: A review of regional models of the atmosphere in middle latitudes.

Monthly Weather Review, 111, 1306-1335.

Anthes, R. A., E.-Y. Hsie, and Y.-H. Kuo, 1987: Dexcription of the Penn State/NCAR

Mesoscale Model Version 4 (MM4). NCAR Technical Note NCAR/TN-282 + STR,

66 pp.

Errico, R. M., 1985: Spectra computed from a limited area grid. Mon. Wea. Rev., 113,

1554-1562.

E-8

Page 70: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Table 3.1. Estimate of RMS errors.

OSCAR IV Case at 48 h

EA ER EF S(%)

Temperature (°C)at 850 mb

Wind (magnitude ofvector wind error, m/s)at 850 mb

Specific humidity(g/kg)

Sea-level pressure(mb)

1.5

3.0

2.0

1.0

16.0 8.0 3.0

32.0 16.0 9.0

10.0 5.0 3.0

24.0 12.0 3.0

Table 4.1. Correlation coefficients between observed and forecast change of a variable forone case (OSCAR IV at 48 h).

Temperature at 850 mb

Mixing ratio at 850 mb

Height of 850 mb surface

Sea-level pressure

0.91

0.82

0.91

0.91

E-9

90.6

81.3

90.0

91.7

Page 71: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Table 5.1. S1 scores.

OSCAR IV Case at 48 h

"Perfect" "Worthless" Persistence MM4

Sea-level pressure 30 80 110 60

500-mb height 20 80 80 38

Table 6.1L Verification of MM4 prediction of precipitation for 24-h period ending0000 GMT 23 April 1981 (OSCAR IV case ).

CategoricalPrecipitation Forecast Threat Score Bias ScoreThreshold (% Correct)

0.025 cm

0.254 cm

0.635 cm

1.270 cm

2.540 cm

70.0

78.0

88.0

96.0

100.0

0.58

0.34

0.34

0.29

0.00

1.00

0.75

0.65

1.30

2.50

E-10

- ---------~I-�-------��

Page 72: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

OBSERVED 000 MB GEOPOTENTIAL

22 APR OOZ

24 APR OOZ

Figure 1.1.

23 APR OOZ

25 APR OOZ

Observed 1000-mb geopotential heights and frontal positions for four timeperiods during the OSCAR IV period (0000 GMT 22 April to 0000 GMT25 April 1981). The contour interval is 30 m.

I

P-A

Page 73: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

FORECAST 1000MB GEOPOTENTIALDAY I

DIX

·DAY 2 DAY 3

t'.1

Figure 1.2. Forecast 1000-mb geopotential heights (contour interval 30 m) and frontal

positions for 24, 48, and 72 h of the OSCAR IV case. The root-mean-square

(RMS) error of 1000-mb heights for this forecast is shown in the upper left

panel.

I

i

I."

F

Page 74: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Figure 2.1. Domain of the MM4 simulations used for computing verification scores andfor the ADMP and the model intercomparison project (MIP). The innerportion is used for verification of the precipitation.

Page 75: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

1

..

Figure 6.1. Observed precipitation for 24-h period ending at 0000 GMT 23 April 1981.The contours are 0.25, 1.0, and 2.0 cm.

Page 76: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

h1201en

Figure 6.2. Forecast precipitation for 24-h period ending at 0000 GMT 23 April 1981.The contours are 0.25, 1.0, and 2.0 cm.

Page 77: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

Figure 6.3. Correlation matrix of observed and forecast precipitation fields. Each valuein the matrix represents a correlation between observed and forecast precip-itation amounts on a subset of the forecast domain for various north-southand east-west lags. The value of 0.65 at the center of the matrix representsno lag. The value of 0.74 in the right column represents the correlation com-puted when the observed grid is shifted five points to the east with respectto the forecast grid.

E-16

-0.02 0.00 0.02 0.05 0.07 0.09 0.11 0.13 0.16 0.18 0.22

0.05 0.07 0. 13 0.15 0.18 0.20 0.22 0.25 0.28 0.32

0.14 0.17 0.20 0.24 0.27 0.30 0.33 0.36 4 2 0.45

: 20. ..2543 046 0.49 0.4.52 0.55 0 .

0.36 42 0. 48 0.53 05 61 0.63 0.65 0.67 .7

0.39 0.46 0.52 0.58 .62 0.65 0.67 0.69 .1 0.73 0.74

0.26 0.41 0.46 o0.50 0.53 0.56 ? 0.62 0.64 0.65

0.04 0.12 0.18 0 O44 .47

-0.16 -0.10 -0.05 -0.01 0.02 0.06 0.09 0.12 0.14 0.16 0.18

-0.27 -0.23 -0.19 -0.16 -0.14 -0.12 -0.10 -0.09 -0.08 -0.07 -0.07

-0.34 -0.30 -0 ..28 -0.25 -0.24 -0.22 -0.22 -0.21 -0.22 -0.22 -0.21

-- --- -- II -I �- I L

I - I - I - - I- -I --- L - - Il---L I --

Page 78: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

NORMALIZED CORRELATION FUNCTION( 500 KM BINS

FOR/OBS

NUMBER OF STATIONS -500 MB

0

F

39

I I I I I I

0

F 0

F 0

F

I I I I I I I I I I

2000

DISTANCE

I I I I I I - I I I I I -- I I ~ ~~~~~~~~~~~~~~~~~~I I I-L I I

3000

(KM)

4000

Figure 7.1. Normalized correlation function for observed and forecast (72 h) 500-mbtemperatures as a function of distance. Each point represents the correlationof all points in 500-km bins.

E-17

T

I I I I i I I I I II

F

- 0

F0

1.0

.8

.6

.4

.2

0

-.2

-.4

-.6

-.8

-1 .00 1000 5000

)

I I I v . I I

I I a A I

Page 79: VERIFY The MM4 i Verification Processoropensky.ucar.edu/islandora/object/technotes:389... · 2020-04-08 · NCAR/TN-281+IA NCAR TECHNICAL NOTE July 1987 VERIFY - The MM4 Verification

SPE C TRA

H

:81 4 23 0

500 MBFORECAST ID. OSCAR CASE STARTING FROM 0000 GMT APRIL

OBSERVED ID. ANALYSES DATA FROM HAAGENSEN

TWO-DIMENSIONAL WAVE NUMBER (10**-5 /M )

Figure 7.2. Two-dimensional spectra (variance vs. wave number) of 500-mb heights at:36 h of the OSCAR IV case. The horizontal scale varies from 4800 km onthe left to 350 km on the right. The rsolid line represents the forecast, thelong dashed line the observed, and the dotted line the difference.

E-18

10 5

4

10 3

102

101

:LL.0z,:,-4a:

c>

100

1 I I I � -I-����1������� 1--

I a I I I I I I I I

1 0- 1

10-2

10-4

1 n-5 I - · -III V