Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
POINT - SPECIFIC MOS FORECASTS FOR THE
2002 WINTER OLYMPIC GAMES
by
Andrew J. Siffert
A thesis submitted to the faculty ofThe University of Utah
in partial fulfillment of the requirements for the degree of
Master of Science
Department of Meteorology
The University of Utah
December 2001
ABSTRACT
From 8-24 Feb. 2002, an estimated 1.5 milli on people will converge on Salt Lake City for
the Winter Olympics. Approximately 3,500 of the world’s best athletes will compete at venues in
and around the Wasatch Mountains. With as many as 100,000 spectators and athletes traveling to
and attending events each day at five outdoor venues, accurate weather forecasts are criti cal.
Statistical techniques combining the high density MesoWest surface observation network and the
University of Utah Real-Time MM5 modeling system have been developed to improve site-
specific objective model guidance for the Winter Games. Using traditional multiple linear
regression techniques, sets of MOS equations have been developed for the MM5 to provide
hourly, point-specific forecasts of six parameters (temperature, dewpoint, relative humidity, wind
speed, wind direction, and surface pressure) at the outdoor Olympic venues and other weather
sensitive locations.
MM5 MOS is shown to produce substantial improvement over MM5 raw model output for
all forecast variables. For temperature, MM5 MOS performed better at higher elevations
compared to the lower elevations. This was due partly to a prolonged inversion event that
produced large errors in the valley sites. The AVN MOS product slightly outperformed the more
traditional NGM MOS with the MM5 MOS in some cases outperforming both of these products.
Overall , the results show that it is possible to create MOS guidance from mesoscale model output
that improves upon raw model output and is useful for point-specific forecasts on small spatial
scales in complex terrain. MM5 MOS also provides forecast guidance at locations where NGM
and AVN MOS are not available.
The development of MM5 MOS for the five outdoor Olympic venues and other weather
critical locations will prove to be beneficial for the Olympic forecast team, make Olympic events
more enjoyable for the estimated 1.5 million spectators coming to Utah in February 2002, and
provide a legacy forecast product for use after the Winter Games.
CHAPTER 1
INTRODUCTION
The success of mesoscale research models, coupled with an exponential increase in
computing power and storage over the last decade, has led to the use of mesoscale models for
operational weather prediction. Although these models are capable of producing locally detailed
forecasts, such as the lake-land interaction of the Great Salt Lake, systematic biases in the
simulation of surface, boundary layer, and other physical processes can limit their utility for
predicting surface weather quantities like temperature, relative humidity, wind speed, and wind
direction (Carter et al. 1989). During the summer of 2001, the Ninth Conference on Mesoscale
Meteorology and the 18th Conference on Weather Analysis and Forecasting were held in Ft.
Lauderdale, FL. At the conferences, there was a panel and group discussion entitled "how will the
roles of humans in mesoscale weather prediction change during the next few decades." Much of
the discussion focused on how the role of the forecaster will evolve as short-range mesoscale
model accuracy improves in the future. However, statistical techniques, when used in concert with
mesoscale model output, offer the opportunity to improve the utility of mesoscale model output
today. This thesis describes the use of a statistical technique, known as Model Output Statistics
(MOS), to produce point-specific forecast guidance for the 2002 Winter Olympics. To our
knowledge, this is one of the first uses of MOS with mesoscale model output.
Two common statistical methods have been applied previously to numerical model output.
The first, used initially by Klein et al. (1959), is known as the "perfect prog" method. In this
method, a statistical relationship is developed between observed weather variables for which the
user would like a forecast (e.g., temperature) and existing observed weather elements (e.g., 700
mb height, 1000–500 mb thickness). The resulting equation is then applied directly to output from
a dynamic model. Although a long record of observations can be used for equation development,
a major disadvantage of the perfect-prog method is it does not correct for systematic model
biases. Therefore, the perfect prog method has not been used widely in recent years.
The other method, MOS, provides a statistical relationship between observed weather
variables and predicted weather elements by a dynamic model. MOS was first developed by
Glahn and Lowry (1972) at the National Weather Service (NWS) Techniques Development
Laboratory (now the Meteorological Development Laboratory), and has been used by NWS
forecasters and other meteorologists for over 30 years. MOS has several advantages over forecasts
produced directly by dynamic models. In particular MOS corrects for systematic model biases,
partially accounts for some phase errors, predicts sensible weather elements not directly forecast
by current dynamic models (e.g., visibility), produces reliable probabilit y forecasts (e.g.,
probability of precipitation), and provides information on model predictabilit y (Carter et al.
1989). This is achieved by producing a forecast that leans toward the sample mean with increasing
forecast projection (Glahn and Lowry 1972). One disadvantage of MOS is that changes to the
dynamic model, which are common with today’s modeling systems, can alter model biases,
reducing the accuracy of MOS equations that were developed with forecasts from the older model
configuration (Dallavalle 1998).
MOS uses forward stepwise multiple linear regression to produce an equation relating
predictors with a predictand, where the predictand is the variable to be forecast (e.g.,
temperature), and the predictors are any available quantities that may be related to the predictand.
Possible predictors include variables from dynamic model forecasts, climate variables, and recent
surface observations. The multiple linear regression allows systematic model biases and phase
errors, as well as the local climatology, to be corrected and automatically built into the MOS
equations. MOS equations can include predictors not readily available to the perfect prog method,
such as vertical velocity. Because of these advantages, MOS has proven to be more successful
than perfect prog for most applications (Klein and Glahn 1974).
Over the past decade, enhancements in observational networks and numerical modeling
capabiliti es have improved forecasting over mountainous regions, such as northern Utah where
the 2002 Olympic Winter Games will be held. Statistical techniques using the high-density
MesoWest surface observation network (Horel et. al. 2002) and University of Utah real-time
Mesoscale Model generation 5 (MM5) modeling system (Onton et al. 2001) offer the potential to
further improve weather forecasting for this region. This thesis describes how MesoWest
observations and real-time MM5 forecasts were used to develop MOS forecast guidance
(hereafter MM5-MOS) for the outdoor Olympic Venues and other weather-sensitive locations
surrounded by fine-scale topographic features that are not adequately resolved by any current
dynamic models (Fig. 1.1) (Table 1.1). This work has resulted in point-specific forecast guidance
for many sites not covered by the Nested Grid Model (NGM) MOS and Aviation Model (AVN)
MOS. The thesis also examines several questions pertaining to the application of MOS for
weather forecasting including:
• How much does MOS improve upon raw model output from a mesoscale model in a region of
complex terrain?
• How skill ful is MOS based on a continuously evolving mesoscale model compared to that
based on a frozen model (e.g., NGM MOS)?
Figure 1.1. Locations and aerial photos of MM5 MOS sites (red: 3-letter MesoWestidentifiers) in the vicinity of Olympic outdoor venues: (a) Plan view; (b)Snowbasin Ski Area; (c) Utah Olympic Park; (d) Park City MountainResort; (e) Deer Valley Resort; and (f) Soldier Hollow. Photos by D.Quinney and venue pictures courtesy of the Salt Lake Olympic Committee.
SLC
a.
HIFOGD
CEN
PSS
PVU
SND
WBB
SNH
DCC
CUPD
UT7
SBBSBW
SBE
b.
WBU
c.
PCS
PCBSNC
d.
MBYDVB
DVE
e.
WMP
WM2
f.
.
STATION LOCATION LATITUDE LONGTITUDEELEVATION
(m)
CLK Canyons - Lookout Chair 40.683 -111.574 2529
CDC Cedar City - Muni Airport 37.7 -113.6 896
CUPD Jordanelle Dam 40.6 -11.42 1855
C99 Canyons - Peak 9990 40.659 -111.594 3045
DCC Deer Creek Dam Chute 40.431 -111.540 2033
DVE Deer Valley - Burns Chair 40.6228 -111.4520 2235
DVB Deer Valley - Bald Eagle Chair 40.6236 -111.4809 2591
EKO Elko, Navada 40.87 -115.73 1608
ENV Wendover, Navada 40.73 -114.03 1292
FWP Farnsworth Peak 40.659 -112.202 2797
GJT Grand Junction, Colorado 39.12 -108.53 1475
HIF Ogden/Hill Air Force Base 41.12 -111.97 1288
KNC Knudsens Corner 40.6384 -111.8136 1402
LGU Logan Cache Valley Airport 41.78 -111.574 1358
MBY Deer Valley Mt. Baldy 40.6089 -111.4802 2849
MS9 Tooele MS9 40.29 -112.37 1545
OGD Ogden - Hinckley Muni Airport 41.194 -112.0167 1362
PCB Park City - Base 40.6518 -111.5113 2000
PCS Park City - Eagle Chair 40.6774 -111.5180 2610
PVU Provo Muni Airport 40.2240 -111.7252 1369
SBB Snow Basin - Allens Peak 41.2052 -111.8829 2835
SBW Snow Basin - John Paul Lift 41.2031 -111.8805 2670
SBE Snow Basin - Base 41.21151 -111.8582 1925
SLC Salt Lake City Intl. Airport 40.78 -111.97 1288
SGU Saint George 37.08 -113.60 896
SND Sundance - Arrowhead Chair 40.3685 -111.5934 2515
UT7 UDOT - Bluffdale 40.4754 -111.9047 1433
UT5 UDOT- Mouth Parley Canyon 40.7122 -111.7252 1369
WBB University of Utah - top of WBB building 40.766 -111.877 1497
WBU Olympic Sports Park 40.7166 -111.5596 1925
WMP Wasatch State Park - Upper Bear 40.4790 -111.5002 1713
WM2 Wasatch State Park - Lower Bear 40.4806 -111.4923 1689
Table 1.1. Station ID, location, and elevation of all 2001-2002 MM5 MOS sites.
• When predictors are provided by a state-of-the-art mesoscale model, does including prior
observations and geoclimatic data in MOS equations result in a significant improvement in
skill ?
• Does MM5-MOS account for unusual climatic conditions?
The remainder of this thesis is organized as follows. Chapter 2 reviews the data and
methods used to develop MM5-MOS for the 2002 Olympic Winter Games. Chapter 3 examines
the accuracy of MM5-MOS and examines the questions posed above. Discussion and conclusions
are then presented in Chapter 4.
CHAPTER 2
DATA AND METHODS
Numerical Modeling System
Development of MM5 MOS involved the use of numerical weather prediction (NWP)
model forecasts and surface observations from Olympic venues and transportation corridors. The
NWP model used was the nonhydrostatic Pennsylvania State University - National Center for
Atmospheric Research MM5 (Grell et al. 1995), which was run in real time twice daily (0000 and
1200 UTC initialization times) at the University of Utah. The model featured an outer domain
with 36-km horizontal grid spacing that covered the western United States and eastern Pacific
Ocean, and a two-way interactive nested domain that was centered over the state of Utah and
featured 12-km horizontal grid spacing (Fig. 2.1). The complex topography of the Intermountain
West necessitates the use of such a high-resolution mesoscale model, al though even greater
resolution is needed to fully resolve local topographic features. Twenty-seven sigma levels were
used in the vertical.
MM5 initial and boundary conditions were created by interpolating NCEP Eta Model forecasts to
the model grid. Major parameterizations included the Kain–Fritsch cumulus scheme (Kain and
Fritsch 1990), the simple-ice microphysics of Dudhia (1989), the so-called MRF planetary
boundary layer scheme (Hong and Pann 1996), Dudhia (1989) radiation, a five-layer soil model,
and an upper-radiative boundary condition (Klemp and Durran 1983).
Figure 2.1. Domains, terrain, and grid points of the University of Utah real-time MM5model.
36 km
12 km
Although the physical parameterizations and domain configurations of the University of
Utah MM5 did not change during the study period, the model was upgraded from MM5 version 2
to MM5 version 3 in September 2000. The newer version included some minor changes to the
aforementioned parameterizations. These minor changes did not appear to have a substantial
impact on MM5 MOS accuracy, although it is possible that performance would be improved
slightly if one reran the previous 3 years with a frozen model and redeveloped the MM5 MOS
equations.
MM5 MOS Development
Following Glahn and Lowry (1972) and Glahn (1985), the development of MM5 MOS
involved the use of forward-stepwise multiple linear regression (MLR) to produce a relationship
between a dependent variable (i.e., predictand) that is being predicted (e.g., surface temperature),
and independent variables (i.e., predictors) that might be related to the predictand. Predictors used
to develop MM5 MOS included MM5 model forecast data, geoclimatic information, and surface
observations.
To develop a forecast equation for a given predictand, site, and forecast hour, potential
MM5 predictors were forecasts of temperature, dewpoint, relative humidity, geopotential height,
vertical velocity, and zonal, meridional, and total wind speed at the lowest half-sigma level
(except for geopotential height and vertical velocity), 800, 700, 600, and 500 mb for that forecast
hour at the site. Surface pressure, 3-hour accumulated precipitation, surface–700-mb lapse rate,
and mean surface–500-mb relative humidty were also used. Bi-linear interpolation was used to
interpolate forecast variables from the MM5 grid to the forecast site.
Potential surface observation predictors were temperature, dewpoint, relative humidity,
surface pressure, sea-level pressure, altimeter setting, 1-hour accumulated precipitation, and
zonal, meridional, and total wind speed at two times: 3-hours after the nominal MM5
initialization time (0300 or 1500 UTC), and the most recent observation corresponding to the time
of day for which the forecast is valid. At several sites, some of the above observations were not
collected and therefore were not used for MOS development. Geoclimatic predictors, which were
included to partly account for seasonal trends within the year (sun angle and day light hours),
included sine and cosine of the normalized day of the year. Combined, 64 potential MM5 model
parameters, surface observation, and geoclimatic predictors were considered for the development
of each MM5 MOS equation.
Predictands were surface temperature, dewpoint, relative humidity, wind speed, zonal
wind component, meridional wind component, and sea level pressure. Observations of these
variables at Olympic Venues and along transportation corridors were collected by MesoWest, a
collection of cooperating mesonets in the western United States (Horel et al. 2002). MesoWest,
which is managed by the NOAA Cooperative Institute for Regional Prediction at the University of
Utah, collects observations provided by federal and state agencies and commercial groups. Since
each station was designed to meet the specific needs of its operating agency, there was
considerable diversity in the configuration and type of instrumentation used at MM5 MOS sites.
Quality control was used to eliminate obvious observational errors, but attempts were not made to
identify smaller errors that may result from instrument bias or siting.
Figure 1.1 illustrates the locations of MM5 MOS sites in the Olympic region. These sites
were selected based on data availability and forecast needs. Of particular concern was the
availability of a quasi-continuous record of observations for a period of at least three winters. For
example, MM5 MOS was not developed for some sites at the Utah Winter Sports Park due to
changes in the location of observing stations during the construction of the Olympic facilities.
Precipitation was not used as a predictand because of a lack of reliable automated precipitation
observations from most sites in the Wasatch Mountains. Quality controlli ng such automated
observations is a challenge and has only recently been completed for relatively short periods
covering single events or a month (Slemmer 1998; Cheng 2001).
As mentioned above, MM5 MOS was not developed for observing sites where a long
record of observations was not available. There are, however, techniques that have been applied to
develop MOS from shorter observational records (e.g., Dallavalle 1998). For example, the number
of predictors used for MOS equations could be kept small to avoid overfitting the dependent data
(Dallavalle 1998). Observations from several stations within a region could be bundled together to
develop a single forecast equation (Gibson et. al. 1997). Such an approach, however, cannot adjust
for local effects (Marglares and Carter 1986), which are substantial in the complex terrain of the
Olympic region. As a result, this project did not attempt to develop MM5 MOS at stations with a
limited period of observations.
Separate MM5 MOS equations were developed for the cool season (1 Nov – 31 March)
and warm season (1 Apr – 31 Oct). For most locations, 3 years of observations and model
forecasts were available for each season, beginning July 1998 and ending March 2001. Most
Olympic sites collected data only during the winter, and results presented in this thesis focus on
that season. The time period used to develop MM5 MOS was thus relatively short compared to
that used to develop NGM MOS, but was similar to what is currently being used for AVN and
MRF MOS development (R. Allen personal contact 2001; Carter et al. 1989; Dallavalle 2000;
Erickson 2000).
To perform the MLR, both predictands and predictors for each individual station were
organized into a text table file and imported into the JMP statistical software package, which
was developed by the Statistical Analytical Software (SAS) Institute (Sall et al. 2001). JMP was
then used to perform the forward-stepwise MLR, relate predictors and predictands, and generate a
forecast equation for each predictand. Separate equations were developed for each site, each
model initialization time (0000 UTC and 1200 UTC), and each forecast hour at 3-hour intervals,
resulting in 538 MM5 MOS forecast equations per site. These equations were then solved, using
MM5 forecast data, observations, and geoclimatic information, to generate a MM5 MOS forecast.
A minimal amount of post-processing was done to the final forecast output. For example, relative
humidity was constrained to be between 0% and 100% even though the forecast equations may
produce values above 100% or below 0%. Wind direction was derived from zonal and meridional
wind speed equations, while wind speed was based on a separate wind speed equation. Wind
speed was constrained to have a value greater than or equal to zero. MM5 MOS forecasts were
displayed in tables and graphical time series available via the Internet. Examples of these tables
and graphs can be found in Fig. 2.2.
The JMP MLR process works in the following manner. The first step determines which
one of the 64 predictors, when used in a linear regression equation, explains the greatest amount
of variance in the predictand. The second step chooses another predictor that, together with the
first one selected, produces the greatest improvement in the variance explained. The processes is
repeated, with additional predictors added to the multiple linear regression equation, until a
stopping criteria is met. The JMP stopping criteria is based on a the Wilk's Lambda, Pill ai's Trace,
Hotelli ng-Lawley Trace, and Roy's Max Root statistical tests (Sall et al 2001). This process
typically yielded a multivariable linear regression equation consisting of 15–25 predictors.
For example, the forward stepwise MLR process chose 22 of 64 potential predictors to
create an MLR equation for the SLC temperature at forecast hour 12 of the 0000 UTC initialized
Figure 2.2. The MM5 MOS (a) text and (b) graphical output give the forecaster a moredetailed look at what the MM5 MOS output is. The output is created twicedaily for both forecast runs. Even though the forecasts are only made in 3-hour intervals, hourly forecasts are provided. This is done by interpulateingbetween the forecast hours. The output is available for all available MM5MOS sites. The text option also gives the MM5 Raw model 1-houraccumulated precipitation along with the total accumulated precipitation.
a.
b.
MM5 (Table 2.1) The first forward step chose the +3-hour observed temperature as the primary
predictor. If the MOS equation was based only on this predictor, the MOS equation would have
been a single variable regression equation of the following form:
2.1
where TF12 is the MOS temperature forecast at forecast hour 12, a is a constant, b is a coefficient,
and TO3h is the +3-hour observed temperature. For the 1200 UTC 20 Feb 2001 initialized MM5
run, this would yield a 12 hour temperature forecast at SLC of
2.2
MM5 MOS, by taking advantage of a larger number of predictors, typically produces a more
accurate forecast. A multivariable regression equation would take the form
2.3
where Pi are predictors and bi are their respective coefficients, as presented in Table 2.1.
Using the values presented in Table 2.1 yields a forecast of -0.46°C, closer to the observed value
of -0.62°C.
The fit of each equation can be expressed using several statistics, including the variance
value, which measures the percentage of variabilit y in the predictand that is explained by the
particular arrangement of predictors found in the regression equation (Panofsky and Brier 1965).
In the case of MOS temperature forecast equations for the MM5 initialized at 0000 UTC, variance
values ranged from 0.98 for the best fit to 0.84 for the least skill ful temperature equation, with
values decreasing with increasing forecast projection. The multivariable regression equation
based on predictors and coefficients from Table 2.1 has a variance of 0.91, which means that 91%
TF12 a b T03h•+=
TF12 1.749 1.72 0.8115•+– 0.35°C–= =
TF12 a bi Pi•
i 1=
n
∑+=
Table 1:.
Predictor Coefficient
Intercept / Constant 45.530842
+3-hour observed temperature 0.5492483
Model forecasted wind speed at 800 mb 0.0639843
Model forecasted relative humidity at 700 mb -0.001869
Model forecasted surface temperature 0.4330212
Model forecasted V component of the wind at 800 mb 0.0192816
+3-hour observed surface pressure (PMSL) -0.056551
24 hour previously observed U component of the wind -0.088776
+3-hour observed U component of wind -0.111513
Model forecasted V component of the wind at 700 mb -0.071577
Model forecasted wind speed at 500 mb 0.0436772
Model forecasted W component of the wind at 800 mb -0.110376
Model forecasted W component of the wind at 700 mb 0.0656301
24-hour previously observed relative humidity 0.0114045
Model forecasted relative humidity at 600 mb -0.003955
Model forecasted dewpoint at 500 mb -0.050737
Model forecasted V component of the wind at the surface 0.176322
Model forecasted U component of the wind at the surface 0.0740645
Model forecasted 3-hour total precipitation -0.5616162
Model forecasted surface relative humidity 0.0348168
Model forecasted dewpoint temperature at 800 mb -0.143036
Model forecasted Surface to 500 mb mean relative humidity 0.0611134
Model forecasted wind speed at 700 mb -0.001628
Table 2.1. Results of regression analysis of 12 hour forecast temperature at SLC fromthe 0000 UTC model initialization using all 64 predictors yields an outputof 22 predictors that best fit the observed values.
of the variance is explained by the chosen predictors. A scatter plot of observed versus predicted
temperature is shown in Fig. 2.3. Regression analysis of 12 hour forecast temperature using only
the +3-hour observed temperature as a predictor yields a variance of 0.80. Thus, 80% of the 12-
hour forecast temperature variance was explained using only the most important predictor.
Validation Methods
Results presented in this thesis are based on validation of MM5 MOS against an
independent sample of observations that were not used for equation development. This involved
developing MM5 MOS equations using observations and MM5 forecasts only on days following
the 10th day of each winter month. Forecasts based on these equations were then produced and
validated against observations from the first 10 days of each month. The results from four sites in
the Olympic region are presented in this thesis. The first is the Salt Lake International Airport
(SLC, 1288 m), which is located near the base of the broadly sloped Salt Lake Valley, just
southeast of the Great Salt Lake (Fig. 1.1a). This site was selected because it is the only site in the
Olympic region where MM5 MOS can be compared with NGM and AVN MOS forecasts.1 SLC is
also the lowest site for which MM5 MOS was developed. The second site is Wasatch Mountain
State Park (WMP, 1713 m), which is also located near the bottom of a valley and is the site of the
Olympic cross country skiing and biathlon events (Fig. 1.1f). Both SLC and WMP experience a
relatively large diurnal temperature range. The third is a mid-mountain site located near the top of
the Burns chairli ft at Deer Valley Ski Area (DVE, 2235 m, Fig. 1.1e). It is well sheltered and
surrounded by Aspen trees. The final verification site is near the start of the women’s downhill
course at the Snowbasin ski area (SBW, 2670 m). Located on a ridge, this is the one of the highest
Olympic MM5 MOS sites (Fig. 1.1b). Each of the above sites represent their own forecasting
1. AVN MOS was available beginning 1 November 2000.
Figure 2.3. Regression analysis of 12-hour forecast temperature using the list ofpredictors in Table 2.1.
challenge but, by spanning a range of elevations and local terrain characteristics, should be
broadly representative of the general performance of MM5 MOS throughout the Olympic region.
Two measures of forecast skill are presented in this thesis: bias error and mean absolute
error (MAE). Bias error is defined as:
2.4
where F is the MM5 MOS forecast, O is what was observed, and N is the number of forecasts.
MAE is defined as:
2.5
and measures the typical size of MM5 MOS errors. Probability of detection was used to evaluate
wind direction and measures the percentage of forecasts with errors less than or equal to 30°.
Unless otherwise specified, results presented in this thesis represent the bias, MAE, or
probability of detection averaged for all forecast hours 6–36 (every 3-hours) and both 0000 and
1200 UTC forecast cycles. During the 150-day independent data period, 280 MM5, 324 NGM
MOS, and 223-262 MM5 MOS forecasts were available. The range of MM5 MOS forecasts
available is due to the fact that some Olympic sites did not collect data during November,
resulting in fewer forecasts, and in some cases prior observations were not available for use in the
MM5 MOS forecast equation.
Bias
Fi Oi–( )
i 1=
i N=
∑N
---------------------------------=
MAE
Fi Oi–( )
i 1=
i N=
∑N
------------------------------------=
CHAPTER 3
MOS VALIDATION
As described in Chapter 1, this study seeks to answer several questions concerning the use
of MOS for weather forecasting including:
1. How much does MOS improve upon raw model output from a mesoscale model in a
region of complex terrain?
2. How skillful is MOS based on a continuously evolving mesoscale model compared to
that based on a frozen model (e.g., NGM MOS)
3. When predictors are provided by a state-of-the-art mesoscale model, does including
prior observations and geoclimatic data in MOS equations result in a significant improvement in
skill?
4. Does MM5-MOS account for unusual climatic conditions?
To answer these questions, this chapter presents an evaluation and comparison of the skill of
MM5 MOS, the raw model output (MM5 RAW), NGM MOS, and AVN MOS.
Impact of Predictor On MOS
To examine the impact of prior observations and geoclimatic data on MM5 MOS, Fig. 3.1
presents MAEs for MM5 MOS equations developed using only MM5 model predictors
(MODEL), MM5 model and geoclimatic predictors (GEO), MM5 model and prior observation
b.a.
d.c.
e.
Figure 3.1. Verification results of the predictands using the four potential predictors atthe four verification sites: (a) temperature (°C) MAE, (b) dewpointtemperature (°C) MAE, (c) relative humidity (%) MAE, (d) wind speed(ms-1) MAE, and (e) probabil ity of detection of wind direction within 30°of observed for the four equations at the four verfication sites.
predictors (OBS), and a combination of all three predictors (ALL). The MAEs are based on
validation against an independent data set as described in Chapter 2.
The temperature MAEs for the two valley sites, SLC and WMP, show that ALL and OBS
performed the best, with ALL exhibiting slightly lower MAEs than OBS. MAEs for GEO and
MODEL were much larger than ALL and OBS, with MODEL performing the worst. At the mid to
high elevation sites, DVE and SBW, MAEs for all four predictor groups were very similar, with
MAE values varying by only 0.04 °C at WMP and 0.05 °C at SBW.
For dewpoint, ALL and OBS produced lower MAEs than GEO and MODEL at all four
verification sites (Fig. 3.1b). ALL generally produced slightly better skill scores than OBS,
whereas the worst performance was exhibited by MODEL. The MAEs for relative humidity
showed that, similar to the dewpoint MAEs, ALL and OBS were more skill ful than GEO and
MODEL (cf. Figs. 3.1b,c). The MAEs for ALL and OBS were very similar. At WMP, the ALL
group verified slightly better than OBS, with MAEs of 10.22% to 10.25%, respectively, whereas
at SLC, DVE, and SBW, OBS MAEs were slighly lower than those for ALL. Although GEO and
MODEL produced similar results at the four sites, MODEL consistently produced the largest
MAEs.
The magnitude of wind speed MAEs varied from station to station, but the four predictor
groups featured similar MAEs at a given site (Fig. 3.1d). The largest variabilit y was found at SBW
where the difference between the MAE for OBS and MODEL was only 0.12 ms-1. Thus, the
addition of prior observations and geoclimatic predictors to model predictors appears to have had
li ttle impact for this variable. The results for probabilit y of detection of wind direction at SLC
demonstrate that the ALL and GEO perform about the same, while the OBS performed the best
(Fig. 3.1e). At WMP, the GEO and MODEL performed much like at SLC with the OBS
performing the best. At DVE, the OBS and ALL were similar, but the OBS showed the best
results. For SBW, the MODEL had the best results, suggesting that the geoclimatic and
observations, slightly hampered the raw model output at this site.
The results presented above illustrate that the use of prior observation predictors generally
improves the performance of MOS temperature, dew point, and relative humidity forecasts. A
significant improvement for wind speed and direction forecasts was not observed. This is likely
due to the fact that the observed wind speed and direction in mountainous areas often are variable,
particularly at light wind speeds. Based on these results, it was decided to utili ze a combination of
model, observed, and geoclimatic predictors (i.e., ALL) for developing Olympic MOS equations.
Results presented for the remainder of this paper are based on these equations.
Comparison of MM5 MOS to MM5 RAW, NGM MOS and AVN MOS
This section examines two major questions concerning MM5 MOS: how much does it
improve upon the output of a high-resolution mesoscale model in complex terrain, and how
skill ful is it relative to NGM MOS and AVN MOS? Since SLC is the only NGM MOS site in the
Olympic region, to answer these questions we compare MM5 MOS MAEs at the four Olympic
validation sites (SLC, WMP, DVE, and SBW) to that of NGM MOS and AVN MOS at SLC.
MM5 MOS and MM5 RAW MAEs at SLC are also compared. Since AVN MOS has only recently
been developed, it was available for only the most recent winter (2000-2001). As a result, Fig. 3.2
shows MAEs based on validation using the full independent data set (no AVN MOS), while Fig.
3.3 shows a validation for only the most recent winter when AVN MOS was available. AVG MM5
MOS in the figures represents the MAE averaged for all four verfication sites.
The temperature MAE shows that at SLC, MM5 MOS improved substantially upon MM5
RAW (Fig. 3.2a). The MM5 MOS MAEs (1.87 °C) were very similar to those of NGM MOS
Figure 3.2. Verification results for the selected predictands using the full indpendentdata set at the four sites selected: (a) temperature (°C) MAE, (b) dewpointtemperature (°C) MAE, (c) relative humidity (%) MAE (d) wind speed(ms-1) MAE, and (e) probabili ty of detection of wind direction within 30o.MM5 RAW and NGM MOS are for SLC. AVG MM5 MOS represents theaverage MAE for MM5 MOS at the four verification sites.
b.a.
d.c.
e.
Figure 3.3. Verification results for the selected predictands using the full indpendentdata set but only perfomed for the most recent winter (2000-2001) at thefour sites selected: (a) Temperature (°C) MAE, (b) Dewpoint temperature(°C) MAE, (c) Relative humidity (%) MAE, (d) Wind Speed (ms-1) MAE,and (e) Probabilit y of detection of wind direction within 30o. MM5 RAW,NGM MOS, and AVN MOS are for SLC. AVG MM5 MOS represents theaverage MAE for MM5 MOS at the four verification sites.
b.a.
d.c.
e.
(1.84 °C), while the AVG MM5 MOS MAEs were slightly better (1.64 °C). Characteristics of the
diurnal temperature cycle, which are related to local topographic features, appear to have a
significant impact on the magnitude of MAEs at the four verification sites. For example, SLC and
WMP observed larger MAEs, which could be due to difficulties forecasting the diurnal
temperature cycle. Lower MAE values are observed at DVE and SBW, which, because of their
higher elevation, typically observe smaller diurnal cycles.
Dewpoint MAEs increased with elevation, with the largest MAEs found at mid to high
elevation sites DVE and SBW (Fig. 3.2b). At SLC, MM5 MOS improved substantially over MM5
RAW, and featured a similar MAE compared to NGM MOS (1.71 °C and 1.72 °C, respectively).
The increase in MAEs with elevation could be the result of lower specific humidity values at
higher elevations, making dewpoint a more sensitive calculation. Figure 3.2c shows that all four
sites observed similar relative humidity MAEs, with SBW featuring the smallest MAE. The
similar relative humidity results at the four verification sites suggests that the variabilit y of
relative humidity with elevation is not as consistent as it is with temperature and dewpoint. MM5
MOS improved significantly over MM5 RAW (10.75% vs. 18.55%). NGM MOS, however,
exhibited slightly better MAEs that MM5 MOS at SLC or AVG MM5 MOS (10.23% vs. 10.75%
and 10.45%, respectively).
The MAE for wind speed shows larger errors at SLC and SBW, with DVE having the
lowest MAE (0.64 ms-1) (Fig. 3.2d). This could be due to DVE being in a sheltered location where
wind speeds are typically small . At SLC, MM5 MOS (1.66 ms-1 MAE) does not perform much
better than MM5 RAW (1.84 ms-1 MAE); however, MM5 MOS at WMP, DVE, and SBW showed
improved results compared to the raw model output (not shown). Figure 3.2e illustrates the
probability of detection of wind direction within 30 degrees. The MM5 MOS performed slightly
better at SLC than at the other verification sites and was least successful at predicting the wind
direction at SBW (only 34.9% of the time). This could be due to the placement of the SBW
station, which is located on a ridge where winds often swirl and create a difficult forecasting
scenario. Overall , MM5 MOS outperformed MM5 RAW at SLC, and produced slightly better
results than NGM MOS.
Figure 3.3 presents the validation using only data from the recent winter (2000-2001) to
allow for comparison with AVN MOS. Like the results shown for the full independent dataset
(Fig. 3.2), temperature MAEs were slightly lower at the higher elevations sites than at the valley
sites (Fig. 3.3a). At SLC, the performance of MM5 MOS (1.96 °C MAE) was similar to that of the
NGM MOS (2.03 °C MAE) and AVN MOS (1.84 °C MAE), and improved substantially over
MM5 RAW. For this one year period, AVG MOS exhibited a warm bias (not shown), which likely
is related to the fact that the first two years used to develop MM5 MOS were characterized by
above normal temperatures, while the most recent winter was characterized by near-normal
temperatures. Another possible reason for the warm bias was a prolonged temperature inversion at
the valley sites, which will be discussed in more detail l ater in this chapter.
For dewpoint, the MAE is slightly lower at SLC than at the higher elevation sites, with
MM5 MOS (1.60 °C MAE) at SLC performing worse than AVN MOS (1.55 °C MAE) and NGM
MOS (1.48 °C MAE) (Fig. 3.3b). Of the four verification sites, the MM5 MOS had the largest
error at DVE (2.11 °C MAE). The relative humidity MAE for MM5 MOS (10.46%) at SLC was
about half of what was observed for MM5 RAW (22.93%) (Fig. 3.3c). The other three MM5 MOS
verification sites show similar large improvement over their MM5 raw output (not shown). The
MM5 MOS performed the best at WMP (8.33% MAE), which was not the case when verification
was computed over all winters in Fig. 3.2c. The MM5 MOS (10.46% MAE) demonstrated MAE
values that were similar to those of the AVN MOS (9.97% MAE) and NGM MOS (11.72% MAE)
product at SLC.
Figure 3.3d illustrates the wind speed MAE with much the same results that are shown in
Fig. 3.2d. The MM5 MOS wind speed MAE (1.49 ms-1) surpassed both the NGM (1.61 ms-1) and
AVN MOS (1.73 ms-1) MAEs. Once again, as observed in Fig. 3.2d, the wind speed MAE values
were lower at WMP (0.63 ms-1) and DVE (0.60 ms-1) than at the two higher elevation sites. It is
assumed that if the verification statistics are recalculated after discarding light wind speeds that
the results would be similar to those at sites with large wind speeds. The probability of detection
of wind direction (Fig. 3.3e) was highest at DVE (48.75%) although this was not the case in Fig.
3.2e. Figure 3.3e shows that the wind direction at SBW was the most diffcult to predict. The MM5
MOS (38.97%) was a slight improvement over MM5 RAW (35.42%) and the NGM (38.74%) and
AVN MOS (36.87%) products.
The above results illustrate that MM5 MOS exhibits MAEs that are very similar to NGM
MOS and AVN MOS. For some variables at SLC, such as temperature, AVN MOS produced the
lowest MAEs for winter 2000-2001, while for others, such as wind speed, MM5 MOS was most
accurate. Comparison of MM5 MOS errors averaged over the four Olympic verification sites also
suggested comperable performance to NGM and AVN MOS. Thus, it appears that MM5 MOS
provides a product with similar performance characteristics to NGM and AVN MOS, but for a
large-number of sites in the Olympic region for which NGM and AVN MOS are not available.
MM5 MOS Comparison by Forecast Hour
Using the full independent data set, Figs. 3.4 - 3.8 show the MAE and bias statistics for
each forecast hour of the 0000 and 1200 UTC MM5 cycles. Viewing the statistics individually for
each model initialization hour allows for the viewing of diurnal patterns that might exist in the
forecasts. Since it was observed in Fig. 3.2 and 3.3 that the MM5 MOS can improve upon the raw
model output, MM5 RAW MAE are not shown in Figs. 3.4 - 3.8. However, the MM5 RAW bias
errors are shown in Fig. 3.4 - 3.8 to il lustrate how MM5 MOS removed most of the model’s
biases. Also shown in Figs. 3.4 - 3.8 are the MAE and bias for NGM MOS at SLC, which allows
for a direct comparison to the MM5 MOS.
Figure 3.4 shows the temperature MAE and biases. Close examination of the forecasts
shows a cyclical pattern in the MM5 MOS MAE that seems to correspond to diurnal patterns and
local effects. For the 0000 UTC MM5 runs, the temperature MAE is highest at forecast hour 21,
which is in the afternoon (Fig. 3.4a). For the 1200 UTC cycle, temperature MAEs show littl e
diurnal signal (Fig. 3.4b). In both cases, gradual error growth with increasing forecast projection
was observed. The bias errors for both 0000 and 1200 UTC reveal that the MM5 MOS does a
good job at correcting the MM5 RAW bias at SLC, removing the model warm bias at night and
cold bias during the day. (Fig. 3.4 c,d). The temperature bias errors at WMP (Fig. 3.4c,d) exhibit
the largest bias errors, although they are generally less than 0.5°C.
The dewpoint MAE (Fig. 3.5) shows a large increase in error with forecast projection.
Results from the 0000 UTC (Fig. 3.5a) cycle also illustrated that there was a rapid increase in
MAE beginning forecast hour 12 for SBW and after forecast hour 18 for DVE and WMP. The
values at SLC show a smaller increase after forecast hour 15, and then a larger decrease at forecast
hour 24, and then a sudden rise in the early evening hours. At WMP, the dewpoint bias for forecast
hour 18 of the 1200 UTC cycle show a 1.5 °C rise, but it drops back to nearly zero at forecast hour
21. Even with further analysis of this forecast equation, it is unclear why this 1.5 °C jump
occurred at forecast hour 18. There are several situations like this that come up with other
b.a.
d.c.
Figure 3.4. Temperature (°C) MAE by forecast hour for each verification site for the (a) 0000 UTC cycle (b) 1200 UTC cycle.Temperature (°C) bias by forecast hour for each verification site for the (c) 0000 UTC cycle (d) 1200 UTC cycle. MM5RAW is only shown for the temperature bias.
b.a.
d.c.
Figure 3.5. Dewpoint temperature (°C) MAE by forecast hour for each verification site for the (a) 0000 UTC cycle (b) 1200 UTCcycle. Dewpoint temperature (°C) bias by forecast hour for each verification site for the (c) 0000 UTC cycle (d) 1200UTC cycle. MM5 RAW is only shown for the temperature bias.
b.a.
d.c.
Figure 3.6. Relative humidity (%) MAE by forecast hour for each verification site for the (a) 0000 UTC cycle (b) 1200 UTC cycle.Relative humidity (%) bias by forecast hour for each verification site for the (c) 0000 UTC cycle (d) 1200 UTC cycle.MM5 RAW is only shown for the temperature bias.
b.a.
d.c.
Figure 3.7. Wind Speed (ms-1) MAE by forecast hour for each verification site for the (a) 0000 UTC cycle (b) 1200 UTC cycle.Wind speed (ms-1) bias by forecast hour for each verification site for the (c) 0000 UTC cycle (d) 1200 UTC cycle. MM5RAW is only show for the temperature bias.
b.
a.
Figure 3.8. Probability of detection of wind direction within 30 degrees by forecasthour for each verification site (a) 0000 UTC cycle (b) 1200 UTC cycle.
predictands and forecast hours that also are unexplained. The large MM5 RAW dry bias at SLC
was largely corrected by MM5 MOS.
The relative humidity statistics are shown in Fig. 3.6. The relative humidity MAE for
MM5 MOS for the 0000 UTC cycle (Fig. 3.6a) reaches a maximum during the warmest part of the
day, then drops until forecast hour 27. This trend is not as evident in the 1200 UTC cycle (Fig.
3.6b). The relative humidity MAE for the MM5 and NGM MOS rise almost linearly after forecast
hour 6 in Figs. 3.6a,b. The rising MAE values level off after forecast hour 18 for the 0000 UTC
(Fig. 3.6a) and after forecast hour 12 for the 1200 UTC cycle (Fig. 3.6b). The relative humidity
bias (Fig. 3.6c,d) seems to show a cyclical pattern much like the temperature bias. The 0000 UTC
cycle (Fig. 3.6c) clearly shows the diurnal pattern, with the pattern most noticeable at SBW and
the NGM at SLC.
The wind speed statistics are shown in Fig. 3.7 The MAE (Fig. 3.7a,b) exhibit little growth
throughout the forecasts, with WMP having the largest variability. This could be due to errors
from the weak nocturnal slope flows that are usually present there. Once again, an unusual spike
is present at WMP for the bias at forecast hour 15 (Fig. 3.7d). The NGM MOS has the largest
biases, with a tendency to overpredict wind speed. Figure 3.8 demonstrates the probability of
detection of wind direction. The lowest skill occurs at forecast hour 27 for the 0000 UTC cycle
(Fig. 3.8a) and at forecast hour 15 for the 1200 UTC cycle. Both of these times are for the early
evening hours. Otherwise, there is little growth in the probability of detection of winds within 30
degrees.
In summary, temperature, dewpoint, relative humidity, have a gradual growth in error with
forecast projection, which is due to increased uncertainty in the underlying model with forecast
range. Wind speed and direction errors, tend be stable throughout the forecasts. The results above
also indicate that MOS can significantly improve upon the raw model output biases. The
variabilit y in forecast hour could be due to a number of factors; however, most of the higher errors
for temperature, dewpoint, and relative humidity are observed in the afternoon. This could be
because the decoupling of the boundary layer from the free air is largest at this time of day. At
some of the observed sites wind direction and speed could also influence the errors in the
temperature and moisture variables.
A Challenging Forecast Situation for MOS
MOS has been known to perform poorly under unusual weather patterns (Klein and
Hammon 1975). As shown earlier, MM5 MOS temperature forecasts were generally more
accurate for higher elevation stations. The lower accuracy of MM5 MOS at SLC and WMP was
due partly to prolonged inversion event that occurred from 26 December 2000 to 11 January 2001.
Because January 1-10 was during the indpendent data validation period only 6 days of this event
were included in the equation development. Figures 3.9 - 3.11 focus on the recent winter season
by computing the temperature MAEs and biases from 1 December 2000 to 12 April 2001. In these
figures each month’s MAEs and biases and the total MAEs and biases are computed with the
inversion and without the inversion period, which shows the impact the inversion had on MOS
MAEs. Figure 3.9 ill ustrates the MM5 RAW and MM5 MOS statistics at SLC. The largest MM5
RAW (8.05 °C MAE) and MM5 MOS (3.55 °C MAE) errors are observed during the inversion
period with, MM5 MOS producing better results. The biases during this period also show that
MM5 RAW and MM5 MOS exhibited a substantial warm bias as the model kept removing the
inversion, particularly durring the day. The MM5 RAW bias for January was 8.01 °C while MM5
MOS produced a warm bias of 3.31 °C.
1.96 MAE With Inversion 1.62 MAE Without Inversion
Datec. Inversion
Inversion
Tem
pera
ture
(°C
) bi
as
Dated. Inversion
3.34 MAE With Inversion 2.21 MAE Without Inversion
Tem
pera
ture
(°C
) M
AE
Datea. Inversion
2.22 bias With Inversion 0.83 bias Without Inversion
0.98 bias With Inversion 0.49 bias Without Inversion
Figure 3.9. SLC MM5 RAW temperature (a) MAE and (b) bias and SLC MM5 MOStemperature (c) MAE and (d) bias. Hourly output (dots), daily average(thin line), and the 10 day average (thick line and value) are shown foreach independent test period. The inversion occurred during the periodfrom 1 January - 10 January 2001. The verification was done using the fullindependent data set and combined the 0000 UTC and 1200 UTC cycles.Total MAE and bias from the selected verification period with and withoutthe inversion period are also included.
b.
Tem
pera
ture
(°C
) M
AE
Tem
pera
ture
(°C
) bi
as
Date
Inversion
2.03 MAE With Inversion 1.55 MAE Without Inversion
Date
Tem
pera
ture
(°C
) bi
as
d. Inversion
b.
Tem
pera
ture
(°C
) bi
as
DateInversion
Tem
pera
ture
(°C
) M
AE
c. DateInversion
1.11 bias With Inversion 0.48 bias Without Inversion
1.84 MAE With Inversion 1.51 MAE Without Inversion
0.85 bias With Inversion 0.52 bias Without Inversion
a.
Tem
pera
ture
(°C
) M
AE
Date
Figure 3.10. SLC AVN temperature (a) MAE and (b) bias and SLC NGM MOStemperature (c) MAE and (d) bias. Hourly output (dots), daily average(thin line), and the 10 day average (thick line and value) are shown foreach independent test period. The inversion occurred during the periodfrom 1 January - 10 January 2001. The verification was done using the fullindependent data set and combined the 0000 UTC and 1200 UTC cycles.Total MAE and bias from the selected verification period with and withoutthe inversion period are also included.
Figure 3.10 shows the AVN and NGM MOS statistics and indicates that the NGM MOS
outperformed the AVN MOS during the 1-10 Jannuary inversion period. The MM5 MOS (3.55 °C
MAE) had lower errors during the inversion than the newer AVN MOS (4.13 °C MAE) but not as
low as the NGM MOS (3.21 °C MAE). This could be due to the NGM MOS using a larger sample
size that includes more inversions periods. It was hypothesized that a larger MOS training period
that included more inversions would result in smaller MM5 and AVN MOS errors.
The results in Fig. 3.11 ill ustrate how MM5 RAW and MM5 MOS performed at WMP.
Just like at SLC, the MM5 MOS at WMP was substantially more accurate than MM5 RAW
during the 1-10 January period. It is believed that if this inversion had not occurred that
temperature MAE and bias for both verification periods (Fig. 3.2 and 3.1) for SLC and WMP,
would be lower and similar to MAE and bias values observed at SBW and DVE. For example,
removing 1-10 January 2001 from the inversion period reduces the overall MAEs from Figure 3.2
at SLC and WMP to 1.53 °C and 1.79 °C respectively, similar to the MAEs observed at DVE and
SBW (Fig 3.2).
Another interesting note is that in April the MM5 MOS (2.20 °C MAE) was slightly
worse than MM5 RAW (1.56 °C MAE). This could be due to the changing of seasons and the data
being based on colder weather regimes that are not typically observed during the month of April .
Summary
The above results revealed that it is possible to create MM5 MOS guidance from
mesoscale model output that depends upon raw model output, and that this is useful for point-
specific forecasts on small spatial scales in complex terrain. MM5 MOS also provides forecast
guidance at locations where NGM and AVN MOS are not available. The skill of the MM5 MOS
products was generally similar to that of the output of the NGM MOS and AVN MOS products.
DateInversion
4.28 MAE With Inversion 2.77 MAE Without Inversion
c. DateInversion
1.98 MAE With Inversion 1.79 MAE Without Inversion
DateInversion
DateInversion
1.68 bias With Inversion -0.55 bias Without Inversion
a.
d.
b.
0.21 bias With Inversion -0.29 bias Without Inversion
Tem
pera
ture
(°C
) bi
as
Tem
pera
ture
(°C
) M
AE
Tem
pera
ture
(°C
) bi
as
Tem
pera
ture
(°C
) M
AE
Figure 3.11. WMP RAW temperature (a) MAE and (b) bias and WMP MM5 MOStemperature (c) MAE and (d) bias. Hourly output (dots), daily average(thin line), and the 10 day average (thick line and value) are shown for eachindependent test period. The inversion occurred during the period from 1January - 10 January 2001. The verification was done using the fullindependent data set and combined the 0000 UTC and 1200 UTC cycles.Total MAE and bias from the selected verification period with and withoutthe inversion period are also included.
CHAPTER 4
SUMMARY AND FUTURE WORK
Summary of Results
This study has evaluated Model Output Statistics (MOS) derived from a mesoscale model
(MM5 MOS) in a region of complex terrain. Verification was performed at four sites that are
representative of other MM5 MOS sites that are important for weather prediction during the 2002
winter Olympics. MM5 MOS equations were developed using MM5 grid point data, surface
observations, and geoclimatic data as predictors. Verification of each group of predictors
demonstrated that MOS equations using only model predictors improve raw model output. The
results further illustrated that the surface observations had a bigger impact upon accuracy than
geoclimatic data. The most accurate results were obtained when MOS equations included
geoclimatic data and surface observations as predictors.
The accuracy of MOS at two lower elevation valley sites, one mid-elevation, and one high
elevation site, were intercompared along with the raw model output (MM5 RAW). This
intercomparison showed that MM5 MOS substantially improved upon the raw model output.
MM5 MOS temperature forecasts were most accurate at higher elevation sites, which was likely
due to a prolonged inversion and the longer daily temperature cycle observed in the lower
elevations. The moisture predictands were similar at all sites, with dewpoint errors being slightly
higher at the mid to high elevation sites. Wind speed and direction errors varied depending on
station location. Where the observed wind speeds were stronger, there were higher errors, while at
sites where the observed wind was variable, the probability of detection was lower for the MM5
MOS products.
The verification also compared the performance of MM5 MOS to NGM and AVN MOS
products at SLC. The performance of MM5 MOS at SLC was similar to that of the operational
NGM and AVN MOS products and in some cases the MM5 MOS products at the other three
verification sites outperformed the NGM and AVN MOS products at SLC. Thus, the primary
advantage of using MM5 MOS is that it provides useful guidance at locations where NGM and
AVN MOS are unavailable.
MM5 MOS, as was well as NGM and AVN MOS, were shown to perform poorly during a
prolonged inversion event at the low elevation sites. Nevertheless, MM5 MOS did improve over
the MM5 RAW during the event. MM5 MOS also outperformed AVN MOS products and had
errors similar to that of NGM MOS. It was hypothesized that including more inversion events in
the development sample would have resulted in smaller MM5 MOS errors.
Based on the promising results presented in this thesis, MM5 MOS has been developed
and expanded for use durring the upcoming winter season (2001 - 2002), which encompasses the
2002 winter Olympic period. The MM5 MOS equations were based on data from the entire three-
winter seasons, to allow for a larger training sample. Primary (model, surface observations and
geoclimatic predictors) and secondary (only model and geoclimatic predictors) equations were
developed for each MM5 MOS site to allow MOS forecast to be produced if surface observations
were not available. Thirty-two MM5 MOS sites are avalable for the upcoming winter season
(Table 1.1), including 27 sites in the Olympic forecasting region (Figure 1.1). The addition of
MM5 MOS products should improve forecasts for the five outdoor Olympic venues, as well as
other weather critical locations, prove to be beneficial for the Olympic forecast team, make
Olympic events more enjoyable for the estimated 1.5 milli on spectators coming to Utah in
February 2002, and provide a legacy forecast product for use after the Winter Games.
Future Work
Several different statistical techniques are currently being tested and developed by the
MDL and several other research institutions. This study applied technique that has been used in
the past (MOS). Even though MM5 MOS seemed to produce successful, accurate results, several
things can stil l be done to improve the product. Below are several ideas that could be applied
toward improving MM5 MOS.
The MOS technique can account for systematic model biases, as well as reduce the errors
of numerical model output with increasing projection (Klein and Glahn 1974). Because of the
relationship between MOS and the underlying model, changes to the underlying model can reduce
the accuracy of MOS (Allen 2001). In the present study, MOS was developed using output from
an evolving model. Improved results could be obtained in a couple of ways. First, if changes are
made to the model, the original model could be kept in service so that MOS can utili ze the longer,
more stable model sample. Second, when changes to the original model are complete, the new
version of the model can be rerun for several years back to develop a longer sample. This sample
could also be improved if the model were run on unique historical cases. This rerunning of the
model has the advantage of creating a developmental database that will also allow the forecaster
to understand how the model reacts to a spectrum of meteorological events.
In addition to the 36 and 12 km resolutions, a new 4 km version of the MM5 was added to
the University of Utah's suite of products in the fall of 2001. With this addition, it will be possible
to test the impact of resolution on MM5 MOS by rerunning the 4 km version of MM5 several
years back to establish a data set to develop MM5 MOS equations. These equations could then be
compared to the 12 km MM5 MOS equations that were used in this study.
One disadvantage of MOS is that the equations are tuned to the sample predictors and
predictands used. For example, if the developmental sample was taken during a relatively dry
period, the equations might not perform as well during a significant wet period. In the study
presented above, much of the data set had warmer than average temperatures and drier than
normal conditions. This seems to have had a small influence on the MAEs and biases from the
four verification sites, as evident from the warm bias. Until a more stable data set is developed
using a wider variety of conditions, these biases will persist.
The above study used only a limited number of predictors compared to the more complex
Eta, MRF, NGM and AVN MOS products developed by MDL (R Allen, 2001 personal contact). It
would be possible to improve the MM5 MOS forecasts by screening an expanded list of predictors
and developing separate equations for each of the four seasons instead of the two seasons used in
this study. It was shown in this study that geoclimatic predictors do not have much impact on the
MM5 MOS output; however, other predictors (e.g., snow cover) could be easily added to see if
they improve MM5 MOS accuracy.
The University of Utah has received requests to develop MM5 MOS guidance for more
elements and more stations. The first question that arises is whether a sufficient sample of
observational data exists for the elements sought and the site requested. Users occasionally ask for
MM5 MOS guidance at sites with a short observing record. This limits the accuracy of MM5
MOS; if, however, the data are adequate they must be processed to ensure quality and then
archived. Irregularities must be accounted for, then suitable predictands must be derived using
MLR. The above process currently takes considerable time to complete. This process could be
automated so that MM5 MOS can be developed for all MesoWest sites with an adequate data
record. Some type of neural network might also be used to develop new equations using data that
is available only at the time of the forecast. Another improvement would be to implement MM5
MOS into the western Advanced Weather Interactive Processing System (AWIPS) system by
making the forecasts in the same alphanumerical message and binary format that of the NGM,
AVN, and MRF MOS products.
Within the next few years progress in objective weather forecasting will continue using a
combination of numerical and statistical models. Observations and numerical model output are a
perishable commodity. Therefore, considerable planning and organization are necessary to ensure
the development of future MM5 MOS products. It was demonstrated in this study that careful use
of the MOS approach is required in this situation. It is envisioned that a large amount of testing,
development, coordination, and implementation will be done to improve MM5 MOS in the
coming years, and that Meteorology department at University of Utah will continue to monitor its
performance and provide forecasters with information on the effective use of this guidance.
REFERENCES
Allen, L. R., 2001: Observational Data and MOS: The challenges in creating high-qualityguidance. Preprints, Eighteenth Conf. On Weather Analysis and Forecasting, Ft.Lauderdale, FL, Amer. Meteor. Soc., 322-325.
Carter, G. M., 1986: Moving towards a more responsive statistical guidance system. Preprints,Eleventh Conf. On Weather Analysis and Forecasting, Kansas City, MO, Amer. MeteorSoc., 39-45.
Carter, G. M., J. P. Dallavalle, and H. R. Glahn, 1989: Statistical forecasts based on the NationalMeteorological Center’s numerical weather prediction system. Wea. Forecasting, 4, 401-412.
Cheng L. 2001. Validation of quantitavive precipitation forecasts during the IntermountainPrecipation Experiment. M. S. thesis, Dept. Of Meteorology, University of Utah, 138 pp.[Available from Dept. Of Meteorology, University of Utah, 145 South 1460 East Room209, Salt Lake City, UT 84112].
Dallavalle, J.P., 1998: A perspective on the use of model output statistics in objective weatherforecasting. Preprints, Sixteenth Conf. On Weather Analysis and Forecasting, Norfok, VA,Amer. Meteor. Soc., 479-482.
Dallavalle, J. P., and M. C. Erickson, 2000: AVN-based MOS Guidance - The alphanumericmessages. NWS Technical Procedures Bulletin No. 463, NOAA, U.S. Dept. of Commerce,9 pp.
Dudhia, J., 1989: Numerical study of convection observed during the Winter MonsoonExperiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46, 3077-3107.
Erckson, M. C., and J. P. Dallavalle, 2000: MRF-based MOS guidance- The alphanumericmessages NWS Technical Procedures Bulletin No. 460, NOAA, U.S. Dept. of Commerce,9 pp.
Glahn, H. R., 1985: Statistical weather forecasting. Probability, Statistics, and Decision Makingin the Atmospheric Sciences, A. H. Murphy and R. W. Katz, Eds., Westview Press, 289-335.
____, and D. A. Lowry, 1972: The use of model output statistics (MOS) in objective weatherforecasting. J. Appl. Meteor., 11, 1203-1211.
Grell, G. A., J. Dudhia, and D. R. Stauffer, 1995: A description of the fifth-generation Penn State/NCAR Mesoscale Model (MM5). NCAR Tech. Note NCAR/TN-398+STR, 122 pp.[Available from UCAR Communications, P.O. Box 3000, Boulder, Co 80307].
Gibson, C. V., M. E. Gerber, and S. Ruiyu, 1997: Developing MOS Equations for RemoteAutomated Weather Sations. Preprint, Second Symposium on Fire and Forest Meteorology,Phoenix, AZ, Amer. Meteor. Soc., 111-114
Horel et al., 2002: MesoWest: Cooperative Mesonet in the Western United States. Submitted toBull. Amer. Meteor. Soc., Fall 2001.
Hong, S.-Y., and Pan H. L., 1996: Nonlocal boundary layer vertical diffusion in a medium-rangeforecast model. Mon. Wea. Rev., 124, 2322-2339.
Kain, J. S., and J. M. Fritsh, 1990: A one-dimensional entraining/detraining plume model and itsapplication in convective parameterization. J. Atmos. Sci., 47, 2784-2802.
Klein, W. H., B. M. Lewis and L Enger, 1959: Objective prediction of five day mean temperaturesduring winter. J. Meteor., 16, 672-682.
Klein, W. H., and H. R. Glahn, 1974: Forecasting local weather by means of model outputstatistics. Bull. Amer. Meteor. Soc., 33, 1217-1227.
Klein, W. H., and G. A. Hammon, 1975: Maximum/minimum temperature forecasts based onmodel output statistics. Mon. Wea. Rev., 103, 796-806.
Klemp, J.B., and D.R. Durran, 1983: An upper boundary condition permitting internal gravitywave radiation in numerical mesoscale models. Mon. Wea. Rev., 111, 430-444.
Maglars, G. L., and G. M. Carter, 1986: How to use MOS guidance effectively. Preprint, EleventhConf. Weather Analysis and Forecasting, Kansas City, MO, Amer. Meteor. Soc., 17-22.
Onton, D. J., and W. J. Steenburgh, 2001: Diagnostic and sensitivity studies of the 7 December1998 Great Salt Lake-effect snowstorm. Mon. Wea. Rev., 129, 1318-1338.
Panofsky, H. A., and G. W. Brier, 1965: Some Applications of Statistics to Meteorology. ThePennsylvania State University, 224 pp.
Sall J., A. Lehman, L. Creightin, 2001: JMP Start Statistics, SAS Institute Inc. DuxburyThomson Learning, Pacific Grove, CA.
Slemmer, J. W. 1998: Characteristics of winter snowstorms near Salt Lake City as deduced fromsurface and radar observations. M. S. thesis, Dept. Of Meteorology, University of Utah,138 pp.[Available from Dept. Of Meteorology, University of Utah, 145 South 1460 EastRoom 209, Salt Lake City, UT 84112].