12
MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL ISSUES Vincent Schueler Washington State Energy Office In the past 10 years, the Institutional Conservation Program (ICP) and. the Bonneville Power Administration's Institutional Buildings Programs (IBP) have spawned a number of efforts to quantify the impact of energy efficiency retrofits in schools, hospitals, and government buildings using utility billing datas Several methodological strategies have been employed ranging from control groups to regression modelling.. Each has experienced serious difficulty in isolating the effects of energy efficiency programs from factors outside the control of the program.. The first section of this paper reviews the results of 10 impact analyses of institutional building retrofit programs0 The second section summarizes methodological complications involving: Attrition bias Weather normalization !SolatllniZ ".11'",... =""..... "". effects from site specific changes in DUIIO.1IU! &..... use, and @ Isolating net program from changes in energy use that would have occurred in the absence of the program The third section examines the strengths and weaknesses of possible approaches to ameliorating these These include statistical bias reduction, case study analysis, and adjusted engineering estimation. INTRODUCTION the Institutional Conservation and the Bonneville Power Admin- Buildings Programs more than a dozen efforts to quantify of energy retrofits in schools, hospitals, and local using and fuel consumption data. This paper reviews 10 evaluations in the institutional sector on statistical analysis of utility records" The analytic methods used for these evalu- ations were extensions of the model used for conselVation retrofits in the residential sector of the changes in pre... and post... retrofit weather adjusted consumption data). These extensions have not gone smoothly because of the diversity of the institutional building stock, of the measures installed, of energy using equipment, and of the exogenous effects that need to be accounted for .. At the same time, treatment groups are smaller and the length of time between pre... and post- installation measurement is often longer.. While the techniques used to measure the impacts of institu- tional retrofit programs have grown considerably more sophisticated since the first national impact evaluation of ICP (Synectics 1983), the problem of measuring and attributing the actual impacts of Program Evaluation 6. 155

MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ININSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF

METHODOLOGICAL ISSUES

Vincent SchuelerWashington State Energy Office

In the past 10 years, the Institutional Conservation Program (ICP) and. theBonneville Power Administration's Institutional Buildings Programs (IBP) havespawned a number of efforts to quantify the impact of energy efficiency retrofits inschools, hospitals, and government buildings using utility billing datas Severalmethodological strategies have been employed ranging from control groups toregression modelling.. Each has experienced serious difficulty in isolating the effectsof energy efficiency programs from factors outside the control of the program..

The first section of this paper reviews the results of 10 impact analyses ofinstitutional building retrofit programs0 The second section summarizesmethodological complications involving:

~ Attrition bias

• Weather normalization

!SolatllniZ ".11'",...=""..... "". effects from site specific changes in DUIIO.1IU! &.....JI\,_""'.>ao~JlI._~ use, and

@ Isolating net program from changes in energy use that would haveoccurred in the absence of the program

The third section examines the strengths and weaknesses of possible approaches toameliorating these complications~ These include statistical bias reduction, case studyanalysis, and adjusted engineering estimation.

INTRODUCTION

the Institutional Conservationand the Bonneville Power Admin­

.Ii..lJi.to.:J"A~""Il.Il..tVA.U.4.t Buildings Programssp~lwrlea more than a dozen efforts to quantify

of energy retrofits in schools,hospitals, and local using

and fuel consumption data. This paperreviews 10 evaluations in the institutionalsector on statistical analysis of utilityrecords" The analytic methods used for these evalu­ations were extensions of the model usedfor conselVation retrofits in the residential sector

of the changes in pre... and post...

retrofit weather adjusted consumption data). Theseextensions have not gone smoothly because of thediversity of the institutional building stock, of themeasures installed, of energy using equipment, andof the exogenous effects that need to be accountedfor.. At the same time, treatment groups are smallerand the length of time between pre... and post­installation measurement is often longer.. While thetechniques used to measure the impacts of institu­tional retrofit programs have grown considerablymore sophisticated since the first national impactevaluation of ICP (Synectics 1983), the problem ofmeasuring and attributing the actual impacts of

Program Evaluation 6. 155

Page 2: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

conservation retrofits in institutional buildingsremains difficult to resolvelJ

This will be illustrated in a review of 10 impactevaluations of the ICP and the mE Eight of thestudies were done as part of the Department ofEnergy's Institutional Conservation Program, whichwas initiated in 1979. The ICP provided partialfunding for audits and a 50 percent match forenergy conservation measures in schools, hospitals,and universitieslJ In most states ICP was targetedtowards non-electric measures. A second source ofimpact evaluations on institutional buildings is theBonneville Power Administration's InstitutionalBuildings Program. The mp was modelled looselyafter the ICE Unlike the IC~ the ffiP had higherincentives (75 to 85 percent), paid only for electricalmeasures and was open to all local, state, andfederal facilities. The list of evaluations reviewedhere is not exhaustive, but is representative of majorapproaches used" It omits studies of retrofits ininstitutional buildings outside of the ICP and the

P (Griffen 1985; Gardiner et at 1984)4;

e first section describes the range of impactfindings for these studies4; Several complicated issuesaffect the direct measurement ofprogram impacts ininstitutional buildings using billing data@ Insecond section we examine four of them:

e Sampling and sample attrition-...implications ofself-selection and non-response bias

• Weather normalization

e Isolating program effects from site specific noise(changes in building use~ andoccupancy)

@ Isolating "net program from changes inuse that would have occurred in the

absence of the program"

In the final section, we examine three approachesfor resolving these methodological complications:statistical bias reduction, case analysis, andadjusted estimates"

OF 10 ICP/IBPIMPACT STUDIES

The findings from the 10 studies are compared inThble Ie Direct comparisons of results are rompli...

6e 156 Schueler

cated by inconsistencies in reporting formats andmajor differences in analytical approaches.. For eightof the studies, the reduction .in energy use isreported as the mean percentage change in area...normalized measured energy use or EDI (EnergyUse Intensity). The Idaho (1985) study reported thepercentage difference in aggregate energy use andJackson's (1989) findings are not area-normalized..

The choice of whether changes in the consumptionof all fuels or only the targeted fuel are reportedalso obscures comparisons between studies. TheJackson (1989) and Miller and Vedadi (1989)studies examine only gas consumption; the mpstudies (Davis 1987; Keating and Blachman 1987)focus on electrical consumption; and the remainingICP studies report changes in total consumption..Thus, the results of Jackson's (1989) analysis ofmeasures implemented to reduce domestic hot waterand space heating gas consumption in gas heatedschools (a 28 percent reduction) may not be incon...sistent with the results of Vine et a1. (1988), whichlooked at all fuels (a 5 percent reduction).. Thebases used by the two studies are clearly verydifferent

Schools

While all the studies reported that energy usedeclined after retrofits, the range ofreported savingswas large.. For schools, measured changes in energyuse ranged between 2 and 29 percent Energysavings reported in ICP evaluations completedthrough 1986 on the first three cycles of theprogram generally reported higher savings thanstudies completed more recently.. Vine et at (1988)argues that while substantial savings of more than20 percent may be possible for individual buildings,it is likely that these early studies overstatedprogram results because the samples used in thesestudies were generally small, included self-selectionbias, and did not have control groups.. Vine et at(1988) reported that energy use for both 381 ICPparticipants and 1,149 non-participants inMinnesota declined by 12 percent between 1978 and1981--the analysis period for the early studies. Thissuggests that the high estimates of reduced energyconsumption in the earlier studies reflected asystematic decrease in energy use for all institutional

Page 3: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

Table 1. Comparison ofICP/IBP Impact Evaluations

Author Building Sample Reduced(Location) Type Size Energy Use Comments

ICPSynectics (1983) Schools 96 220/0 ± NA* Ten state sample(National) Hospitals 16 80/0 ± NA Case analysis not

weather adjusted

Quintrell (1985) Schools 23 290/0 ± 18% Survey sample(Maine) Hospitals 10 10% ± 80/0 Weather adjusted

Idaho Water Res (1985) Both 41 14% ± NA Aggregate(Idaho) consumption

Weather adjusted

Olle and Howlett (1986) Schools 235 18% ± NA Weather adjusted(Wisconsin) Hospitals 29 13% ± NA Not weather adjusted

Jackson (1989) Schools 24 28% ± 19 % Natural Gas Only(Montana) Weather adjusted

Schwartz (1989) Schools 143 90/0 ± 20% First year savings only(Ohio) Hospitals 44 140/0 ± 160/0 Weather adjusted

Miller Vedadi (1989) Schools 2% ± 40/0 Natural gas only(Utah) Weather adjusted

Vine at al. (1988) Schools 381 50/0 ± 8- Aggregate analysis(Minnesota) 14%** Control group

weather adjusted

IBPKeating Blachman(1987) AU 130/0 ± 140/0 Regression adjustment(Pacific Northwest) attempted

Weather adjusted......electriconly

(1987) AU 19% ± 16% Weighted by pre-(Washington) consumption

Weather adjusted

'* NA= not reported. Otherwise standard deviation is reported** Percentage of noise in data, noise =(maximum-minimum)/average annual

consumption for study

Program Evaluation 6.. 157

Page 4: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

buildings, which was not captured because of theabsence of control groups..

Hospitals

The range of improvements in total EUls forhospitals (between 8 and 14 percent) was somewhatnarrower than that reported for schools" Althoughthe magnitude of percentage changes in hospitalEUIs reported in Thble 1 were smaller than thosefor schools, absolute energy savings in hospitalbuildings were greater than those in school buildingsbecause of higher base EUIs for hospitals (Synectics1983; Schwartz 1989; One and Howlett 1986; Davis1987)&

Changes in total hospital EUIs are also masked byincreasing electric loads, which were generally nottargeted in ICE. In three ICP studies where electricand non-electric EUIs were examined separately,electric use either decreased at a much lower ratethan other fuels (Quintrell 1986) or increased.. Inthe Olle and Howlett (1986) and Schwartz (1989)studies, electricity use increased by 6 and 7 percentrespectively over the 3 years studied" Olle and.JI...JI..'-'V'll'lIfH~>.JB_IL_ and Schwartz linked the increased electricloads to increased loads from new electriceQl11J)lne][1t added to keep with changes in equip-

Persistence of ;sa'Vi.n,~s

Of the 10 studies Olle and Howlett(1986), Vine et at (1988) and Schwartz (1989)

10Dl$!1t'udlnal data" Each looked at savingsafter installation~ All three

erodedand Glenn (1988)

found that for Utah ICP particl10antsdid not Non-weather corrected energy usefor 115 Utah ICP from 9,,6 to3..3 of base use between the first and third

after installation..

An alternate n.o.l'·~.t""lo"ll"'Ii"'lA>n'lil"ll""""" index is the ratio ofactual cnaLn2c~s in use to savings predictedenJZln~eer:ln{! estimates.. The five studies reviewed inThble 2 "'_R.o'_.&IIl._~ that on the average between 59and 80 of the predicted change in energy

60 158 Schueler

use materialized in measured data& For individualbuildings, engineering estimates were poor predic­tors of actual changes in energy use.. Vine et al..(1988) found a range of ratios ..500 percent to+450 percent and standard deviation of +/- 157percent for 198 schools, after outliers were removed..Davis (1987) reported a range of -88 percent to+528 percent and standard deviation of 113 percentactuaVpredieted savings ratios for 40 WashingtonffiP participants0 These findings were consistentwith Olle and Howlett (1986) and Jackson (1989) ..

Birdsall's (1987) analyses of a national sample of120 ICP 'Thchnical Assistance studies suggests thatthe poor agreement between predicted and actualsavings is not primarily a function of poor engineer­ing calculations, but a function of problems in ECMdesign, installation, and operation coupled withunanticipated non-ECM-related changes in energyuse" Since this study looked only at the engineeringcalculations and did not reexamine the actualbuildings, it was less clear whether the buildingswere properly modelled or appropriate measureswere recommended.. However, in an reanalysis of39 IBP projects that did involve a site visit, Kunkle(1989) showed that when non~ECM changes inbuilding energy use were factored out of consump­tion data, the average ratio between predicted andactual energy use increased from 58 to 72 percent0

FOUR ..dIlf..&'Il.J.&~AL.I...u&,..;J~ IN ANALYSIS

Four issues were identified in the review:attrition bias, weather normalization, site specificCOIll011nCl.iru! changes in energy use, and net impacts..

Thble 3 summarizes the methodological strategiesem'olo'ved to address: weather con-·...A_.&.&.&..&.J&.~for site specific non-ECM changes in energyuse; isolating net impacts; and whether consumption(reported by institutions), operation, and facilitydata (reported in the TAS), ECM installation, andsite specific changes were verified.. The specifictechniques are discussed in the subsequent text..

~a]LDpjie Attrition and NOn",1c,eSDOJ1Se Bias

The samples in the impact studies analyzed were notgenerally random or large.. They were determined bythe availability and cleanliness of the data.. As

Page 5: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

Table 2. Comparison ofActual and Predicted Energy Savings

Author Study Median Mean Comment

aile and Howlett (1986) Wisconsin NA 66% Ratio of MeansJackson (1989) Montana NA 77% Ratio of MeansKeating and Blachman (1987) IBP- PNW 410/0 590/0 Mean of RatioDavis (1987) IBP - WA 69% 820/0 Mean of RatioVine at al. (1988) Minnesota NA 720/0 Mean of Ratio

Table 3~ Summary ofMethodological Approaches

ControlWeather Site Net Data

StUdy Normalization Changes Impacts Verified

ICPSynectics (1983) None None No Site Visit(National)

Quintrell (1985) Relative None No No(Maine) Ratio

Idaho Wtr.(1985) Relative None Survey NoIdaho Ratio

Olle Howlett (1986) Relative Screen Survey Yes -Some(Wisconsin) Ratio Site Visits

Jackson (1989) Custom Screen(Montana) Regression Outliers No Yes ..

(1989) Relative Screen No Yes - Some(Ohio) Ratio Site Visits

Miller Vedadi (1989) Screen No No(Utah)

(1988) Normal Screen Control No(Minnesota) Aggregate Group

IBP& Normal Screen Regression Yes - PhoneNW) Ratio

Davis (1987) Regression Screen No Yes - Phone(ENACT)

Program Evaluation 6B 159

Page 6: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

shown in Thble 4, sample losses of 40 to 60 percent(before buildinglECM changes are accounted for)were common. With buildinglECM changesaccounted fOf, attrition rates were as high as75 percent Sample losses were classified in threegeneral areas for the studies reviewed..

The first screen is for a complete set of energy usedata.. A minimum of 1 year of pre- and 1 year ofpost-retrofit data was required for the studies.. Boththe ICP and the ffiP relied on self-reported con­sumption data.. These institutions did not have astrong motivation for providing accurate and com­plete data. Sample points were also lost because oflong lead times for measure installation.. Conse­quently, most analyses are heavily weighted to theearliest participants in the program..

Once a complete set of utility records was available,a second series of screens for utility data problemswere typically used.. These problems include mastermetering, uncertain timing of measure installation,reporting and recording errors, bulk fuel deliveries,and outliers.. Retrofits in more complex buildingsare affected most For example, college and univer-

retrofits were not examined in any of the stateICP studies because these facilities typically havecentral heating plants, and are not metered at .thebulldiIU! leveL

A final set of screens were used in five of thestudies to case-specific changes in thebuildings and/or their use or operation that wouldmask the effects of the measures (eGg..,renovations, heating change outs)"

The attrition rates in Thble 4 mayresult in biased estimates of performance0 Theeffects of attrition bias are not well understood, norhave been addressed in theevaluations reviewed@ Of the 10 studies reviewed

Davis (1987) and Vine (1988) compared thefinal sample to attrition groups.. The comparisonswere limited to building and measure types, andaverage floor areas. found evidence thatattrition selected towards less complicated buildingsand measures, which might bias resultsdownward0

In addition to measures and building types, thecomparison of sample and attrition groups should

6~ 160 Schueler

include an examination of behavioral factors (e.g.,sophistication and the commitment of client withrespect to energy management). Several studies haveestablished a strong link between the energymanagement capabilities of an institution andsuccess of energy conservation measures (Kunkle1989; Synectics 1983; and Duerr EvaluationResources 1987).

Weather Normalization

Weather normalizing consumption data in institu­tional buildings is problematic. There is nocommonly accepted, validated standard normaliza­tion method.. Early ICP studies either did notnormalize for weather or simply divided total fueluse by annual heating degree days in the pre- andpost-retrofit periods (relative ratio method)0 Normalratio (or slash and burn) methods use a multiplierbased on the ratio of long-term average heatingdegree days to heating degree days for a particularyear. When HDD prorations are applied to totalfuel use, which typically includes uses not dependentto weather (e~g", lighting and equipment loads), ratiomethods over correct This is a particular problemin buildings vvith large non-weather sensitive·loads(e4lg&, hospitals)&The over correction can be mini­mized by applying the ratio only to space heatingfuels (where there is a separate space heat fuel) orto an estimated space heating fraction of totalenergy use (if end-use distributions are available)&

Because of stringent data requirements, inability todeal with bulkomdelivered fuels and confoundingseasonal effects ( e..g.., summer break in schools),regression methods have been used sparingly forweather normalizing utility data in institutionalbuildings.. When using regression based weathernormalization techniques in institutional samplesthere is a trade-off between ensuring reliability ofthe normalization based on the quality of linear fit(R2) and the standard error and maintaining samplesizes. 1b maintain reasonable samples sizes the twoauthors who did use regression techniques tonormalize weather effects (Miller and Vedadi 1989,Jackson 1989) accepted R2 values as low as 860 (wellbelow the standard used for residential retrofits--R2

> .90). An alternative to dropping cases from thesample that do not meet reliability tests is to

Page 7: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

Table 49 Sample Survival Rates for IBP/ICP Impact Analyses

Percent After Screen ForBldng Sample Data Data Site

Author Type Frame Avail. Clean Changes

Synectics (1983) Mixed 150 960/0 75% N/A(National)

Quintrell (1985) School 68 62% 400/0 N/A(Maine) Hospe 19 84% 58% N/A

Olle and Howlett (1986) Schools 354 66% N/A(Wisconsin) Hosp. 54 54% N/A

Vine at ale (1988) Schools 500 75-910/0 45-67% N/A(Minnesota)

Miller and Vedadi (1989) Schools 306 52% 22% 19%(Utah)

Jackson (1989) Schools 60 400/0(Montana)

Schwartz (1989) Schools 229 690/0hio) Hasp. 61 720/0 N/A

Keating and Blachman Mixed 111 63% 460/0(1987 -- P ific NW)

Davis (1987) Mixed 121 33%(Washington)

weather correct If'.ol.c&~i-.r1.o.1''1l:1f

meet certain criteria

Use

All 10 studies ana.lvz~~a ./f."-'IIJ'-'A....__ problems isolatingthe effects of the m from background fluctua-tions in not related to thenf02ralUo The first ICP study ectics 1983) statedthat in use interact with energyconservation...-true results are virtually impossible to'-f'loA'l.'I,AA,-.aJI.YO" Vine etat) that "Varia-

tions in annual energy use for ainstitution can be quite large and mask theenergy savings of ECMs,,"

Vine et at (1988) calculated the percentage of noise(measured as the difference between maximum andminimum annual energy use for a period, divided byaverage annual use over that period) in consump­tion data for 255 schools after they had beenretrofitted" They reported that the most probablerange of year to year variation energy use was8 percent to 14 percent (actual distributions weremuch broader) and therefore unless measure savings

Program Evaluation 6" 161

Page 8: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

were at least 15 percent to 25 percent their effectsmight be masked. In a comparison of engineeringestimates of predicted savings to estimates of sitespecific non-ECM changes in consumption for 57 ofWashington's IBP largest (in grant size) project,Kunkle (1990) showed that magnitude of non...measure changes in energy use was comparable topredicted savings in half the buildings studiedo

The range of possible site specific confoundingeffects in institutional buildings is wide. Theyinclude changes in the building use, operation, andphysical characteristics and installation of other

& These confounding factors are often notimmediately discernible without an on-site visitSince the likelihood of significant change increaseswith time, the problem of site-specific confoundingeffects is particularly acute in longitudinal studies"

1Wo strategies are available for controlling for sitespecific changes in building use and operation0 Themore common method used is to screen out caseswhere significant changes are apparent This

if vigilantly, contributes tosample attritione A second related problem is that

not be done systematically* eeningrelies on self- orts by participants who arenot trained to what really matters5 enn-r.n,h fiDln4l cases are identified throll' outlieranallVSllS, an is made that cases that liewithin the of values energy usechanges are close to what was predicted) are left inbecause the measures are performing asext)eelteQ--wrneJrlIII in there be off...settingcna.nf!e~ss Unless sources of confounding effects are

and eliminated for all cases, close1QJl,.~./l. __&.,&.&_Jl.AIS. between actual and savingsbe the result non...ECM Without a consis...tent set of rules for cases, there is a risk

results towards the ana t's e ectations"

used Vine et at (1988),COl1npalres Cn(]ln~~eS in total EUIs over multi...

nDl'l'"lln,1C' before and after installation rather than'I§~.uLj! -·lLIUr-1lf~IIr.lll ~JAJI..I'.!!J\,lI,A .•U;J'V'JI..A& Separate

consumption in and post-V_A, A"U''O.a:.y are developed and comparedo

~n~,'§"n'l1tI""Jio'l. works most when con-U\l,4.Jl.ll.llI!J'~,Ji.V..lls. is normalized for square footage,and time. samples are large and

Be 162 Schueler

the distribution ofnon-measure site specific changesin energy use are random, aggregate analysis mayhelp to average out the noise and ameliorate theeffects of outliers in the analysiso

Isolating "Net Impacts"

Of the 10 studies reviewed, only Keating andBlachman (1987) and Vme et at (1988) addressednet impacts$ In addition, the Olle and Howlett(1986) and the Idaho Department of WaterResources (1985) estimated free-ridership using asimple surveY0 The most successful attempt was theVme et a1. (1988) study of Minnesota schools thatuses a control group0 In this study 381 MinnesotaICP participants were compared to 1,149 non­participants.. Because both the participant andcontrol samples were large and had comparabledistributions by floor area, enrollment, and EUI,there was reasonable confidence that the controlgroup was a valid oneo The availability of a largeand well specified control group in Minnesota wasunique.. As penetration rates for ICP and state initio..atives for institutional buildings increase it will bemore difficult to find control groups..

In programs targeted to several types of institutionalbuildings (as in the case of IBP), specifying andmatching defensible control groups is much moredifficult Because of this, Keating and Blachman(1987) developed an alternative approaCh thatinvolved regressing non-ECM factors (e&g., energyprices) on changes in post-installation energyconsumptione Ultimately this approaCh was dis­carded because of difficulty operationalizing keyvariables and capturing site specific changes in theregression modeL At present, practical methods forisolating net impacts in the institutional sector inother than the most homogeneous samples are notwell developede

THREE ALTERNATIVES

Because of the volatility and diversity of institu­tional buildings, the challenges in using utilitybilling data in impact evaluations are formidable&Three methods for addressing these challenges arediscussed..

Page 9: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

of Energy study the base buildings described in asample of 100 technical studies were reanalyzed andcalibrated to utility data0 Measure savings werere-estimated. and correlated with the original energysavings estimated in the technical studies to derivethe correction factor" As this process did not involvea site visit to verify audit measurements, measureperformance, or identify changes in building use andoccupancy, the engineering estimates are only beingcorrected for biases in the calculations themselves"However, it is possible to extend this generalmethodolo to develop correction factors for otherbiases (e.g", discounting energy savings for measureclasses that require extensive human interaction,such as energy management systems)" This type ofcorrection for energy savings is valid only foraggregate data and not for correcting estimates fora specific institution.. It is a useful teChnique, in theabsence of comprehensive measured datao

Engineering models can also be used in conjunctionwith utility data to re-estimate energy savings inbuildings where major changes obscure measuredsavingsG In this process, the pre-retrofit buildings ismodeled and calibrated to utility billing data andlocal weathef0 The building is also modelled withlong-term average weather to develop a ratio forweather-normalization8 A post-retrofit modelreflecting ECM changes is developed and comparedto the pre-retrofit modeL The final model can beadjusted by changing model parameters to describeactual measure installation and operation, andfacility characteristics in the post-retrofit period toreconcile the modelled energy savings with meas­ured dataG e adjusted savings estimates canthen be statistically aggregated as needed" Themodeling and calibration process is technicallycornnlicated and time intensive" A software package,the trofit Energy Savings Estimation Model(RESEM), is being developed and field tested toautomate this process (carroll et at 1989)9

While it holds some promise, this is an unte&tedapproach~ calibrated engineering models have notyet been applied and validated on a large scale"Also, data requirements can be high. At a minimum,it requires a site visit and a simple walk-throughaudit to identify changes in use and occupancyschedules, building operation schedules, function,

Calibrated J£n2ineeriD2 Estimates

Statistical Bias Reduction

There are statistical and econometric techniquesavailable to address problems with attrition bias andtruncated samples.. If the nature of the bias isknown, samples can be stratified and programsavings can be adjusted with weighted averages..However, Blasnik (1989) notes that in using thistechnique to address attrition bias there may beproblems properly identifying the true nature of thebias and, if a control group is used, applying aparallel method to the control sample..

Econometric techniques have been developed toestimate self-selection bias, which by extension couldbe used to address attrition biasIt 1Y,pically, thesemethods involve developing a model to estimate theprobability of participating in a program andincorporating the probability term into a multi­variate regression model of energy savings.. Whilethese techniques have been successfully applied tocommercial and industrial programs (see Violetteand Ozog), they require large samples and extensivedata on characteristics differentiating the attritiongroup from the study These data are oftennot available"

An engineering estimate of potential savings isa prerequisite for most retrofits in

institutional buildings" themselves engineeringestimates are not reliable for determining actualenergy savings because of the inability ofcalculationtechniques to ure complex interactions betweenmeasures and building systems, the difficulty of

future changes in building use andOD~~ration that may effect the estimate, and poor

estimates also tend toenergy savings because

they estimate technical potential.....the savingsassuming close to perfect installation and operationof the meaSl.lreS" Thro approaches are available forA&a'll.4lo'.II.1ll.&.&.J~J"!. better use of engineering estimates..

UI..1IL,J.V(,Jl,.1tVJUI." used the DeS" Department ofto estimate national savings for the

Dr(HZr~im~ is to reanalyze a subsample of engi­~OL:Il'l!"'ll""'lIln studies to identify any systematic biases anduse this information to develop correction factorsfor estimates.. In the DoS.. Department

Program Evaluation 6. 163

Page 10: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

equipment, fuel switching, lighting, HVAC equip­ment, building envelope, and floor spaces A finalconcern is that applying and calibrating engineeringmodels to actual buildings relies heavily on thejudgment of the modeler. Because the modelingprocess remains as much of an art as a science,ensuring consistent application of adjustedengineering estimates in different studies may bedifficult

Getting Site Specific Data

Before savings can be claimed, the analyst must havesome assurance that the measures installed areoperating as expected as well as an understanding ofwhat other factors may be influencing changes inenergy use. Without this knowledge the analyst maybe reduced to divining performance in the entrailsof consumption data, because the samples canpotentially be biased, and true program effects arelikely to be obscured by non-controlled effeetse

Case study analysis was devised to address this issue..Case studies are not new. The Synectics study useda case study procedure and site visits were doneindependently as part of evaluations of the ICP inMaine (Quin I 1 ) and Utah (Zenger andGlenn 1988)$ See unlde (1989) for a description of

I protocols used in a case analysis ofWashington IBP participants.. The relationshipbetween impact and case study analyses is similar tothe relationship between surveys and focus groupsin market researche studies develop in-d thinformation on a small number cases, and thereforeare invaluable for gaining insights a developing.h'lll'nI"'~_'I"~"'.t:!l.l"I,o.l"I about what does and does not work.. Fewlarge-scale case studies have been conducted becauseof cost considerationse The Washington State case

reCrulr(~a an of 40 to 80 hours tocornPl,ete each case.. As samples are small or areselected to hig successes or failures, findingson energy savings are not statistically generalizableto

.c-AA!l.J..tVIl,ll.l:;..U. BBI-LI~IJUI case studies are not practical for~t-:~lA:lIU::::: verification of program effects, statistical

billing data can be strengthenedth project specific data obtained

from site visits" Although some information may beavailable from project files and engineering studies

6~ 164 Schueler

(for example, floor areas, space heat fuels, installedmeasures) data should be verified on-site to ensurereliability" In addition the following data collectionis recommended:

• Collect energy use data directly from utilities. Thfacilitate weather normalization obtain end-usedistributions by fuel type"

4) Inspect measures at least 1 year after installationto verify installation dates and whether they havebeen installed and are being operated asexpected"

• Include an assessment of the energy managementcapability of participating facilities. Identifynon...program investments in energy efficiency"

@ Develop and document criteria for screeningsample points with major building and energyuse changes (for example, renovations, non­program energy efficiency investments)"

@ Compare basic building characteristics, measuretypes and costs, and energy management sophisti..cation in the final and screened samples toidentify potential sources of bias0 It is alsoinstructive to compare early and late programparticipantse

CONCLUSIONS

Because of the diversity and volatility in theinstitutional sector, evaluations of institutionalretrofit programs have yielded inconsistent resultsandmethodologicalcomplications involvingattritionbias, weather normalization, isolating net effects,and controlling for site..specific confounding changesin energy useo Although some of these analyticalcomplications can be ameliorated through statisticalbias reduction techniques (assuming sufficient dataand sample sizes), or bypassed through adjustedengineering estimates (assuming a good model),neither approaCh is completely satisfactory in thesituation most analysts will fare--small, veryheterogeneous samples. At this point there are noeasy solutions for the problem of statisticallyisolating net savings.. Yet despite these complica­tions, empirical measurement of energy savingsusing statistical analysis of consumption data isnecessary to demonstrate that energy efficiency

Page 11: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

works in institutional buildings. These analyses canbe strengthened considerably with project specificdata obtained from site visits*

REFERENCES

Birdsall, Bo, Bo Lebot, Rs Kammerud, and B.Hatfield. 1987. An Examination of Technical AuditCalculations for Schools and Hospitals. LBL-23537,Lawrence Berkeley Laboratory, Berkeley California.

Blasnik, M. 1989. "Attrition Bias in Fuel SavingsEvaluations of Low Income Energy ConservationPrograms." In Energy Program Evaluation,."Conservation andResource Management - The FourthInternational Conference in Chicago. Chicago,Illinois..

Carroll, w., B. Birdsall, R" Hitchcock, and R.Kammerud. 1989. .RESEM: An Evaluation Tool forEnergyRetrofits in InstitutionalBuildings. LBL-27207,Lawrence BerkeleyLaboratory, Berkeley, california.

Davis, B. 1987. titutionalBuildings Program inWashington State." An Analysis of Program Impacts..Waoeng 87 hington State Energy Office,Olympia, shington.

of gy.. 19 ,. An Estimate ofAggregate En Savings Due to the ICP Program"DOE/SF} H2, UeS" Department of Energy,Washington, D ..C"

Duerr Evaluation Resources" 1987,.AnAssessmentofProgram Elements Necessary for the Long-TermEffectiveness ofSchoolEnergyManagementPrograms"California Energy Commission..

Gardiner, M" and J" Harris. 1984.."Measured Results ofEnergy Conservation Retrofitsin Non-Residential An of theBECA...CR Data Base,," In Doing Better,: Setting AnAgenda for the Second Decade,:' Volume D,." New andExistingCommercialBuildings.. American Council foran Energy Efficient Washington, D,.C..

_A.II•.lL4"""L&' K 19 ergy Management in California'sPublic Institutions.: An Assessment ofthe Schools andNOISVljralS Program. California Energy Commission"

Idaho Department of Water Resources.. 1985..Evaluation ofthe Schools/Hospitals Program Cycles 1and 2. Boise, Idaho.

Jackson, M.. 1989. Evaluation ofEnergy ConservationMeasures in Montana Public Schools. MontanaDepartment of Natural Resources, Helena,Montana..

Keating, K, and S. Blachman. 1987. "In Search ofAn Impact: An Evaluation of an InstitutionalBuildings Program,," In Energy Conservation ProgramEvaluation: Practical Methods, Useful Results ­Proceedings from the Third International Conference,Volume 2" Chicago, lliinois"

Kunkle, R.. 1989" "Case Study Analysis of the IBP inWashington State." In Energy Program Evaluation:Conservation and Resource Management - The FourthInternational Conference in Chicago. Chicago,Illinois..

Kunkle, R,. 1990. Case Study Analysis of the IBP inWashington State: Final Report. Washington StateEnergy Office, Olympia shington.. In Process"

Miller, J" and A Vedadi.. 1989" "Evaluation ofthe Utah Institutional Conservation Program:Preliminary Results. it In Energy Program Evaluation,,"Conservation and Resource Management - The FourthInternational Conference in Chicago.. Chicago,nlinois.

OIle, M.., and J.. Howlett 1986" Energy Managementin Wisconsin Schools and Hospitals: 1985 ICPPeifonnance Review. Wisconsin Division of StateEnergy and Coastal Management, Madison,Wisconsin"

Quintrell, J$ 19850 Evaluation of the InstitutionalConservation Program in Maine. Office of EnergyResources, Augusta, Maine"

Schwartz, B.. 1989. An Impact Evaluation ofthe Stateof Ohio's Institutional Conservation Program. OhioDepartment of Development, Office of EnergyConservation..

Program Evaluation 6.. 165

Page 12: MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES ... · MEASURING THE IMPACTS OF ENERGY EFFICIENCY MEASURES IN INSTITUTIONAL BUILDINGS WITH BILLING DATA: A REVIEW OF METHODOLOGICAL

Synectics Group.. 1983.. An Evaluation of theInstitutional Conservation Program ... .Results ofOn-SiteAnalysis.... FinalReport DOE/CS64571-1, D ..S..Department of Energy, Washington D..C..

Vine, EOl' B.. Hatfield, Bit Lebot, R.. Kammerud, andW. Carrolt 19881t Energy Use in Minnesota Schools....Aggregate Performance and Perspectives on EnergySavings. LBL-24052, Lawrence BerkeleyLaboratory,Berkeley, California..

6e 166 Schueler

Voilette, D.., and M. Ozog. 1989.. "Correction forSelf-selection Bias: Theory and Application.. If InEnergy Program Evaluation.... Conservation andResource Management - The Fourth InternationalConference in Chicago.. Chicago, nlinoiso

Zenger, J., and M.. Glenn. 1988.. Tier 1 Final ReportUtah State Energy Officelldaho Department ofWater Resources..