117
STATE OF CALIFORNIA Edmund G. Brown Jr., Governor PUBLIC UTILITIES COMMISSION 505 VAN NESS AVENUE SAN FRANCISCO, CA 94102Ͳ3298 Commission Staff Report Lessons Learned From Summer 2012 Southern California Investor Owned Utilities Demand Response Programs May 1, 2013 Performance of 2012 Demand Response programs of San Diego Gas and Electric Company and Southern California Edison Company: report on lessons learned, staff analysis, and recommendations for 2013Ͳ2014 program revisions in compliance with Ordering Paragraph 31 of Decision 13Ͳ04Ͳ017.

StaffReport_2012DRLessonsLearned

Embed Size (px)

Citation preview

Page 1: StaffReport_2012DRLessonsLearned

STATE OF CALIFORNIA Edmund G. Brown Jr., Governor

PUBLIC UTILITIES COMMISSION505 VAN NESS AVENUE

SAN FRANCISCO, CA 94102 3298

CommissionStaff Report

Lessons Learned From Summer 2012Southern California Investor OwnedUtilities�’ Demand Response ProgramsMay 1, 2013

Performance of 2012 Demand Response programs of San Diego Gas andElectric Company and Southern California Edison Company: report onlessons learned, staff analysis, and recommendations for 2013 2014program revisions in compliance with Ordering Paragraph 31 of Decision13 04 017.

Page 2: StaffReport_2012DRLessonsLearned

ACKNOWLEDGEMENT

The following Commission staff contributed to this report:

Bruce KaneshiroScarlett Liang Uejio

Tim DrewRajan MutialuDorris Chow

Paula GruendlingTaaru ChawlaJennifer CaronAlan Meck

Page 3: StaffReport_2012DRLessonsLearned

i

TABLE OF CONTENTS

EXECUTIVE SUMMARY....................................................................................................... 1

Chapter 1: Introduction.................................................................................................. 5

I. 2012 Summer Reliability and Demand Response Programs.................................................. 5

II. Energy Division November 16, 2012 Letter and the Staff Report.......................................... 6

Chapter 2: Demand Response Program Load Impact ...................................................... 8

I. Summary of Staff Analysis and Recommendations ............................................................... 8

II. Different DR Load Impact Estimates ...................................................................................... 9

III. Comparison of DR Daily Forecast and Ex Post Results .......................................................... 9

IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA).............................. 26

Chapter 3: Demand Response Program Operations...................................................... 32

I. Summary of Staff Analysis and Recommendations ............................................................. 32

II. 2012 DR Program Trigger Criteria and Event Triggers ......................................................... 32

III. DR Events Vs. Peaker Plant Service Hours ........................................................................... 33

IV. Peaker Plant Comparison..................................................................................................... 34

V. Conclusions .......................................................................................................................... 35

Chapter 4: Residential Demand Response Programs .................................................... 36

I. Summary of Staff Analysis and Recommendations ............................................................. 36

II. Residential Peak Time Rebate (PTR) .................................................................................... 36

III. Residential Air Conditioning (AC) Cycling............................................................................. 51

Chapter 5: Non Residential Demand Response Programs............................................. 57

I. Summary of Staff Analysis and Recommendations ............................................................. 57

II. Background and Summary of Utility Data............................................................................ 57

III. Commercial Air Conditioning (AC) Cycling ........................................................................... 59

IV. SCE�’s Auto DR....................................................................................................................... 63

V. SDG&E�’s Demand Bidding Program (DBP) ........................................................................... 65

Chapter 6: Flex Alert Effectiveness ............................................................................... 67

I. Summary of Staff Analysis and Recommendations ............................................................. 67

II. Background .......................................................................................................................... 67

III. Utility Experience with Flex Alert......................................................................................... 69

IV. Customer Experience ........................................................................................................... 69

V. The Future of Flex Alert........................................................................................................ 71

VI. DR Program Ex Post Load Impact Results on the Flex Alert Days........................................ 71

Chapter 7: Energy Price Spikes ..................................................................................... 73

Page 4: StaffReport_2012DRLessonsLearned

ii

I. Summary of Staff Analysis and Recommendations ............................................................. 73

II. Definition of Price Spikes ..................................................................................................... 73

III. DR Programs and Price Spikes.............................................................................................. 73

IV. Conclusion............................................................................................................................ 74

Chapter 8: Coordination with the CAISO ...................................................................... 75

I. Staff Recommendations....................................................................................................... 75

II. DR Reporting Requirements in Summer 2012 ..................................................................... 75

III. DR Reporting Requirements for 2013 2014......................................................................... 76

Appendix A: Highlight of 2012 Summer Weather & Load Conditions.................................... 77

Appendix B: Energy Division November 16, 2012 Letter........................................................ 78

Appendix C: Descriptions of DR Load Impact Estimates......................................................... 79

Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) ............................ 85

Appendix E: SCE 2012 DR Program Load Impact by Event (MW)........................................... 87

Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW) ....................... 91

Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)..................................... 92

Appendix H: SCE 2012 DR Program Overview ....................................................................... 93

Appendix I: SDG&E DR Program Overview ............................................................................. 96

Appendix J: SCE Historical DR Event Hours............................................................................. 98

Appendix K: SCE Historical Number of DR Events .................................................................. 99

Appendix L: Summary of SCE�’s Reasons for the 2012 DR Triggers....................................... 100

Appendix M: SDG&E Historical DR Event Hours ................................................................... 101

Appendix N: SDG&E Historical Number of DR Events .......................................................... 102

Appendix O: Utilities�’ Peaker Plant Total Permissible vs. Actual Service Hours................... 103

Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days ............................ 104

Appendix Q: CAISO Energy Price Spikes................................................................................ 105

Appendix R: Utilities�’ Demand Response Reporting Requirements..................................... 111

Appendix S: Additional Information .................................................................................... 113

Page 5: StaffReport_2012DRLessonsLearned

1

EXECUTIVE SUMMARY

This report is prepared by Energy Division in compliance with Ordering Paragraph 31 ofD.13 04 017. The purpose of this report is to provide the lessons learned from the 2012Demand Response (DR) programs operated by San Diego Gas and Electric Company (SDG&E)and Southern California Edison Company (SCE) (Utilities), and to recommend program oroperational revisions, including continuing, adding, or eliminating DR programs. Below arehighlighted conclusions and recommendations in the report. To see all recommendations,please go to each chapter in the report.

In summary, Energy Division makes the following overarching conclusions about theUtilities�’ DR programs:

Forecast vs. Ex Post:While a few DR programs met or even exceeded their dailyforecast when triggered, on average the ex post results for all program eventsdiverge from the daily forecast by a considerable degree. The majority of programseither provided a �‘mixed�’ performance (the program both over and underperformed relative to its forecast) or were poor performers (consistently coming upshort relative to its forecast). Of particular note are the Utilities�’ Peak Time Rebateprogram1 and SCE�’s Summer Discount Plan.2 (Chapter 2)

The divergence between the ex post results and the daily forecasts can be traced toa variety of causes, such as inadequate forecasting methods employed by theUtilities, program design flaws, non performance by program participants and/orprogram operations. A complete explanation of the reasons for divergence acrossall programs however, was not possible within the scope and timing of this report.(Chapter 2)

2012 RA vs. Ex Post: Comparing the ex post results to the 2012 Resource Adequacy(RA) forecast is not a good indicator as to how well a DR program performs. RAforecasts are intended for resource planning needs. Ex post load impacts reflectdemand reductions obtained in response to operational needs at the time theprogram is triggered. Resource planning and operational planning have differentconditions and serve different purposes. (Chapter 2)

DR vs. Peaker Plants: The Utilities used their DR programs fewer times and hoursthan the programs�’ limits (each program is limited to a certain number of hours orevents). In contrast, the Utilities dispatched their peaker power plants far morefrequently in 2012 in comparison to 2006 �– 2011 historical averages. (Chapter 3)

Energy Price Spikes: DR programs are not currently designed to effectively mitigateprice spikes in the CAISO�’s energy market. On many days a DR event was called and

1 SCE�’s marketing name for Peak Time Rebate is �“Save Power Day�” , SDG&E calls it �“Reduce Your Use�”.2 Air conditioning (AC) cycling

Page 6: StaffReport_2012DRLessonsLearned

2

no price spikes occurred, and conversely there were days where price spikesoccurred and DR events were not called. The timing and scope of this report did notpermit a quantification of the cost of unmitigated price spikes to ratepayers, but intheory, avoidance of these spikes would benefit ratepayers. (Chapter 7)

Energy Division also makes the following program specific conclusions about the Utilities�’DR programs:

SCE�’s AC Cycling Program Forecasting: SCE�’s 2012 forecasting methodology for itsair conditioning (AC) Cycling program (the DR program that SCE triggered the mostin 2012) cannot be relied upon to effectively predict actual program load reductions.(Chapter 2)

SCE�’s AC Cycling Dispatch Strategy: SCE�’s sub group dispatch strategy for its ACCycling Program (also called Summer Discount Plan) created adverse �‘rebound�’effects, thereby reducing the effectiveness of the program during critical hotweather days, e.g. 1 in 10 weather. (Chapter 2)

SDG&E�’s Demand Bidding Program: SDG&E Demand Bidding Program produced onaverage 5 MW of load reduction when triggered, although the US Navy did notparticipate. The US Navy claimed certain program terms and conditions precludedit from participating in the 2012 program. The Commission�’s decision to modify theprogram to a 30 minute trigger may further limit the US Navy�’s ability to participate.(Chapter 5)

Peak Time Rebate Awareness: SCE and SDG&E customers who received utilitynotification of Peak Time Rebate (PTR) events had higher awareness of the programwhen compared to customers who were not notified by the utility. Moreimportantly, customers who opted into receiving PTR alerts significantly reducedload. All other customers in the program provided minimal load reduction. (Chapter4)

Peak Time Rebate Free Ridership: The Utilities�’ PTR program has a potentially large�‘free ridership�’ problem, where customers receive incentives without significantlyreducing load. SCE paid $22 million (85% of total PTR incentives in 2012) in PTR billcredits to customers whose load impact was not considered for forecast or ex postpurposes. 94% of SDG&E�’s 2012 PTR incentives ($10 million) were paid tocustomers who did not provide significant load reduction. The inaccuracy ofsettlement methodology (in comparison to the ex post results) is the main reasonfor the �‘free ridership�’ problem. The default nature of the program (everyone isautomatically eligible for the incentives) aggravates the problem. (Chapter 4).

Flex Alert: There is a lack of data to evaluate the effectiveness and value of the FlexAlert campaign. Attribution of savings from Flex Alert is complicated by the factthat load reduction from the Utilities�’ DR programs on the two days Flex Alert was

Page 7: StaffReport_2012DRLessonsLearned

3

triggered in 2012 contributed to reduced system peak load. A load impactevaluation of Flex Alert is planned for 2013. (Chapter 6)

DR Reports: The Utilities�’ DR daily and weekly reports were useful to the CAISO andthe Commission for purposes of up to date monitoring of DR resources throughoutthe summer. (Chapter 8)

In light of above findings, Energy Division recommends the following:

DR Evaluation: The Commission should require further evaluation of Utility DRprogram operations in comparison to Utility operation of peaker plants for thepurpose of ensuring Utility compliance with the Loading Order. (Chapter 3)

Forecast Methods Generally: The Utilities�’ daily forecasting methods for all DRprograms (especially AC cycling and other poor performers) should undergomeaningful and immediate improvements so that the day ahead forecastingbecomes an effective and reliable tool for grid operators and schedulingcoordinators. (Chapter 2)

Forecasting for SCE�’s AC Cycling Program: SCE should improve forecasting methodsfor its residential AC Cycling Program with input from agencies and stakeholders.SCE should also pilot more than one forecasting method for the program in 2013.(Chapter 2)

Forecasting for SDG&E Programs: SDG&E�’s forecasting methods for its AC CyclingProgram (Summer Saver) could be improved doing the following: running a testevent and including a correlation variable that accounts for customer fatigue.SDG&E�’s Capacity Bidding Program forecasting could be improved by including aweather variable. (Chapter 2)

SCE�’s Outreach for Commercial AC Cycling: Through its outreach and marketingefforts, SCE should clearly communicate the new features of its commercial ACcycling program to avoid customer dissatisfaction and dropout. (Chapter 5)

Auto DR: Future studies are necessary to explore the load impacts of Auto DR.(Chapter 5)

SDG&E�’s Demand Bidding Program: SDG&E should work collaboratively with the USNavy to design a program to meet the unique needs of the Navy. Key attributes toconsider are a day ahead trigger, aggregation of 8 billable meters and a minimumbid requirement of 3 megawatts (MW). (Chapter 5)

Peak Time Rebate Design Changes: The Utilities�’ residential PTR program should bechanged from a default program to an opt in program, so that bill credits are paidonly to customers who opt in. (Chapter 4)

SCE�’s AC Cycling Dispatch Strategy: SCE should reconsider its current strategy ofcalling groups of residential AC cycling customers in sequential one hour cyclingevents. Alternatively, if SCE retains its current strategy, it should modify the

Page 8: StaffReport_2012DRLessonsLearned

4

program�’s incentive structure so that customers who are willing to have their ACunits cycled for an entire event (as opposed to just one hour) are compensatedmore than those who can tolerate only one hour of cycling. (Chapter 4)

DR Reports: The Utilities (and Pacific Gas & Electric) should submit daily and weeklyDR reports to the CAISO and the Commission for the summers of 2013 and 2014.They should follow the same format and data requirements in the 2012 reports,unless otherwise directed by the Commission or Commission staff. (Chapter 8)

Page 9: StaffReport_2012DRLessonsLearned

5

Chapter 1: Introduction

I. 2012 Summer Reliability and Demand Response Programs

San Onofre Nuclear Generating Station (SONGS) Units 2 and 3 were taken out of service inJanuary 2012. By March 2012, the Commission determined that the outage of SONGS�’ twounits could extend through summer 2012. Working closely with the Governor�’s Office, theCalifornia Independent System Operator (CAISO), and the California Energy Commission (CEC),the Commission took immediate mitigation actions to ensure that lights stay on in Californiawith the loss of 2,200 MW of capacity provided by SONGS.3

When considering adding new generation resources,4 an important action was to furtherincorporate the Utilities�’ Demand Response (DR) programs into the CAISO�’s contingencyplanning and daily grid operations during the summer. This included mapping the Utilities�’ DRprograms to grid contingency plans and developing new daily and weekly DR reportingrequirements. In addition, the Commission also moved swiftly to approve three new DRprograms for summer 2012: SDG&E�’s Peak Time Rebate (PTR) for commercial customers andDemand Bidding Program (DBP); and SCE�’s 10 for 10 conservation program for non residentialcustomers.5

Because of the intensive interagency mitigation effort and relatively cool weather,California grid reliability was not compromised in spite of the SONGS outage. Nevertheless,southern California experienced several heat waves in August and September with the highesttemperature reaching 109°F in SDG&E�’s service area and 100°F for SCE on September 14.6 TheCAISO issued two Flex Alerts: on August 10 and 14. The Utilities triggered all of their DRprograms at least once and some on multiple occasions.

Throughout the summer, Energy Division (ED) staff monitored the Utilities�’ DR programevents on a daily basis and provided weekly briefings to the Governor�’s Office, the CAISO, andthe CEC. Staff observed that, for many event days, the load impact forecasts provided by theUtilities to the CAISO and the Commission in their daily DR reports were inconsistent with theresults submitted seven days after each event (referred as the �“7 Day report�”). In some cases,the Utilities reported much lower load reduction results than they originally forecasted. Inaddition, load impact forecasts provided by the Utilities throughout the summer were lowerthan the capacity counted for the 2012 Resource Adequacy (RA) Requirement. This raised aquestion as to whether the Commission might have overestimated DR load impact for RApurposes or, rather, if the Utilities might have under utilized their DR programs.

Sometime in mid summer, the Utilities began to experience price spikes in CAISO�’swholesale energy market. Questions were raised on whether the DR programs could be usedto mitigate price spikes, and if so, should they be.

3 http://www.songscommunity.com/value.asp4 Retired Huntington Beach Units 3 and 4 were brought back on line temporarily.5 Resolutions E 4502 and E 45116 A 1 in 10 (or 10% probability) weather condition in any given years.

Page 10: StaffReport_2012DRLessonsLearned

6

Some of the Utilities�’ DR programs were triggered on as many as 23 events over the fivesummer months, and many were triggered on two or three consecutive days. Appendix Ahighlights the DR program load impact on the three hottest days and the three days whenSDG&E and SCE experienced highest system peak load. Staff observed that SDG&E�’s systempeak correlate to temperature and biggest DR load reduction happened on the hottest day.

On the other hand, SCE�’s system peak load did not consistently correlate to weather. Incontrast, SCE�’s system load reached its annual peak at 90°F temperature, 10°F cooler than thehottest day in its service territory. Counter intuitively, DR program load impact on a cooler daywas actually higher than the amount delivered on the hottest day. This led to questions howthe Utilities make decisions to trigger DR programs and whether aspects of the customers�’experience, such as expectations and fatigue have an effect.

In August, CAISO issued two Flex Alerts when it determined a reliability risk due toinsufficient supply to meet demand. As expected, the Utilities triggered relatively largeamounts of DR programs on both days. CAISO reported that the actual peak load wassignificantly lower than its hours ahead forecasts and attributed the load drop to Flex Alertevents. This parallel dispatch situation raises important questions regarding the effectivenessof the Flex Alert when overlapped with the Utilities�’ DR program events and how customersperceived with these statewide alerts versus local utility DR notifications.

Based on the above experience, the Commission concluded that staff should evaluate DRprogram performance and other lessons learned in order to seek answers to these and otherquestions. Such lessons could help the Commission to determine the extent of DR programreliability and usefulness and in turn, to the extent to which DR resources can be counted on inCAISO markets and operations.

II. Energy Division November 16, 2012 Letter and the Staff Report

On November 16, 2012, the Energy Division sent a letter (Energy Division Letter) to theUtilities directing the Utilities to 1) file an application proposing DR program improvements for2013 and 2014 to mitigate the SONGS outage and 2) provide data and responses to a set ofquestions on lessons learned from 2012 DR programs. The questions were developed based onthe Utilities�’ 2012 demand response experience and fell into six categories:

1. DR Program Performance, which include load impact and programoperations,

2. CAISO Market, covering price spikes and market analysis3. Customer Experience,4. Coordination with the CAISO and Utility Operations5. Emergency DR Program Dispatch Order, and6. Flex Alert Effectiveness

The Energy Division Letter is attached in Appendix B of this report.

Page 11: StaffReport_2012DRLessonsLearned

7

On December 21, 2012, the Utilities filed separate applications for the approval of the DRprogram revisions for 2013 and 2014.7 The Utilities submitted data and responses to thequestions attached to the Energy Division Letter and subsequent Assigned Administrative Law(ALJ) rulings for developing the record.8 Decision (D.)13 04 017 approved certain DR programimprovements for 2013 2014 and directed the Commission staff to develop a report on thelessons learned from the DR programs in 2012.

This report is based on a snapshot of data and studies available at the time (i.e. ex post loadimpact data, utility responses to Energy Division data requests, etc.) On going and future (e.g.Flex Alert load impact analysis per D.13 04 021) evaluations will shed further light on the issuesraised in this report.

One point of emphasis in this report is the extent to which the current DR programsdelivered their forecasted savings when they were triggered by the utilities. It is important tounderstand that there are a range of factors that can affect whether a program delivers itsforecasted savings targets. Some of these factors can be controlled through good programdesign, operation and forecasting methodologies. Other factors that can impact programperformance are exogenous or outside the utilities�’ control such as temperature, participantenrollment fluctuations, and behavioral or technological changes by the participants.

While this report contains certain findings and recommendations for DR programs, wecaution against sweeping conclusions or generalizations about DR programs based on thisreport. The point of this report is to find ways to improve existing DR programs so that theyare more useful to grid operators, utilities, ratepayers and participants.

7 A.12 12 016 (SDG&E) and A.12 12 017 (SCE).8 On January 18, 2013 and February 21, 2012.

Page 12: StaffReport_2012DRLessonsLearned

8

Chapter 2: Demand Response Program Load Impact

I. Summary of Staff Analysis and Recommendations

SCE

Most of the program event ex post results diverge from the daily forecast by a considerabledegree. The daily forecast should be more consistent with the ex post results in order for theday ahead forecasting to be valid and useful for grid operators. Staff recommends that thedaily forecasting methods for all programs undergo meaningful and substantial improvements,including more thorough and transparent documentation and vetting through relevant agenciesand stakeholders.

The Summer Discount Plan (Residential AC Cycling) program forecasting methods inparticular requires an audience with a broad panel of agencies and stakeholders. Staff alsorecommends that SCE pilot more than one forecasting method and conduct interim protocolbased load impact evaluations to identify the most reliable forecasting methods throughout the2013 summer season.

SCE should also be required to address Summer Discount Plan program operation issuesbefore the 2013 summer peak season begins, if possible. Specifically, the strategy of callinggroups of customers for sequential one hour cycling events, rather than calling all thecustomers for the duration of the full event (or other potential strategies), needs to bereconsidered before the program is further deployed. As discussed in detail later in thischapter, this strategy resulted in load increases during the latter hours of events, therebyreducing the overall effectiveness of the program.

SDG&E

Similar to SCE, many of SDG&E�’s program event ex post results also diverge from the dailyforecast by a considerable degree. The Demand Bidding Program daily forecast was accurateand reliable in predicting ex post results, while the Summer Saver and Capacity Bidding DayAhead and Day Of program daily forecasts did not accurately nor reliably predict ex post results.The Peak Time Rebate Residential daily forecast was not accurate in predicting ex post results,but consistently underestimated ex post results by approximately 80%. The Critical PeakPricing and Base Interruptible program did not accurately or reliably predict ex post results, butconsistently under predicted ex post load impacts. Due to a weak price signal and inelasticcustomer demand, the PTR commercial program ex post results were not significant. The CPP Ewas discontinued as of December 31, 2012.

Staff recommends (1) including only customers that opt in to receive e mail or text alerts inthe PTR residential daily forecast model (2) running a test event to measure % load impact percustomer in order to improve CPP daily forecast estimates (3) including a correlation variable inthe Summer Saver daily forecast model to account for customer fatigue during successive eventdays (4) including a weather variable in the CBP daily forecast model in order to have paritywith the ex post regression model.

Page 13: StaffReport_2012DRLessonsLearned

9

II. Different DR Load Impact Estimates

DR programs load impact are forecasted or estimated at different times for differentpurposes. The following table summarizes the five different DR load impact estimates that arediscussed in this chapter. Detail descriptions and methodologies for each DR programmeasurement are provided in Appendix C.

Table 1: DR Load Impact Estimates

DR Load Impact Estimates General Description PurposeEx Ante for RA (e.g., 2012 RA) A year ahead monthly ex ante load

impact potential attributed byindividual program under a 1 in 2weather condition.

To determine the RA counting againstthe Load Serving Entity�’s system andlocal capacity requirements.

Daily Forecast The Utilities�’ daily estimate of hourlyload impact from DR programs duringan event period.

To provide the CAISO, CPUC, and CEC thehourly MW provided by DR programs oneach event day.

7 Day Report The Utilities�’ preliminary estimate ofhourly load reduction results fromeach triggered DR program

To report to the CAISO the loadreduction data from the triggered DRprograms seven days after each DRevent.

Ex Post Results The Utilities�’ most accuratemeasurement of the load impactresults from all of the DR programstriggered in a year. The ex postresults are calculated usingcomprehensive regression models.

To report to the CPUC the actual resultsof the DR events

Settlement A measurement of customers�’ loadreduction from their specific referenceload using a baseline method.

To calculate customers�’ incentivepayments for billing purpose.

In this proceeding, the Utilities provided the above DR load impact estimates for their DRprograms, which are shown in Appendices D to G.

III. Comparison of DR Daily Forecast and Ex Post Results

A. Overall Program Performance

The following section draws on data provided by the Utilities on March 4, 20139 in responseto the Feb 21, 2013 ALJ ruling, which compares event day forecasts (daily forecast or day aheadforecast) to the event day ex post load reduction estimates. Detailed data and methodologicaldescriptions relevant to this chapter are provided in Appendices C and G. Subsequent to itsMarch 4 filing, SCE updated its ex post results for some of the DR program events in its April 2Load Impact Report but did not update its March 4 filing accordingly. However, in most cases,the April 2, 2013 updated ex post results are even lower than the March 4 preliminary data, e.g.,the AC cycling. Therefore, if the updated data was used, it would further support staff�’sfindings.

9 SCE 03 and SGE 03.

Page 14: StaffReport_2012DRLessonsLearned

10

On average, the ex post results for all program events diverge from the daily forecast by aconsiderable degree. While some program events were forecasted more accurately andconsistently than others, Energy Division staff�’s overall conclusion is that the daily forecastingmethods for all programs requires meaningful and immediate improvements in order for theday ahead forecasting can become an effective and reliable tool for grid operators.

Some of the divergence between the ex post results and the daily forecast estimates canpossibly be explained by inadequate program design and program operations. This sectionfocuses on the observed differences between the ex post and the daily forecast with an eyetowards identifying improvements for day ahead forecasting, and thus does not cover allpotential program improvements. Furthermore, many program design and operationalimprovements that could lead to better ex post results may not be evident by simply inspectingthe daily forecast and ex post data.

The ex post analysis methods are guided by Commission adopted load impact protocols10

and the study results are carefully documented in reports prepared by independent consultantsmanaged by SCE staff. However, there are currently no comparable standards and processesguiding the methods for daily forecasting. Indeed, during the course of preparing this report,Energy Division staff became aware that the day ahead forecasting methods are far fromtransparent, and in some cases lack the robust analysis that is expected of the Utilities. Theseproblems may be somewhat understandable, however, since the daily reports were onlyformally instituted in 2012.

While this report is highly critical of the implementation of the day ahead forecasting, it isimportant to recognize that the 2012 DR events as a whole did indeed reduce participants loads,and some of the program load reductions were consistent with or better than the day aheadforecast. To that end, staff has categorized the demand response programs into threecategories (good, mixed, and poor performance) based on how well the program eventsperformed relative to the day ahead forecasts.

SCE

Programs that performed well yielded load impacts that were consistent with or better thanthe day ahead forecast. The Base Interruptible Program (BIP) and the Day of Capacity BiddingProgram events produced load reductions that were on par with the forecasts. It is worthnoting that BIP, the single largest program, was triggered on only one occasion in 2012 however,and this was test event.

Program events with mixed performance were not consistent with the day aheadforecast, but sometimes exceeded the forecast. Staff includes the Day ahead Capacity Bidding,Demand Bidding, and the Residential Summer Discount Plan program events in this categorybecause these program events did indeed occasionally exceed the day ahead forecasts by asignificant margin. These programs are discussed in greater detail elsewhere in this section andreport. While considered to be mid performing programs, they do have many important issuesthat deserve attention.

10 Decision 08 04 050

Page 15: StaffReport_2012DRLessonsLearned

11

Program events that were consistently below the forecast are considered to be poorperforming programs. All of the Critical Peak Pricing, Peak Time Rebate, Demand ResponseContracts, Commercial Summer Discount Plan, and Agricultural Pumping Interruptible programevents triggered during 2012 produced load reductions that were lower than forecasted.

Table 2: SCE�’s DR Overall Performance

ProgramsNo. ofDR

EventsDaily Forecast Ex Post Difference %

Good Performance:

Capacity Bidding Program �– Day of 14 12 16 >2 >17%

Base Interruptible Program 1 514 573 59 12%

Mixed Performance:Capacity Bidding Program �– Day Ahead 12 0.08 0.03 0.29 to 0.08 315% to 86%

Demand Bidding Program 8 84 76 33 to 16 40% to 21%

Summer Discount Plan (AC Cycling) Res. 23 280 184 603 to 92 100% to 58%

Poor Performance:

Critical Peak Pricing 12 50 37 < 5 < 11%

Peak Time Rebate 7 108 20 < 11 < 11%

Demand Response Contracts 3 230 148 < 70 < 34%

Summer Discount Plan (AC Cycling) Com. 2 5 3 2 35%

Agricultural Pumping Interruptible 2 48 21 < 19 < 52%

(AveragedMW over All Events) (Range from Low to High)

SDG&E

Utilizing the same criteria for evaluating SCE DR programs, The Base Interruptible Programand the Critical Peak Pricing Program were categorized as good performers, the CapacityBidding Day Ahead, Capacity Bidding Day Of, Demand Bidding, and Summer Saver (AC Cycling)were categorized as mixed performers, and the Critical Peak Pricing Emergency and residentialPeak Time Rebate programs were categorized as poor performers. As stated above, DRprogram design and operation characteristics also need to be taken into account for a completeevaluation of DR program performance.

Page 16: StaffReport_2012DRLessonsLearned

12

Table 3: SDG&E�’s DR Overall Performance

B. Program Performance During Critical Event Days

The critical event days of August 10th, 13th, 14th, and September 14th were selected as afocus because they occurred on Flex Alert days, the service area system peak day, or thehottest days of the year. These are all conditions when demand response resources are mostcritical.

August 10, 2012

SCE

Two SCE programs were called on August 10th, a Flex Alert day. The programs triggeredduring that event were the Demand Bidding Program and the Save Power Day (also known asthe Peak Time Rebate program). The load reductions achieved during the Demand BiddingProgram event surpassed the forecast by 12%, while the Save Power Day event was below theforecast by 11%.

Table 4: SCE�’s August 10, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C=B A D=C/ADemand Bidding Program 85.59 95.82 10.23 11.95%Save Power Day 107.2411 95.85 11.39 10.62%Total 192.83 191.67 1.16

11 SCE did not provide a daily forecast for this event, so the comparison for this event is done with the 7 day reportrather than the daily forecast.

Programs Number ofEvents

DailyForecast Ex Post Difference %

(Averaged MW over AllEvents)

(Low To High)

Good Performance:Base Interruptible Program 1 0.3 0.8 0.5 167%Critical Peak Pricing 7 15 18 > 2.4 >3.1%Mixed Performance:Capacity Bidding Program �– DayAhead

78 6 4.9 to 0.1 32% to 12.2%

Capacity Bidding Program �– Day Of 5 12 10 3.2 to 0.7 27.4% to 6.0%Demand Bidding Program 3 5 5 0.4 to 0.1 8.0% to 8.0%Summer Saver (AC Cycling) 8 20 17 12.3 to 3.5 64.0 to 38.7%Poor Performance:Peak Time Rebate Residential 7 19 4 < 24 < 73.6%Critical Peak Pricing �– Emergency 2 2 1 < 0.7 < 53.3%

Page 17: StaffReport_2012DRLessonsLearned

13

SDG&E

Three DR programs were called on August 10th. The Capacity Bidding Day Ahead programload reduction exceeded the forecast by 1%. Conversely, the Summer Saver and residentialPeak Time Rebate forecasts under predicted the forecast by 32% and 75%.

Table 5: SDG&E August 10, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast & Ex

PostMW

% DifferenceForecast &Ex Post

A B C = B A D=C/ACapacity Bidding Day Ahead 7.50 7.60 0.10 1.33%Summer Saver (AC Cycling) 27.20 18.50 8.70 32.00%Residential Peak Time Rebate 12.60 3.20 9.40 74.60%Total 47.30 29.30 18.00

August 13, 2012

SCE

August 13, 2012 was the system peak day for the SCE service area, with a peak load of22,428 MW. As shown in Table 6 below, the Critical Peak Pricing program, a dynamic pricingprogram for commercial and industrial customers over 200 kW, and the Day Of CapacityBidding Program were triggered during this day. Again, the Capacity Bidding Programsexceeded the forecast by a few MW. The Critical Peak Pricing program event had satisfactoryperformance, falling short of the forecast by 15%.

Table 6: SCE�’s August 13, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C=B A D=C/ACritical Peak Pricing 50.54 42.96 7.58 15.00%Capacity Bidding Program (Day Of) 12.30 15.70 3.40 27.60%Total 62.84 58.66 4.18

SDG&E

All three DR programs that were triggered on August 13th, Capacity Bidding Day Of,Summer Saver (AC Cycling), and Critical Peak Pricing, had ex post load impacts that wererespectively below daily forecast predictions by 27%, 45%, and 48%.

Page 18: StaffReport_2012DRLessonsLearned

14

Table 7: SDG&E�’s August 13, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C= B/A D= C/ACapacity Bidding �– Day Of 11.70 8.50 3.20 27.33%Summer Saver (AC Cycling) 33.30 21.40 11.90 45.35%Critical Peak Pricing Emergency 2.30 1.20 1.10 47.83%

Total 47.30 31.10 16.20

August 14, 2012

SCE

August 14, 2012 was another Flex Alert day, during which seven events were called, using avariety of DR programs. As shown in Table 8 below, all the events combined were forecasted toreduce loads by 570 MW. However, the ex post load impact evaluations found that the actualload reductions were short of the total forecast by 155 MW. 60% of the 155 MW shortfall isattributed to the Demand Response Contract program. The Agriculture Pumping Interruptibleprogram event was short of the event forecast by 52%. Only the Capacity Bidding Programexceeded the forecasted load reduction, but this only made up 4% of the Demand ResponseContract program forecast, and thus was insufficient to cover the overall event day shortfall. Itis worth noting that the Demand Response Contract and Capacity Bidding Programs sharesomething in common in that they are both commercial aggregator programs. The reason forthe difference in performance between these programs requires further study. It should benoted that SCE�’s Demand Response Contracts expired on December 31, 2012 and have sincebeen replaced by new contracts that that expire at the end of 2014.12

Table 8: SCE�’s August 14, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C=B A D=C/A

Demand Response Contracts 275.00 182.05 92.95 33.80%Demand Bidding Program 94.09 61.76 32.33 34.36%Agriculture Pumping Interruptible 36.00 17.29 18.72 51.99%Summer Discount Plan (Res) Group 1 130.40 119.40 11.00 8.44%Capacity Bidding Program (Day Of) 12.30 17.82 5.52 44.86%Summer Discount Plan (Res) Reliability 17.42 13.50 3.92 22.49%Summer Discount Plan (Com) 4.77 3.10 1.67 35.04%

Total 569.98 414.91 155.07

12 D.13 01 024 http://docs.cpuc.ca.gov/PublishedDocs/Published/G000/M046/K233/46233814.PDF

Page 19: StaffReport_2012DRLessonsLearned

15

SDG&E

Four DR programs, Demand Bidding, Critical Peak Pricing, Capacity Bidding Day Ahead,and residential Peak Time Rebate, were called on August 14th. While the Demand Bidding andCapacity Bidding Program ex post load impacts closely matched the daily forecast, the CriticalPeak Pricing and residential Peak Time Rebate did not. Since the Critical Peak Pricing andresidential Peak Time Rebate programs are large scale residential programs it is possible thatthe difference between the forecast and ex post load impacts reflect widely varying customerbehavior during DR events.

Table 9: SDG&E�’s August 14, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C=B A D=C/ADemand Bidding Program 5.00 5.10 0.10 2.00%Critical Peak Pricing 14.30 25.90 11.60 81.12%Capacity Bidding Program (Day Ahead) 7.50 7.50 0.00 0.00%Residential Peak Time Rebate 12.50 1.10 11.40 91.20%

Total 39.30 39.60 0.30

September 14, 2012

SCE

September 14, 2012 was the hottest day of the year in both the SCE and SDG&E serviceareas (see Table 10 below). Understandably, SCE triggered their Summer Discount Plan(residential AC Cycling Programs) during this day. The Capacity Bidding Program was alsotriggered, with performance comparable to the other Capacity Bidding Program events oncritical days discussed above.

The September 14 residential Summer Discount Plan events consisted of three separatecustomer groups sequentially triggered for one hour events. All three one hour events fellconsiderably short of the forecasted load reductions.

Table 10: SCE�’s September 14, 2012 Demand Response Events

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C=B A D=C/ASummer Discount Plan (Residential)Groups 5 and 6 135.61 20.70 114.91 84.74%Summer Discount Plan (Residential) Groups 1 and 2 110.89 37.80 73.09 65.91%Capacity Bidding Program (Day Of) 11.90 16.21 4.31 36.18%

Summer Discount Plan (Residential) Groups 3 and 4 99.32 17.80 81.52 82.08%

Total 357.72 92.51 265.22

Page 20: StaffReport_2012DRLessonsLearned

16

SDG&E

On September 14, 2012, the peak temperature in SDG&E�’s service territory was 109degrees. The Demand Bidding, Summer Saver, and Base Interruptible Programs ex post loadimpacts were above the daily forecast in a range between 8% and 167%. Since the absolutevalue of the Base Interruptible Program load impact is ~ 1 MW, a small increase or decrease inthe daily forecast prediction can result in high variability in the percent difference betweenthese two figures. Conversely, the Capacity Bidding Day Of and Day Ahead Programs and theCritical Peak Pricing Emergency Program daily forecasts were below the daily forecast in a rangebetween 12% and 44%.

Table 11: SDG&E�’s September 14, 2012 Demand Response Events

C. Detailed Program Analysis

The following section discusses programs and events that produced load reductionsforecasted by the daily reports, as well as programs that failed to produce the forecasted loadreductions. For this purpose, all programs and events that came within 10% (+/ ) of theforecasted load reductions are considered to be consistent with the daily forecast and allprograms and events that were more or less than 50% of the forecasted load reductions areconsidered to have failed to produce the forecasted load reductions.

SCE

There were a total of 104 separate events in the SCE service area in 2012. Only ten of theseevents produced the load reductions consistent with those forecasted in the daily reports. Asshown in Table 12 below, all of these events produced fairly sizable load reductions, rangingfrom 59 to 130 MW, with the exception of one Capacity Bidding Program event, whichproduced a very small load reduction.

Program NameDaily

ForecastMW

Ex PostMW

DifferenceForecast& Ex Post

MW

%DifferenceForecast &Ex Post

A B C=B A D=C/ACapacity Bidding Program (Day Of) 9.00 5.70 3.30 36.67%Capacity Bidding Program (Day Ahead) 12.10 10.60 1.50 12.40%Demand Bidding Program 5.00 5.40 0.40 8.00%Summer Saver (AC Cycling) 15.50 22.50 7.00 45.16%Base Interruptible Program 0.30 0.80 0.50 166.70%Critical Peak Pricing Emergency 1.60 0.90 0.70 43.75%

Total 43.50 45.90 2.40

Page 21: StaffReport_2012DRLessonsLearned

17

Table 12: SCE�’s DR Events with Ex Post Results within 10% of the Daily Forecast

Program Name Event DateDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

%DifferenceForecast &Ex Post

A B C=B A D=C/ASummer Discount Plan (Residential) 08/14/12 130.40 119.40 11.00 8.44%Summer Discount Plan (Residential) 08/29/12 82.56 80.30 2.26 2.74%Summer Discount Plan (Residential) 08/01/12 58.60 57.10 1.50 2.56%Summer Discount Plan (Residential) 08/15/12 77.77 77.50 0.27 0.35%Demand Bidding Program 10/17/12 79.05 79.25 0.20 0.26%Demand Bidding Program 10/01/12 78.75 79.78 1.03 1.31%Summer Discount Plan (Residential) 08/09/12 118.06 121.20 3.14 2.66%Summer Discount Plan (Residential) 08/28/12 83.86 88.20 4.34 5.18%Capacity Bidding Program (Day Ahead) 07/31/12 0.0700 0.0740 0.00 5.71%Demand Bidding Program 08/08/12 85.59 92.95 7.36 8.60%

Of the 104 events in 2012, thirty (or about 29%) of the events were more than 50% off ofthe day ahead forecast. Five of these events produced load reductions that were greater thanthe forecast, while the remaining 25 were lower than the forecast. The three events with thehighest percentage difference below the forecast were very small Day Ahead Capacity BiddingProgram events, and thus are not considered the most critical problem. Twenty one of theremaining events were Summer Discount Plan (AC Cycling) events, and these varied markedlyoff the forecast.

Page 22: StaffReport_2012DRLessonsLearned

18

Table 13: SCE�’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast

Program Name Event DateDaily

ForecastMW

Ex PostMW

DifferenceForecast &Ex PostMW

% DifferenceForecast &Ex Post

A B C=B A D=C/ACapacity Bidding Program (Day Ahead) 10/01/12 0.09 0.20 0.29 315.22%Capacity Bidding Program (Day Ahead) 10/02/12 0.09 0.10 0.20 213.04%Capacity Bidding Program (Day Ahead) 10/05/12 0.09 0.07 0.16 170.65%Save Power Days / Peak Time Rebates 09/07/12 108.66 23.11 131.77 121.27%Summer Discount Plan (Residential) 06/20/12 128.01 0.50 127.51 99.61%Save Power Days / Peak Time Rebates 09/10/12 108.52 1.65 106.87 98.48%Summer Discount Plan (Residential) 09/14/12 135.61 20.70 114.91 84.74%Summer Discount Plan (Residential) 07/10/12 263.67 44.70 218.97 83.05%Summer Discount Plan (Residential) 09/14/12 99.32 17.80 81.52 82.08%Summer Discount Plan (Residential) 06/29/12 178.26 33.30 144.96 81.32%Summer Discount Plan (Residential) 09/20/12 77.39 14.60 62.79 81.14%

Summer Discount Plan (Residential) 06/29/12 178.26 35.80 142.46 79.92%Summer Discount Plan (Residential) 07/10/12 263.67 66.60 197.07 74.74%Summer Discount Plan (Residential) 10/02/12 298.91 86.20 212.71 71.16%Summer Discount Plan (Residential) 07/10/12 263.67 76.70 186.97 70.91%Summer Discount Plan (Residential) 09/20/12 65.53 21.10 44.43 67.80%Summer Discount Plan (Residential) 09/20/12 65.73 21.90 43.83 66.68%Summer Discount Plan (Residential) 09/14/12 110.89 37.80 73.09 65.91%Summer Discount Plan (Residential) 08/22/12 115.03 42.40 72.63 63.14%Agriculture Pumping Interruptible 09/26/12 60.56 24.00 36.56 60.36%Summer Discount Plan (Residential) 09/21/12 168.96 69.10 99.86 59.10%Summer Discount Plan (Residential) 09/28/12 55.06 24.50 30.56 55.50%Agriculture Pumping Interruptible 08/14/12 36.00 17.29 18.72 51.99%Summer Discount Plan (Residential) 10/17/12 127.25 62.30 64.95 51.04%Summer Discount Plan (Residential) 10/17/12 146.77 72.30 74.47 50.74%Summer Discount Plan (Residential) 08/17/12 101.30 153.00 51.70 51.04%Capacity Bidding Program (Day Ahead) 10/29/12 0.09 0.15 0.06 59.78%Summer Discount Plan (Residential) 08/17/12 58.00 98.30 40.30 69.48%Capacity Bidding Program (Day Ahead) 10/18/12 0.09 0.17 0.08 85.87%Summer Discount Plan (Residential) 09/10/12 18.98 68.40 49.42 260.42%

Summer Discount Plan

The Summer Discount Plan event variability ranges from 121% below the forecast (with aload increase rather than a load reduction) to 260% above the forecast. Overall, the AC Cyclingprogram represents the most variance13 of all the SCE DR programs. When all of the variancesfor individual events are aggregated, the AC Cycling program represents 49% of the totalvariance. The Pearson Product Moment Correlation between the daily forecast and the ex postload impacts is 0.21, representing a very weak positive correlation.

13 Variance in this context specifically refers to the absolute difference between the daily forecast and the eventday ex post load reductions.

Page 23: StaffReport_2012DRLessonsLearned

19

The Pearson correlation between the average event temperature14 and the event levelvariance (difference between the daily forecast and the event day ex post load reductions) is0.37, representing a moderately weak correlation. In everyday language this means that SCE�’s2012 Summer Discount Plan forecast method cannot be relied upon to effectively predict theactual program load reductions. In addition, there appears to be little relationship between theevent day temperature and the difference between the daily forecast and the event day expost load reductions, potentially ruling out temperature as an explanatory factor for thedifference.

The Summer Discount Plan was (by far) the most often triggered program in SCE�’s 2012 DRportfolio. There were 23 separate events, including two early test events15. Most of the 23events were split into 3 customer segments such that each group of customers was triggeredfor only a portion (i.e. one hour) of each event (typically lasting three hours). Three events on9/14, 9/20, and 9/28 deployed 6 customer segmentations. SCE operated the program in thismanner to avoid cycling their customers�’ air conditioners for more than one hour at a time16.The purpose of this strategy is so customers will be minimally impacted by the loss of one hourof AC services, compared to multiple continuous hours, and in theory the utility would still beable to reduce load when needed.

As shown in Table 14 below, the implementation of this strategy, however, resulted in arebound effect from the groups curtailed in event hours 1 & 2 that added load in hours 2 & 3 asAC units ran at above normal capacity to return the participants�’ buildings to the originaltemperature set points17. The net effect was to dampen the average hourly load impact for theentire event period, as illustrated in Table 14. It is possible that the daily forecasts wereprepared assuming that all customers would be curtailed at the same time over the entireduration of the event. In such a case, the average hourly load reductions would likely havebeen larger because all customers would be simultaneously curtailed and the rebound effectwould be delayed until after the event was over. This issue is further illustrated in Chapter 2,Section IV �“Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)�”.

Table 14: SCE�’s Hourly Load Impact from a Sept 14 Summer Discount Plan event

Event HourEnding:

Event Hours w/ Rebound Post Event Rebound Event HourAverage

16 17 18 19 20

15 39.6 25.1 17.0

16 27.1 27.0 39.6

17 21.3 49.6 37.8

Hour Total 39.6 2.0 22.7 89.2 37.8 6.3

14 SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.15 The last two events in late October were not included in the ex post analysis.16 SCE 01 Testimony at 11.17 SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.

Page 24: StaffReport_2012DRLessonsLearned

20

Another potential explanation for the suboptimal performance could be customersexercising the override option in their enrollment contracts with SCE. However, SCE�’s A.12 12016 testimony18 indicates that the proportion of customers with an override option is fairlysmall (consisting of about 1% of the customers enrolled in SDP) and that these customers rarelyexercise the override option. Finally, it is possible that transitioning Summer Discount Planfrom an emergency program to a price responsive program could have introduced someadditional uncertainties that aren�’t adequately captured by the current forecasting methods.Regardless of the explanation for the unexpectedly low load reductions during these events, itis critical that SCE improve the day ahead forecast for the SDP program as a whole.

Energy Division staff reviewed SCE�’s method for forecasting the Summer Discount Planprogram.19 The methodology, provided in Appendix C, is described in a 1986 internal SCEmemorandum and consists of a simple algorithm which estimates the load reduction per ton ofAC based on the forecasted temperature. The equation coefficients were determined by a1985 load reduction study that SCE staff could not locate when requested to do so by EnergyDivision staff. Without the 1985 load reduction study Energy Division staff could not fullyevaluate the forecasting methodology. SCE did provide a revised algorithm which modifies theequation structure. But the underlying methods for estimating those coefficients as yet remainunexplained.

This evidence suggests that there is a critical flaw in either the way the Summer DiscountPlan events are forecasted or in the operation of the program, or both. The lack of a reliableday ahead forecasting method is a major weakness that undermines the ability to fully considerAC Cycling in the CAISO grid operations. Even if the utilities�’ DR resources are eventually to bebid into the CAISO market, which currently are not, ED recommends that SCE immediatelydocument the forecasting methods to be used for the 2013 season and thoroughly vet themethods with CPUC and CAISO staff and relevant stakeholders to ensure the proposedforecasting methods are reasonable and reliable. Throughout the 2013 summer season (andlonger if necessary), SCE should consider piloting more than one forecasting method whichshould be tested using small ex post load impact evaluations to identify the most reliableforecasting methods.

Base Interruptible Program

The Base Interruptible Program was triggered only once during the entire 2012 season andthis was a test event. This single event produced 573 MW of load reductions on September 26.The load reductions for this event were 59 MWmore than the day ahead forecast. It is worthnoting that the single Base Interruptible event was more than three times the load reduction ofany other SCE program event during 2012, and it was not triggered on one of the critical eventdays discussed earlier in this section.

The Commission should explore a policy requiring more frequent deployments of thisprogram since it appears to have significant, yet underutilized, potential.

18 SCE 01 Testimony at 11, Lines 3 5.19 See Appendix S.

Page 25: StaffReport_2012DRLessonsLearned

21

Capacity Bidding Program

The Capacity Bidding Program Day Ahead events produced an average load reduction of0.03 MW across all events. With the exception of three events in October (that wereassociated with negative load reductions in the ex post analysis) most events producedrelatively small load reductions forecasted by the daily report. None of the Capacity BiddingProgram day ahead events occurred in August and September when the load reductions aretypically most needed.

By comparison, all of SCE�’s Capacity Bidding Program Day Of events exceeded theforecasted load reductions, by an average of 32%. The average load reduction for the CapacityBidding Program Day Of events was 15.9 MW, over 500 times the load reductions produced byDay Ahead events.

This evidence suggests that, unlike the Day Of program, the Day Ahead Capacity BiddingProgram may not be serving a useful function in SCE�’s DR portfolio.

Demand Bidding Program

The Demand Bidding contracts were called on eight occasions during the summer of 2012.Of these eight events, five occurred in August. The first two August events on August 8 andAugust 10 resulted in load reductions that exceeded the daily forecast by an average of 10%.The third and fourth events on August 14 and August 16 were 34% short of the forecasted loadreductions and the fifth event on August 29 was 40% below forecast, suggesting that perhaps adecline in customer participation in events could be explored as a potential factor indiminishing returns.

Demand Response Contracts (DRC) �– Nominated

Somewhat surprisingly, there were only two events for which Demand Response Contractswere called. The ex post load reductions for these two events were both around 35% belowthe daily forecast. Energy Division was not able to examine why this program performed sopoorly. As noted earlier, SCE�’s DRCs expired on December 31, 2012, and have since beenreplaced by new contracts approved by the Commission.

Save Power Days / Peak Time Rebates (PTR) �– Price Responsive

Daily forecasts were not provided by SCE for the four PTR events that occurred in August,thus comparisons between the daily forecast and ex post results are possible for only the twoevents on September 7 and September 10. Both of the September events were forecasted toreduce loads by 109 MW. Ex post results, however, indicate that the PTR events had no impactat all. In fact, the September 7 event was correlated with a fairly significant load increase of23.11 MW.

Ex post load reductions were estimated for the four August PTR events, for which dayahead estimates were not provided by SCE. As a proxy for the daily forecast the 7 day reportswere used. As shown in Table 15 below, estimated load reductions were between 107 and 108,while the ex post load reductions ranged between 0.02 and 96 MW.

Page 26: StaffReport_2012DRLessonsLearned

22

Table 15: SCE�’s Peak Time Rebate MW

Event Day 7 Day Report Ex Post8/10/2012 107.24 MW 95.85 MW8/16/2012 107.61 MW 24.43 MW8/29/2012 108.51 MW 21.93 MW8/31/2012 108.73 MW 0.02 MW

Given the considerable variability in ex post results for the PTR program events, the dayahead forecasting and event reporting will need significant revision to account for thesediscrepancies. If the PTR program is going to continue, staff recommends that SCE prepare aproposal for a viable forecast and submit that for staff to review.

SDG&E

There were a total of 46 DR program events that were triggered on 14 event days inSDG&E�’s service area from June 2012 October 2012. Daily forecasts for twelve DR programevents were within + 10% of ex post load impacts. As depicted in Table 16, moderate loadreductions ranging from 5 to 17 MW were produced when these events were triggered. Threeprograms delivered accurate results with a moderate degree of consistency: Demand BiddingProgram, Critical Peak Pricing, and Capacity Bidding Program Day Of.

Table 16: SDG&E�’s DR Events with Ex Post Results within + 10% of the Daily Forecast

Program NameEventDate

DailyForecastMW

Ex PostMW

DifferenceForecast& Ex Post

MW

% DifferenceBetweenForecast &Ex Post

Demand Bidding Program 10/2/2012 5 4.6 0.4 8.00%Capacity Bidding Program (Day Of) 8/8/2012 11.7 11 0.7 5.98%Capacity Bidding Program (Day Ahead) 8/9/2012 7.5 7.5 0 0.00%Capacity Bidding Program (Day Ahead) 8/14/2012 7.5 7.5 0 0.00%Capacity Bidding Program (Day Ahead) 8/10/2012 7.5 7.6 0.1 1.33%Demand Bidding Program 8/14/2012 5 5.1 0.1 2.00%Summer Saver (AC Cycling) 9/15/2012 8.6 8.8 0.2 2.33%Critical Peak Pricing 10/2/2012 16 16.5 0.5 3.13%Critical Peak Pricing 8/21/2012 16.5 17.2 0.7 4.24%Critical Peak Pricing 9/15/2012 13.7 14.5 0.8 5.84%Demand Bidding Program 9/14/2012 5 5.4 0.4 8.00%Critical Peak Pricing 8/30/2012 16.2 17.8 1.6 9.88%

A total of 19 DR program events had ex post load impacts that were greater than + 50% ofthe daily forecasts as depicted in Table 17. In particular, the residential and commercial PeakTime Rebate program ex post load impacts deviated from the daily forecasts by greater than70%. According to SDG&E, the commercial Peak Time Rebate ex post load impacts weredeemed to be not statistically significant. On this basis, SDG&E reported zero load impacts forthis program.

Page 27: StaffReport_2012DRLessonsLearned

23

Table 17: SDG&E�’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast

Program Name Event Date

DailyForecastMW

Ex PostMW

DifferenceForecast &Ex Post MW

%DifferenceBetweenForecast &Ex Post

A B C= B A D= C/ACommercial Peak Time Rebate 8/9/2012 1.2 0 1.2 100.00%Commercial Peak Time Rebate 8/10/2012 1.1 0 1.1 100.00%Commercial Peak Time Rebate 8/11/2012 0.8 0 0.8 100.00%Commercial Peak Time Rebate 8/14/2012 1.2 0 1.2 100.00%Commercial Peak Time Rebate 8/21/2012 1.2 0 1.2 100.00%Commercial Peak Time Rebate 9/15/2012 0.9 0 0.9 100.00%Residential Peak Time Rebate 8/14/2012 12.5 1.1 11.4 91.20%Residential Peak Time Rebate 8/21/2012 25 3 22 88.00%Residential Peak Time Rebate 8/11/2012 12.2 1.7 10.5 86.07%Residential Peak Time Rebate 8/9/2012 13.1 3.3 9.8 74.81%Residential Peak Time Rebate 8/10/2012 12.6 3.2 9.4 74.60%Residential Peak Time Rebate 9/15/2012 32.3 8.3 24 74.30%Residential Peak Time Rebate 7/20/2012 23.9 6.3 17.6 73.64%Capacity Bidding Program (Day Ahead) 10/1/2012 9 4.1 4.9 54.44%Capacity Bidding Program (Day Ahead) 10/2/2012 9 4.2 4.8 53.33%Summer Saver (AC Cycling) 9/14/2012 15.5 22.5 7 45.16%Critical Peak Pricing 8/11/2012 11.7 18.4 6.7 57.26%Critical Peak Pricing 8/14/2012 14.3 25.9 11.6 81.12%Base Interruptible Program 9/14/2012 0.3 0.8 0.5 166.67%

Capacity Bidding Program Day Ahead (CBP DA)

The percent difference between the CBP DA daily forecast and ex post results respectivelyranged from 32% 12% (Table 3). Based upon this assessment, the daily forecasts for CBP DAwere not accurate or consistent predictors of ex post results.

Since the CBP DA daily forecast model does not have a variable that accounts for weather,and the ex post models do, this methodological difference could account for the variabilitybetween the two load impact measures. Another factor that could affect this difference is thepercent load impact per customer. Although customers submit load impact bids prior to eachDR event, the actual load reduction on the event day may not coincide with the projected loadreduction.

If weather affects event day load reduction by CBP customers, the addition of a weathervariable to the daily forecast model could increase its accuracy. In order to address uncertaintyin the percent load reduction per CBP customer, DR test events could be scheduled to measurethis value on event like days.

Capacity Bidding Program Day Of (CBP DO)

Similar to the CBP DA program, the CBP DO daily forecasts were not accurate nor consistentpredictors of ex post results based upon the range of the difference, 27.4% 6.0% (Table 2),between the two load impact measures. As stated above, inclusion of a weather variable in the

Page 28: StaffReport_2012DRLessonsLearned

24

daily forecast model and measurement of percent load reduction per customer during testevents could increase the accuracy and consistency of the daily forecast model to predict expost load impacts.

Demand Bidding Program (DBP)

The percent difference between the DBP daily forecasts and ex post load impacts rangedfrom 8.0% to 8.0% (Table 3) for the three DBP events that were called during the summer.Based upon this result, the DBP daily forecast accurately and consistently predicted ex postload impacts.

One caveat for making a general assessment of the DBP forecast model is that only onecustomer provided load reduction bids for the DR summer events. In order to do so, it wouldbe advised to examine forecast and load impact data from at least 5 10 event days.

Commercial Peak Time Rebate

SDG&E reported zero ex post load impacts for this program in its March 4th filing. Accordingto SDG&E, zero values do not imply that no load reduction occurred but that the load impactswere not statistically significant.20 Therefore, a comparison of daily forecasts and ex post loadimpacts could not be performed.

Based upon conversations with SDG&E, the lack of effectiveness of the commercial PeakTime Rebate program could be attributed to a weak price signal and inelastic customer demandduring event periods. SDG&E would be advised to discontinue the commercial Peak TimeRebate program.

Residential Peak Time Rebate

The percent difference between daily forecast and ex post load impacts ranged from 91.2%to 73.6% (Table 3). This implies that the residential Peak Time Rebate program daily forecast isnot an accurate predictor of ex post load impact. However, the residential Peak Time Rebateprogram daily forecast consistently over predicted the ex post results.

Since the ex post methodology only modeled load impacts for customers that signed up toreceive e mail or text alerts and the daily forecast model does not, it is possible that theaccuracy of the daily forecast model could improve if there was parity between the twomethodologies. If only residential Peak Time Rebate opt in customers were included in thedaily forecast model this may resolve the discrepancy. As an alternative solution, since thedaily forecast consistently over predicted the ex post results, SDG&E might consider deratingdaily forecasts by a factor of 0.7 to 0.9 when estimating ex post load impacts.

Summer Saver (AC Cycling)

The range of the percent difference between daily forecast and ex post load impacts,64.0% 38.7%, presented in Table 3 indicates that the daily forecast is not an accurate orconsistent predictor of ex post load impacts.

20 SCE 03 at 21.

Page 29: StaffReport_2012DRLessonsLearned

25

It should be noted that the both the residential and commercial Summer Saver ex postmethodologies (respectively a randomized experiment and a panel vs. customer regression)differed from prior years due to the availability of smart meter data21. This could account forthe difference between daily forecast and ex post results. In addition, both ex postmethodologies utilized control and treatment groups, whereas daily forecast methodologies didnot. According to this assessment, it would be advised to examine how the daily forecast andex post models could be harmonized.

Based upon a conversation with SDG&E, a temperature squared variable is utilized in thedaily forecast model. Compared to SCE�’s current AC cycling daily forecast model, SDG&E�’s dailyforecast model includes an additional measure of accuracy. However, in order to better predictcustomer behavior on successive event days or prolonged event hours, SDG&E might considerincluding an autocorrelation variable in the daily forecast model.

Critical Peak Pricing

The percent difference between the daily forecast and ex post results ranged from 3.1%81.1%. This is the only program where the ex post results consistently outperformed the dailyforecast predictions.

According to SDG&E, the percent load impacts for the Critical Peak Pricing program in 2012were lower in comparison to 2011 and led to an underestimation in the daily forecast22. CriticalPeak Pricing has approximately ~ 1,000 customers and, as SDG&E claims, any variation in thepercent load reduction per customer could lead to high variation in the aggregate impactestimates. This would also be the case for large scale residential DR programs including PeakTime Rebate and Summer Saver (AC Cycling).

SDG&E also claims that measurement error might account for differences between loadimpact category values. However, no explanation is provided to elucidate how themeasurement error occurred (e.g. since Smart Meters were not fully deployed in SDG&E�’sterritory during Summer 2012, measured load reductions obtained from analog meters werenot accurate).

Base Interruptible Program

The percent difference between the daily forecast and ex post load impact for the BaseInterruptible Program was 166.7%.

Since two large Base Interruptible Program customers dropped out of the program, SDG&Ewas not able to accurately forecast the load impact from the remaining customers. It ispossible that further analysis with additional Base Interruptible Program load impact data mightshed light on the accuracy of the daily forecasting methods.

21 SDG&E load impact Filing Executive Summary, April 2, 2012 at 31.22 SGE 03 at 19.

Page 30: StaffReport_2012DRLessonsLearned

26

Critical Peak Pricing �– Emergency

Due to decreasing customer subscription to this tariff, the CPP E program was discontinuedas of December 31, 2012.23

D. Summary of Recommendations

Given the divergence between the daily forecast estimates and ex post load impact results,staff makes the following recommendations:

The daily forecasting methods for all programs must be improved.

The daily forecasting methods should be better documented and should bedeveloped with relevant agencies and stakeholders.

SCE should test a number of different forecasting methods for the Summer DiscountPlan program.

SCE should change the Summer Discount Plan program strategy of calling groups ofcustomers for sequential one hour cycling events.

SDG&E should include only opt in customers in the residential PTR daily forecastmodel.

SDG&E should run a test event to improve CPP daily forecast estimates.

SDG&E should account for customer behavior during successive event days in theSummer Saver daily forecast model.

SDG&E should include a weather variable in the CBP forecast model.

IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)

A. Summary of the Staff Analysis and Recommendations

Comparing the 2012 ex post results with the 2012 RA forecast is not an accurate method ofdetermining how the DR programs performed. RA load forecast represents the maximumcapacity DR can provide under a set of condition for resource planning needs. Ex post loadimpact reflects the demand reduction obtained during actual events in response to operationalplanning needs. Resource planning and operational planning are different in terms ofconditions (i.e. event hours, participation, and temperature) and purposes.

However, in summer 2012, the Utilities�’ DR programs had not been utilized to its fullcapacity even under an extreme hot weather condition. This raises the question of theusefulness of the current RA forecast and whether RA forecast should be changed to reflect theset of conditions reflecting operational needs that include the utilities�’ day to day resourceavailability limitations and DR dispatch strategies for optimal customer experience. A workinggroup that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address theforecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growthrate, dropout rate) used for forecasting RA.

23 At 61, SDG&E load impact Filing Executive Summary, April 2nd

Page 31: StaffReport_2012DRLessonsLearned

27

B. BackgroundThe 2012 RA forecast represents the maximum capacity DR can provide under a set of

conditions for resource planning needs. The conditions entail a 1 in 2 weather year24, portfoliolevel, entire participation, five hour window event (1 p.m. to 6 p.m.), and enrollment forecastassumption.

The 2012 ex post load impacts reflect the demand reductions obtained during actual eventsin response to operational needs. Operational needs on the event day may not require the fullcapacity of DR because the condition does not warrant it. Utilities have the discretion to call fora few DR programs with shorter event hours or a smaller group of participants based on theirgeneration and DR resource dispatch strategies.25 This means an ex post impact may onlyreflect a 1 hour event window versus an RA forecast that has a 5 hour event window.Therefore, the ex post impact may reflect only a segment of a program�’s participants versus theRA forecast that assumed the program�’s entire set of participants. The ex post impact mayreflect a lower temperature as versus the RA forecast that has a higher temperature of the 1 in2 weather year condition.

C. Staff Analysis

Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate methodon how well the program performs against its forecast.

The table below contains August monthly average load impact for the 2012 ResourceAdequacy (RA) forecast as filed in the spring of 2011 and the ex post results that occurred in2012. There are stark differences between what the Utilities forecasted a year ahead (RA) andwhat the results are (Ex Post). On average for the month of August, the variability ranges from485% (over performance) to 95% (under performance) for SCE and 58% to 97% for SDG&E.The main reason for the discrepancy is because the RA data is used to assist in resourceplanning, which means it is characterized as a 5 hour event in which all customers are called forthe entire period (1 6pm) for the summer. However, ex post results reflect the impact fromthe actual DR operations, which means that it can be a 1 hour event in which some (not all)customers are called for a short period of time. Other factors that contributed to thediscrepancy include temperature, enrollment and dual participation.

24 Represent the monthly peak day temperature for an average year. Exhibit SGE 03, Page 14.25 SGE 06, Page 6.

Page 32: StaffReport_2012DRLessonsLearned

28

Table 18: SCE Demand Response Load Impact2012 Resource Adequacy vs. 2012 Ex Post

August Average (MW)

Program Name

RAForecast26 Ex Post27 Difference

RA vs. Ex Post% Difference RA

vs. Ex Post

A B C=B A D=C/A

Demand Bidding Program 12 72 60 485%

Demand Response Contracts 105 182 77 74%

Base Interruptible Program28 548 573 25 5%

Capacity Bidding Program Day Of 19 17 2 11%

Summer Advantage Incentive/Critical Peak Pricing 69 39 30 44%

Agricultural Pumping Interruptible 40 17 22 57%

Summer Discount Plan/ AC Cycling Residential 500 212 288 58%

Save Power Days / Peak Time Rebates 266 36 230 87%

Capacity Bidding Program Day Ahead29 1 0 1 94%

Summer Discount Plan/AC Cycling �– Commercial 62 3 59 95%

Table 19: SDG&E Demand Response Load Impact2012 Resource Adequacy vs. 2012 Ex Post

August Average (MW)

Program Name

RAForecast30 Ex Post31

DifferenceRA vs. Ex Post

% DifferenceRA vs. Ex Post

A B C=B A D=C/A

Critical Peak Pricing Default 12 19 7 58%

Summer Saver/ AC Cycling 15 19 4 27%

Capacity Bidding Program Day Ahead 10 8 2 20%

Capacity Bidding Program Day Of 22 10 12 55%

Base Interruptible Program32 11 0.84 10.16 92%

Reduce Your Use / Peak Time Rebates 69 2 67 97%

Demand Bidding Program n/a33 5 n/a n/a

Critical Peak Pricing Emergency n/a 1 n/a n/a

26 Exhibit SCE 03, Table 1.27 Exhibit SCE 03, Table 1.28 Number based on September average because there were no events for month of August.29 Number based on July average because there were no events for month of August or September.30 Exhibit SDG 03, Table 131 Exhibit SDG 03, Table 132 Number based on September average because there were no events for month of August.33 DBP was not approved until the year after the 2012 RA forecast was filed.

Page 33: StaffReport_2012DRLessonsLearned

29

Forecasting DR estimate for resource planning needs is different than forecasting foroperational needs.

Unlike resource planning needs, operational needs on the event day may not require thefull capacity of DR because the condition does not warrant it or the Utilities deployed �‘optimal�’dispatch strategies for customer experience. Utilities have the discretion to call for shorterevent hours or a smaller group of participants if the system is adequately resourced for that day.As discussed in Chapter 3, peaker or other generation resources may have been dispatchedinstead of DR even though such operation would be contrary to the Loading Order.34 Forexample, SCE can divide its residential Summer Discount Plan participants into three groups anddispatch each group for one hour of an event, resulting in three consecutive one hour events(see chart below). Approximately 1/3 of the customers can be curtailed in any given hour.Rebound from the groups curtailed in event hours 1 and 2 can reduce the net impact in hours 2and 3, lowering the average hourly impact for the entire event period. As a result, the averageimpact per hour can be roughly 100 MW for operation needs. The following figures illustratethe rebound effects from SCE�’s sub group dispatch strategy for its AC cycling.

Figure 1

Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex AnteBriefing

34 http://www.cpuc.ca.gov/NR/rdonlyres/58ADCD6A 7FE6 4B32 8C70 7C85CB31EBE7/0/2008_EAP_UPDATE.PDF.

Page 34: StaffReport_2012DRLessonsLearned

30

However for the RA forecast, resource planning needs require the full capacity of DR. Forexample, SCE assumed all residential Summer Discount Plan participants would be curtailed atthe same time to represent the full program capabilities of a reliability event (see chart below).Subsequent hourly impacts can be larger due to all customers being curtailed at once andrebound effect being delayed until end of entire event window. As a result, the average impactper hour for RA forecast can be roughly 300 MW, which is roughly 3 times greater than ex postin an hour.

Figure 2

Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex AnteBriefing

The opposite extreme condition could occur where the ex post result is higher than the RAforecast. In the case of SCE�’s Demand Bidding Program, the average ex post result is 72 MW,which is 6 times more than the RA forecast of 12 MW (see Table 18). Dual participation was themajor contributor to the discrepancy. For customers who enrolled in two programs such asBase Interruptible Program and Demand Bidding Program, the RA forecast only counts the MWin one program (Base Interruptible Program) to avoid double counting.35 Had the two programsbeen called the same day, the ex post would have shown a much lower amount for DemandBidding Program.

35 Portfolio level.

Page 35: StaffReport_2012DRLessonsLearned

31

September 14, 2012 was considered a hot day (1 in 10 weather year condition36), however,SCE still did not dispatch their entire residential Summer Discount Plan participants. Instead,SCE only dispatched a portion of its participants for one hour of an event, resulting in a fiveconsecutive one hour events. On average, SCE received only 6.3 MW37 for the event, which is ahuge underperformance in comparison to RA forecast of 519 MW.38 This raises the questionthat if SCE chose not to dispatch all of its Summer Discount Plan participants at the same eventhour during a 1 in 10 weather year condition, under what circumstances SCE will dispatch itsSummer Discount Plan to its full program capacity. The usefulness of the RA forecast is inquestion if the utility does not test a DR program to its full capacity. Should the RA forecastprocess be amended to include another Ex Ante forecast that is based on operational needsincluding optimal customer experience, and if so what would that entail?

D. Conclusion and Recommendations

Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate methodin determining DR program performance because the ex post results are in response tooperational needs which can be entirely different than resource planning needs. However, in2012 the RA forecast was not tested to its full capacity. This raises the question of whether RAforecast should be changed to reflect both planning needs and operational needs. A workinggroup that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address theforecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growthrate, drop of rate) used for forecasting RA. This working group should meet inDecember/January annually and come up with a set of input assumptions (i.e. growth rate,drop off rate) used for forecasting DR estimates.

36 Represent the monthly peak temperatures for the highest year out of a 10 year span. Exhibit SGE 03, Page 14.37 Christensen Associates Energy Consulting 2012 Load Impact Evaluation of Southern California Edison�’s

Residential Summer Discount Plan (SDP) Program, April 1, 2013, Table 4 3d.38Exhibit SCE 03, Table 1, 2012 RA for the month of September.

Page 36: StaffReport_2012DRLessonsLearned

32

Chapter 3: Demand Response Program Operations

I. Summary of Staff Analysis and Recommendations

The 2006 to 2011 data shows that the Utilities historically triggered their DR programs farbelow the program limits in terms of number of events and hours. Even with the SONGSoutage, the Utilities did not trigger their DR programs in 2012 summer more frequently asanticipated. Almost all of the Utilities�’ 2012 DR program events and hours fall within thehistorical averages or below the historical maximum. However, staff was surprised to find thatthe Utilities dispatched their peaker power plants (peaker plants) three to four times morefrequently in 2012 than the historical averages. The peaker plant service hours were closer tothe plants�’ emission allowances than the DR events to the program limits.

Staff observed a trend where some DR program events decreased from 2006 to 2012 andyet peaker service hours increased in the same period. This trend raises a concern that theUtilities had under utilized DR programs and over relied on peaker plants. Under the �“LoadingOrder�”, DR is a preferred resource and intended to avoid the building and dispatching of peakerplants.

Due to the time constraints and lack of additional information, Staff was unable to fullyaddress this question and the reasons behind these trends in this report. Therefore, staffrecommends in future DR program Measurement and Evaluations, the Commission evaluatesthe DR program operations and designs in comparison with the peaker plant operations toensure the utilities�’ compliance with the Loading Order.

Specifically, the staff recommends that the Commission:

1. Require the Utilities to provide both DR event and peaker plant data and explanationsfor the disparity between historical DR event hours and peaker plant service hours infuture DR evaluations and the next DR budget applications. The Utilities should includethe DR and peaker plant hourly data and explain why they did not trigger DR programsduring any of the hours when the peaker plant was dispatched. This information willinform the future DR program designs to improve the DR usefulness.

2. Require that DR historical operations be reflected in the input assumptions for the ExAnte forecast and the evaluation of the program cost effectiveness.

3. Address the Loading Order policy in DR planning and operation and utilization of peakerplants in the next DR Rulemaking and the Utilities�’ energy cost recovery proceedings.

II. 2012 DR Program Trigger Criteria and Event Triggers

Appendices H and I are a summary of the Utilities�’ 2012 DR program trigger criteria and theevent triggers. The DR program trigger criteria consists of a list of conditions, which is selfexplanatory depending on the type of the program, e.g., Emergency Program triggers are basedon system contingencies and non Emergency Program triggers also include high temperature,

Page 37: StaffReport_2012DRLessonsLearned

33

heat rate (economic), and resource limitations. The 2012 event triggers were the actualconditions that led to the Utilities�’ decisions to call DR events.

While the DR trigger criteria provides some general ideas on how DR programs are triggered,there is lack of transparent information on the Utilities�’ DR operations, e.g., when and how theUtilities made decisions to trigger a DR program. It is necessary to evaluate the DRperformance not only from load impact perspective, but also from the DR operations todetermine the DR reliability and usefulness as a resource. Staff analyzed the 2006 2012 DRevent data and gained some understanding on how the Utilities had utilized DR programs andhow useful the programs were.

III. DR Events Vs. Peaker Plant Service Hours

How do the number compare to the 2012 limit and historically?

As shown in Appendices J and K, SCE has a few DR programs with unlimited number ofevents or hours: Demand Bidding Program, Save Power Days (Peak Time Rebate), and SummerDiscount Plan �– Commercial (Enhanced). Others have various event/hour limits ranging from 24hours/month to 180 hours/year or 15 events/year.39

For the DR programs with an event limit, most of them did not attain the maximum numberof events and/or hours except for SCE�’s Summer Advantage Incentive (Critical Peak Pricing).40

In summer 2012, SCE triggered 12 events for its Critical Peak Pricing, which is within the rangeof 9 to 15 events/year. Other DR programs�’ event hours were well below the limits. Forexample, SCE�’s residential Summer Discount Plan (AC cycling) is the second to highest triggeredprograms with 23 DR events and 24 event hours in 2012, which is still far below the 180 hoursof its event limit despite the SONGS outage. The Base Interruptible Program (BIP) had only onetest event for two hours in 2012.

However, SCE�’s DR program event hours were either within the program historical ranges orbelow the 2006 2011 maximum except for Agricultural Pumping Interruptible with 7 hours in2012 as comparing to 0 to 2 from 2006 to 2011.

What were the reasons for the differences between the 2012 DR event numbers and hoursand the event limits?

SCE explained that the reasons for the differences between the 2012 DR event numbers andhours vary for each program, which is summarized in Appendix L.41 The reasons can becharacterized for the three types of DR programs as: 1) trigger conditions, 2) optimal dispatches,and 3) no nomination

As discussed above, DR program operations are based on the trigger criteria set for eachprogram. For the non Emergency Programs, SCE indicated that optimizing performance andminimizing customer fatigue is an additional factor considered in its decision to trigger a DRprogram. SCE�’s optimal dispatch strategy may have resulted in the DR events and hours far

39 SCE 02, Appendix E, Table 2 A at E 4 and E 5.40 Id.41 SCE 02, Appendix E, at E 6 and E 7.

Page 38: StaffReport_2012DRLessonsLearned

34

below the maximum hours and events for the programs. For example, SCE�’s Summer DiscountPlan is available for 180 hours annually. However, customers would probably never expect thatthis program will be triggered close to 180 hours based on their experience to date with theprogram. As shown in Appendices M and N, staff finds a similar trend with SDG&E�’s DR eventdata.

IV. Peaker Plant Comparison

Most of SCE�’s non Emergency Programs include resource limitation as a program trigger.Therefore, in theory, one would expect that SCE would trigger DR programs before dispatchingits peaker plants in accordance with the Loading Order. In light of the SONGS outage, theCommission anticipated more SCE and SDG&E DR events in 2012, yet SCE dispatched peakerplants substantially more than DR programs (compared to their historical averages as discussedbelow.

How do the historical DR events compare to the utilities�’ peaker plants?

SCE provided the permit and service hours for four of its own peaker plants, three werelocated in the SONGS affected areas, which is shown in Appendix O.42 SCE historicallydispatched its peaker plants about 9% to 16% of the permissible service hours annually. Asshown in the table below, during the same period, SCE triggered its non Emergency DRprograms 11 to 106 hours on average. However, in 2012, SCE dispatched its peaker plantsthree to four times more than the historical average. On the other hand, SCE�’s 2012 DR eventhours were less than the historical range. SDG&E�’s peaker plant and DR event data show asimilar trend as SCE. For example, SDG&E�’s Miramar ran 4,805 hours out of 5,000 hours ofemission allowance. In contrast, its Critical Peak Pricing with the most triggered hours wasdispatched 49 hours out of 126 hours of annual limit.

Table 20: DR Event Hour Vs. Peaker Plant Service Hours

2006 2011 Range 2012

SCE:

Peaker Plants 96 �– 129 Hours 405 �– 465 Hours

Non Emergency DR 11 �– 106 Hours 2 �– 64 Hours

SDG&E:

Peaker Plants 436 �– 1715 Hrs. 974 �– 4805 Hrs.

Non Emergency DR 19 �– 39 Hrs. 14 �– 49 Hrs.

In addition, staff observed that the Utilities highest DR event hours occurred in 2006 and2007 during the summer heat storms but the highest peaker plan hours occurred in 2012. Thisdata suggests that the Utilities under utilized DR programs and over relied on its peaker plants,which is inconsistent with the Loading Order.

42 SCE 01, Appendix C, Tables 9 and 10 at Page 17.

Page 39: StaffReport_2012DRLessonsLearned

35

In its comments on the 2013 2014 DR Proposed Decision, SCE disagreed with the suggestionof �“under utilization�” of DR programs based on the 2012 DR events. SCE argued that �“(s)implybecause SCE did not dispatch all of the programs�’ available hours does not mean the programsshould have been dispatched more�…Optimal utilization (of DR) ensures the necessary amountof load drop to enable a reliable grid�…�”43 SCE should explain why it dispatched its peaker plantssubstantially more last summer instead of DR and whether SCE�’s optimal dispatch of DR or thetrigger criteria or designs resulted in SCE�’s increased reliance on peaker plants.

Due to the time constraint and absence of the Utilities�’ explanations, staff is unable tocomprehensively address this issue in this report. The Utilities data warrants further evaluationto ensure the usefulness of DR resource as a replacement of peaker plants and the complianceof the Loading Order.

V. Conclusions

Consistent with D.13 04 017, staff finds that most of SCE�’s DR programs did not attain themaximum number of events and/or hours except for SCE�’s Critical Peak Pricing. The Utilities�’total numbers of DR events and hours in 2012 were within the historically average, but far fromthe program limits. However, in contrast, staff found that SCE owned and contracted peakerplants were dispatched far more in 2012 in comparison with the historical averages. Somepeakers were much closer to their emission allowance than the DR hours were to theiroperating limits. Staff reaches a similar conclusion with SDG&E�’s DR programs in comparisonwith its peaker plants.

If the Utilities have historically never triggered their DR programs close to the availablehours, there is a concern with how realistic these limits are. There is a reliability risk if theUtilities are relying on a DR resource that has never been used to its full capacity. In addition,the DR cost effectiveness should reflect the historical operations. Staff recommends theCommission to address the issue in future DR evaluation and budget approval proceedings.

43 SCE Opening Comment filed on April 4, at 4 5.

Page 40: StaffReport_2012DRLessonsLearned

36

Chapter 4: Residential Demand Response Programs

I. Summary of Staff Analysis and Recommendations

Analysis of Residential programs included Peak Time Rebate (PTR) and AC Cycling. Overall,customers seem satisfied with the programs based on utility reports and surveys. However staffencountered problems with program design and operation that need to be addressed toimprove reliability and effectiveness of the programs.

For PTR, staff found that customers who received utility notification of events have higherawareness of the program when compared to customers who were not notified by the utility orreceived indirect notification such as mass media alerts. More importantly, data for bothutilities show that customers who opted into receiving alerts were the only group thatsignificantly reduced load. For both utilities, customers defaulted on MyAccount to receivealerts did not reduce load significantly. However, the entire eligible customer class qualifies forbill credits, which resulted in a problem of 'free ridership.' Both utilities should modify PTR froma default to an opt in program, where only customers opting to receive event alerts wouldqualify for bill credits.

For SCE's Residential AC Cycling staff found that the current group dispatch strategy isresulting in a rebound effect. The rebound effect impacts the actual load reduction the programis capable of producing. Staff recommends SCE to (1) align the maximum program eventduration with customer preference for shorter events to improve forecast, and to (2)reconsider its incentive structure to favor participation in longer event duration.

Finally, both utilities should take advantage of AMI infrastructure and related enablingtechnology that could improve program delivery, reliability and customer experience.

II. Residential Peak Time Rebate (PTR)

A. Overall Customer Experience

For both utilities, customers were generally satisfied with the program. For SCE, customersseem satisfied with the level of incentives, the time between notification and event. Howevercustomers would like more information regarding the program and bill credits. SDG&E�’scustomers reported overall customer satisfaction with the program, but similar to SCE�’scustomers, would benefit from more information and outreach.

Level of awareness for both utilities seems higher amongst customers who chose to sign upto receive notifications. This is reflected in the overall load reduction verified by ex post data.Only customers who signed up for event notification significantly reduced load.

For PTR, none of the utilities noticed evidence of customer fatigue, but this does mean it didnot occur; just that it was not noticeable.

Page 41: StaffReport_2012DRLessonsLearned

37

B. SCE�’s Peak Time Rebate/Save Power Day

1) Summary

Customers who received utility notification of events have higher awareness of the programwhen compared to customers who were not notified by the utility. More importantly,customers who opted into receiving alerts were the only group that significantly reduced load.Customers defaulted on MyAccount to receive alerts and the remaining customers not directlynotified by the utility did not reduce load significantly. SCE considered only customers whoreceived alerts in their forecast and ex post verification. However, the entire eligible customerclass qualifies for bill credits. Awareness of the program, reflected by the willingness to sign upfor receiving alerts, seems to indicate more willingness to reduce load. This factor should beconsidered in program design. Staff identified an issue with �‘free ridership�’, where customersare paid even though they didn�’t significantly reduce any load. Staff recommends changing PTRfrom a default program to an opt in program, paying bill credits only to customers who opt into participate.

2) Background

D.09 08 028 approved Save Power Day, SCE�’s Peak Time Rebate (PTR) rate. The decisionapproved bill credits of 0.75c/kWh reduced with an additional 0.50c/kWh for customers withenabling technology.

This is a default program for residential customers with a smart meter and has beenavailable since 2012. The program provides incentives to eligible Bundled Service Customers,who reduce a measurable amount of energy consumption below their Customer SpecificReference Level (CSRL) during PTR Events.44,45

The utility may call events throughout the year on any day, excluding weekends andholidays. Events will take place between 2pm and 6pm on days an event is called. Participantsreceive a day ahead notification of the event. Bill credits will be paid in each billing cycle basedon the sum of events called and usage reduction during the period. 46 Bill credits will berecovered from the respective customer class through the Energy Resource Recovery Account(ERRA).

During 2012, SCE started defaulting customers on MyAccount to receive email notifications,with the remaining customers not directly notified by the utility. Alternatively, customers maychoose to opt in to receive alerts. As of November 30th, approximately 4 million customers areon PTR and 824,000 were signed up to receive notifications (via MyAccount).47 According to SCE,

44 SCE Schedule D �– Domestic Service, sheet 345 CSLR: peak average usage level�” is the customer�’s average kWh usage during the 2:00 p.m. to 6:00 p.m. time

period of the three (3) highest kWh usage days of the five (5) non event, non holiday weekdays immediatelypreceding the PTR Event. The CSRL is used to determine the customers kWh reduction for each PTR Event inorder to calculate the rebate.

46 SCE Schedule D �– Domestic Service, D.09 08 028 Att. C at 7.47 SCE 01 Testimony at 27, lines 11, 18 19.

Page 42: StaffReport_2012DRLessonsLearned

38

approximately 60,000 customers have opted in to receive alerts in 2012 during the summermonths.48

3) Lessons Learned

In support of its 2013 2014 Application, SCE provided data to highlight lessons learned fromthe 2012 program year.

Customer awarenessAwareness of the program is higher amongst the group of customers whom the utility

notified of events: 66% of notified respondents were familiar with the program but only 43%were familiar in the group not notified49. When prompted for awareness of events, the samepattern is noticeable. 72% of respondents in the group receiving notifications who were awareof the program claimed awareness of specific events, compared to 40% in the group notreceiving notifications. When including customers aware and the ones prompted withinformation about the program, 55% of the notified group was aware but only 23% of the nonnotified respondents was aware.50

Customer satisfactionThere was no information regarding customer perception of fairness of savings/incentive

levels in SCE�’s data, however customers seem to link participation with expectation of savingsas 80% of respondents identified earning bill credits as important for participation51. Moreover,participants seem to be willing to participate even in the face of low savings.52

Event notificationThe majority of respondents aware of the program found out about events via utility

notification (over 60% for the opt in group). Close to 23% of respondents in the overallpopulation found out about events in the news.53

According to results of the customer surveys, about 90% of customers notified of the eventand about 56% of customers not notified but aware of the event, were happy with the amountof time between notification and event54. It appears that a day ahead strategy could beadequate, however customers were not prompted regarding preference for a day of reminder,so it is not clear from the lessons learned if this could increase awareness and response. SCErequested to add a day of notification in their 2013 2014 Program Augmentation Application,which the Commission denied due to lack of evidence of need.55

48 Email communication with SCE (4/5/2013)49 SCE 02 Appendix A at 3.It is important to note that the surveys only represented results for two groups:

customers notified by the utility and customers who were not notified. Defaulted customers and customers notdefaulted into receiving notifications from the utility were bundled together under notified customers.

50 SCE 02 Appendix A at 451 SCE 02 Appendix B at 2452 SCE 02 Appendix B at 3653 SCE 02 Appendix A at 554 SCE 02 Appendix A �– Save Power Day Incentive/Peak Time Rebate Post Event Customer Survey at 1555 D. 13 04 017, at 28

Page 43: StaffReport_2012DRLessonsLearned

39

Customer preferenceAnother survey showed that customers would benefit from more information about the

program, most specifically in terms of expectations of savings. The majority of customers wouldprefer to be notified by email and they believe that a reminder at the beginning of the summerwould help them to be more ready to participate.56

Program utilizationPTR has no limits on the number of events called, with maximum of 4 hours per event. SCE

called 7 events, 28 total event hours in 2012 and did not observe evidence of customer fatigue.The trigger criterion was temperature for all events. 57 Although SCE explains the need tobalance usefulness with the preservation of the resource58, the program appears underutilizedin 2012. Still, this is the first year of the program.

Other findingsSCE states that third party providers such as telecommunication companies, cable

companies, security providers, retailers, and manufacturers of thermostats or providers ofhome automation services are potential partners to reach untapped load reduction potential inthe residential sector59. As part of their 2013 2014 Program Augmentation Applications, SCEhas proposed a pilot to explore this market and the Commission has approved funding for thispilot.60

4) Analysis of settlement and ex post data

Ex post load impactSCE only calculated ex post data for customers notified of events; it did not verify ex post

load impact for customers not notified by the utility. This indicates that this group was notexpected by SCE to reduce load significantly. SCE�’s 2012 Load Impact Report found thatcustomers who opted into event notifications reduced a statistically significant average of0.07kWh per hour.61 The same report found that customers defaulted into receivingnotifications did not produce statistically significant load impact. 62

Incomplete data does not allow staff to verify with certainty the differences in loadreduction between all participant groups (opt in, defaulted in notification and the remaining ofthe population). However staff looked at the all the data SCE provided to look for evidence ofwhat is most likely happening.

56 SCE 02 Appendix B �– Save Power Days Research Study Results at 3957 SCE 03 �– March 4, 2013 �– Appendix B Table 4; SCE o1, Appendix C at 14.58 SCE 01, Appendix C at 1459 Email communication with SCE April 10, 2013.60 D.13 04 017, OP 1961 �‘2012 Load Impact Evaluation of Southern California Edison�’s Peak Time Rebate Program�’ Christensen Associates

Energy Consulting (4/1/2013) at 1. This figure is slightly lower than what the 0.097kW reported on SCE 03March 4, 2013 at 22.

62 Id at 24

Page 44: StaffReport_2012DRLessonsLearned

40

It is interesting to notice that for the first four events, customers defaulted did reduce loadalthough not significantly, but for the three last events, their load in fact increased. In contrast,the opt in group, to various degrees, reduced load for all events. The ex post results variedconsiderably between events, even though the temperature seems fairly constant and notextreme. It would be interesting to investigate why such variability and how it could help toimprove ex post results to improve reliability of the program. A more detailed analysis ofimpact can be found on the sections above.

Table 21: 2012 Ex post Load Impact by Group (MW)(Average Event Hour)

Event Date

Customerswho optedinto alerts

(a)

Customersdefaulted into emailalerts excludingOpt in alerts

(b)

Customers notnotified directly

of events(c)

Temperature(d)

7/12/12 N/A N/A N/A 80

8/10/12 39.60 56.25 N/A 89

8/16/12 11.17 13.25 N/A 89

8/29/12 21.22 0.71 N/A 92

8/31/12 6.37 6.35 N/A 86

9/7/12 0.17 23.28 N/A 84

9/10/12 6.04 4.39 N/A 89

Source: Email communication with SCE (3/25/2013); SCE 01 Appendix C Table1

Settlement data analysisIn 2012, SCE paid a total of $27,349,008 in incentives for PTR residential customers.63 SCE

provided full settlement data, which shows evidence of a potentially large �‘free ridership�’problem, where customers receive incentives without significantly reducing load.

63 Email communication with SCE (4/5/2013)

Page 45: StaffReport_2012DRLessonsLearned

41

Table 22: Settlement Load Reductions MW(Average Event Hour)

Event Date

Customerswho optedinto alerts

(a)

Customersdefaulted intoemail alerts

excluding Optin alerts

(b)

Customersnot notifieddirectly ofevents(c)

EventSettlement

(d)

7/12/12 85.9 140.08 1,613 1,839

8/10/12 55.94 134.68 827 1,018

8/16/12 87.99 233.35 1,499 1,821

8/29/12 37.19 84.36 579 700

8/31/12 52.95 132.79 981 1,166

9/7/12 60.85 165.71 1,105 1,332

9/10/12 61.9 139.65 1,049 1,250

Average (MW) 63.2 147.2 1093 1304

% 4.9% 11.3% 83.9% 100.0%

AverageParticipants 60,190 160,430 1,265,544 1,486,165

% 4% 11% 85% 100%

Source: Email communication with SCE (4/5/2013)

According to settlement data, 84% of bill credits were paid to customers whose load impactwas not considered for forecast or ex post purposes. In addition, 11% of incentives were paid tocustomers defaulted into receiving notifications and did not produce statistically significantload impact. 64 This means that in fact 95% of all incentives were paid to customers who eitherwere not expected to or did not reduce load significantly.

64 �‘2012 Load Impact Evaluation of Southern California Edison�’s Peak Time Rebate Program�’ Christensen AssociatesEnergy Consulting (4/1/2013) at 24

Page 46: StaffReport_2012DRLessonsLearned

42

Table 23: 2012 PTR Incentives Paid

Event Date

Customerswho opted

into alerts (Expost MW)

(a)

Customersdefaulted intoemail alerts

excluding optin alerts (expost MWreduction)

(b)

Customers notnotified by SCE

(c) Total

7/12/12 $254,572 $419,794 $4,836,197 $5,510,563

8/10/12 $166,245 $403,752 $2,480,819 $3,050,816

8/16/12 $261,825 $699,568 $4,495,547 $5,456,940

8/29/12 $110,681 $252,931 $1,734,182 $2,097,794

8/31/12 $157,557 $398,093 $2,939,474 $3,495,124

9/7/12 $181,406 $496,648 $3,312,785 $3,990,840

9/10/12 $184,349 $418,665 $3,143,816 $3,746,830

Total $1,316,635 $3,089,451 $22,942,822 $27,348,908% 5% 11% 84% 100%

Source: Email communication with SCE (4/5/2013)

As there is no ex post data for customers not directly notified by the utility (either opted toreceive or defaulted in notification), it is not possible to verify their actual impact and if it wouldbe significant or not. However, based on the fact that not even defaulted customers reducedload significantly and findings from SDG&E (see next section), it is fair to assume that results forthat group would not be significant.

Incentives and capacity costIt is possible to notice difference in cost of capacity between the group who opted in to

receive notification and the group defaulted to receive notifications. In this report, staffnormally used the average event hour reductions. But as in SCE�’s case there is such variability inex post results, staff will use average hourly impact for all events as a simple way of showingthat the average capacity produced by the defaulted group is nearly six times more expensivethan the average capacity produced by the opt in group.

Page 47: StaffReport_2012DRLessonsLearned

43

Table 24: 2012 PTR Cost of Capacity

Event Date

Customerswho optedinto alerts(MW)(a)

Incentives paidto the groupper event

(b)

Customersdefaulted intoemail alerts

excluding Optin alerts (MW)

(c)

Incentives paidto the groupper event

(b)

7/12/12* N/A $254,572 N/A $419,794

8/10/12 39.60 $166,245 56.25 $403,752

8/16/12 11.17 $261,825 13.25 $699,568

8/29/12 21.22 $110,681 0.71 $252,931

8/31/12 6.37 $157,557 6.35 $398,093

9/7/12 0.17 $181,406 23.28 $496,648

9/10/12 6.04 $184,349 4.39 $418,665

Average 14.10 6.03Total $1,062,064 $2,669,657Cost of

Capacity $75.34 $442.62

Source: Email communication with SCE (4/5/2013). Staff did not include the 7/12/12 eventin the calculation as there is not ex post data for this event.

5) Findings

Based on analysis of program design, settlement and ex post load impact and customerparticipation data for the summer of 2012, staff has found the following:

The program, as approved in the decision, pays the same amount of incentives for allcustomers enrolled into the program. There is additional incentive for customers who haveenabling technology.

There are differences in performance, awareness and willingness to reduce load betweencustomers who were notified directly by the utility and customers who were not.

Customers are overall satisfied with notification mode, timing and level of incentives. There is not enough information to determine if customer fatigue is an issue. Ex post analysis of customers who opted into alerts significantly reduced their load in

comparison to customers only defaulted into alerts. This indicates that customer willingnessto participate (indicated by the action to sign up for alerts) may help improve load reduction.

Incomplete ex post load impact results show load reduction for customers notified by theutility �– both who have signed up and defaulted into receiving alerts. No results wereavailable for the entire population.

It is not possible to verify if incentives paid to non notified customers did not result insignificant load reduction, but the fact that SCE does not include this group in its forecastand ex post results indicates that their load impact is not significant.

Page 48: StaffReport_2012DRLessonsLearned

44

There is potential �‘free ridership�’ issue in SCE�’s PTR.

C. SDG&E�’s Peak Time Rebate/Reduce Your Use

1) Summary

Overall, customers are satisfied with the program. There is difference, however, in loadawareness and load reduction between customers who opted into receiving alerts and the restof the population. Only customers who opted into receiving utility notification significantlyreduced load. However, the entire population qualifies for bill credits. Awareness of theprogram, reflected by the willingness to sign up for receiving alerts, seems to indicate morewillingness to reduce load. Staff identified an issue with �‘free ridership�’, where customers arepaid even though they didn�’t significantly reduce any load. Staff recommends changing PTRfrom a default program to an opt in program, paying bill credits only to customers who opt into participate.

2) Background

D.08 02 034 approved the Reduce Your Use program, SDG&E�’s Peak Time Rebate (PTR) rate,the first dynamic rate of such design approved by the Commission65. The program has beenavailable since the summer of 2012, with a pilot in 2011.

The program is implemented as proposed: �‘A two level PTR incentive with a higher levelpayment for customers who reduce electric usage below an established CRL [customerreference level]66 with enabling demand response technology, and a lower level payment tocustomers without such technology.�’67

Customers receive a bill credit of 0.75$/kWh with an additional credit of 0.50$/kWh forcustomers with enabling technology. SDG&E�’s tariff lists programmable communicatingthermostats (PCTs), AC cycling, pool pump cycling as examples of technologies eligible for the0.50¢/kWh additional incentive.68 Commission has approved the addition of In Home Displays(IHD) to the list of enabling technologies in SDG&E�’s tariff.69

The utility may call events throughout the year without limit to the number of events called.Events will take place between 11am and 6pm on days an event is called and participantsreceive a day ahead notification of the event. Bill credits will be paid in each billing cycle based

65 SCE�’s Save Power Day program was approved in 2009 on D.09 08 028.66 Defined as the �‘total consumption for the PTR event period averaged over the three (3) highest days from within

the immediately preceding five (5) similar non holiday week days prior to the event. The highest days aredefined to be the days with the highest total consumption between 11 a.m. and 6 p.m. The similar days willexclude weekends, holidays, other PTR event days, and will exclude other demand response program event daysfor customers participating in multiple demand response programs.�’ SDG&E PTR Tariff.

67 D.08 02 034 at 22.68 SDG&E PTR tariff defines enabling technologies as to be �‘initiated via a signal from the Utility, either directly to

the customer or the customer�’s device, or via a third party provider to the customer or the customer�’s devicethat will reduce electric energy end use for specific electric equipment or appliances, is included in a designatedUtility demand response program, and that is acceptable to and approved by the Utility, subject to theverification of processes necessary to safeguard confidential and proprietary Utility and customer information.�’

69 D.13 04 017, OP 22

Page 49: StaffReport_2012DRLessonsLearned

45

on the sum of events called and usage reduction during the period. Bill credits will be recoveredfrom the respective customer class through the Energy Resource Recovery Account (ERRA).70

The utility can call only one event per day with a maximum of 7 hours.

3) Lessons Learned

In support of its 2013 2014 Application, SDG&E provided data to highlight lessons learnedfrom the 2012 program year. For PTR, SDG&E conducted three post event surveys.

Customer AwarenessResults of the surveys showed differences in level of awareness between the three main

groups71 of customers participating in PTR: customers who actively opted into day ahead eventnotifications (opt in), customers registered onto MyAccount and receiving event notifications(default) and customers not directly notified by the utility, but notified via mass media (noMyAccount). In general, the opt in group demonstrated the highest level of awareness of thePTR events. About 83% of the opt in group was aware of the program concept �– events and billcredit, compared to 43% of respondents in the defaulted group and 40% in the no MyAccountgroup.72

Customer SatisfactionCustomers are generally satisfied with the amount of incentives paid.73 Customers also

seem generally satisfied with number of notifications, although respondents did indicate thatmore promotion and information about the program would be beneficial.74 SDG&E indicatedthat is working to resolve issues of notification encountered in 2012 as well as working toimprove customer education for using online tools. 75 Overall, customers responded positivelyto the program.

Program UtilizationIn the summer of 2012, SDG&E called 7 events, a total of 49 event hours, and all events

were called due to temperature76. Given that this program has no limit of events, the programseems underutilized. However SDG&E states that even if a temperature point is reached, theprogram may not be necessarily called, as system need is assessed internally. This approachalso takes into consideration customer experience. 77

Customer FatigueSDG&E states that it is difficult to determine if customer fatigue is an issue, but ex post

results show that when program was called three days consecutively in August, the load impact

70 SDG&E GRC Phase 2 Settlement at 8.71 SDG&E in post event surveys segmented customers into more than the groups analysed in this report, but to

simplify the analysis, staff looked only at the main three groups of participants.72 SGE 02 February 4th, 2013 Attachment 6 (Table 5)73 SGE 02 �– Revised appendix X at 20.74 SGE 02 Feb 4th, 2013 Att. 5 Table 13, Att. 6 Table 975 SGE 02, Revised Attachment 1 �– Revised Appendix X at1976 SGE 02, Revised Attachment 1 �– Revised Appendix X Table 1177 SGE 02, Revised Attachment 1 �– Revised Appendix X at 14

Page 50: StaffReport_2012DRLessonsLearned

46

was lowest on the last day.78 Temperature does not seem to be a factor as the day with thelowest reduction had similar temperature to two preceding days. Still, the result does not seemconclusive.

Table 25: Customer Fatigue79

Event Date

Average EventHour Reduction

(MW)Temperature

(°F)

8/9/12 3.2 888/10/12 3.1 928/11/12 1.7 91

Enabling TechnologyEnabling technology seems to be improving load reduction as preliminary results show that

customers with In Home Display (IHD) saved 5% to 8% on average during events, whilecustomers without saved between 0% to 2%.80

Effort to reduce usage during eventsSDG&E investigated as part of post event surveys what actions customers would take on

event days and the level of effort made to respond. While actions taken were hypothetical, i.e.do not reflect reported actions taken, respondents in all three groups seem aware of possibleactions to reduce load.

For instance 38% of opt in respondents, and around 30% of MyAccount and 30% noMyAccount said they could unplug electronics. 41% of the opt in, 23% of my account and 19%on no MyAccount would turn off AC. When prompted about the effort made to reduce usageduring the August 14th event, 33% of opt in respondents indicated having made �‘a lot moreeffort than usual�’ in comparison to around 10% for MyAccount and 10% no MyAccountrespondents. 54% of the opt in respondents and around 40% of MyAccount and 40% of noMyAccount groups said they made somewhat of an effort. Finally 13% of the opt in, 50% of theMyAccount and 44% of the no MyAccount made no more or less effort than usual to reduceload. 81

The results seem to indicate that respondents in all groups, irrespective of IOU notification,may have made an effort to reduce load and did know what options they had to do so. Still, expost load reduction shows that only the opt in group, about 6% of the entire population,significantly reduced load, contradicting assumptions that mass media or defaulting customersinto email alerts could generate significant reduction.

78 SGE 02, Revised Attachment 1 �– Revised Appendix X at 1179 Source: SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 3.80 SGE 02 February 4th, 2013 at 5, Lines 5 7.81 SGE 02 February 4th, 2013 Attachment 6 (Table 11 and 12).

Page 51: StaffReport_2012DRLessonsLearned

47

4) Analysis of settlement and ex post data

Ex post load impactAwareness of the program and willingness to participate (in the form of signing up to

receive alerts) seem to be an important factor in load reduction. This is supported by analysis ofex post data. The opt in group was the only group to produce statistically significant loadreductions82.

Table 26: Ex Post Load Reductions83

(Average Event Hour MW)

Event Date

Customerswho opted into

alerts(a)

Customerson MyAccountexcluding Opt in

alerts(b)

Customers not onMyAccount excluding

opt in alerts(c)

Temperature(d)

7/20/12 6.1 0 0 87

8/9/12 3.2 0 0 88

8/10/12 3.1 0 0 92

8/11/12 1.7 0 0 91

8/14/12 1.1 0 0 88

8/21/12 3 0 0 83

9/15/12 8.2 0 0 104

Settlement analysisBased on average hour load reduction used for settlement calculation, 94% of incentives

were paid to customers either defaulted to receive email alerts on MyAccount or customers noton MyAccount and 6% were paid to customers that opted into alerts.84 When compared to expost data, only customers who opted into alerts, or about 4% of the total population enrolledon PTR, significantly reduced load. 85,86 This points to an issue of �‘free ridership�’, wherecustomers receive incentives without significantly reducing load.

82 SGE 01a, at 3.83 Source: SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 3.84 SGE 02, Attachment 1, Revised Appendix X Table 3 and SGE 03 March 4, 2013 Table 3.85 SGE 02, Attachment 1, Revised Appendix X at 4 and Table 3.86 For PTR residential and small commercial the participants represent the customers who proactively opted into

alerts and the enrollment number represents all the customers who were eligible to receive a bill credit. Fiftypercent of residential customers are enrolled in MyAccount and received an e mail alert.�’ SGE 02, Attachment 1,Revised Appendix X at 4.

Page 52: StaffReport_2012DRLessonsLearned

48

Table 27: Settlement Load Reductions MW87

(Average Event Hour)

Event Date

Customers whoopted into alerts

(a)

Customers onMyAccount

excluding Opt inalerts (default)

(b)

Customers not onMyAccount excluding

opt in alerts(c)

EventSettlement

(d)

7/20/12 10 79 71.2 160.1

8/9/12 13.7 100.2 89 202.8

8/10/12 12.6 97.1 87 197

8/11/12 12.7 117.3 101.2 231.1

8/14/12 14.8 118.4 106.7 240

8/21/12 29.9 270.3 258.4 559

9/15/12 17.3 151.1 129.8 298

Average (MW) 15.9 133.3 120.5 269.7

% 5.9% 49.4% 44.7% 100.0%

AverageParticipants 45,268 562,982 608,250 1,171,232

% 4% 48% 52% 100%

Incentives and capacity costIn 2012, SDG&E paid out $10,134,879 in incentives for PTR residential customers.88 If

assuming the estimate MW reported to the CAISO (7 day Report), the program maximumexpected capacity was an average event hour impact of 45.8 MW (event hours).89 This implies acapacity cost of approximately $221/kW. According to ex post data, the actual capacitygenerated was an average event hour of 8.2MW resulting in a cost of capacity of $1,232.7/kW.This cost will be recovered from the residential class of customers.

5) Findings

Based on analysis of program design, settlement and ex post load impact and customerparticipation data for the summer of 2012, staff has found the following:

The program, as approved in a Commission decision, pays the same amount of incentivesfor all customers enrolled into the program. There is additional incentive for customers whohave enabling technology.

87 Source: Adapted from SGE 02 Attachment 1, Revised Appendix X, Table 2 5; SGE 03 March 4th Table 3.88 SDG&E AL 2420 E, at 2.89 SGE 02, Attachment 1, Revised Appendix X Table 3.

Page 53: StaffReport_2012DRLessonsLearned

49

There are differences in performance, awareness and willingness to reduce load betweenthe three main groups of participants: customers who opted to sign in to receive alerts,customers defaulted into MyAccount to receive event email alerts and customers not yet onMyAccount and not being directly notified by the IOU and who finds out about events viamass media.

There is not enough information to determine if customer fatigue is an issue. Ex post load impact results show only customers who signed in to receive alerts significantly

reduced load. 94% of incentives paid did not result in significant load reduction. �‘Free ridership�’ is an issue in SDG&E�’s PTR, where the majority incentives were paid to

customers who did not significantly reduce load. Based on incentives paid during the summer of 2012, the cost of capacity is five times

higher when adjusting forecasted load impact by ex post load impact.

D. Staff Recommended Changes for PTR

It is clear that �‘free ridership�’ is an issue that needs to be addressed. It is an issue whenforecasting load reduction �– the forecasted impact would be much higher than what could beverified, and results in additional costs to ratepayers. While �‘free ridership�’ in most cases is abaseline and settlement methodological issue, this issue could be partially alleviated by changesin program design.

Incentives should reward and encourage customer engagement. Therefore, staffrecommends changing PTR from a default program to an opt in program, eliminating incentivespaid to customers not actively choosing to receive event alerts and keeping the currentincentive level to customers who sign up to receive alerts and use enabling technologies. Staffsuggests the following incentive structure:

Table 28: Propose Program Structure

Group $/kWh

Opt in to receive alerts 0.75

Opt in to receive alerts andEnabling Technologies

1.25

Not opt in Not a participant in theprogram

This approach to PTR would ensure that customers are rewarded for the level of action theyare prepared to take. If this proposed level of incentives were in place in 2012, it could havereduced the amount of incentives paid by about 95% as shown below.90

90 To simplify the calculation, staff ignored the additional $0.50/kWh for enabling technology. These incentiveswould be paid in addition to the $0.75 /kWh.

Page 54: StaffReport_2012DRLessonsLearned

50

Table 29: Iillustration of Staff Proposed Changes for SCE

Current Incentive Structure

GroupIncentive Level

($/kWh)Capacity (MWEx post*)

Total incentivepaid ($) Cost of capacity ($/kW)

All 0.75 95.8 27,349,009 285.48

Proposed Incentive Structure

Opt in 0.75

95.8

1,328,160

13.86No opt in 0

Potential reduction 95%

* Ex post for the entire program

Table 30 Iillustration of Staff Proposed Changes for SDG&E

Current Incentive Structure

GroupIncentive Level

($/kWh)Capacity (MWEx post*)

Total incentivepaid ($) Cost of capacity ($/kW)

All 0.75 8.2 10,108,082 1,232.69

Proposed Incentive Structure

Opt in 0.75

8.2

582,750

71.07No opt in 0

Potential reduction 94%

* Ex post for the entire program

While issues of baseline and settlement methodology are out of the scope of this analysisand would demand a much more in depth investigation, it is possible to attempt to alleviate theimpact of free ridership by limiting PTR bill credits to customers who do not opt to participate.

Utilities should focus on encouraging customers to adopt enabling technologies. Perhapssome of the resources saved by having a three tier structure of incentives could be used tosubsidize enabling technologies to enable direct load control. Also, utilities should explorealternatives to service delivery such third party entities. SCE found that the interest of thirdparties is shifting towards the residential sector and such opportunities should be seriouslyexplored.

Finally, utilities should track as part of their ex post verification efforts, if the presence ofenabling technologies significantly improves load reduction and if there is difference betweendifferent technologies used. In addition, utilities should look to investigate if customer fatigue isan issue, especially in view of the SONGS outage potentially increasing the trigger of PTR events.

Page 55: StaffReport_2012DRLessonsLearned

51

III. Residential Air Conditioning (AC) Cycling

A. Overall Customer Experience

Customers were generally satisfied with the program. For SCE, 2012 was the year theprogram was transitioned from emergency to price trigger. SCE reports customers have kept apositive view of the program and regarded incentives an important part of participating in theprogram. Customers did report that they prefer shorter and more frequent events as opposedto longer events.

SDG&E also reports overall customer satisfaction but points that the majority of customerscomplaints were due to uncomfortable temperatures due to the unit cycling on/off. Also,SDG&E reports customers were satisfied with the level of incentives.

No utilities reported customer fatigue, although SDG&E had three events in consecutivedays and load reduction dropped. However, without analyzing other factors, such as humidityand customer perceptions of discomfort, amongst other factors, that could have contributed toload impact reduction, it is not possible to say with certainty if it did occur or not.

A. SCE�’s Summer Discount Plan

1) Summary

SCE�’s AC Cycling changed its event trigger structure from emergency to price. Customersseem satisfied with the current program design. Staff has identified that the program has issueof �‘rebound effect�’ and recommends that the program design should be changed to include anadditional level of incentive that would cater to customers willing to cycle their unit for theentire event duration in below.

2) Background

As part of a D.11 11 002, SCE agreed to transition the Residential Summer Discount Plan(Res SDP) from emergency to price trigger and to bid Res SDP�’s load in the CAISO market fordispatch. D.11 11 002 authorized revisions to SCE�’s program to enable the changes agreed to ina settlement.91

As currently designed, Res SDP offers an annual incentive for customers who wish toparticipate in the program. The program offers two choices for cycling duration as well as givescustomer the choice to override an event up to five times in the year for slightly lowerincentives. Incentives are calculated according to size of the equipment, cycling duration andoverride option:92

91 D.11 11 002 at 2 492 SCE Schedule D SDP, sheet 1; SCE 01 Testimony Table II 2

Page 56: StaffReport_2012DRLessonsLearned

52

Table 31: SCE Residential AC Cycling Incentives

Option Incentive p/Summer Saver day p/

ton

100% cyclingmaximum savings

(based on 4.5 ton unit)

50% cyclingmaximum savings

(based on 4.5 ton unit)

Standard Option $0.36 (100% cycling)

$0.18 (50% cycling)

$200 $100

Override Option $0.18 (100% cycling)

$0.09 (50% cycling)

$100 $50

SCE Res SDP program has approximately 307,000 customers with an expected loadreduction of 466MW.93 Events can be dispatched year round with a maximum of 180 hours andeach event can last up to six hours. In 2012, SCE paid a total of $51,882,087 in incentives.

3) Lessons Learned

The 2012 summer season proved to be a transition year for this program. Customers had totransition from an expectation of little service reduction to expecting several disruptionsthroughout the year. Overall, SCE asserts that customers continue to have a positive view of theprogram.

Lessons learned from the transition in 2012 showed that bill savings are an importantelement for customer participation. The majority of customers opted for the Standard Optionpreferring savings to override capability, and the ones who chose to override rarely used it.94

Only 1.5% of customer who left the program did so due to the program changes.Preliminary findings of customer surveys found that customers prefer shorter events even ifmore frequent. SCE experimented with different event duration calls and found that as eventsgot longer, customer dissatisfaction increased.

In 2012, SCE triggered 23 events for a total of 24 hours, for reasons of temperature, CAISOEmergency and evaluation. Because the program changed the trigger condition and design in2012, historical comparison would not be accurate. But data shows that Res SDP was calledmore often than in previous years.95

B. SDG&E�’s Summer Saver

1) Summary

Customers seem satisfied with the program. The program performed in accordance withpast years. Staff does not recommend any changes to the program design.

93 SCE Schedule D SDP, sheet 1; SCE 01 Testimony at 9, Lines 21 23. Load impact based on ex ante estimates fromCommission Monthly Report (12/21/2012)

94 SCE 01 Testimony at 11, Lines 3 595 SCE 03 March 4, 2013, Appendix B Table 4.

Page 57: StaffReport_2012DRLessonsLearned

53

2) Background

The Summer Saver program is a 15 year long term contract based procurement run byComverge.96 Comverge is responsible for installing, removing and servicing the AC unit.

Summer Saver is a direct load control program where a device is installed on the premise tocycle the AC Unit when an event is called. It has a day of notification, meaning customersreceive event notification on the day of the event. The program runs May through October.Customers are eligible to annual incentives for participation based on the cycling option andsize of the unit and participation period: 97

Table 32: Summer Saver incentives

CyclingOption Res Bus

30% N/A $ 9.00 Per ton

50% $ 11.50 $ 15.00 Per ton

100% $ 38.00 N/A Per ton

The Summer Saver program had around 28,500 residential and commercial customersenrolled in 2012.98 The majority of participants are residential customers �– 23,948 in 2012, andthis distribution has been fairly consistent since 2009.99

The program has an event limit of 15 events or 120 event hours. The utility can call oneevent per day and events run for minimum 2 hours and maximum 4 hours. Events can be calledanytime from 12pm to 8pm on event days. In 2012, the utility called 8 events or 29 event hours,an average of 3.6 hours/event. Events will be called based on temperature and system load.100

3) Lessons Learned

Residential customers were responsible for 84% of load reduction during the 2012 summer.SDG&E paid $2.5 million in incentives to residential customers for 18.6 MW average event hour.

The majority of customer complaints were due to uncomfortable temperatures due to theAC cycling.101 Overall, customers seem satisfied with the level of incentives as SDG&E reportedthat less than 1% of customers who left did so due to unfair incentives.102

96 http://www.comverge.com/residential consumer/find a program97 http://www.sdge.com/save money/demand response/summer saver program and email communication with

SDG&E on (3/4/2013)98 SGE 02, Attachment 1, Revised Appendix X Table 299 Email communication with SDG&E (4/4/2013)100 SGE 02, Attachment 1, Revised Appendix X Table 8101 SGE 02, Attachment 1, Revised Appendix X at 19102 SGE 02, Attachment 1, Revised Appendix X at 20

Page 58: StaffReport_2012DRLessonsLearned

54

SDG&E did not report evidence of customer fatigue for Summer Saver, although itrecognizes that this does not mean fatigue does not occur, just that it is not measurable.103 Expost load impact results showed that when the program was called three days consecutivelythere was a drop in the load reduction. However, SDG&E states that there is not enoughinformation to suggest that this is a result of fatigue. Humidity or outside temperature beinglower in the last day than the previous day amongst other factors could have contributed tolower load reduction in the last day.

Table 33: AC Cycling customer fatigue104

Date

Ex post average overevent period (MW)

Res Res+Com Temperature

9/13/12 12.0 12.6 81

9/14/12 18.6 22.5 109

9/15/12 8.2 8.8 104

The frequency of events called has been fairly consistent throughout the programavailability (with a few exceptions like 2008), with the program being called in 2012 accordingto historical average. But when compared to program design, it seems under utilized. Still,there is a higher incidence of events in comparison to event hours inferring events are morefrequent, but shorter.

103 SGE 01, Direct Testimony of Michelle Costello at 10104 SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 2; email communication

(4/3/2013).

Page 59: StaffReport_2012DRLessonsLearned

55

Table 34: SDG&E Summer Saver Historical Comparison of Number of Events and EventHours105

YearEvent hour

(year)Event hours

calledNumber of events

(year) Events Called

2006 120 24 15 8

2007 120 43 15 12

2008 120 8 15 2

2009 120 30 15 7

2010 120 44 15 11

2011 120 22 15 6

2012 120 29 15 8

Average 120 29 15 8

Average historicalperformance

compared to design 24% 51%

2012 compared tohistorical average

According toaverage

According toaverage

C. Staff Recommended Changes for AC Cycling

Staff does not have any recommendations to change in program design for SDG&E at thispoint. SDG&E�’s is a mature program and customers seem fairly satisfied with the offerings.

SCE�’s program trigger just changed from emergency to price and customers seem satisfiedwith the program overall. However, last summer SCE deployed a new dispatch strategy ofwhich it divided the customers into three to six subgroups with one hour per event persubgroup instead of the whole group triggered for the entire event duration. While suchstrategy is optimal for customers�’ comfort, as discussed in Chapter 2, such strategy caused a�‘rebound effect�’106 Program design should help correct this issue. First, the program as designedstates that events can last up to six hours, even though customers seem to prefer shorter eventdurations and dissatisfaction went up as event duration increased107. Also, SCE counts a total ofsix hours per event for RA purposes. SCE needs to review the program proposal to reflect

105 Based on SGE 02, Attachment 1, Revised Appendix X Table 11.106 �‘Effects of an event in subsequent hours, when electricity usage may exceed the curtailed customers�’ reference

load, as air conditioners work to return residences to original temperature set points.�’ 2012 Load ImpactEvaluation of Southern California Edison�’s Residential Summer Discount Plan (SDP) Program at 13

107 Staff does not have more detailed information on customer preference or what would be the ideal eventduration before customers drop off the program.

Page 60: StaffReport_2012DRLessonsLearned

56

customer preference �– if customers will not favor being cycled for 6 hours the program shouldnot have such long event duration proposal.

Moreover, SCE should explore new ways of delivering the program, i.e. using temperaturecontrol via a PCT instead of a switch in the equipment that cycles the unit off/on. This couldallow for longer event duration while maintaining customer engagement as the unit wouldnever be off completely108. In fact, both SDG&E and SCE should take advantage of AMIinfrastructure and related enabling technology that could improve program delivery, reliabilityand customer experience.

108 D. 13 04 017 at 27 states that innovative approaches via using PCTs and OpenADR could enable shorter eventduration. At the time, the Commission did not have data that reflected the rebound effect which maydiscourage short event durations. This issue should be taken into consideration when designing the approvedpilot.

Page 61: StaffReport_2012DRLessonsLearned

57

Chapter 5: Non Residential Demand Response Programs

I. Summary of Staff Analysis and Recommendations

The analysis of customer experience for DR programs for commercial customers focuses onthree key commercial programs: AC Cycling, Auto DR and Demand Bidding Program (DBP).Staff recommends that the outreach and marketing of the new features of SCE�’s AC Cyclingprogram be clearly communicated to the customers to avoid customer dissatisfaction anddropout. In addition, Staff finds that there is limited evidence that the Auto DR programcoupled with the Critical Peak Pricing (CPP) rate provides evidence of greater load impact thanthe load impacts obtained by customers on the CPP rate alone. As a result, Staff recommendsthat future studies continue to explore the load impacts of Auto DR.

Staff recommends SDG&E and the Navy collaboratively design a Navy only DBP program tomeet the unique needs of the Navy. Key attributes of the program would include a day aheadtrigger, aggregation of 8 billable meters and a minimum bid requirement of 3 MW.

II. Background and Summary of Utility Data

In response to the Energy Division letter, SCE and SDG&E provided data on the commercialcustomer experience and commercial customer participation in the non residential DRprograms. Customer enrollment and participation numbers during events by program wereprovided as well as the load impacts that those customers produced. In addition, SCE andSDG&E provided qualitative information on the commercial customer experience of the DRprograms including how customers felt about the incentives offered, whether customers werefatigued by consecutive DR events and if the customers felt that too many DR events werecalled. SCE and SDG&E also provided information on the efficacy and customer experience ofDR event notification.

Overall, SDG&E reported that the customer experience was positive, and that it tried todeliver notification to drop load earlier than required (for both commercial and residentialcustomers).

Among the various non residential program offerings, SDG&E offers a Capacity BiddingProgram (CBP) where participants can choose between a day ahead and a day of program.Participants are required to reduce their usage by 20 kW or more. Program participants receivea capacity payment and receive an energy payment for the hours of reduction. However, theprogram also carries penalties for a reduction of less than a 50% pledge. The customerfeedback for this program came from aggregators109 who suggested that increasing theincentives could potentially increase enrollment in CBP.

SDG&E offers a Demand Bidding Program and has two enrolled non residential customers inthis program. In 2012, the Demand Bidding Program was offered on a day ahead basis withincentives to customers for reducing their demand during an event. In response to SDG&E�’s

109 An aggregator is an entity that aggregates or combines customer load and makes it available for interruption.

Page 62: StaffReport_2012DRLessonsLearned

58

questions about incentives, the DBP customers indicated that the incentives were not highenough. The Commission adopted SDG&E proposal of changing this program from a day aheadto a day of, 30 minute trigger program110.

Another SDG&E non residential program offering is the Peak Time Rebate(PTR) program(Commercial). On event days, participating customers are required to reduce their electricityuse during the event duration. Customers can sign up to be notified of events in advance.Commercial customers signed up for alerts at a much lower rate than residential customers andalso provided less load reduction. Most likely, this is due to limited ability to reduce loadbetween 11 a.m. and 6 p.m.

SDG&E included three post event PTR surveys, which provided results of residential andsmall commercial customers�’ experiences to the PTR event. Key trends were:

Small commercial customers were generally aware of Reduce Your Use days. However,event specific awareness was lower.

Small commercial customers indicated that they face different challenges thanresidential customers in responding to PTR events. Feedback on program improvementfrom commercial customers included the following comments: commercial customersstated that they were not able to reduce more and were already doing what they could;small commercial customers indicated that they would benefit from advancednotification; and finally, they stated that responding to events would affect theirbusiness operations or customer comfort.

General program feedback from SDG&E indicated that estimating the effects of customerfatigue on load impacts is difficult. When event days are called in a row, there are manyvarying factors, such as events being called on different days of the week, varying temperatureson event days, that it is difficult to determine whether the change in load impact is due tomultiple event days or other influencing factors. SDG&E describes its experience with PTRevents called in quick succession, and indicates that preliminary load impacts were lowest onthe last day. Thismay be due to customer fatigue.

For the other programs, the load impacts did not show evidence of customer fatigue. Again,this is not conclusive of customer fatigue not being present. Customer fatigue was simply notmeasurable relative to other variations in load impacts between events.

SCE launched a Summer Readiness Campaign in April 2012 in order to prepare thecustomers for the upcoming summer. Overall the customer experience was responsive andpositive.

SCE offers many non residential DR programs similar to the programs offered by SDG&E.These programs include an AC Cycling program. This program offers customers various ACCycling options where the utility can directly turn off the customer�’s air conditioner whenneeded. Customers receive a credit based on several factors, including the program cyclingoptions that they choose and the AC unit cooling capacity. SCE has proposed changes to this

110 D.13 04 017 at 15.

Page 63: StaffReport_2012DRLessonsLearned

59

program and has requested the ability to call events not just for emergency reasons, but alsofor when the prices are high.

Another non residential program offered is the Auto DR program. The program providesincentives to offset the cost of purchase and installation of technology to help the customerautomatically manage their load. The customer determines their load control strategy and presets it in the technology. With the technology in place, the program automates the process ofreducing demand by sending a signal from a central system to the customer�’s automatedcontrol system, which then automatically reduces usage during program event duration.

During the 2012 Summer Event Season, the Demand Response Help Desk (for large businesscustomer DR program) received 1,410 calls. 21% of those calls were related to program events,however, none of the callers indicated that there were too many events or that the incentivepayments were inadequate. Non event calls (79%) pertained to program eligibility, questionsabout enrollment, assistance with online tools and other general program information.

In mid August, SCE conducted market research to gauge customer awareness of enrollmentcampaigns and SCE messaging, customer actions in response to the campaigns, and attitudestowards SCE and energy conservation. Overall, most of the residential and small businesscustomers heard the campaign message and attempted to reduce their usage. Most of thecustomers understood the need to conserve energy over the summer and attempted to do so.Customer awareness was raised and the campaign prompted some customers to enroll in SCEprograms. Customer attitudes about reliability and avoiding outage remained strong.

SCE did not observe customer fatigue during 2012 event season for DR programs in general.The programs are able to avoid multiple consecutive days of events by flexibility in the dispatchtriggers of the programs.

III. Commercial Air Conditioning (AC) Cycling

A. SCE�’s Summer Discount Plan

In December 2012, SCE conducted a pilot telephone survey on several programs, includingthe Summer Discount Plan (SDP) program111. The overall sample size was 200 businessparticipants, though the sample size varied by the question being asked. The satisfaction withthe program was moderate, with only 72% of the participants aware that their business wasenrolled in SDP. The Decision (D.13 04 017) approved SCE�’s proposed changes to the SDPCommercial program, and we examine the commercial customer experience, as presentedthrough this survey, in detail.

Overall, a larger percentage (81%) of participants felt that the program was worthwhile. Ofthe three main touch points identified, billing was the key and customers were moderatelysatisfied with this touch point. Relatively few participants (18%) had reasonably high familiaritywith the program details. Customers who had high familiarity tended to be more satisfied.Customers who received targeted SDP communication were more familiar and more satisfiedwith the program.

111 Service Delivery Satisfaction Recalibration: Summer Discount Plan 2012 Pilot Survey.

Page 64: StaffReport_2012DRLessonsLearned

60

Most of the business participants were SCE customers at home (86%) and werepredominantly male (63%).

There were three main touch points, which had a significant impact on satisfaction. Billing,enrollment and events were key drivers, with billing being the highest priority driver, andevents being the lowest priority driver. Customers had difficulties identifying the discount andnot thinking the discount was fair given the effort it took to participate. For enrollment,customers were primarily concerned with delays in the device installation. Customer reasonsfor event dissatisfaction were, specifically, the time, day, and frequency of events as well as aperception of fairness.

The satisfaction with the billing was 78%, which was considered to be at a moderate levelcompared to other SDP touch points. 83% received paper bills, while 19% received electronicbills. 17% could not easily find the discount on the SDP bill. The top two comments on the SDPbill were to provide a separate line item of discount and offer a bigger discount/lower rate. Thebill currently includes a separate line item for the discount so customers are not able to find itand need to be reminded that it is there. The problems with event attributes were low (8%).

Satisfaction with enrollment was modest (78%). 7% experienced problems with enrollmentwith most of the problems being related to confusions (amount of the discount, expectedsavings) or delays (waiting for the device to be installed, multiple visits, and multiple phonecalls).

The more satisfied customers were the ones that were aware that:

1. They receive a discount regardless of event.2. The indicator light identifies an event in progress.3. Events occur between June 1 to October 1.4. The maximum duration of event is 6 hours.

Relatively few participants (18%) had reasonably high familiarity with the program details.Most business participants were aware of the 100% and 50% cycling options. Awareness of themethods (indicator light, SCE.com) of determining whether the device was currently cycling theAC off was at less than half of the respondents. Only 22% of the respondents knew the correctstart and end months of the program; many of the other customers did not know or did notprovide correct answers to the question. Those that were correct tended to be more satisfiedwith the program.

Only 12% of the respondents identified the correct 6 hours that an event can last. 21% saidthat there was no limit, and they were less likely to be satisfied with the program. 57% of therespondents who knew that they receive discounts whether or not the events were called weremore likely to be satisfied as a result. The number of events did not impact satisfaction.

When investigating the reasons for program satisfaction, 36% of the respondents werehappy with the program and 17% responding that there was a good discount provided. 19% ofthe comments were negative. 11% of this feedback was related to financial reasons such as thebill being too high, or that the bill increased, or that the discount was small. Bad customerservice was also another negative at 5%.

Page 65: StaffReport_2012DRLessonsLearned

61

Around a third of the customers provided suggestions for improving the SDP. In thisfeedback, financial comments were paramount, with the following reasons being cited:

Lower rates (5%) Bigger discount (5%) Better communication (4%)

A large percentage of participants (77%) did not know the discount amount. Participantswho are most satisfied are likely to know the discount dollar amount.

Participants with moderate to high familiarity were more likely to have received recentcommunication from SCE. All types of communication boosted familiarity though the writtenmethod was the dominant form.

B. SDG&E�’s Summer Saver Program

The findings in this section are based on KEMA�’s process evaluation of the 2008 SummerSaver program112. At the time of the evaluation, the program had 4,500 commercialparticipants (and even greater number of residential participants). Commercial customers canchose between 30% and 50% cycling options and choose between 5 day and 7 day option. Tothe extent to which commercial customer experience was provided, it is cited in this report.For other cases, general feedback is cited.

Since this information is dated, we used it primarily for general feedback on the SummerSaver Program at SDG&E and as a means of comparing the AC Cycling programs of SCE andSDG&E.

A key conclusion of the report included improving the program marketing and informationalmaterials to reduce program dropouts and attract more interested customers. Betterinformation about cycling frequency could have resulted in less dissatisfaction and drop out.Discomfort and program cycling were most often the top reasons for dropout. Bettermarketing could have reached customers who are interested in the program. The reportrecommended customizing marketing messages to customer subgroups. Surveys of SummerSaver participants and non participants discovered that bill credit messages had greater appealto lower income customers while environmental messages had greater appeal to higherincome customers.

With regard to cycling options, the report recommended to reduce program complexity byreducing the number of cycling options. A related cycling recommendation was to not increasethe cycling frequency. Currently the program cycles 10 12 times a year, and participantsindicated that they were uncomfortable during Summer Saver control events. The key reasonthat participants joined the program was for the financial incentives.

112 Process Evaluation of SDG&E Summer Saver Program, March 19, 2009.

Page 66: StaffReport_2012DRLessonsLearned

62

Key Lessons Learned from the AC Cycling Programs

When comparing the feedback received for the AC Cycling program for SCE and SDG&E, aclear recommendation emerges �– marketing, clear communication and managing expectationsis a key facet of the program. When customers know what to expect, they tend to be moresatisfied with the program. SCE customers with high familiarity of program attributes faredbetter and were more satisfied. SDG&E could also improve marketing and informationmaterials to reduce dropouts and attract interested new customers113. Clarifying the programand making it less complex is important to attract and retain customers. Targeting thosemessages, by subgroups in the case of SDG&E, is another useful method in attracting customersbased on their values and priorities, whether they are financial or environmental.

D.13 04 017 approved SCE�’s proposal of modifying its commercial program from a reliabilitybased DR program to a program that can be dispatched for economic purposes. The newtrigger will allow the program to be called when there are high wholesale market prices, whichoccur during times of extreme temperature or when the system demand is higher thanexpected. Additionally, SCE will consolidate the Base and Enhanced commercial programs intoone program with different features, and proposes that the SDP be made available year around.The key program changes are outlined below:

Table 35:

Program Element Current Design Approved DesignCurtailment Event Trigger Emergency Only Emergency and Economic

Program AvailabilityEvents can occur June 1September 30

Events can be called year roundwith a maximum of 180 eventhours during a calendar year.

Event Duration 6 hours

Multiple events may occur in asingle day, with varying durations.Maximum 6 hour interruption in aday.

Customer Cycling Options 30%, 40%, 50%, and 100% 30%, 50%, 100%

With the movement to an economic based program and the new program features, it isparamount that the marketing campaign clearly explain the changes, such as the duration ofthe event, which is now expected to be shorter, yet the programs can be called year around.Billing changes should also be made to assist customers in identifying discounts. SCE customerfeedback on program improvement was largely financial. SCE�’s new program design has newincentive levels. The new proposed enhanced program will pay a greater incentive perton/month than the current enhanced program, and the new incentive should becommunicated clearly to the participants, whether it is through clear bill presentation ormarketing efforts or a combination of the two.

113 This survey is dated, so SDG&E may have made marketing modifications to alleviate some of the concernspresented above.

Page 67: StaffReport_2012DRLessonsLearned

63

In 2012, the Summer Discount Plan Commercial was triggered once for 5.6 hours114. Withits movement to an economic based program, which can be called when wholesale marketprices are high, it is likely to be called more frequently. The Capacity Bidding Program DayAhead (CBP DA) had a heat rate trigger condition in 2012, and that program can be used as anexample of how frequently a non emergency based program may be called. In 2012, the CBPDA was called 12 times.

The proposed changes for SCE�’s AC Cycling program may provide the needed megawattsthis summer and will also benefit customers who are often financially motivated. However,these expected changes need to be communicated in a clear way to avoid customerdissatisfaction and possible customer dropout of the program. If the marketing program ismanaged carefully, SCE�’s Summer Discount Program for Commercial customers can be a usefulsource of load impact for the summers of 2013 and 2014.

IV. SCE�’s Auto DR

Auto DR is a technology program whereby customers receive an incentive to installequipment to improve the ability for load reduction during a DR event. Auto DR is consideredto provide a better load shed as described in the Decision (D.12 04 045):

�“Limited data suggests that ADR customers have a higher participationrate in DR programs and provide better load shed. Data also suggests thatcustomers on dynamic rates perform better with ADR.�”

SCE�’s Auto DR customers pre program the level of DR participation and when a DR event iscalled, the Auto DR technology enables the facility to automatically participate. This methodreduces the necessity of a manual response. All non residential customers must have aninterval meter and participate in an available price responsive DR program. As of 2013,customers are paid 60% of technology incentives during installation testing verification; 40% ofeligible incentives are paid according to participation in a DR program115.

By the end of 2012, SCE Auto DR program funding was 100% subscribed116. In theApplication, SCE requested approval for an additional $5 million for Auto DR which would beearmarked for projects in the Target Region. The majority of the funding ($4,200,000) was forthe technology incentive payments.

The key questions which arose were what the customer experience was with the Auto DRprogram and how effective were customers in shedding load when events were called. In theApplication, SCE does not have a breakdown of DR programs by those customers whoparticipate in Auto DR. To understand customer experience better, we refer to other studiesdone on the efficacy of Auto DR and customer feedback on the program.

CPP is a rate that sets a higher price for electricity during critical peak days. In return, thecustomer receives a reduction in the non peak energy charge, demand charge or both charges.

114 SCE 03.115 SCE 03 at 25.116 SCE 02 at 18.

Page 68: StaffReport_2012DRLessonsLearned

64

A report on the non residential CPP presents the estimated ex post load impacts of TechnologyIncentives and Auto DR participants on average for 2011 CPP Event117.

SCE called 12 CPP events in 2011 over the months of June and September. On average,each event had 3,006 participants. For SCE�’s CPP customers for 2011, the percentage loadimpact was 5.7% for the average event and the average load impact was 11.6 kW. There were35 CPP customers on Auto DR, and they provided a percentage load impact of 21%. Theaverage load impact was 103 kW. Based on this information, customers on Auto DR and CPPprovide a greater load reduction than customers on the CPP rate alone.

To further understand Auto DR and its potential to provide load impacts, we examine astudy by Enernoc on CPP118.

SCE and SDG&E offer Technology Assistance and Technology Incentives Programs. SDG&E�’sTechnical Assistance program provides customers with energy audit services to identifypotential for energy cost reduction and encourage participation in DR and EE programs. TheTechnology Incentive program at SDG&E provides financial incentives and on bill financing(interest free) for customer adoption and installation of DR measures and enabling technologies.

The EnerNOC study outlines the barriers to response to an event. The main barrier forthose customers which were the bottom performers, or the low performing participants, is thelack of ability to reduce demand because of business needs, and that responding to an eventwould negatively impact business functions. Examples of these limitations included a need tomaintain a temperature for preventing produce from spoiling or to protect sensitive equipmentfrom damage, or just for reasons of comfort of staff. A related barrier to response is the lack ofknowledge on how to reduce load. Additional barriers included lack of enabling technology,however, this is not identified as a top concern.

Most of the top responders, or the high performing participants, are able to easily shift theirprocesses or shut down heavy energy using equipment, and respond to events. However, fewof the top responders use technology to automate their response.

The main barrier to responding to events is the ability to respond and not suffer negativebusiness consequences. For businesses which have the capacity to respond and not benegatively impacted as a result, technology is a possible solution which can be explored. Due tothe small population size, only 16 technology enabled customers were interviewed. Themajority of those interviewed were SCE customers. This is a small sample size and the feedbackshould be interpreted with caution. Half of these customers said that the technology wasimportant for their response, and the other half stated that they would have stayed on the CPPrate without the technology. Four of the customers interviewed do not utilize the technologyinstalled; 3 of these customers respond to events.

Select feedback includes:

�“The load we shed is entirely enabled by the Auto DR technology�” �– SCE technology enabled.

117 2011 California Statewide Non Residential Critical Peak Pricing Evaluation, p. 41.118 California Statewide CPP Research on Improving Customer Response, December 3, 2012.

Page 69: StaffReport_2012DRLessonsLearned

65

Most of the bottom responders do not use technology to respond and are not aware ofoptions in this regard.

From the quantitative data, Auto DR customers on the CPP program provide greater loadimpacts than those customers on CPP without enabling technology. However, the data above,though limited due to the small sample size, can provide direction for continued research. Withthe additional Auto DR funding request in SCE�’s application, the participant pool continues togrow. With this growth, it is possible to conduct better studies with more robust results.Studies can attempt to get to benefits provided by Auto DR, in particular the load impact whichcan be attributed to Auto DR and which, in its absence, could not have been achieved.

V. SDG&E�’s Demand Bidding Program (DBP)

The Commission approved SDG&E�’s Demand Bidding Program in the middle of summer2012 as a part of the mitigation efforts to address SONGS outage.119 SDG&E called 3 DayAhead DBP events and obtained a load reduction of 5.1 MW, 5.4 MW and 4.6 MW. During thecourse of 2012, SDG&E spent $44,192 on this program, which was minimal comparing to itsresidential PTR program costs of $10 million for only 4 MW of load reduction. The recent DRdecision (D.13 04 017) approves SDG&E�’s proposed continuation of its Demand BiddingProgram modified from a day ahead to day off, 30 minute product . The purpose of thismodification was to align the program with the Energy Division Letter and provide programsthat can provide quick response capability. In its comments, the Navy stated that the DBP witha 30 minute trigger would only permit participation from entities with automated demandresponse systems, in effect reducing participation. The Navy requests the continuation of theday ahead program.

The Navy states that the 2012 DBP did not allow for the Navy�’s participation and the changeto a day of, 30 minute program will further limit the Navy to participate.

Instead the Navy proposes a day ahead program, with some modifications. The Navyproposes that the customer be allowed to aggregate 8 billable meters. The second proposal isto lower the minimum bid requirement from 5 MWh to 3 MWh. The Navy states that it maynot be able to produce 5MWh at a single geographic location and cites its experience of August2012, when during a Demand Reduction test, the Navy shed 4MWh from a multitude of shorefacilities on three Navy Installations.

SDG&E responded to the Navy�’s comments, and explained why it believes the Navy did notparticipate in the 2012 DBP. SDG&E understand that the Navy will participate in an emergencyprogram; however, the DBP is not a day ahead emergency program.

In its response, SDG&E indicated its willingness to work with the Navy and create a demandresponse program to meet the Navy�’s unique needs.

119 In Resolution E 4511 on July 12, 2012.

Page 70: StaffReport_2012DRLessonsLearned

66

Staff recommends that SDG&E and the Navy collaboratively develop the Navy only DBPprogram, which will address the following issues raised in the Navy�’s comments to the 20132014 DR Proposed Decision:120

1. A day ahead trigger to enable the Navy to appropriately plan for the event.2. The ability to aggregate 8 billable meters.3. Lower minimum bid requirement from 5 MW to 3 MW.

Experience from the Demand Reduction test demonstrates that the Navy has the ability toreduce load and be a useful DR resource for SDG&E�’s system during the summer.

120 Filed on April 9, 2013 in A.12 12 016.

Page 71: StaffReport_2012DRLessonsLearned

67

Chapter 6: Flex Alert Effectiveness

I. Summary of Staff Analysis and Recommendations

The Flex Alert campaign has not been evaluated since 2008. Earlier evaluations from 20042008 suggest that the impacts of an emergency alert have ranged from an estimated 45 MW to282 MW. The utilities have identified areas to improve communication between CAISO and theutilities when alerts are triggered and cancelled. The utilities also question whether customersare confused about the differences between a Flex Alert event and local Peak Time Rebateevents. The utilities cite several reasons to consider transitioning Flex Alert from a utilityfunded program, to a CAISO led and funded program.

Staff finds that there is a lack of data to evaluate the effectiveness and value of the FlexAlert campaign. Staff agrees with the utilities that an evaluation in the current program cycleis needed. Staff finds that there is merit to the utilities proposal to terminate Flex Alert as arate payer funded and utility led activity after 2015. Rather than providing recommendations inthis report staff defer to the proceeding that is currently reviewing the utilities statewidemarketing, education and outreach applications, (A. 12 08 007, A.12 08 008, A.12 08 009, andA.12 08 010) and the Phase I Decision in that proceeding, D.13 04 021.

II. Background

�“Flex Alert�” is the current name of a statewide marketing campaign that asks Californians toconserve or shift electricity when the CAISO determines that there is a risk that electricitysupply may not be adequate to meet demand.121 This alert campaign is approved throughCPUC decision and the CPUC authorizes the three investor owned electric utilities to providethe total budget. One utility acts as the lead utility, and contracts with a marketing agency todevelop TV and radio ads, and purchase advertising time. The marketing agency purchasesadvertising slots throughout the state to run the ads during the summer season, when demandis likely to be highest and the grid is more likely to be constrained. CAISO triggers an alertbased on grid conditions and informs the utilities and the marketing agency. The marketingagency swaps out informational advertisements with emergency alert messages, calling a �“FlexAlert�” and asking Californians to do three things during a six hour window of time on a specificday 1) turn off unnecessary lights, 2) set air conditioners to 78 degrees, and 3) wait until after7PM to use major appliances. Individuals and businesses also have the opportunity to sign up toreceive email or texts notifying them that there will be an alert.

Flex Alert Performance in 2012

In 2012, two Flex Alert events were called on August 10 and August 14. Initially Flex Alertswere triggered on August 9 for August 10 12. However, the Alerts for August 11 and 12 werelater cancelled. A formal evaluation of Flex Alert was not conducted in 2012. The utilities did

121 From 2001 2004 the name for emergency alerts was Power Down, and from 2004 2007 they were referred toas Flex Your Power Now.

Page 72: StaffReport_2012DRLessonsLearned

68

not conduct any analysis to determine an estimate of the impacts that resulted from either FlexAlert event.

SDG&E was concerned that customers would not recall the difference between ReduceYour Use, which provides customers a bill credit, and Flex Alert, which provides no monetarybenefits. In the event that customers did not understand the difference between Flex Alert andReduce Your Use, SDG&E wanted to avoid customer frustration that could occur whencustomers reduced their usage during a Flex Alert but were not paid for it. To mitigateconfusion, SDG&E triggered its Reduce Your Use program on the same days when Flex Alertswere called. There were three days when the utility triggered a Reduce Your Use event whenthere was no Flex Alert. However, the utility claims that the weather on the three Reduce YourUse event days was atypical, and therefore the utility cannot determine the load reductionsattributable to Flex Alert by comparing Flex Alert days with the days when Reduce Your Use wascalled and Flex Alert was not.122

SCE also states that with the limited data available the utility cannot determine the effect ofa Flex Alert. SCE did a basic comparison of two days with similar conditions when the same DRprograms were dispatched, one day with a Flex Alert and one day without and concluded thatFlex Alert could be counter productive because SCE�’s total system load was higher on the daythe Flex Alert was called.123

In comparison, there have been three evaluations of the alert campaign in its history: 20042005, 2006 2007, and 2008. The 2004 2005 evaluation did not estimate the impact of an alertevent. The 2006 2007 evaluation reported that the system wide demand response impact onFlex Alert days, (including all other demand response programs that were called), ranged from200 MW to 1100MW. The impact from Flex Alert was a portion of this total. The 2006 2007evaluation estimated the load impacts associated with alert events, specifically from customersadjusting their air conditioner settings in response to the ads. Although, the study estimatedimpacts ranging from 93 MW to 495 MW,124 in 2008 the consultant redid its analysis withrevised assumptions and adjusted the estimate to be between 45 MW and 75 MW.125 The 2008evaluation estimated load impacts based on customers turning off lights, and adjusting airconditioners. The study estimated that impacts from alert events in 2008, based on customerstaking these two actions ranged from 222MW to 282 MW.126

Since 2008 there has been a long gap since Flex Alert has been evaluated. In 2009 and 2010no Flex Alert events were triggered. In 2011 there was one event, but there was no evaluation.Given that Flex Alert has not been evaluated since 2008, and the utilities seem unable to drawany conclusions about load impacts attributable to Flex Alert events, it is reasonable to plan anevaluation for the summer of 2013. The Commission issued a Decision on the utilities

122 SGE 02 at 25123 SCE 01 at 59.124 2006 2007 Flex Your Power Now! Evaluation Report, Summit Blue Consulting, May 22, 2008, p. 126. A link to

this report is provided in Appendix S.125 2008 Flex Alert Campaign Evaluation Report, Summit Blue Consulting, December 10, 2008, p. 102. A link to this

report is provided in Appendix S.126 Id.

Page 73: StaffReport_2012DRLessonsLearned

69

statewide marketing application on April 18, 2013. The Decision includes a directive to evaluatethe program.127

III. Utility Experience with Flex Alert

SCE identified three weaknesses with the implementation of Flex Alert. First, the utilitystates that challenges exist because neither a utility nor the PUC own the trademark to thename Flex Alert. Second, the utilities did not receive advanced notice from the CAISO when aFlex Alert was triggered or cancelled. Third, inability of the CAISO to accurately forecast theduration of an alert resulted in confusion, when an alert was cancelled.128

The utilities state that they were contacted by CAISO at the same time that news media andthe general public was informed about a Flex Alert. CAISO held weekly calls with the utilities todiscuss weather forecasts and the likelihood a Flex Alert would be called. However, when analert was triggered the utilities learned of the event through a robo call, automated email ortext message, which are the same methods used to inform residential customers and mediaoutlets. The utilities would prefer to have advanced notification so that they are able tostrategically coordinate their own DR program initiation, and to proactively communicate withcustomers.

The cancellation of the weekend alerts on the 11th and 12th also caused confusion. SDG&Eclaims that both internal staff and local media were confused about whether or notconservation and Reduce Your Use days were still necessary.129 SCE acknowledges that it isinefficient and costly to re contact media outlets to cancel alerts. Flex Alert radio and televisioncommercials continued to air throughout the weekend, because the marketing agency was notable to give the media stations adequate time to switch the messages before the ads werelocked in for the weekend. To add to the confusion, CAISO�’s website continued to indicatethere was a Flex Alert even though the agency had issued a press release stating the weekendalert events were cancelled.130

Prior to the start of the 2013 Flex Alert season, the utilities, CAISO and the marketingagency should discuss the weaknesses identified by the utilities from 2012. The organizationsshould use their expertise and the recommendations from past Flex Alert evaluations toidentify methods to improve the timeliness of communication and ensure that implementationis as efficient and effective as possible.

IV. Customer Experience

Only one utility, SCE, conducted a survey in 2012 to determine customer awareness andreaction to the 2012 Flex Alert campaign. Although SDG&E did not conduct a survey, the utilityraised concerns that both customers and the media seem confused about the differencebetween Reduce Your Use events and Flex Alert.131 SCE found that 10 percent of surveyed

127 D.13 04 021, Ordering Paragraph 14.128 SCE 01 at 60.129 SGE 02, Attachment 1 at 26.130 SCE 01 at 60.131 SGE 02, Attachment 1 at 26

Page 74: StaffReport_2012DRLessonsLearned

70

customers were confused about the difference between the utility�’s Peak Time Rebate and FlexAlert. 132

SCE reported the following results from its survey of 400 customers.133

Nearly 60% of residential customers reported hearing or seeing a Flex Alertadvertisement

54% of small business customers reported hearing or seeing a Flex Alert message 25% of residential customers surveyed reported that they took steps to reduce

electricity use on a Flex Alert day 21% of small business customers reported taking steps to reduce when a Flex alert was

called

Compared with past evaluations customer recall of Flex Alert ads has increased from oneevaluation period to the next. In 2004 2005, 12 percent could recall hearing or seeing an ad,compared to 15 percent in 2005 2006, and 23 percent in 2008.134 A formal evaluation in 2013can help determine if the jump in awareness reflected in SCE�’s survey results is an accuratereflection of the trend. The 2013 evaluation should take into account the variety ofmechanisms used to relay information about alerts to customers. For example, 2012 was thefirst year that the utilities conducted outreach through Community Based Organizations to helpprepare customers for a Flex Alert event.

Another highlight from SCE�’s survey is that 25 percent of residential customers took actionin response to an alert. This percentage is also an increase from past evaluation results. In pastyears between 10 and 21 percent of residential customers reported taking action in response tothe ads.135 However, SCE�’s survey failed to determine whether customers accuratelyunderstood the message that a Flex Alert is intended to convey. All three prior evaluationsfound that customers did not understand that they were supposed to adjust their behavior forjust the day of the Flex Alert event. Instead customers reported continuing to conserve duringafternoon hours every day since the event had been called.136 While conservation has its ownbenefits, the purpose of a Flex Alert is for customers to shift load during a brief peak event. Itwill be important for utilities, the CAISO and the marketing agency to continue to strive toaccurately relay this concept, and for an evaluation to determine if the right message is gettingthrough to customers.

The utilities made one specific recommendation to improve the program in 2013 2014.They proposed to continue community outreach partnership efforts in 2013 and 2014 in the

132 SCE 01 at 61133 SCE 01 at 61134 Process Evaluation of the 2004 / 2005 Flex Your Power Now! Statewide Marketing Campaign, Opinion Dynamics

Corporation, July 24, 2006 p. 5; 2006 2007 Flex Your Power Now! Evaluation Report, Summit Blue Consulting,May 22, 2008, p. 90; 2008 Flex Alert Campaign Evaluation Report, Summit Blue Consulting, December 10, 2008,p. 83. Links to these reports are provided in Appendix S.

135 Id.136 Id.

Page 75: StaffReport_2012DRLessonsLearned

71

demand response proceeding. The Commission adopted a Decision on April 18, 2013, whichapproves these requests.

V. The Future of Flex Alert

SDG&E sites a passage from the SCE�’s testimony in its Statewide Marketing Applicationwhich identifies several reasons that the Commission should consider that CAISO take fullcontrol of the statewide emergency alert campaign starting in 2015. SDG&E states that theysupport the recommendation made by SCE. SCE�’s testimony states that since 2004 the utilitieshave funded alerts through ratepayer dollars. However, when alerts are called, the resultsbenefit customers outside of the utilities�’ service territories as well, yet neither CAISO nor nonutility Load Serving Entities contribute to the funding. SCE also pointed out that from 20072011 only one alert was triggered. Increased growth in utility demand response programs haspositively impacted grid reliability, the utility states. SCE found that balancing utility specificregulatory constraints with the CAISO desired scope of the program was challenging. As anexample CAISO requested to share emergency alert messaging with Baja Mexico to promoteenergy conservation in that region. SCE�’s testimony goes on to state that the utilities do nothave the discretion of when to trigger the program. SCE recommended that Since CAISOtriggers the program, the CAISO should assume total ownership of, and authority over it. SCErequests that this recommendation is approved during 2013 2014 so that CAISO has theopportunity to seek funding in it GMC cost recovery.137

The Commission adopted a Decision on Phase 1 of the utilities statewide marketingapplication on April 18, 2013. The Decision authorizes a total of $20 million to be spent on FlexAlerts between now and the end of 2014. The Decision also directs the program to beevaluated. The Decision includes a directive for the utilities to work with CAISO to develop aproposal for the transfer of the administration and funding of the Flex Alert program to theCAISO or another entity, effective in 2015. The Decision directs SCE to submit the proposal inthe Statewide Marketing Proceeding by March 31, 2014.138

VI. DR Program Ex Post Load Impact Results on the Flex Alert Days

As shown in tables below, all three utilities triggered a DR event for some of their DRprograms during the two Flex Alert days with a total of 739 MW of load reduction from 4:005:00 p.m. on August 10, 2012 and 432 MW from 3:00 4:00 on August 14, 2012. The CAISOreported that the actual system peak load during the peak hours between 3:00 p.m. to 5:00p.m. were significantly lower than its forecasts and attributed the load drops to its Flex Alerts.However, the data suggests that a large or some portion of the load reduction came from theDR programs. Appendix P shows the ex post load impact for each of the utilities�’ DR programson the two Flex Alert days.

137 SGE 02, Attachment 1 at 28.138 D.13 04 021 at 25 27.

Page 76: StaffReport_2012DRLessonsLearned

72

Table 36: Utilities�’ DR Program Ex Post Load Impact on the Flex Alert Days139

UtilityEx Post (MW)

3:00 4:00 p.m. 4:00 5:00 p.m.

8/10/12:

SCE 194 185SDG&E 8 27PG&E 459 527

Total 661 739

8/14/12:

SCE 394 242SDG&E 38 38PG&E No Events

Total 432 280

139 Provided to staff through emails. Data source: the utilities�’ April 2, 2013 Load Impact Reports (links to thereports are provided in Appendix S).

Page 77: StaffReport_2012DRLessonsLearned

73

Chapter 7: Energy Price Spikes

I. Summary of Staff Analysis and Recommendations

Because most DR programs are dispatched a day ahead or several hours ahead of events, itis difficult for the utilities to effectively use DR programs in response to real time price spikes.There were many days where price spikes occurred but DR programs were not called, andconversely there were days where DR programs were called but no price spikes occurred. 30minute or 15 minute DR programs could respond to price spikes much more efficiently.

II. Definition of Price Spikes

For the purposes of this report, a price spike day was defined as any day where the averagehourly real time price hit $150/MWh or more in any 3 or more hours from HE12 HE18. Thisdefinition is designed to evaluate only those hours where DR could respond. By restricting thedefinition to HE12 �– HE18 the definition considers only those hours where DR can be called. Byalso restricting the definition to days where 3 or more hours were above $150/MWh, thiseliminates days with momentary jumps in price that DR could not reasonably be expected torespond to.

III. DR Programs and Price Spikes

Using the definition above, SCE had 67 hours that averaged $150/MWh or more across thehour, with 7 days where 3 or more hours averaged $150/MWh or more across the hour.SDG&E had 126 hours that averaged $150/MWh or more across the hour, with 18 days where 3or more hours averaged $150/MWh or more across the hour.

DR events overlapped real time price spikes with varying success. SCE was successful on 2out of 7 days, whereas SDG&E was successful on 4 out of18 days.140

Table 37: Number of Days with Energy Price Spikes

SCE SDG&E

Days that DR events successfully overlapped price spike days (3 or more hours of$150/MWh) between HE12 �– HE18 2 4

Number of price spike days (3 or more hours of $150/MW) between HE12 �–HE18 7 18

Days that DR events were called 43 15Days that DR events were called but without price spike ($150/MWh) occurring 31 6Days with at least 1 price spike of $150/MWh 36 60

Most of the utility price responsive DR programs are currently designed to be called a fullday ahead of when the load reductions are needed. The existing programs therefore do nothave real time hourly prices as a trigger. They are triggered by other market indicators such asheat rates and forecasted temperature. According to SCE, price spikes occur with 2.5 minutenotice, and that any resource that could be used to mitigate price spikes would have to be

140 For a more complete chart, see Appendix Q.

Page 78: StaffReport_2012DRLessonsLearned

74

already bid into CAISO�’s market awaiting CAISO dispatch instructions141. DR programs arecurrently not bid into CAISO�’s market.

To the extent that DR programs were triggered when price spikes occurred, it is outside thescope of this report to quantify the impact of DR programs on those price spikes. Thequantification of those impacts would require some method of modeling what the prices wouldhave been but for the load impacts of the DR programs.

In theory, DR should have had some impact on prices given that DR events overlapped pricespike days on a few occasions. Demand response on those days, in theory, probably had somedownward impact on the equilibrium price (i.e. mitigating the price spikes).

IV. Conclusion

DR programs are not able to address real time price spikes because of their currentdesign, and because the programs are not yet bid into CAISO markets. The Utilities shoulddesign new DR programs that enable them to mitigate real time price spikes in anticipation thatthese programs will be bid into CAISO markets.

141 A.12 12 017, SCE Exhibit 1, pages 48 49

Page 79: StaffReport_2012DRLessonsLearned

75

Chapter 8: Coordination with the CAISO

I. Staff Recommendations

Because the Utilities�’ current DR programs are not integrated into the CAISO wholesaleenergy market, there is no market mechanism to inform the CAISO how much DR capacityexists in the system on daily and hourly basis. Such information is important for the CAISO�’soperational consideration. The utilities�’ Weekly and Daily DR reports developed in summer2012 are a valuable alternative to make their DR resource more visible to the CAISO. Staffappreciates the Utilities�’ efforts in the development and submission of the Daily and Weekly DRreports. Staff agrees with the CAISO that all three utilities should submit the Daily and WeeklyDR reports in summers 2013 and 2014. The utilities (including PG&E142) DR reportingrequirements for 2013 2014 is summarized in Appendix R.

II. DR Reporting Requirements in Summer 2012

As discussed above, prior to the summer 2012, under the oversight of the Governor�’s Office,the Commission worked closely and intensively with the CAISO, the CEC, and the Utilities on thecontingency planning to mitigate the potential affects from the SONGS outage and ensuresystem reliability throughout the summer. One of the initial steps was to identify the Utilities�’DR resources available to address the five different types of system contingencies such astransmission, voltage collapse, generation deficiency, etc., which is referred as the mapping ofthe DR programs.

The next step was to develop a mechanism to inform the CAISO how much Day Ahead andDay of DR capacity is available on a daily and hourly basis. Unlike other generation resources,currently, DR is not integrated in the CAISO�’s wholesale energy market. Under the CAISO�’s DRResource User Guide,143 the Utilities are required to only submit the forecast and results for thetriggered DR programs. Therefore, if no DR program is triggered, the CAISO is blind to howmuch DR capacity exists in the system. With an exception of the Emergency Program, the DRprograms are dispatched by the utilities, not the CAISO. This operation as well as the reportingrequirements as set in the CAISO�’s guide since 2007 had not presented any problem in the pastwhen the system had sufficient resources.

However, in light of the SONGS outage, CAISO emphasized the importance of a dailycommunication on the Utilities�’ DR programs so the CAISO�’s grid operator could request theUtilities to dispatch their DR programs if and when they are needed. Working cooperativelywith CAISO and the Commission staff, the Utilities developed and submitted the Daily DRreports from June 1, 2012 to October 31, 2012. The Utilities continued to submit the results ofthe DR events seven days after each event (referred as the �“7 Day Report�”) consistent with the

142 As staff guidance only because PG&E is not subject to this proceeding.143 DRAFT Version 1.0, August 30, 2007. http://www.caiso.com/1c4a/1c4a9ef949620.pdf.

Page 80: StaffReport_2012DRLessonsLearned

76

CAISO�’s guidance. Staff provided to the Governor�’s Office the data from the Daily and the 7Day reports in weekly briefings during the summer 2012.

III. DR Reporting Requirements for 2013 2014

In its 2013 2014 DR application, SCE proposed to eliminate the Weekly and Daily DRreporting requirements because it did not find these reports provided value for SCE. SCErecommends transition back to the 2007 CAISO User Guide but suggests that the CAISO shouldupdate and publish for all DR Providers.144 In its protest to SCE�’s application, the CAISO objectsSCE�’s proposal and requests that the Utilities resume the Daily DR reports after the winterseason ends. The CAISO contends that �“(t)he underlying purpose of the date forecasting andpublication was to benefit the system operator rather than the IOUs themselves. The ISO findsgood value in the daily demand response reports. Because the report mechanism, the ISO is nolonger blind to how much DR capability exists in the system in a daily and hourly basis, if andwhen it is needed.�”145

Staff finds that these reports not only have value to the CAISO, but also to the Commission.Through the Daily and 7 Day reports, staff was able to monitor and provide timely DR status tothe Governor�’s Office throughout the summer. There were a number of lessons learnedleading the development of the comprehensive questions on the DR performance. Therefore,staff recommends the continuation of all of the DR reports submitted in 2012 for 2013 2014 assummarized in Appendix R.

144 A. 12 12 017, SCE 1, at p.54.145 A. 12 12 017, CAISO�’s Comments filed on January 18, 2013.

Page 81: StaffReport_2012DRLessonsLearned

77

Appendix A: Highlight of 2012 Summer Weather & Load Conditions146

SCE

Date Max Temp(°F)

Max RA Temp(°F)

DR Ex PostLoad Impact (MW)

Peak Load(MW)

8/10/2012 89 93 192 22,2828/13/2012 90 95 59 22,4289/14/2012 100 97 93 21,79910/1/2012 95 N/A 80 21,35510/17/2012 97 88 270 17,609

SDG&E

DateMax Temp

(°F)Max RA Temp

(°F)

Ex PostLoad Impact

(MW)

Peak Load(MW)

8/13/2012 91 88 31 4,2668/17/2012 94 88 23 4,2669/14/2012 109 96 46 4,5929/15/2012 104 96 32 4,31310/2/2012 98 96 25 4,146.3

146 Include event days with top three highest temperatures and peak load.

Page 82: StaffReport_2012DRLessonsLearned

78

Appendix B: Energy Division November 16, 2012 Letter

Provided in a separate PDF file

Page 83: StaffReport_2012DRLessonsLearned

79

Appendix C: Descriptions of DR Load Impact Estimates

012 RA

The 2012 Resource Adequacy (RA) load is a monthly forecast estimate of the load reduction attributedto individual DR programs under a 1 in 2 weather year condition. This value is utilized in load resourceplanning and it is based on a year ahead forecasted customer enrollment.

SCE�’s Methodology

In SCE�’s A. 12 12 017 March 4th Response To ALJ�’s February 21, 2013 Ruling Requesting Applicants ToProvide Additional Information, 2012 RA MW is based on SCE�’s ex ante load impact results under a 1 in 2weather year condition, portfolio level, and average hourly impacts from 1pm to 6pm in May Oct. andfrom 4pm to 9pm in Nov. Apr.

The PTR, Residential and Commercial Summer Discount Plan (AC Cycling) methodologies are outlinedby the following steps:

1. Defining data sources2. Estimating ex ante regressions and simulating reference loads by customer and scenario3. Calculating percentage load impacts from ex post results4. Applying percentage load impacts to the reference loads; and5. Scaling the reference loads using enrollment forecasts147

SDG&E�’s Methodology

The 2012 Resource Adequacy MW is based on SDG&E�’s ex ante load impact results under a 1 in 2weather year condition, portfolio level, and average hourly impacts from 1pm to 6pm in May Oct. andfrom 4pm to 9pm in Nov. Apr. The forecast is calculated in accordance with the load impact protocols149.The forecast is calculated by multiplying (1) historical load impact per participant as a function of weatherand (2) SDG&E�’s forecast of the number of participants per program.

Load Impact Per Participant150

147 Details of RA protocols obtained from SCE DRAFT 2012 Ex Post Ex Ante Load Impact for SCEs PTR pg. 16http://www3.sce.com/law/cpucproceedings.nsf/vwOtherProceedings?Openview&Start=1&Count=25

148 Details of RA protocols obtained from SCE DRAFT 2012 Ex Post Ex Ante Load Impact for SCEs SDPhttp://www3.sce.com/law/cpucproceedings.nsf/vwOtherProceedings?Openview&Start=1&Count=25

149 D.08 04 050.150 Detailed information on RA protocols obtained from �“San Diego Gas & Electric Company Response to Administrative Law Judge�’s

Ruling Requesting Applicants to Provide Additional Information�” pg. 14 and communication with Kathryn Smith, SDG&E.

2012 SCE Resource AdequacyProtocols �–Program Details148

Summer DiscountPlan (AC Cycling) &Peak Time Rebate(PTR)

Time of Day (hour) Day of week Variables for Monday and Friday Month Cooling degrees Heating degrees

Page 84: StaffReport_2012DRLessonsLearned

80

The first step in the process is the development of a regression model. The model used in the analysisincludes the following input variables: temperature, day of week, month, and participant loads prior tothe DR event (i.e. participant loads at 10 a.m.). A 1 in 2 weather year condition was used as an inputvariable in the regression model and it represents the monthly peak day temperature for an average year.SDG&E utilized 2003 11 historical weather data to calculate monthly system peak temperatures. In theevent that DR program enrollment, baselines, or the number of DR events changed significantly, datafrom prior years was utilized. Regression variable coefficients in the 2011 Ex Post model were utilized forthe 2012 RA forecast model.

After the impact per participant regression model is developed, the model is re run with averagemonthly peak temperature values. The output is the historical load impact per participant as a function ofweather.

SDG&E�’s Forecast of the Number of Participants per Program

The forecasted number of participants per DR program is obtained by examining historical trends andprogram designed change.

2012 SDG&E Resource Adequacy Protocols �–Program Details151

ACSAVER 1 in 2 weather data for monthly system peak day

Enrollment estimates by customer type (residential and commercial) and by cycling option (Res50%, 100% cycling; Com �– 30%, 50% cycling).

BIP A Time of Day, Day of Week, Month, Temperature (shape and trend variables (and interactionterms) designed to track variation in load across days of the week and hours of the day).

Forecasted load in the absence of a DR event (i.e. the reference load)

Participant�’s Firm Service Level

Estimates of over or under performance

TOU period variables (binary variables representing when the underlying TOU rates changedduring the day and season)

CBPDA/DO

Simulated per customer reference loads under 1 in 2 weather year condition and event typescenarios (e.g., typical event, or monthly system peak day)

Estimates of reference loads and percentage load impacts, on a per enrolled customer basis,based on modified versions of the ex post load impact regressions.

Estimated percentage load impacts combined with program enrollment forecasts from SDG&E todevelop alternative forecasts of aggregate load impacts.

Forecasts were developed at the program and program type (e.g., DA and DO) level.

CPP �–D Load impacts for existing CPP D customers were prepared for 2010 2020 based on per customerreference loads and load impact estimates from the ex post evaluation, and enrollmentforecasts.

The enrollment forecast for CPP D is calculated using opt out rates by NAICS

CPP E Forecast is based on prior event data and accounts for temp. & customer growth

151 Details of RA protocols obtained from �“Executive Summary of the 2011 SDG&E Measurement and Evaluation Load Impact Reports�”http://www.sdge.com/sites/default/files/regulatory/SDGE_PY2011_LoadImpactFiling_ExecutiveSummary%20final.pdf

Page 85: StaffReport_2012DRLessonsLearned

81

PTR Com& Res

There are five major assumptions required to compute the expected PTR load reduction fromresidential customers. 1) The meter deployment rate, 2) the rebate price, 3) the participationrates, 4) the average load, and 5) the elasticity which determine the percent impact per customerwhen combined with the prices.

Average load is based upon SDG&E�’s load research and daily load profile data.

Average daily energy use per hour in the peak and off peak periods

Elasticity of substitution between peak and off peak energy use

Average price during the peak and off peak pricing periods

Change in elasticity of substitution due to weather sensitivity

Average cooling degrees per hour during the peak period.

Change in elasticity of substitution due to the presence of central air conditioning

2012 Adjusted RA

The DR load impact for 2012 Adjusted RA is a monthly estimate of the expected load reductionattributed to individual DR programs that accounts for current customer enrollment. This value is utilizedin load resource planning.

SCE�’s Methodology

Adjusted RA is calculated by taking the 2012 RA value and dividing by the 2012 RA enrollment to getthe average RA load impact per customer. The average RA load impact per customer is multiplied by thenumber of ex post customers that were dispatched. The adjusted RA value accounts for the differencebetween the number of customers forecasted for RA and the number of customers actually enrolledduring the ex post events; i.e. the adjusted RA represents what RA would have been if SCE had hadperfect knowledge of enrollment for 2012

SDG&E�’s Methodology

The adjusted 2012 RA load forecast is obtained by multiplying the 2012 RA impact per customer bythe number of current enrolled customers. SDG&E did not adjust its 2012 RA load forecast for weather orother variables.

DR Daily Forecast and CAISO�’s 7 Day Report

The daily forecast is intended to provide an estimate of the expected hourly load reduction per DRprogram during an event period.

The CAISO�’s 7 day Reports provide load reduction data that is calculated and reported to the CAISOseven days after a DR event.

SCE�’s Methodology

AC Cycling

SCE�’s daily forecast for the Summer Discount Plan is calculated using an algorithm derived from a 1985AC cycling load reduction analysis report. The algorithm is a linear equation:

MW Reduction = [a + b x (T x k)] x t

Page 86: StaffReport_2012DRLessonsLearned

82

Where:

T = Temperature forecasted for the following business day in Covina, CA

t = Air conditioner tonnage available for cycling

k = Temperature adjustment factor

a = Constant adjustment factor

b = Slope adjustment factor

When the temperature in Covina is below 70 degrees, the assumption is that no AC Cycling DR isavailable and thus no forecast is made. Specific values for a, b, and k are disclosed in a 1986 SCE internalmemo for 4 SCE service area weather zones and for the 50% and 100% cycling strategies.152 Adjustmentsare made to the algorithm based on air conditioner tonnage available for cycling. This particular algorithmis only valid for event day temperatures between 90 and 116 degrees.

As of this draft, the 1985 AC cycling load reduction analysis report has not been provided to ED staff.Consequently ED staff has not been able to examine the specific slope, constant, and temperatureadjustment values.

SCE used a modification of this algorithm to accommodate the hourly forecasts requested by the CAISOprior to August 28, 2012. The modified methodology uses the program load reduction estimates using atemperature input of 100 degrees that is scaled based on actual temperatures below 100 degrees. Towardsthe end of the summer, the legacy algorithm was built into a system where the temperatures could beapplied by hour across the different zones requested by the CAISO.

SCE�’s 7 day report for the Summer Discount Plan is calculated using the AC cycling load reductionalgorithm with a temperature input based on actual temperatures in Covina CA. When the temperature inCovina CA is below 70 degrees, the assumption is that no AC Cycling DR is available. Adjustments are madebased on enrollment and temperature.

SDG&E�’s 7 day results reports for the AC Saver program are calculated using a one or two day baselinewith adjustments based on same day or historical days with the most similar weather conditions to theevent day. The 7 day report results provided to the CAISO are hourly but the event day results averageresults from 1p.m. 6 p.m. for events including those hours and the average results over the event period forevents not including all of the hours 1p.m. 6p.m.

Peak Time Rebate

SCE�’s daily forecast and 7 day report for the Save Power Day peak time rebate program is calculated bymultiplying the population of residential customers actively enrolled in Save Power Day event notificationby a forecasted average load drop of 0.229 kW per participant.

SDG&E�’s methods for developing the daily forecast and 7 day report for the residential peak timerebate program are the same as those described above for the AC Cycling program.

DR Contracts

SCE�’s daily forecast and 7 day report for DR Contracts program are calculated as the current month'scontract capacity with no adjustments are made for enrollment, temperature, or other factors.

152 See Appendix S.

Page 87: StaffReport_2012DRLessonsLearned

83

SDG&E�’s Methodology

Daily Forecast

The daily forecast is calculated in two steps.

The first step is the creation of a regression model that predicts the entire load of participatingcustomers. Model input variables include temperature, day of week and month. Temperature inputsutilized in the regression model are the monthly peak temperatures from the prior year. In some instances,the load forecast may be scaled up or down according to the number of currently enrolled participants andtheir impact on on peak load. In some instances, if large customers leave a program, the load forecastregression is re run with participants that are still enrolled in the program.

The second step in the process is to multiply the estimated load of participating customers by a fixedpercentage load reduction that is based upon ex post results from the previous year.

CAISO�’s 7 day Report

Load reductions detailed in CAISO�’s 7 day Report are calculated by subtracting an estimated baselinefrom the measured load during DR event hours. SDG&E utilizes 10 working days prior to an event tocalculate an estimated baseline for its CPP, CBP, CPP E, and BIP programs. For its residential programs,SDG&E utilizes 1 to 2 days to calculate its estimated baseline. The exception is that if the PTR event occurson a Monday, then data from the prior work week (excluding event days) is used.

As of this draft, Energy Division staff has not inspected SDG&E�’s regression model, model inputs, orcases where comparisons and judgment were applied to scale forecasts up or down.

Ex Post Results and Settlement Data

Ex Post Results

Ex post result is the measurement of MW delivered using Regression methods. Regression methodsuse an entire season�’s data and data across multiple events to improve on the accuracy of impact estimates.It relies on historical information about customer loads and focuses on understanding the relationshipbetween load, or load impacts, during hours of interest and other predictor variables (i.e., temperature,population characteristics, resource effects, and observed loads in the hours preceding the DR event).

Whenever ex ante estimation is required, regression analysis is generally the preferred method becauseit can incorporate the impact of a wide variety of key drivers of DR.

DR load Impact estimates are determined directly from the regression modal. Decision 08 04 050adopts protocols for estimating the impact of DR activities for resource planning.

The purpose of the ex post results is to inform DR Resource Planning and Program Design

Settlement Data

Day matching is the primary approach used to calculate customer settlement for DR options involvinglarge Commercial and Industrial customers. Settlements refer to the methods of paying customers forparticipating in DR program and it is an important component of DR program design and implementation.Because of the need to produce estimates in a short time frame after an event for prompt payments, thislimits the amount of data collected. Forecasting future impacts of DR events is limited because Day

Page 88: StaffReport_2012DRLessonsLearned

84

matching do not collect data on influential variables (i.e., weather conditions, seasonal factors, customerpopulation characteristics) that would cause impacts for vary in the future.

SCE MethodologyLoad impact is calculated as the difference between the reference load (baseline) and the observed load

(usage). The purpose of the settlement data is to calculate payment to customers.

Page 89: StaffReport_2012DRLessonsLearned

85

Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW)153

with RA Measurement Hours (1 6 p.m.)

Month 2012 RA(1)

2012 Adjusted RADaily

Forecast7 DayReport Year End Ex PostEnrollment

(2)

Enrollment&

Weather(3)

Monthly Nominated ProgramsCapacity Bidding Program (Day Ahead)June 1.19 No Events N/A No Events No Events No Events No EventsJuly 1.24 0.60 N/A 0.07 0.07 0.08 0.07August 1.27 No Events N/A No Events No Events No Events No EventsSeptember 1.23 No Events N/A No Events No Events No Events No EventsOctober 1.18 0.50 N/A 0.09 0.09 0.04 0.01

Capacity Bidding Program (Day Of)June 17.56 No Events N/A No Events No Events No Events No EventsJuly 18.21 14.10 N/A 11.74 11.74 14.84 15.28August 18.63 12.85 N/A 12.30 12.30 15.38 16.66September 18.49 12.51 N/A 11.90 11.90 14.65 16.21October 17.25 11.47 N/A 11.72 11.72 15.02 14.78

Demand Bidding ProgramJune 11.49 No Events N/A No Events No Events No Events No EventsJuly 12.05 2.91 N/A 74.65 85.59 96.09 90.21August 12.39 3.02 N/A 88.35 77.14 76.81 72.43September 12.24 No Events N/A No Events No Events No Events No EventsOctober 12.27 2.90 N/A 78.90 71.67 90.33 79.52

Demand Response Contracts (Day Ahead & Day Of)June 99.15 No Events N/A No Events No Events No Events No EventsJuly 102.51 No Events N/A No Events No Events No Events No EventsAugust 104.74 166.28 N/A 275.00 275.00 174.79 182.05September 103.56 No Events N/A No Events No Events No Events No EventsOctober 100.22 139.10 N/A 185.00 185.00 122.11 114.90

Other Price Responsive ProgramsSave Power Days / Peak Time RebatesJune 207.89 No Events N/A No Events No Events No Events No EventsJuly 256.82 74.02 N/A N/A 58.76 58.76 N/AAugust 265.60 120.13 N/A N/A 108.02 108.02 35.56September 238.08 107.66 N/A 108.59 108.62 108.62 10.73October 202.43 No Events N/A No Events No Events No Events No Events

153 SCE 03, Table 1.

Page 90: StaffReport_2012DRLessonsLearned

86

Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) (Cont.)

with RA Measurement Hours (1 6 p.m.)

Month 2012 RA(1)

2012 Adjusted RADaily

Forecast7 DayReport Year End Ex PostEnrollment

(2)

Enrollment&

Weather(3)

Summer Advantage Incentive Program / Critical Peak Pricing (CPP)June 66.49 63.23 N/A 42.59 49.00 49.93 27.68July 69.31 65.64 N/A 52.40 51.72 61.65 39.95August 68.57 65.14 N/A 52.00 42.46 46.72 38.50September 65.08 61.66 N/A 46.76 40.09 42.69 35.90October 62.86 No Events N/A No Events No Events No Events No Events

Summer Discount Plan (Residential)June 462.15 168.00 N/A 161.51 137.87 137.87 69.60July 545.82 538.42 N/A 263.67 158.81 158.81 188.00August 500.00 454.95 N/A 227.94 162.46 150.95 211.90September 519.53 514.47 N/A 254.06 254.06 254.06 133.02October 0.00 N/A N/A 292.62 292.62 292.62 101.95

Emergency ProgramsSummer Discount Plan (Commercial)June 33.62 No Events N/A No Events No Events No Events No EventsJuly 48.30 No Events N/A No Events No Events No Events No EventsAugust 62.43 4.99 N/A 4.77 3.43 3.43 3.10September 53.70 No Events N/A No Events No Events No Events No EventsOctober 0.00 N/A N/A No Events No Events No Events No Events

Agriculture Pumping InterruptibleJune 41.26 No Events N/A No Events No Events No Events No EventsJuly 39.66 No Events N/A No Events No Events No Events No EventsAugust 39.78 15.57 N/A 36.00 36.00 15.34 17.29September 37.71 23.73 N/A 60.56 60.56 28.39 24.00October 39.58 No Events N/A No Events No Events No Events No Events

Base Interruptible ProgramJune 553.24 No Events N/A No Events No Events No Events No EventsJuly 542.67 No Events N/A No Events No Events No Events No EventsAugust 542.52 No Events N/A No Events No Events No Events No EventsSeptember 548.21 558.24 N/A 513.78 520.91 441.46 573.01

Page 91: StaffReport_2012DRLessonsLearned

87

Appendix E: SCE 2012 DR Program Load Impact by Event (MW)Daily Average by Event Hours

Event Date DailyForecast

7 DayReport Year End Ex Post

Monthly Nominated ProgramsCapacity Bidding Program (Day Ahead)7/23/2012 0.07 0.07 0.03 0.047/24/2012 0.07 0.07 0.08 0.097/25/2012 0.07 0.07 0.08 0.087/30/2012 0.07 0.07 0.11 0.067/31/2012 0.07 0.07 0.10 0.0710/1/2012 0.09 0.09 0.24 0.2010/2/2012 0.09 0.09 0.33 0.1010/3/2012 0.09 0.09 0.12 0.0510/5/2012 0.09 0.09 0.18 0.0710/17/2012 0.09 0.09 0.00 0.0710/18/2012 0.09 0.09 0.02 0.1710/29/2012 0.09 0.09 0.19 0.15

Capacity Bidding Program (Day Of)7/20/2012 11.74 11.74 14.84 15.288/7/2012 12.30 12.30 14.92 16.468/13/2012 12.30 12.30 15.22 15.708/14/2012 12.30 12.30 16.01 17.829/14/2012 11.90 11.90 14.65 16.2110/2/2012 11.72 11.72 14.24 15.8010/18/2012 11.72 11.72 15.80 13.76

Demand Bidding Program7/12/2012 74.65 85.59 96.09 90.218/8/2012 85.59 102.63 100.67 92.958/10/2012 85.59 94.84 98.76 95.828/14/2012 94.09 70.89 66.96 61.768/16/2012 94.35 55.50 56.16 62.708/29/2012 82.15 61.84 61.51 48.9410/1/2012 78.75 80.85 98.54 79.7810/17/2012 79.05 62.49 82.12 79.25

Demand Response Contracts (Day Ahead & Day Of)8/14/2012 275.00 275.00 174.79 182.0510/2/2012 185.00 185.00 122.11 114.90

Page 92: StaffReport_2012DRLessonsLearned

88

Appendix E: SCE 2012 DR Program Load Impact by Event (MW) (Cont.)

Daily Average by Event Hours

Event Date DailyForecast

7 DayReport Year End Ex Post

Other Price Responsive ProgramsSave Power Days / Peak Time Rebates7/12/2012 N/A 58.76 58.768/10/2012 N/A 107.24 107.24 95.858/16/2012 N/A 107.61 107.61 24.438/29/2012 N/A 108.51 108.51 21.938/31/2012 N/A 108.73 108.73 0.029/7/2012 108.66 108.66 108.66 23.119/10/2012 108.52 108.57 108.57 1.65

Summer Advantage Incentive Program / Critical Peak Pricing (CPP)6/29/2012 42.59 49.00 49.93 27.687/12/2012 49.00 62.40 80.14 41.537/23/2012 55.79 41.05 43.17 38.368/7/2012 50.91 48.57 54.29 33.488/9/2012 50.91 53.07 59.96 39.148/13/2012 50.54 46.70 55.98 42.968/20/2012 53.21 44.04 44.52 45.198/27/2012 53.21 23.59 25.02 34.418/29/2012 53.21 38.79 40.55 35.859/10/2012 47.36 48.60 52.04 42.269/20/2012 47.36 26.92 30.09 27.429/28/2012 45.55 44.75 45.95 38.00

Summer Discount Plan (Residential)6/20/2012 Group 1 128.01 8.23 8.23 0.506/29/2012 Group 1 178.26 41.89 41.89 35.806/29/2012 Group 2 178.26 87.75 87.75 33.307/10/2012 Group 1 263.67 29.17 29.17 44.707/10/2012 Group 2 263.67 41.89 41.89 66.607/10/2012 Group 3 263.67 87.75 87.75 76.708/1/2012 Group 1 60.50 29.17 29.17 49.108/1/2012 Group 2 46.40 29.56 29.56 56.408/1/2012 Group 3 58.60 46.63 46.63 57.108/3/2012 Group 1 60.50 29.17 29.17 35.708/3/2012 Group 2 54.90 21.83 21.83 65.608/3/2012 Group 3 58.60 46.63 46.63 46.008/8/2012 Group 1 135.52 67.69 67.69 104.608/8/2012 Group 2 133.55 66.33 66.33 100.008/8/2012 Group 3 151.14 98.88 98.88 128.40

Page 93: StaffReport_2012DRLessonsLearned

89

Appendix E: SCE 2012 DR Program Load Impact by Event (MW) (Cont.)

Daily Average by Event Hours

Event Date DailyForecast

7 DayReport Year End Ex Post

Other Price Responsive ProgramsSummer Discount Plan (Residential) (cont.)8/9/2012 Group 1 151.14 67.69 67.69 125.908/9/2012 Group 2 121.12 66.33 66.33 107.208/9/2012 Group 3 118.06 98.88 98.88 121.208/14/2012 Group 1 130.40 194.47 61.14 119.408/14/2012 Reliability 17.42 8.15 3.43 13.508/15/2012 Group 1 116.01 88.62 88.62 74.308/15/2012 Group 2 75.10 42.35 42.35 84.208/15/2012 Group 3 77.77 40.44 40.44 77.508/17/2012 Group 1 101.30 102.53 102.53 153.008/17/2012 Group 2 58.00 42.25 42.25 98.308/21/2012 Group 1 61.87 53.44 53.44 72.708/21/2012 Group 2 62.65 29.93 29.93 83.408/21/2012 Group 3 50.70 29.39 29.39 57.508/22/2012 Group 1 115.03 29.39 29.39 42.408/22/2012 Group 2 75.11 29.93 29.93 67.208/22/2012 Group 3 101.25 47.12 47.12 58.508/28/2012 Group 1 129.54 129.54 129.54 76.308/28/2012 Group 2 83.86 83.86 83.86 88.208/28/2012 Group 3 71.90 71.90 71.90 81.308/29/2012 Group 1 82.56 82.60 82.60 80.308/29/2012 Group 2 66.42 66.40 66.40 91.708/29/2012 Group 3 108.42 108.40 108.40 125.909/10/2012 Group 1 72.72 72.72 72.72 92.409/10/2012 Group 2 77.52 77.52 77.52 69.009/10/2012 Group 3 18.98 18.98 18.98 68.409/14/2012 Group 1

110.89 110.89 110.89 37.809/14/2012 Group 29/14/2012 Group 3

99.32 99.32 99.32 17.809/14/2012 Group 49/14/2012 Group 5

135.61 135.61 135.61 20.709/14/2012 Group 69/20/2012 Group 1

65.73 65.73 65.73 21.909/20/2012 Group 29/20/2012 Group 3

77.39 77.39 77.39 14.609/20/2012 Group 49/20/2012 Group 5 65.53 65.53 65.53 21.10

Page 94: StaffReport_2012DRLessonsLearned

90

9/20/2012 Group 6

Appendix E: SCE 2012 DR Program Load Impact by Event (MW) (Cont.)

Daily Average by Event Hours

Event Date DailyForecast

7 DayReport Year End Ex Post

Other Price Responsive ProgramsSummer Discount Plan (Residential) (cont.)9/21/2012 Group 1 130.98 130.98 130.98 67.009/21/2012 Group 2 168.96 168.96 168.96 69.109/21/2012 Group 3 105.16 105.16 105.16 77.109/28/2012 Group 1

43.16 43.16 43.16 29.309/28/2012 Group 29/28/2012 Group 3

55.06 55.06 55.06 24.509/28/2012 Group 49/28/2012 Group 5

43.28 43.28 43.28 34.409/28/2012 Group 610/2/2012 Group 1 298.91 298.91 298.91 86.2010/2/2012 Group 2 198.32 198.32 198.32 130.9010/17/2012 Group 1 127.25 127.25 127.25 62.3010/17/2012 Group 2 146.77 146.77 146.77 72.3010/17/2012 Group 3 92.50 92.50 92.50 56.1010/18/2012 Group 1 154.37 154.37 154.37 N/A10/18/2012 Group 2 58.71 58.71 58.71 N/A10/26/2012 Group 1 38.65 38.65 38.65 N/A10/26/2012 Group 2 47.23 47.23 47.23 N/A10/26/2012 Group 3 7.77 7.77 7.77 N/A

Emergency ProgramsSummer Discount Plan (Commercial)8/14/2012 4.77 3.43 3.43 3.10

Agriculture Pumping Interruptible8/14/2012 36.00 36.00 15.34 17.299/26/2012 60.56 60.56 28.39 24.00

Base Interruptible Program9/26/2012 513.78 520.91 441.46 573.01

Page 95: StaffReport_2012DRLessonsLearned

91

Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW)

with RA Measurement Hours (1 6 p.m.)

Program Month 2012RA

2012 Adjusted RADaily

Forecast7 DayReport

ExPost

SettlementEnrollment

Enrollment &Weather

Emergency ProgramsBIP A 6 10 3 N/A N/A N/A N/A N/ABIP A 7 11 4 N/A N/A N/A N/A N/ABIP A 8 10 3 N/A N/A N/A N/A N/ABIP A 9 11 3 N/A 0.34 1.3 0.84 N/ABIP A 10 10 3 N/A N/A N/A N/A N/AMonthly NominatedCBP DA 6 9 8 N/A N/A N/A N/A N/A

CBP DA 7 10 8 N/A N/A N/A N/A N/A

CBP DA 8 10 9 N/A 8 9 8 9

CBP DA 9 10 8 N/A 9 7 7 7

CBP DA 10 10 8 N/A 9 8 4 8

CBP DO 6 20 10 N/A N/A N/A N/A N/A

CBP DO 7 22 10 N/A N/A N/A N/A N/A

CBP DO 8 22 10 N/A 12 11 10 11

CBP DO 9 23 11 N/A 12 10 11 10

CBP DO 10 23 10 N/A 12 10 9 10

Price ResponsiveACSAVER 6 7 7 N/A N/A N/A N/A N/A

ACSAVER 7 12 12 N/A N/A N/A N/A N/A

ACSAVER 8 15 14 N/A 27 18 19 N/A

ACSAVER 9 17 18 N/A 13 12 15 N/A

ACSAVER 10 18 18 N/A 15 9 18 N/A

CPP 6 12 16 N/A N/A N/A N/A N/ACPP 7 15 18 N/A N/A N/A N/A N/ACPP 8 12 15 N/A 14 20 19 N/ACPP 9 12 14 N/A 14 6 14 N/A

CPP 10 14 16 N/A 16 16 16 N/A

DBP 6 N/A N/A N/A N/A N/A N/A N/A

DBP 7 N/A N/A N/A N/A N/A N/A N/A

DBP 8 N/A N/A N/A 5 8 5 8

DBP 9 N/A N/A N/A 5 9 5 9DBP 10 N/A N/A N/A 5 8 5 8

PTR Com 6 N/A N/A N/A N/A N/A N/A N/A

PTR Com 7 N/A N/A N/A 2 0 0 31

PTR Com 8 N/A N/A N/A 1 4 0 37

PTR Com 9 N/A N/A N/A 1 0 0 33

PTR Com 10 N/A N/A N/A N/A N/A N/A N/A

PTR Res 6 46 46 N/A N/A N/A N/A N/A

PTR Res 7 70 70 N/A 24 13 6 160

PTR Res 8 69 69 N/A 15 21 2 286

PTR Res 9 63 63 N/A 32 46 8 298

PTR Res 10 52 52 N/A N/A N/A N/A 286

Page 96: StaffReport_2012DRLessonsLearned

92

Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)Daily Average by Event Hours

Program Event Date Daily Forecast 7 Day Report Ex Post Settlement

Emergency ProgramsBIP A 9/14/2012 0.3 1.3 0.8 N/ACPPE 8/13/2012 2.3 1.5 1.2 N/ACPPE 9/14/2012 1.6 1.4 0.9 N/AMonthly NominatedCBP DA 8/9/2012 7.5 9.3 7.5 9.4

CBP DA 8/10/2012 7.5 9.5 7.6 9.5CBP DA 8/14/2012 7.5 8.3 7.5 8.5CBP DA 9/14/2012 9 5.8 5.7 5.9

CBP DA 9/17/2012 9 8 7.9 8.4

CBP DA 10/1/2012 9 7 4.1 7.3

CBP DA 10/2/2012 9 8 4.2 8.7

CBP DO 8/8/2012 11.7 11.2 11 11.5

CBP DO 8/13/2012 11.7 10.6 8.5 10.6

CBP DO 9/13/2012 12.1 10.5 10.6 10.7

CBP DO 9/14/2012 12.1 9.9 10.6 10.1

CBP DO 10/1/2012 12.1 9.5 9.2 9.5Price ResponsiveACSAVER 8/8/2012 26.3 13.7 14 N/A

ACSAVER 8/10/2012 27.2 19.8 18.5 N/A

ACSAVER 8/13/2012 33.3 18.2 21.4 N/A

ACSAVER 8/17/2012 19.3 20.6 22.7 N/A

ACSAVER 9/13/2012 16 12.8 12.6 N/A

ACSAVER 9/14/2012 15.5 21.5 22.5 N/AACSAVER 9/15/2012 8.6 3.1 8.8 N/A

ACSAVER 10/1/2012 14.5 9.2 18 N/ACPP 8/9/2012 13.5 20.9 15.9 N/A

CPP 8/11/2012 11.7 12.3 18.4 N/A

CPP 8/14/2012 14.3 27.1 25.9 N/A

CPP 8/21/2012 16.5 20 17.2 N/ACPP 8/30/2012 16.2 20.3 17.8 N/A

CPP 9/15/2012 13.7 5.5 14.5 N/ACPP 10/2/2012 16 16.1 16.5 N/A

PTR Com 7/20/2012 2 0.1 0 31.2

PTR Com 8/9/2012 1.2 0.3 0 27.4PTR Com 8/10/2012 1.1 8 0 37.5

PTR Com 8/11/2012 0.8 0 0 26.2

PTR Com 8/14/2012 1.2 4.8 0 29.8

PTR Com 8/21/2012 1.2 4.5 0 62

PTR Com 9/15/2012 0.9 0 0 32.8

PTR Res 7/20/2012 23.9 13.3 6.3 160.1PTR Res 8/9/2012 13.1 26.1 3.3 202.8PTR Res 8/10/2012 12.6 28.1 3.2 197PTR Res 8/11/2012 12.2 33.6 1.7 231.1PTR Res 8/14/2012 12.5 6.9 1.1 240PTR Res 8/21/2012 25 10 3 559PTR Res 9/15/2012 32.3 45.8 8.3 298

Page 97: StaffReport_2012DRLessonsLearned

93

Appendix H: SCE 2012 DR Program Overview

Program Type ProgramSeason

AvailableAnnual

Events/Hours

AvailableMonthly

Events/Hours

AvailableWeekly

Events/Hours

AvailableDaily

Events/Hours

# of EventsTriggered/# of Hours

AvailableRemaining

AvailableTrigger Criteria

2012Trigger

Condition

AgriculturalPumpingInterruptible (API)

Day OfYear Round(excludingHolidays)

150 Hours 25 Events 4 Events 1 Event6 Hours Max

2 Events7.1 Hours 143 Hours

�• CAISO Stage 1 Alert�• CAISO Stage 2 Alert�• SCE Grid ControlCenter Discretion�• Measurement &Evaluation

�• SystemEmergency(San JoaquinValley)�• Measurement& Evaluation

Base InterruptibleProgram (BIP) Day Of

Year Round(excludingHolidays)

180 Hours 10 Events No Limit 1 Event6 Hours Max

1 Event2 Hours 178 Hours

�• CAISO Stage 1 Alert�• CAISO Stage 2 Alert�• SCE Grid ControlCenter Discretion�• Measurement &Evaluation

�• Measurement& Evaluation

Capacity BiddingProgram

DayAhead

May �– Oct(excludingHolidays)

No Limit 24 Hours Mon Fri1 Event8 Hours(11am �– 7pm)

12 EventsJuly �– 17HrsOct �– 22Hrs

May �– 24HrsJune �– 24HrsJuly �– 7 HrsAug �– 24 HrsSep �– 24 HrsOct �– 2 Hrs

�• High temperature�• Resourcelimitations�• A generating unitoutage�• Transmissionconstraints�• CAISO Alert orWarning�• SCE SystemEmergency�• Measurement &Evaluation

�• Heat Rate

Page 98: StaffReport_2012DRLessonsLearned

94

Appendix M (Cont.)

SCE 2012 DR Program Overview (Cont.)

Program Type ProgramSeason

AvailableAnnual

Events/Hours

AvailableMonthly

Events/Hours

AvailableWeekly

Events/Hours

AvailableDaily

Events/Hours

# of EventsTriggered/# of Hours

AvailableRemaining

AvailableTrigger Criteria

2012Trigger

Condition

Capacity BiddingProgram Day Of

May �– Oct(excludingHolidays)

No Limit 24 Hours No Limit

1 Event4,6, or 8 houreventdurationoptions

7 EventsJuly �– 3 HrsAug �– 12 HrsSept �– 6 HrsOct �– 10 Hrs

May �– 24 HrsJun �– 24 HrsJul �– 21 HrsAug �– 12 HrsSep �– 18 HrsOct �– 14 Hrs

�• High temperature�• Resourcelimitations�• A generating unitoutage�• Transmissionconstraints�• CAISO Alert orWarning�• SCE SystemEmergency�• Measurement &Evaluation

�• Heat Rate

Demand BiddingProgram

DayAhead

Year Round(excludingHolidays)

No Limit No No LimitMon Fri

1 Event8 hours

8 Events64 Hours No Limit

�• CAISO Alert orWarning�• Day Ahead loadand/or Price Forecast�• Extreme or unusualtemperatureconditions�• SCE Procurementneeds�• Measurement &Evaluation

�• Heat Rate

DR Contracts DayAhead Varies Varies by

ContractVaries byContract

Varies byContract

Varies byContract

1 Event2 Hours

Varies byContract Varies by Contract �• Peak Load

Forecast

DR Contracts Day Of Varies Varies byContract

Varies byContract

Varies byContract

Varies byContract

2 Events5 Hours

Varies byContract Varies by Contract

�• Energy Prices�• Peak LoadForecast

Page 99: StaffReport_2012DRLessonsLearned

95

Appendix M (Cont.)

SCE 2012 DR Program Overview (cont.)

Program Type ProgramSeason

AvailableAnnual

Events/Hours

AvailableMonthly

Events/Hours

AvailableWeekly

Events/Hours

AvailableDaily

Events/Hours

# of EventsTriggered/# of Hours

AvailableRemaining

AvailableTrigger Criteria

2012Trigger

Condition

Save Power Day DayAhead

Year Round(excludingHolidays)

No Limit No Limit No Limit1 Event4 Hours(2pm �– 6pm)

7 Events28 Hours No Limit �• Temperature �•

Temperature

SummerAdvantageIncentive

Day OfJune �– Sep(excludingHolidays)

60 HoursMin: 9 EventsMax: 15Events

No Limit No Limit1 Event4 Hours(2pm �– 6pm)

12 Events48 Hours 3 Events

�• Temperature�• CAISO Alert orWarning�• SCE SystemForecast�• Extreme or unusualtemperatureconditions�• Day Ahead loadand/or PriceForecast

�• HighTemperature�• Peak LoadForecast�• Day Aheadload and/orPrice Forecast

Summer DiscountPlan Residential Day Of

Year Round(excludingHolidays)

UnlimitedEvents180 Hours

No Limit No LimitUnlimitedEvents6 Hours

23 Events24 Hours 156 Hours

�• CAISO Alert orWarning�• CAISO Discretion�• SCE Grid ControlCenter Discretion�• SCE EnergyOperations CenterDiscretion�• Measurement &Evaluation

�• CAISOEmergency�• Heat Rate�•Measurement& Evaluation

Summer DiscountPlan �– Commercial Day Of

Year Round(excludingHolidays)

Base �– 90HoursEnhanced �–Unlimited

No Limit No Limit 6 Hours 1 Event5.6 Hours No Limit

�• CAISO Stage 1 Alert�• CAISO Stage 2 Alert�• SCE Grid ControlCenter Discretion�• Measurement &Evaluation

�• CAISOEmergency

Page 100: StaffReport_2012DRLessonsLearned

96

Appendix I: SDG&E DR Program Overview

Program Type Program SeasonAvailable AnnualEvents/Hours

AvailableMonthly

Events/Hours

Available WeeklyEvents/Hours

Available DailyEvents/Hours

# of EventsTriggered

AvailableRemaining

Trigger Criteria Trigger Condition

1 EventTemperature andsystem load

Always*Monday: 86 ; 3472MW

7 Hours*Tues Fri: 84 ; 3837MW

(11am 6pm)*Saturday: 86 ; 3837MW

May Oct 1 Event 7 Events Price:

Mon Fri Up to 8 HoursAug: 12 Hours(3 events)

Aug: 32 Hours *Mon Friday only

(11am 7pm)Sep: 8 Hours(2 events)

Sep: 36 Hours

*Market Price equalto or greater than15,000 btu/kWh heatrate

Oct: 8 Hours(2 events)

Oct: 36 Hours*Other Statewide orlocal systemconditions

May Oct 1 Event 5 Events Price:

Day Of Mon Fri Up to 8 HoursAug:7 Hours(2 events)

Aug: 37 Hours *Mon Friday only

(11am 7pm)Sep: 8 Hours(2 events)

Sep: 36 Hours

*Market Price equalto or greater than15,000 btu/kWh heatrate

Oct: 4 hours(1 event)

Oct: 40 Hours*Other Statewide orlocal systemconditions

1 Event 1 EventCAISO forecasts aStage 1

1 ComplianceTest

Up to 4 Hours 4 HoursCAISO declares aStage 2

2 Met triggercriteria

CAISO calls forinterruptible loadExtreme weather orsystem demands orat SDGE discretion.

116 HoursBase InterruptibileProgram (BIP)

Day Of 30minute

Year Round 120 Hours 10 Events

Capacity BiddingProgram (CBP)

No Limit 44 Hours No Limit

Mitigate potentialprice spikes andload forecastabolve 4000 MWand/or Real TimeLoad came inhigher than DayAhead forecast

Mitigate potentialprice spikes andload forecastabove 4000 MW

Critical PeakPricing Default

(CPP D)Day Ahead Year Round 18 Events No Limit No Limit 7 Events 11 Events

Met trigger criteriafor all 7 events

Capacity BiddingProgram (CBP)

Day Ahead No Limit 44 Hours No Limit

Page 101: StaffReport_2012DRLessonsLearned

97

Appendix N: SDG&E DR Program Overview (Cont.)

Program Type Program SeasonAvailable AnnualEvents/Hours

AvailableMonthly

Events/Hours

Available WeeklyEvents/Hours

Available DailyEvents/Hours

# of EventsTriggered

AvailableRemaining

Trigger Criteria Trigger Condition

May Oct 15 Events 1 Event 8 EventsTemperature andsystem load

Holidays Excluded or Noon to 8 pmAug: 15 Hours(4 events)

Aug: 25 Hours*Monday Friday:3800 MW

120 HoursMin 2/Max 4

HoursSep: 10 Hours(3 events)

Sep: 30 Hours*Saturday SundayOptionalParticipation

Oct: 4 Hours(1 events)

Oct: 36 Hours *CAISO Stage 1 or 2

Annual 91 Hours*Local or systememergency

1 EventTemperature andsystem load

Always*Monday: 86 ; 3472MW

7 Hours*Tues Fri: 84 ; 3837MW

(11am 6pm)*Saturday: 86 ; 3837MW

Day Of 2 EventsAug:1 Event(5 Hours)

Terminates Dec31

30 minuteSep:1 Event(4 Hours)

Jul Dec 3 EventsCAISO 1,2,or 3Emergency

2012 only 14 Hours

Transmission orimminent systememergency or aswarranted by theutility

08/10/1308/14/13

Conditionswarranted byUtility

Flex Alerts inEffect

71 Hours

Local utilityemergency with

intent to avoid anyfirm load curtailment

CAISO calls for

Conditionswarranted byUtility

Demand Bidding Day Ahead No Limit No Limit No Limit No Limit N/A

Critical PeakPricing Emergency

(CPP E) Year Round 80 Hours 40 Hours 4 Events 1 Event

Mitigate potentialprice spikes andload forecastabolve 4000 MWand/or Real TimeLoad came inhigher than DayAhead forecast

Reduce Your Use Day Ahead Year Round No Limit No Limit No Limit 7 Events No LimitMet trigger criteriafor all 7 events

Summer Saver Day Of 40 Hours 3 Events

Page 102: StaffReport_2012DRLessonsLearned

98

Appendix J: SCE Historical DR Event Hours

DR Programs Event LimitsMaxEvent

Duration2012

20062011

Average

20062011Max

2011 2010 2009 2008 2007 2006

Monthly NominatedCapacity Bidding Program Day Ahead (1 4) 24 Hrs./Mo 4 hrs. 39 53 72 48 47 72 47 53Capacity Bidding Program Day Ahead (2 6) 24 Hrs./Mo 6 hrs. 0 51 71 23 49 71 53 58Capacity Bidding Program Day Ahead (4 8) 24 Hrs./Mo 8 hrs. 0 14 42 0 0 28 42 0Capacity Bidding Program Day Of (1 4) 24 Hrs./Mo 4 hrs. 23 18 40 8 31 8 3 40Capacity Bidding Program Day Of (2 6) 24 Hrs./Mo 6 hrs. 33 12 40 8 40 8 3 0Capacity Bidding Program Day Of (4 8) 24 Hrs./Mo 8 hrs. 0 11 49 0 0 8 0 49Demand Bidding Program Unlimited 8 hrs. 64 106 172 40 72 116 101 172 136Demand Response Contracts Day Ahead Various 4 hrs. 2 19 71 8 8 6 71 0Demand Response Contracts Day Of Various 4 hrs. 12 11 16 14 16 6 7 14Other Price ResponsiveSave Power Days / Peak Time Rebates Unlimited 4 hrs. 28Summer Advantage Incentive / Critical PeakPricing (CPP) 15 Events/Yr. 4 hrs. 48 57 70 48 48 70 60

Summer Discount Plan Residential &Commercial Base

15 Events/Summer Season 6 hrs./day 15 38 0 22 5 0 38 24

Summer Discount Plan �– Residential &Commercial Enhanced

UnlimitedEvents/ Summer

Season6 hrs./day 15 39 0 22 9 0 39 18

Summer Discount Plan Commercial �– Base 15 Events/Summer Season 6 hrs./day

6Summer Discount Plan CommercialEnhanced

UnlimitedEvents/ Summer

Season6 hrs./day

6Summer Discount Plan �– Residential 180 Hours/Yr. 6 hrs./day 24EmergencyAgricultural Pumping Interruptible (API) 1/Day

4/Wk.25/Mo.

6 hrs./Day40 hrs./Mo150 hrs./Yr.

7 1 2 1 2 0 1 0 0

Base Interruptible Program (BIP) 1/Day10/Mo.

6 hrs./Day180 hrs./Yr. 2 1 3 2 0 2 0 0 3

Page 103: StaffReport_2012DRLessonsLearned

99

Appendix K: SCE Historical Number of DR Events

DR Programs Event Limits 201220062011

Average

20062011Max

2011 2010 2009 2008 2007 2006

Monthly Nominated ProgramsCapacity Bidding Program Day Ahead (1 4) 24 Hrs./Mo 12 20 26 19 18 26 20 15Capacity Bidding Program Day Ahead (2 6) 24 Hrs./Mo 0 16 22 10 16 22 19 13Capacity Bidding Program Day Ahead (4 8) 24 Hrs./Mo 0 3 11 0 0 6 11 0Capacity Bidding Program Day Of (1 4) 24 Hrs./Mo 7 5 11 3 9 2 2 11Capacity Bidding Program Day Of (2 6) 24 Hrs./Mo 7 3 8 2 8 2 2 0Capacity Bidding Program Day Of (4 8) 24 Hrs./Mo 0 2 9 0 0 2 0 9Demand Bidding Program Unlimited 8 14 22 5 9 15 15 22 17Demand Response Contracts Day Ahead Various 1 5 18 2 2 1 18 0Demand Response Contracts Day Of Various 2 3 5 5 2 1 3 3Other Price ResponsiveSave Power Days / Peak Time Rebates Unlimited 7Summer Advantage Incentive / Critical PeakPricing (CPP)

15 Events/Yr.12 12 12 12 12 12 12

Summer Discount Plan Residential &Commercial Base

15 Events/Summer Season 5 11 11 6 3 0 5 2

Summer Discount Plan �– Residential &Commercial Enhanced

UnlimitedEvents/ Summer

Season8 22 10 22 5 0 6 2

Summer Discount Plan Commercial Base 15 Events/Summer Season 1

Summer Discount Plan CommercialEnhanced

UnlimitedEvents/ Summer

Season1

Summer Discount Plan �– Residential 180 Hours/Yr. 23Emergency ProgramsAgricultural Pumping Interruptible (API) 1/Day, 4/Wk.

25/Mo. 2 1 2 1 2 1 1 0 0

Base Interruptible Program (BIP) 1/Day,10/Mo. 1 1 1 1 0 1 0 0 1

Page 104: StaffReport_2012DRLessonsLearned

100

Appendix L: Summary of SCE�’s Reasons for the 2012 DR Triggers

DR Program Category Programs ReasonsMonthly Nominated Capacity Bidding Program

Demand Bidding Program

DR Contracts

No nomination or trigger conditionsTrigger conditions plus SCE�’s discretion tooptimize performance & minimizeparticipant fatigueTrigger conditions

Price responsive Save Power Day (PTR)

Summer Advantage Incentive (CPP)

Summer Discount Plan (SDP) �– Res.

SCE discretion to optimize performance &minimize participant fatigueOptimal dispatch

Transitioned to price trigger starting June2012. Remaining hours reserved forcontingencies.

Emergency Agricultural Interruptible Program

Base Interruptible Program

Local transmission contingency

No emergency, test event only

Page 105: StaffReport_2012DRLessonsLearned

101

Appendix M: SDG&E Historical DR Event Hours154

DR Programs Event Limits 201220062011

Average

20062011Max

2011 2010 2009 2008 2007 2006

Monthly Nominated ProgramsCapacity Bidding Program DayAhead 24 Hrs./Mo 24 19 38 19 28 24 4 38 0

Capacity Bidding Program Day Of 24 Hrs./Mo 20 28 50 28 50 37 6 45 0Price Responsive ProgramsPeak Time Rebate Unlimited 49 32 32 32Critical Peak Pricing Default 98 Hrs. ('06 '07)

126 Hrs. ('08 '12) 49 39 70 14 28 56 0 63 70

Demand Bidding Program Unlimited 14 29 41 41 16Summer Saver 120 Hrs./Yr. 30 29 44 22 44 30 8 43 24Emergency ProgramsBase Interruptible Program (BIP) 120 Hrs./Yr. 4 2 4 4 4 0 0 4 2Critical Peak Pricing Emergency 80 Hrs./Yr. 9 4 14 0 0 0 0 14 7

154 Source for the 2006 2012 data: SGE 02, Attachment 1, Revised Appendix X, Tables 8 11.

Page 106: StaffReport_2012DRLessonsLearned

102

Appendix N: SDG&E Historical Number of DR Events155

DR Programs Event Limits 201220062011

Average

20062011 Max 2011 2010 2009 2008 2007 2006

Monthly Nominated ProgramsCapacity Bidding Program Day Ahead Unlimited 7 5 8 5 7 6 1 8 0Capacity Bidding Program Day Of Unlimited 5 7 12 7 12 7 1 12 0

Price Responsive ProgramsPeak Time Rebate Unlimited 7 5 5 5Critical Peak Pricing �– Default 12 ('06 '07)

18 ('08 '12) 7 6 10 2 4 8 0 9 10

Demand Bidding Program Unlimited 3 7 9 9 4Summer Saver 15/Yr. 8 8 12 6 11 7 2 12 8

Emergency ProgramsBase Interruptible Program (BIP) 10/Mo. 1 1 1 1 1 0 0 1 1Critical Peak Pricing �– Emergency Unlimited 2 1 3 0 0 0 0 3 2

155 Source for 2006 2012 data: SGE 02, Attachment 1, Revised Appendix X, Tables 8 11.

Page 107: StaffReport_2012DRLessonsLearned

103

Appendix O: Utilities�’ Peaker Plant Total Permissible vs. Actual Service Hours

SCE Owned Peaker Plants

Within SONGS Affected Areas

Center Barre Grapeland Mira Loma

Permissible Service Hours 1096 955 1073 700

Actual Service Hours:

Sept. Dec. 2007 93 123 87 104

Jan. Dec. 2008 120 118 125 119

Jan. Dec. 2009 93 83 46 70

Jan. Dec. 2010 156 174 137 148

Jan. Dec. 2011 163 149 85 127

2007 2011 Average 125 129 96 114

% of Permitted 11% 14% 9% 16%

Jan. Oct. 2012 459 465 403 413

% of Permitted 42% 49% 38% 59%

% of 2007 2011 Avg. 367% 359% 420% 364%

SDG&E Owned Peaker Plants

Cuyamaca El Cajon EnergyCenter Miramar Orange Grove

Permissible Service Hours N/A 2500 5000 6400

Actual Service Hours:

2006 200

2007 250

2008 373 671

2009 625 1919

2010 481 439 2946

2011 667 433 4306

Historical Average 537 436 1715

% of Permitted N/A 17% 34% N/A

2012 1621 974 4805 2148

% of Permitted N/A 39% 96% 34%

% of Historical Avg. 302% 223% 280% N/A

Page 108: StaffReport_2012DRLessonsLearned

104

Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days

ProgramsEx Post (MW)

3:00 4:00 p.m. 4:00 5:00 p.m.

8/10/12:

SCEDemand Bidding Program 107 107Save Power Day/Peak Time Rebate 87 78

Subtotal 194 185SDG&ECapacity Bidding Program 8 8Summer Saver/AC Cycling (Res. & Com.) 19

Subtotal 8 27PG&ECapacity Bidding Program 41 41Aggregator Managed Program 174 172Peak Day Pricing/Critical Peak Pricing 22 24Peak Choice 3 2SmartAC 65Base Interruptible Program 220 222

Subtotal 459 527

TOTAL 661 739

8/14/12:

SCECapacity Bidding ProgramDemand Bidding Program 72 71Demand Response Contract 184 180Summer Discount Plan/AC Cycling Res. & Com. 137 22Agricultural Pumping Interruptible 14

Subtotal 394 242SDG&ECapacity Bidding Program 8 8Critical Peak Pricing 24 25Peak Time Rebate 1 0Demand Bidding Program 6 5

Subtotal 38 38PG&E No Events

TOTAL 432 280

Page 109: StaffReport_2012DRLessonsLearned

105

Appendix Q: CAISO Energy Price Spikes

SCE Price Spikes156

156 Source: SCE 03, SCE�’s Response to ALJ February 21, 2013 Ruling, Appendix B (Excel Data Tables in Response, Table 9)

Page 110: StaffReport_2012DRLessonsLearned

106

Page 111: StaffReport_2012DRLessonsLearned

107

Page 112: StaffReport_2012DRLessonsLearned

108

SDG&E Price Spikes157

157 Source: SGE 02, SDG&E�’s Response to the ALJ February 4, 2013 Scoping Memo, Attachment 3.

Page 113: StaffReport_2012DRLessonsLearned

109

Page 114: StaffReport_2012DRLessonsLearned

110

Page 115: StaffReport_2012DRLessonsLearned

111

Appendix R: Utilities�’ Demand Response Reporting Requirements158

(2013 2014)

1. DR Weekly Forecast

The utilities should continue to submit a 7 day (Monday to Sunday)159 DR forecast (MW) to theCAISO/CPUC_ED/CEC and highlight the DR programs that they anticipate to trigger by noon everyMonday.

Daily Value

For the DR programs that have different hourly forecast, the utilities use slightly different methodsto determine the daily value as described below. If an averaging method is used, the daily value maybe higher or lower than the MW in a given hour such as the peak hours in the CAISO's demandforecast. Energy Division staff uses an averaging method over the actual event hours for its reports onthe historical DR events.

Utility Methods for the Daily Value

SCE Average over the available event hours in the tariffs, which vary from program toprogram.

SDG&E PROGRAM PERIOD AVERAGEDDay Ahead 11 a.m. �– 6 p.m.Day Of 1 p.m. �– 6 p.m. (like RA)

PG&E PROGRAM PERIOD AVERAGEDBIP 1:00 pm �– 6:00 pmPDP 2:00 pm �– 6:00 pm (no significant enrollment/load 12 2p)SmartRate 2:00 pm �– 6:00 pmSmartAC 1:00 pm �– 6:00 pm

For AMP, CBP, DBP, and PeakChoice, the hourly forecast does not vary;therefore, PG&E will continue to submit the same hourly forecast amount for thegiven month.

2. Daily DR Reporting to the CAISO (by 8 a.m. weekdays & weekends)

For the non summer months (January 1 to April 30 and November 1 to December 31), the utilitiesshould submit their Daily DR Reporting to the CAISO/CPUC_ED/CEC only when they intend to trigger aDR program for that day. In the submission email, please identify the triggered DR program(s). Ifthere is no DR event, the utilities do not need to submit this report.

For the summer months (May 1 to October 31), the utilities should submit their Daily DR Reportingto the CAISO/CPUC_ED/CEC on a daily basis as they did in 2012.

This report is based on a common template developed by the CAISO and in Excel spreadsheet. Inthis report, the utilities provide the scheduled as of 8 a.m. and available MW for the day and next dayfor all of their DR event based programs (including Day Ahead and Day of) on an aggregated basis.SCE also has added the MW by each DR program. SDG&E only added the MW for the DR program(s)triggered for the day or the next day.

158 For SCE and SDG&E. Staff guidance only for PG&E because it is subject to this proceeding (A.12 12 016 et al.). However,staff includes the reporting requirements for PG&E as a guidance consistent with what are required for SCE and SDG&E.

159 Change from SCE and PG&E current days from Tuesday �– Monday to Monday �– Sunday.

Page 116: StaffReport_2012DRLessonsLearned

112

3. Updated Reporting to the CAISO/CPUC_ED/CEC (by COB weekdays for DR events called after 8a.m.)

PG&E:

PG&E continues to send the DR forecast for all of the Day Ahead and Day Of events triggered tothe CAISO and CPUC throughout the day as it used to do prior to summer 2012 in Excel spreadsheet.These reports provide the forecasted MW for each DR program.

SCE:

SCE sends a revised Daily DR Report to include the Day Of events called after 8 a.m. and theforecasted MW by program at the end of the event day. In the submission email, please identify thetriggered DR program(s).

SDG&E:

SDG&E also sends a revised Daily DR Report to include the all DR events called after 8 a.m. and theforecasted MW by program at the end of the event day.

4. Reports on DR Results to the CAISO/CPUC_ED/CEC (Seven Days After the Events)

All three utilities should continue to provide the DR results in Excel spreadsheet seven days aftereach DR event (CAISO 7 Day Report). The 7 Day Report should also include the DR results to date ineach year.160

The utilities should submit the DR Weekly DR Forecasts (No.1) to the following emails:Entity/Individual Email AddressCAISO John Goodin [email protected] Bruce Kaneshiro [email protected] Scarlett Liang Uejio scarlett.liang [email protected] Dorris Chow [email protected] Paula Gruendling [email protected] Margaret Sheridan [email protected] utilities should submit the Daily DR Reports, revisions, and Results (No.2 No.4) to the following

emails:Entity/Individual Email AddressCAISO Shift Supervisors [email protected] Market Operations [email protected] John Goodin [email protected] Glen Perez [email protected] Market Monitoring Keith Collins [email protected] Scarlett Liang Uejio scarlett.liang [email protected] Bruce Kaneshiro [email protected] Dorris Chow [email protected] Paula Gruendling [email protected] Margaret Sheridan [email protected]

160 See SCE�’s 2012 7 Day Reports as an example.

Page 117: StaffReport_2012DRLessonsLearned

113

Appendix S: Additional Information

Provided in separate PDF files