4. Discussion

Many of the comments appear to arise from parties' expectations that the EM&V protocols for 2006 and beyond would resemble the framework and contain the level of specificity provided in the pre-1998 EM&V protocols. Indeed, in D.05-04-051, the Commission expressed similar expectations when it described what the protocol document should include. In particular, the pre-1998 protocols contained specific tabular cross-walks between the type of program (or defined program groupings), the type and frequency of studies required, the specific set of "how to" protocols that would apply to each type of study, and the dates when the EM&V reports would be presented for public review. It also specified at the outset which measures would and would not require load impact or persistence studies for the purpose of truing up the performance basis, and established a technical committee to consider waivers to specific protocols (e.g., minimum sample size) on a case-by-case basis. The pre-1998 protocol document also outlined a schedule for the filing of EM&V studies for energy efficiency and a forum for dispute resolution, namely, it created the Annual Earnings Assessment Proceeding, or "AEAP."

Because of the deterministic nature of these pre-1998 EM&V protocols, once the energy efficiency program plans were adopted in each program cycle, the EM&V study plans/budgets and resulting RFPs were basically defined by the protocols. For example, per the protocol tables, all lighting programs were subject to a first-year load impact study every other program year, and retention/persistence studies were required in the 4th and 9th years after installation. The protocol tables also laid out the evaluator "how to" protocols for all load impact studies (including sample design and billing data protocols), with accompanying tables that specified the acceptable modeling approach, basis for establishing hours of operation, approaches for adjusting for weather or the effects of existing state or federal efficiency standards, and other study parameters. Therefore, once the utility program plans were adopted, the evaluation contract managers would use these tables to scope the work for their EM&V contractors and develop budgets without further debate over what studies would be undertaken for what measures, in what frequency, and what evaluation protocols should apply.

The protocol documents developed by Joint Staff and the TecMarket Works Team reflect a different approach to the protocols, based on their assessment of what makes the most sense for the size and nature of the post-2005 portfolio plans. During the pre-1998 era, most of the programs focused on providing financial incentives directly to participating customers. In contrast, the post-2005 portfolio plans contain a much wider variety of market strategies, including incentives to upstream market actors (e.g., retailers and wholesalers of energy efficient equipment), statewide marketing and outreach activities, among others. The proposed protocols recognize that the broader efforts that portfolio administrators will be undertaking in 2006 and beyond to capture energy efficiency potential do not lend themselves to the "one size fits all" deterministic EM&V protocols of the past. They also recognize that there may be ways to maximize the efficiency of evaluation efforts by aggregating programs at the technology level, where appropriate, rather than conducting individual studies for each program, as generally required under the pre-1998 protocols.

For the future, Joint Staff and the TecMarket Works Team have developed EM&V protocols that outline a process for setting evaluation priorities and budgets, and present a decision-tree approach to determining the applicable "how to" evaluator protocols, once those priorities are established. More specifically, for impact studies, Joint Staff plans to review the administrator's portfolios and programs in order to establish evaluation groupings, and then decide which programs (or program components) will receive verification-only analysis, direct impact evaluation or indirect impact evaluation.4 Joint Staff will then conduct a risk analysis in order to assign minimum rigor level requirements (with associated evaluator "how to" protocols) along with evaluation budgets across the program evaluation groupings. The resulting evaluation plans and budget levels will then be used to develop the RFPs for evaluation activities during the 2006-2008 program cycle.

Given this EM&V protocol framework, I requested that Joint Staff further clarify the Process and Review Protocols in response to comments. Accordingly, Joint Staff augmented its draft protocol documents to include a process and schedule for obtaining public input on the various steps still needed to establish the EM&V study plans for 2006-2008. Joint Staff also added to the protocol documents a description of the process for the development and review of impact and market effects studies, including the study team approach and opportunity for public comment that was discussed in D.05-01-055.5 In addition, Joint Staff clarified the EM&V cycle by developing a document that identified when various parameters used to calculate the performance basis (measure installations, program costs, unit energy savings, etc.) would be verified and published in reports over the three-year program cycle, including a discussion of how each performance parameter would be updated.

Joint Staff and the TecMarket Works Team also responded to parties' comments on the Evaluator Protocols by eliminating superfluous text and references, adding concise summaries of the required protocols at the end of each chapter, and by making other improvements. Additional suggestions for improvement were discussed during the workshop, and in the post-workshop comments.6 Joint Staff and the TecMarket Works Team are in the process of considering those suggestions and incorporating additional revisions into this set of protocols in response. This set of revisions to the evaluator "how to" protocols is expected to be completed, and a ruling adopting revised sections will be issued, in February 2006. Also in February, Joint Staff and the TecMarket Works Team will distribute to parties the remaining evaluation protocols-Emerging Technology Evaluation Protocol, Codes and Standards Evaluation Protocol, and Effective Useful Life (persistence and technical degradation) Evaluation Protocol-and collect comments from parties. Joint Staff will hold a public workshop on these remaining evaluation protocols in February, and will then consider and incorporate revisions based on the feedback received via written comments and those received at the workshop. This final set of protocols is expected to be adopted via Administrative Law Judge's Ruling in March 2006.

I attended both days of the workshop and led the discussion on the dispute resolution process. At the end of the workshop, Joint Staff and I summarized the additional revisions/clarifications that would be made to the Process Protocols in response to the workshop discussion. Those changes have been made to the following documents, which are attached to this ruling:

a) The Performance Basis Protocol, which identifies when Joint Staff and its consultants plan to verify various components (e.g., measure installations, program costs, unit energy savings) used to calculate the performance basis for each portfolio administrator for the 2006-2008 planning cycle. (Attachment 2.)

b) The Public Process Protocol for the risk analysis, priority assessment and study scoping that Joint Staff will be undertaking in the coming weeks for impact evaluation studies. (Attachment 3.)

c) The Study Review Protocol, which describes the process Joint Staff will use to develop and review comments after a contractor has been selected to conduct a specific set of evaluations for impact and market effects studies. This protocol also identifies the Annual Earnings Assessment Proceeding as the forum for dispute resolution. (Attachment 4.)

During the workshops, we also summarized the protocols that still needed to be developed by Joint Staff and/or its consultants, and presented for discussion in a subsequent workshop. The protocols still to be reviewed in a later workshop are indicated in italics in Attachment 1. We will move ahead with finalizing all of the EM&V protocols as early in 2006 as possible.

In particular, Joint Staff has informed me that the "Evaluation and Program Planning Cycle Integration" document identified in Attachment 1, which will consist of a side-by-side listing of the activities and timelines associated with the energy efficiency and program planning cycle and resource planning proceedings, will soon be posted to the Commission's website.7 This document is intended to alert all parties and evaluation contractors to the possible timeframe for hand-off of EM&V results to other proceedings, and Joint Staff will be refining this document over time as the timelines for each proceeding are further developed and/or revised.

IT IS RULED that the Process Protocols presented in Attachments 2, 3, and 4 of this ruling are adopted.

Dated January 11, 2006, at San Francisco, California.

CERTIFICATE OF SERVICE

I certify that I have by mail this day served a true copy of the original attached Administrative Law Judge's Ruling Adopting Protocols for Process and Review of Post-2005 Evaluation, Measurement and Verification Activities on all parties of record in this proceeding or their attorneys of record.

Dated January 11, 2006, at San Francisco, California.

NOTICE

Parties should notify the Process Office, Public Utilities Commission, 505 Van Ness Avenue, Room 2000, San Francisco, CA 94102, of any change of address to ensure that they continue to receive documents. You must indicate the proceeding number on the service list on which your name appears.

Gottstein Ruling Attachment 1

Gottstein Ruling Attachment 2

Gottstein Ruling Attachment 3

Gottstein Ruling Attachment 4

4 "Verification-only" analysis refers to the verification of program participation (types and numbers of measures installed) and program costs. "Direct impact" evaluation refers to the estimation of the savings from installed measures. The term "indirect impact evaluation" refers to those program-specific evaluations designed to measure the specific program goals that create an impact that is expected to eventually lead to energy and/or demand savings, but where these savings cannot be directly estimated. Statewide marketing and outreach, for example, is the type of program that would need to be evaluated using indirect impact evaluation.

5 D.05-01-055, pp. 108-111.

6 At the end of the discussion on Evaluator Protocols, workshop participants were provided the opportunity to present to Joint Staff and the TecMarket Works Team (by December 21, 2005) any further specific recommendations to modify/clarify the text in the "how to" protocol document. PG&E, SCE, and ORA provided post-workshop comments to Joint Staff.

7 EM&V-related documents in this proceeding are posted to the Commission's website at http://www.cpuc.ca.gov/eerulemaking.

Previous PageTop Of PageGo To First Page