Workshop #3 focused specifically on the issue of performance basis for energy efficiency programs that do not directly procure energy resources, i.e., "non-resource" program. More specifically, these programs work towards the goal of increasing the efficiency of energy use through energy information, marketing and outreach, education and training and other approaches that do not directly involve or result in the installation of energy efficient equipment or measures at customer premises. As discussed at the workshop and in written comments, the performance basis must reflect the goal(s) of the particular information, marketing or outreach program. Workshop participants and Energy Division reached consensus on how to measure the performance basis for these types of programs, as follows:46
· Audits and Targeted Information Programs to Customers: The performance basis should measure net benefits based on program participants being: a) moved to take action through a resource program; b) taking an action themselves based on the audit/targeted education program, c) doing both of the above.
· Codes and Standards Advocacy and Industry Standards Programs: The performance basis should be based on a) predicted savings in case study analyses or American Society for Testing and Materials (ASTM) standards (for programs developing standards) that are presented to decision makers, and b) by how much of the recommended case study/ASTM savings are implemented in the adopted code or standard.
· Education/Training Programs: For schools, universities and other training programs, the performance basis should be based on: a) attitude, awareness and knowledge of students; b) reasonable impacts on energy savings or intention to act based on students' actions.
· Advertising and Marketing: The performance basis should be based on: a) any direct energy savings impacts attributable to the activity; b) the intention to act, if no direct impacts are possible to measure; and c) the reach of the advertising/marketing activity, the frequency of the activity and the leveraging of ancillary resources that comes from the activity.
In addition, workshop participants agreed that a separate performance basis for telephone centers and websites should not be developed. Rather, these program activities should be considered as part of the administrative costs of the programs they support. They also reached consensus that the term "market transformation" should be dropped for the purpose of establishing performance basis, since the activities and program efforts that have been included under this term are more currently covered under resource programs and other program categories.
We adopt these consensus positions, with the expectation that Energy Division with input from the public and after obtaining necessary technical expertise (see Section 5 below) will further develop each performance basis to more specifically identify outputs to be measured and evaluation methodologies.
In their comments on the draft decision, NRDC, CCSF and others argue that there should not be a clear distinction between "resource programs" and "non-resource programs," because some of the program activities discussed above may actually lead a customer to a program that directly produces verified energy savings. For example, NRDC points out that there may be audit programs that include a direct install component. We agree that such a program has both "resource" and "non-resource" elements, and that the direct install component should be considered a resource program subject to the performance basis and EM&V protocols (including "true-up" requirements) associated with resource programs. Consequently, the verified savings associated with the resource program element should also count towards the goals. Furthermore, we place value on the non-resource program (in this example, the audit component) in the overall portfolio because of its ability to lead customers to the resource program (direct install).
However, as discussed in Section 4.2.4 above, the issue of whether to attribute the estimated energy savings associated with the Codes and Standards Advocacy program towards "resource program" achievements, or to use those estimates to adjust the IOUs savings goals, is an issue that still needs to be explored in the context of further developing the performance basis and associated EM&V protocols for this program. Moreover, as reflected in Rule IV.9, what really distinguishes "resource programs" from "non-resource programs" is our ability to reasonably estimate and verify the resource savings attributable to programs that do not necessarily focus on the timing or type of resource needs of the utility. That is why our adopted Rules do not require these programs to be evaluated based on their cost-effectiveness, but rather, recognize that "factors and performance metrics other than the TRC and PAC Tests of cost-effectiveness" will need to be considered "when evaluating such program proposals for funding and when evaluating their results."47 (Rule IV.9.)
Therefore, while our Rules clearly recognize that non-resource programs can add considerable value to the overall performance of the portfolio (Rule IV.6), there is-and should continue to be-a clear distinction between "resource" and "non-resource" programs even though the non-resource program may lead a customer to a resource program. The resource program is subject to cost-effectiveness evaluation during the program planning process (although passing the Dual-Test for each program is not a threshold requirement). The non-resource program is not. In addition, resource programs are subject to ex post EM&V true-up requirements in order to verify performance and the associated net resource savings for resource planning purposes, including the achievement of projected load impacts. At this time, we do not know what EM&V protocols will be developed to assess the performance basis of the programs listed above, including the methods for estimating and verifying associated savings where those savings can be quantified.
Therefore, we believe it is reasonable and appropriate to continue to classify the programs described in this section as "non-resource" at this time. However, we are persuaded by the comments that Joint Staff should explore whether the Codes and Standards Advocacy Program should be reclassified as a resource program during the PY2006-PY2008 planning cycle. Joint Staff should present recommendations on this issue in its EM&V protocol submittals (see below), after carefully considering whether this program can be held up to a level of review for cost-effectiveness and associated resource savings that provide credible and objective information on savings impacts, and whether the associated protocols can produce results that meet the needs of the ISO and resource planners.
If acceptable EM&V protocols for estimating and verifying the savings from this program can be developed and approved during the development of EM&V protocols in the coming months (see below), we will allow the IOUs to begin counting the savings from these programs towards savings goals during the PY2006-PY2008 program cycle. We direct Joint Staff to solicit input from the IOUs and other technical experts on this issue as soon as possible, so that Joint Staff can develop its recommendations and solicit public input on those recommendations during the expedited approval process described in Section 6 below.46 Workshop Report on Future Commission Policies on Energy Efficiency Evaluation, Measurement and Verification, November 2, 2004, pp. 4-5. 47 In fact, we note that in response to the urging of several parties during workshops, the ALJ specifically removed the phrase "in addition to" (the TRC test) that appeared in an earlier version of Rule IV.9 to clarify how we will evaluate programs such as emerging technologies, statewide outreach and marketing, information-only programs and other activities where the link between program efforts and savings is either very difficult to discern or where the primary focus is to structurally change the marketplace.