The Reporting Requirements Manual (RRM) Working Group and the Standardization Team created a joint subcommittee (Joint RRM/ST Cost Effectiveness Subcommittee) to assure compliance with ordering paragraph (OP) 9 of Decision (D.) 01-12-020, December 11, 2001. OP 9 requires that the utilities:
"...evaluate the Low Income Energy Efficiency (LIEE) program and individual measures by calculating both the participant cost test and utility cost test, including in that calculation the non-energy related benefits developed by the RRM Working Group. The RRM Working Group and Standardization Project Team shall jointly develop recommendations, after obtaining public input, on:
· how each of these tests should be considered in making final measure selections, or in evaluating the overall effectiveness of LIEE programs from year to year or across utilities, and
· an explicit method for addressing the "gross" versus "net" costs and savings issue in measure and program evaluation.
The joint report shall include a discussion of the pros and cons of the various options considered."
Public workshops on these issues were held in San Francisco on March 26 and in San Diego on March 27, 2002. In appropriate areas where the input from the public could not be reconciled with that of the Joint RRM/ST Cost Effectiveness Subcommittee, it has been noted.
For comparison purposes, utility level analysis was performed with the SoCalGas and SCE programs as a combined entity, since they serve roughly the same customers. Exhibit 1.1 provides the program level results of the analysis.
Exhibit 1.1
Cost Effectiveness Test Results for the LIEE Program
Cost effectiveness is clearly an important element in the assessment of programs and measures. However, the Joint RRM/ST Cost Effectiveness Subcommittee believes that, especially in the LIEE arena, clear cut rules on inclusion and exclusion of measures cannot always be made based solely on measure test results. Policy and social welfare considerations not fully captured by these cost-effectiveness tests are often the main guiding element in decisions to retain measures within low-income programs. Additionally, the benefits of many measures offered under the LIEE program (particularly weatherization measures) are strongly interactive, so that it is very difficult, if not impossible, to disaggregate and assess their impacts.
The fact that the modified participant cost test (PCm) and a utility cost test (UC) results are uniform across the state indicates that program offerings are comparable statewide if considered on an electric and gas utility service area basis. This lead to the use of the average program PCm and UC test values for each utility as the threshold selection criteria for measure retention/exclusion.
The Joint RRM/ST Cost Effectiveness Subcommittee recommends a three level methodology for assessment of LIEE program measures. Measure level benefit-cost (B/C) ratios that include NEBs should be used along with the following guidelines:
1. Measures that have both a PCm and a UC greater than or equal to the average program PCm and UC for that utility should be included in the LIEE program. This applies for both existing and newly proposed measures.
2. Existing measures with one of the two benefit-cost (B/C) ratios less than the average program PCm and UC for that utility should be retained in the program. New measures meeting this criterion would not be accepted because of the substantial effort required to integrate a new measure.
3. Existing and new measures with both the UC and PCm test results less than the average program PCm and UC for that utility should be excluded from the LIEE program unless substantial argument can be made that significant NEBs are not currently being accounted for in the PCm and UC test values or there are other policy or program considerations that require the measure to be retained.
It is necessary to use the utility specific values in order to fairly assess the programs offered by single fuel utilities. If the statewide values were used as the criteria, then, despite the acceptable level of program offering when SCE and SoCalGas are considered together, the SCE programs would pass handily and the SoCalGas programs would fail many measures. By using the utility specific PCm and UC values, each is measured against its own criteria, and measures are not unduly eliminated from the combined SCE and SoCalGas service area.
Under this approach, the elimination of low cost-effectiveness measures will slowly raise the average program PCm and UC test values. The program level criteria would be held constant for two-year periods (with some exceptions).
Using these guidelines, a very broad look at the electric appliances, gas appliances, and weatherization measures shows that electric appliances often have both a PCm and UC that are over the utility-specific thresholds. The electric appliances falling into category #1 are measures that are relatively easy to install and have the potential for high savings. For gas appliances, there are slightly more measures that have neither cost effectiveness tests over the thresholds. This is in line with the fact that gas measures tend to have lower impacts. The weatherization measures are manpower-intensive measures to install, yet provide relatively small impacts. As such, there is a high percent of measures that fail both cost-effectiveness tests. Interestingly, for weatherization measures, there are no measures which pass one test while failing the other.
The following conclusions and recommendations are made:
· Use a modified participant test to enable benefit-cost ratio comparisons of the participant and utility cost tests. The modified participant test ratio is the participant benefits divided by the utility program costs.
· When addressing specific measures, adopt a three level methodology using average program PCm and UC for each utility as the measure screening criteria for that utility's measures.
· Caution should be used when comparing program level cost effectiveness across utilities for a single year. Variations in measure mix provided, gas versus electric savings, and reported program costs make such comparisons problematic.
· Comparing cost effectiveness of a single utility across different years requires an understanding of the underlying reasons for changes. Variations in the mixes of measures installed and the resident types targeted, combined with the associated changes in program costs and benefits make comparisons difficult. However, an understanding of differences can be useful for deciding future program measure mix.
· When comparing program level cost effectiveness across utilities, consider SCE and SoCalGas benefits and costs together to obtain a better representation of utility-to-utility customer and utility benefits versus costs.
· Use "gross" savings and costs for all measures in the LIEE program. (Note: In this context, "gross" savings means the total kWh difference between the new equipment and the existing equipment applied over the entire useful life of the new equipment.) The Joint RRM/ST Cost Effectiveness Subcommittee reviewed the PY2001 rapid deployment measures and concluded that the use of the "gross" costs and savings should be applied to them as well.
The remainder of this report provides details on the analysis and results.