4 Methodology

Ordering Paragraph 9 required the Joint RRM/ST Cost Effectiveness Subcommittee to address several issues.

These first two issues are discussed next. No specific methodologies were developed to address the last two issues - the recommendations for addressing these issues were based on public input and group discussions. Therefore, the recommendation on the use of cost effectiveness tests in making decisions and "gross" versus "net" issues are presented in the results section (Section 4).

4.1 Cost Effectiveness Tests for LIEE Program

Ordering Paragraph 9 clearly stated which cost effectiveness tests were required - the participant cost test (PC) and the utility cost test (UC). As stated in the Decision:

"The Participant Cost Test (PC) measures benefits and costs from the perspective of the customer receiving the measures or services. This test compares the reduction in the customer's utility bill, plus any incentive paid by the utility, with the customer's out-of-pocket expenses. In the case of LIEE program measures, where there generally are no out-of-pocket expenses to the eligible customer, the PC basically measures the bill savings associated with the program or measure.

The Utility Cost Test (UC) measures the net change in a utility's revenue requirements resulting from the program. The benefits for this test are the avoided supply costs of energy and demand ("avoided costs") - the reduction in transmission, distribution, generation and capacity costs valued at marginal costs - for the periods when there is a load reduction. The costs for the UC test are the program costs incurred by the utility, including any financial incentives paid to the customer, and the increased supply costs for the periods in which load increased. "1

The "California Standard Practice Manual: Economic Analysis of Demand-Side Programs and Projects" October 2001, provides further specifics regarding these two tests. The formulas from the Standard Practice Manual are presented in Appendix F; for each of these tests, there is a net present value (NPV) formulae that is the benefits minus the costs and a benefit-cost (B/C) ratio formulae that divides the benefits by the costs.2 The inputs and methods used to determine the results for each test are provided next.

D.01-12-020 accepted the NEBs proposed by the LIPPT report. All NEBs presented in the LIPPT report have been included in the calculations in this report. Appendix A presents a listing of the NEBs, a description of the NEB, the measures included for each NEB, and comments on which NEB is recommended for further study in the future.

4.1.1 Participant Cost Test

As stated in the description of the PC included in D.01-12-020, participant costs for the LIEE program are zero. This effectively removes the PC B/C ratio from consideration as division by zero results in an undefined number. Therefore, the PC simply defaults to the NPV formula discussed above, or the net present value of the benefits received by the customer3. These benefits are the bill savings due to the installation of the program measures.

The work that created the LIPPT report also developed a spreadsheet model for calculating LIPPT values. The spreadsheet included all inputs needed to calculate the PC, both with and without NEBs, with one exception. The avoided costs for energy were used instead of energy rates encountered by the customer. Based on how the spreadsheet was set up, by simply substituting energy rates for avoided costs, a bill savings value was calculated for the analysis in this report.

The energy rates used in this analysis for PY2000 and beyond are the same as presented in the "Joint Utility Low Income Energy Efficiency Program Costs and Bill Savings Standardization Report" of March, 2001. The rates by utility are shown in Exhibit 3.1.

Exhibit 3.1
Energy Rates Used in Participant Cost Tests

Utility

PY 2000
kWh Rate

PY 2000
Therm Rate

PG&E

0.1159

0.6537

SCE

0.1040

NA

SDG&E

0.1179

0.5926

SoCalGas

NA

0.6110

All Subsequent Years

Previous Year * (1 + Escalation Rate)

The escalation rate was set to 3% per year with an 8.15% discount rate.4 Because these rates do not take into account recent rate increases, the bill savings over time are most likely conservative.

D.01-12-020 recognized that it was not possible to compute a B/C ratio for LIEE participants since the participants have no costs related to the installations. While the PC and UC tests can be computed as the NPV of the benefits, those NPVs have little meaning in isolation. The Joint RRM/ST Cost Effectiveness Subcommittee discussed the difficulties in comparing PC and UC tests if the PC test is simply an NPV dollar value and the UC test is a NPV and a B/C ratio. As part of this discussion, and as directed by D.01-12-020, the group reviewed the relevant portions of D.92-09-080.5 This decision discusses the possibility of using the utility costs to create a benefit cost ratio, while not specifically addressing its use to create a modified participant cost test. On this basis, the subcommittee decided to also calculate a "modified" participant cost test (PCm) whereby the participant benefits are divided by the utility costs to provide a PCm B/C ratio. (As it turns out, this value is the ratio of the bill savings divided by the utility cost, which is already being calculated by the utilities as part of the bill savings reporting.) The utility costs used in the PCm are identical to those in the UC test.

The Joint RRM/ST Cost Effectiveness Subcommittee feels that the creation of the PCm, or some similar ratio, is an important step toward being able to evaluate and rank measures in conjunction with the UC. Without the creation of a participant related ratio, the comparison would be between two different types of measures/units of vastly differing orders of magnitudes.

It is important to note that the NEBs applied in this test are only those benefits that apply to the participants. For example, "fewer customer calls" and any others NEBs that accrue to the utility are not included in the participant cost test or modified participant cost test benefits.

4.1.2 Utility Cost Test

The utility cost test, as defined by the Standard Practices Manual, also has a net-present-value and B/C ratio formula. In the UC test the benefits for the utility are determined using the utility avoided costs rather than the energy rates used in the participant cost test. The avoided cost forecast as adopted by the Commission for PY2000,6 and used in this analysis to value electricity savings, was a statewide kWh value.7 It is anticipated that that future efforts in this area will use the avoided cost values most recently adopted by Commission.

The electric and gas avoided costs used in the determination of benefits for the UC test presented here include energy, transmission and distribution, and environmental externalities. The values used were $0.0452 per kWh for electricity and $0.3580 per therm for natural gas.

The utility costs used do not include incentives paid since no incentives are paid in this program. Likewise, there are no increased supply costs since this is not a fuel substitution or load shifting program. Therefore, the costs used in the UC are the program costs only.

The NEBs applied in this test are only those benefits that accrue to the utility. Therefore, NEBs such as "water and sewer savings" and other NEBs that were determined to accrue to the participant are not included in the UC benefits estimates.

4.2 Cost Effectiveness Test for LIEE Program Measures

4.2.1 Allocation of Non Energy Benefits to Measures

Moving from whole program assessments to measure level assessments significantly increases the complexity of the analysis. The original LIPPT report created utility-specific NEB values per household and multiplied that value by the number of households serviced to obtain an annual monetary value for a non-energy benefit. As such, the OP 9 provision requiring the calculation of measure-specific benefits that include NEBs, means that decisions had to be made regarding allocation of the NEB to a different unit of measurement (i.e., per-measure as opposed to per-household).

While the Joint RRM/ST Cost Effectiveness Subcommittee recognizes that NEBs need to be allocated to individual measures in order to permit their inclusion in measure assessment, it also feels the necessity to document the inherent weaknesses in conducting such a task.

Given those caveats, three methods of allocating the NEBs across measures received serious consideration and analysis. These methods weighted the NEB based on the:

· simple association of a measure with that NEB,

· average installations per house in the program for that measure, and

· NPV of the energy savings over the life of the measure.

Each of these three methods is discussed in the order presented above.

Simple Association of a Measure - With the concept of allocation by association, if a measure type is logically associated with an NEB (e.g., lower water costs are associated with faucet aerators) then program level savings for that NEB is divided equally across the associated measures, independent of how many units of each measure were installed. The problem with this approach is that it causes the B/C ratio to fluctuate greatly. As an example using this allocation method, there only a few compact fluorescent lamp (CFL) porch lights installed and the measure subsequently has small benefits. When the NEB allocation is added to the benefit portion of the B/C, it has a huge effect on the B/C ratio. At the other end of the spectrum is the measure of regular compact fluorescent lamps. This measure has large energy benefits already. As such, adding a comparatively small amount more to the benefits results in a tiny change in the B/C ratio. As a result, this method causes changes in the measure level B/C ratios that logically seem to be out of proportion to any rationally expected effects (see Exhibit 3.2). Thus allocation based on the simple association of the measure with an NEB was rejected as an allocation method.

Exhibit 3.2
Example of Weighting Method - Simple Association

The rejection of the Simple Association of a Measure method left two competing methods for allocating the NEBs: (1) the lifecycle monetary benefit of a measure (called the kWh weighting method for simplicity) and (2) the average installations per household method. The Joint RRM/ST Cost Effectiveness Subcommittee discussed at great length how to make a decision between these two methods and whether the chosen method of allocation was appropriate and defendable.

Both methods use the same mathematical mechanics to allocate the NEBs; it is just the weighting values that differ. Exhibit 3.3 below graphically shows how the NEB is allocated for the average installations per household method. As shown there, NEB dollars are only allocated to measures that have been determined to have a relationship to the NEB. (Appendix A documents which measures are included in each NEB.) In Exhibit 3.3 the dollar values are weighted based on the average number of measures installed per home. A measure with a higher average number of measures installed per home would receive a larger proportion of the NEB dollars compared to a measure with a lower average number of measures installed per home. After allocating the dollars for each NEB, the values are summed across a measure to determine the measure specific NEB benefit.

If the kWh weighting method were to be used, the lifecycle monetary savings of the measure would replace the values in the second column (Average Measures Installed per Home). Those measures with a higher lifecycle savings would receive a higher proportion of the NEB dollars. Higher lifecycle savings would be due to measures with high initial energy savings and/or long effective useful lives.

Exhibit 3.3
Illustration of Allocation Method

Given that the mechanics of allocation are identical, the main task that remained was the choice of the criteria to be used to select the "better" or more logical method for distributing NEBs amongst the measures. Many discussions and email exchanges ensued amongst the Joint RRM/ST Cost Effectiveness Subcommittee.

As part of the struggle with these and other issues, the group developed the following arguments for and against each of the final two allocation methods.

Allocating based on the average measures installed per household.

Pros:

Cons:

Allocating based on the lifecycle monetary benefit of the installed measures.

Pros:

Cons:

In addition to discussing the advantages and disadvantages of these two methods, the subcommittee discussed potential combinations of the two methods. By about the mid-point in the deliberations, it was generally accepted that the kWh allocation method was particularly applicable to the UC test, since the NEBs that applied to the UC test were highly correlated with energy savings. Thus the majority of the later discussion centered on the best approach to use in allocating the NEBs for the PCm test. Consideration was given to using the kWh allocation method for the UC test and the average measures installed per household method for the PCm, however consensus was that the dramatic changes in the PCm could not be justified. Appendix C and Appendix D have the B/C ratios both with and without NEB for these two allocation methods.

Throughout the Joint RRM/ST Cost Effectiveness Subcommittee discussions, continual attention was paid to the fact that the method employed had to be readily applicable on a mass basis across the utility databases and could not require detailed, minute adjustments.

Recommendation: The Joint RRM/ST Cost Effectiveness Subcommittee recommends that, for the present, both UC and the PC tests should allocate NEBs based on the lifecycle monetary benefit of the installed measures. Given the lack of documented, concrete information on how the measure level NEBs should be distributed, this method allocates the NEBs to the measures without causing significant changes to the order ranking of the UC and PCm test. In lieu of better information, this approach is considered reasonable.

In choosing to allocate participant related NEBs by energy savings, the Joint RRM/ST Cost Effectiveness Subcommittee does not disallow the fact that NEBs in general are intended to capture those effects not reflected in the standard ways of valuing energy impacts. Rather, the Subcommittee seeks to develop a systematic and consistent rule for allocating program-level NEBs to the measure level. Because, in many cases, it can be shown that these NEBs are correlated with energy savings, the Subcommittee believes that allocating participant NEBs according to energy savings yields a more consistent and believable result than allocating them according to the average number of measures installed per household.

In addition, the Joint RRM/ST Cost Effectiveness Subcommittee wants to make clear that the choice of the kWh allocation method is based partly on a shortage of information that might allow other approaches. Its choice as a proxy now should not preclude changes to alternate, more appropriate methods when better information becomes available, or discarding NEBs at the measure level altogether.

It should be noted that measures with no claimed energy saving receive no NEB allocation.

4.2.2 Decision Making for Measures

The Joint RRM/ST Cost Effectiveness Subcommittee reviewed several different approaches to screening measures for the LIEE program. Early in the process, the following general three-stage approach to screening measures was agreed.

Given this three stage approach, the main issue remaining was the selection of the threshold criteria for pass/fail. While many criteria were discussed, several criteria received the majority of the attention. The following descriptions summarize these approaches and describe why they were rejected or accepted.

The Joint RRM/ST Cost Effectiveness Subcommittee selected the last of these options; the average program PCm and UC test values for each utility, as the threshold selection criteria for measure retention/exclusion. Once this selection was made, many specific details and situations were discussed. These are documented below in order to supply an expanded description of the measure selection process and to give guidance to the Standardization Team who's responsibility it will be to apply this standard.

When applied using the a three level assessment framework for LIEE measures discussed above, the measure level benefit-cost (B/C) ratios, including NEBs should be assessed as follows:

The applications of these criteria are presented in a tabular form in Exhibit 3.4.

Exhibit 3.4
Measure Assessment/Decision Rules for Retention/Addition

 

Assessment Test Type

Decision Rule

Pass/Fail Guideline Number

Modified Participant Cost Test

Utility Cost Test

Existing Measure

Proposed New Measure

1

Pass

Pass

Retain

Add to Program

2

One Pass/ One Fail

Retain

Do Not Add

3

Fail

Fail

Retain ONLY if significant excluded NEBs can be identified.

Do Not Add

The more restrictive approach to adding new measures is believed to be justified because adding measures requires added support costs (e.g., development of standards, training, etc.) and measures already in the program have received some level of scrutiny. Additionally, some non-energy related measures are already in the program for policy reasons (e.g., furnace repair/replacement, some minor home repairs). These measures will need to be assessed on a case-by-case basis.

The reasoning behind retaining measures that pass one test and the other test is that either marginal adjustments in the measure offering or changes in economic conditions can swing measures back into a pass/pass situation. The Joint RRM/ST Cost Effectiveness Subcommittee does not want to see measures that have marginal utility precipitously rejected from the program.

Under this approach, the elimination of low cost-effectiveness measures will slowly raise the average program PCm and UC test values. As the average program PCm and UC rise, the pass/fail criteria should not exceed a maximum of 1.0 for either test. This is the point at which the benefits exceed the cost, and it is not reasonable to eliminate measures with a benefit greater than the cost. In addition, it is recognized that for all electric utilities (where the benefits are high) that some added measures may actually reduce the overall utility B/C ratio. This would still be considered appropriate, since the new measure still has a benefit greater than the cost.

The Joint RRM/ST Cost Effectiveness Subcommittee recommends that the program level criteria be held constant for two-year periods and then updated to the average program value of the second year. . The primary recognized exception to this rule would be when a utility institutes a large structural change in the LIEE program, in which case the criteria ought to be updated in the year the program is changed.

The assessment of measure inclusion or exclusion should occur biennially for existing measures to coincide with the biennial program impact evaluation, with new measures being evaluated in the program year in which they are proposed.

The Joint RRM/ST Cost Effectiveness Subcommittee reviewed the following possible issues that could arise from the proposed methodology:

The Joint RRM/ST Cost Effectiveness Subcommittee realizes that it is likely that the Standardization Team will need to review and make decisions on many cases such as those presented above.

The current utility-by-utility retention/addition criteria are documented in Section 4, Results and Recommendations, which follows.

1 Page 57. R.01-08-027, D. 01-12-020, December 11, 2001, Section V. 2 California Standard Practice Manual: Economic Analysis of Demand-Side Programs and Projects. October 2001. Chapters 2 and 5. 3 In actuality, it is the net-present-value of the participant benefit minus the net-present-value of the participant cost, but with the participant cost term equal to zero, it reduces to only the first term. 4 ALJ Bytof ruling, dated October 25, 2000, in Application (A.) 99-09-049, et. al 5 Section 6.1.2.2 Consideration of Total Resource and Utility Costs. 6 D. 99-08-021 and further adopted in D. 00-07-017. 7 While the electric use is expressed only in kWh, the avoided cost value was developed using a hybrid demand profile. Thus extracting the relative demand contribution is virtually impossible. It is believed that the demand component represents between 1% and 6% of the overall avoided cost using 2001 kW values.

Previous PageTop Of PageNext PageGo To First Page