In his September 16, 1999, ruling, the Assigned Commissioner clarified that: 41
"Nothing in my scoping memo or the assigned ALJ's prehearing conference ruling was intended to exclude evidence on what the `broadest extent possible' and `appropriate' degree of competitive bidding should be."
Accordingly, parties presented evidence on this issue and took positions on whether the evidence indicated that competitive bidding should be required at this time. This was by far the most heavily contested issue in the proceeding.
SoCal, SDG&E, and RHA contend that PG&E's experience demonstrates that competitive bidding would result in unacceptable results in southern California that would not comport with AB 1393. These include allegations of serious quality control problems and a dramatic reduction in CBO participation. Moreover, these parties argue that the Commission has never established a firm policy requirement that competitive bidding shall take place. Southern California Agencies argues that bidding has not produced better results than the status quo, and that there is no evidence that the southern California utility programs currently have a problem that bidding will fix.
In their view, SDG&E's and SoCal's programs currently provide high quality services, and are well-run at reasonable cost. These parties recommend that the Commission permit the utilities to go forward with their programs and to continue to manage them, as they have done effectively for the past many years under the Commission's guidance. Competitive bidding, should, in their opinion, remain within the discretion of the utility administrators.
Contractors' Coalition and ORA, on the other hand, strongly urge the Commission to order all the utilities to competitively bid their PY2000 programs. Contractors' Coalition argues that this would be consistent with the Commission's articulated policies and necessary to ensure the most efficient delivery of quality services. In ORA's view, a competitive bid process can effectively ensure that the best quality services are provided to low-income ratepayers for the lowest costs. LIAB supports moving forward with competitive bidding for all utilities.42
PG&E recommends the continuation of competitive bidding for its weatherization program and requests immediate Commission authorization to initiate a competitive bid for PY2000. PG&E states that it is indifferent as to whether the southern California utilities go out to bid. However, PG&E argues that putting its program out to bid is consistent with the Commission's policies, as articulated in D.99-03-056 and Res. E-3586, and with AB 1393.
Most of the 76 exhibits in this proceeding address the issue of whether competitive bidding should be required at this time for all utilities. Exhibits were submitted to compare experiences with competitive bidding to other outsourcing approaches with regard to cost efficiencies, bill savings to low-income customers, quality and safety of installations, and AB 1393 criteria regarding program accessibility. We discuss these issues below.
10.1 Cost Comparisons
Before summarizing the evidence in this case regarding cost comparisons between competitively bid and non-bid programs, we note that only PG&E and its contractors were forthcoming with cost information in this proceeding on a timely basis. It took several meetings and discussions well into the hearing phase to obtain information from the Southern California utilities that would enable us to compare costs in this proceeding. We uphold the assigned ALJ's rulings with regard to the release of cost information in exhibits and with regard to the utilities' proposed protective orders. As discussed further below, the cost comparisons presented by parties were limited in scope and comprehensiveness in part because of the discovery delays.
We put all the utilities and their low-income energy efficiency contractors on notice that program costs, including costs per measure or home, must be made available to parties in any future Commission proceeding where the cost-efficiency of these programs is being litigated, subject to Commission-approved confidentiality agreements. The Commission, its staff, or low-income energy efficiency program consultants (subject to confidentiality agreements) may obtain this information upon request at any time. To be useful, this cost information must be presented in a format that allows the Commission and other parties to compare costs in a normalized fashion, e.g., normalized over the types and number or frequency of the measures installed in each home.
10.1.1 SoCal's Competitive Bid Experience
The results of SoCal's 1995 and 1996 bidding pilots are summarized in SoCal's April 1997 DSM Report. (Exh. 28, Attachment.) As discussed in the report, SoCal was directed to determine from the results of this pilot bid program "if competitively bidding out [low-income weatherization] services could provide equal services at lower cost than negotiated contracts with community-based organizations." (Ibid., p. II-13.) SoCal concludes that there were no substantial differences in total average costs between profit and non-profit contractors during the two-year pilot program. (Ibid., p. II-15.) The report goes on to state that for-profit contractors had higher average costs for weatherstripping and exhaust fan dampers, but non-profit contractors had higher average prices for all other measures. (Id.)
However, the average cost comparison presented in the report does not appear to tell all of the story. First, the report does not state (and SoCal does not recall) whether or not the average cost calculations were normalized over the quantity of measures installed in each home. (RT at 609.) If these costs were not normalized over the number and type of measures installed per home, then we are looking at an apples and oranges comparison. As the report does acknowledge, the increase in average unit cost in 1996 over 1995 can be attributed to the installation of more measures, per unit. (Id.) Unfortunately, there is no way to compare costs relative to comprehensibility of treatment between the non-profit and for-profit contractors, based on this report.
Moreover, the report appears to omit from its analysis the amounts paid to CBOs for visits to homes where no measures were actually installed. SoCal paid the CBOs a fee for these "no measure" visits, but did not allow the private contractors to include a fee for no-measure visits in their bids. Instead, the private contractors rolled the expected cost to them of the no-measure visits into their prices for weatherstripping because weatherstripping is the measure most often installed. Hence, it appears that the report overstates the private contractors' prices for weatherstripping, which leaves the CBOs with lower prices only for exhaust fans. (Exh. 17, pp. 30-31; RT at 378.)
In addition, the average cost comparisons do not reflect SoCal's observation that "the pilot bid program was a contributing factor in preventing non-bid contractors from demanding higher prices." (Exh. 28, Attachment, p. II-16.) Negotiations between SoCal and contractors while the pilot bid was in the planning stage actually reduced prices to the non-bid CBOs. (Ibid., p. 5.) Moreover, SoCal reports that the pilot bid gave it competitive price information for negotiating 1996 and 1997 contracts. (Ibid. Attachment, p. II-16). SoCal summarizes the cost-reduction benefits attributable to the competitive bid pilot, as follows:
"...SoCalGas has found quantifiable benefits resulting from the pilot bid program. The average unit price for 1995 was 17% lower than for 1994. SoCalGas believes that the competitive bid process put downward price pressure on the negotiated contracts for 1995. Also the competitive price information gained from the pilot bid was used during negotiations for 1996 and 1997 contracts; and, the 1996 and 1997 overall average unit cost remained 8% and 7% lower than 1994." (Exh. 50, p. 5.)
10.1.2 Savings Under PG&E's PY1998 Competitive Bid
Table 2 in Exh. 17 presents all categories of measures offered for installation in the 1998 PG&E Energy Partners Program, along with the prices actually paid to RHA for 1997 installations and prices actually paid to SESCO for 1998 installations. The result of allowing open bidding was that the average price per measure declined 11.7%, even before there is any accounting for inflation between 1997 and 1998. This savings would enable treatment of an additional 4,432 homes, each receiving the standard package of measures envisioned by PG&E in its RFP for the 1998 program.
No parties present evidence to refute this cost comparison. However, we note that the record does not supply us with PG&E's cost of administering the bid, with which to compare this reduction in average per measure costs. This was the type of analysis anticipated by our discussion in Res. E-3586, where we stated:
"The Commission understands that there is a trade-off in putting programs out for competitive bid-while unit costs may go down, an additional one-time administrative costs is incurred by each bidding process. Among other things, these administrative costs must be weighed against the potential reduction in unit costs. PG&E's competitive bid programs for 1997, 1998 and 1999
should provide us with useful information for evaluating competitive bid programs for the other utilities." (Res. E-3586, p. 31.)
Nonetheless, the evidence does indicate that the recent competitive bid successfully reduced PG&E's average prices per measure by an appreciable amount, enabling additional homes to be weatherized than would have been the case under the pre-bid price structure.
10.1.3 PG&E's Competitive Bid Prices Applied to SDG&E's Program
Exh. 66 shows that, if the measures installed in the SDG&E program in 1997 had been priced at the prices that PG&E paid SESCO for the same measures in 1998, the cost of the SDG&E program would have been 15.89% less. That savings would have enabled the program to treat an additional 548 homes with the same mix of measures installed.
SDG&E/SoCal argues that the Exh. 66 cost comparison is not valid because SDG&E's program has higher performance requirements and requires its prime contractors and subcontractors to do more than PG&E's service providers. In particular, SDG&E/SoCal asserts that SDG&E requires its prime contractor to perform employee background checks, maintain a 95% customer satisfaction rate, as well as train its own installers, and those of its subcontractors. (SDG&E/SoCal Opening Brief, p. 30.)
We do not find any evidence on the record to support the assertions that SDG&E's prime contractor performs more extensive work than PG&E's prime contractor, or that its weatherization subcontractors have more requirements imposed on them. We note that SDG&E does not require CAS testing prior to installation of filtration measures, for example, while PG&E does, a requirement that results in more complicated scheduling requirements for the weatherization contractors. (RT at 972.) Moreover, PG&E's prime contractor must manage a program that treats over four times the number of units per year with more measures per unit than SDG&E's program. (Exhs. 36, 66; RT at 973.) With regard to training, we note that the prices per measure used in Exh. 66 do not include the contract prices paid to RHA for training, so that the differences in which entity performs that function under the two programs does not skew the analysis.
In addition, SDG&E/SoCal argues that this exhibit inappropriately compares only 16 measures, whereas SDG&E provided measure costs and frequencies for 18 measures and PG&E provided the same for 23 measures. (SDG&E/SoCal Opening Brief, p. 31.) However, a comparison between SDG&E's Exh. 35 and 66 shows that all 18 measures in Exh. 35 are compared in Exh. 66. The comparison presented in Exh. 66 effectively standardizes a comparison between competitive bid and negotiated prices using the measures that SDG&E actually installed under the program. It certainly could have been done the other way, i.e., taking the measures installed under the PG&E program and applying the SDG&E negotiated prices and the SESCO bid prices to those measures. The obvious problem with that is the fact that there are more measures installed under the PG&E than under the SDG&E program. In sum, we find no merit to SDG&E/SoCal's objections to this exhibit.
RHA also questions the findings of Exh. 66, claiming that the minor home repair figures are not comparable. (RHA Opening Brief, p. 15.) We believe that Contractors' Coalition adequately addresses this issue in its reply brief. As Contractors' Coalition points out, the extent to which minor home repair may be less than comparable actually increases the cost savings that is documented in Exh. 66. Moreover, Exh. 66 also shows the cost savings if minor home repairs is disregarded. That comparison indicates that competitive bid prices for the measures installed in SDG&E's program (not including minor home repairs) would have yielded total saving per home of 24.2%, or $440,920. (Contractors' Coalition Reply Brief, Appendix A.)
10.2 Bill Savings to Low-Income Customers
The benefits to low-income customers from energy efficiency programs should be directly measurable in terms of the level of bill savings they realize from having the work done to their homes. This is a function of the number and mix of measures installed in each home, the savings associated with that number and mix of measures, and, for the program as a whole, the number of homes weatherized. The relative cost-efficiency of the programs, which is of particular interest to non-participating ratepayers, should be measurable in terms of the total program (or per home) level of bill savings relative to program expenditures.
Therefore, an important area of discovery in this proceeding should have been the level of bill savings to participating customers, relative to program dollar expenditures, across utility programs. Only Contractors' Coalition attempted to address this issue by examining comparative costs and the ability to treat more homes under competitive bidding.
At the direction of the assigned ALJ, the utilities put together late-filed Exh. 76 attempting to document and compare program expenditures and lifecycle customer bill savings for program years 1997, 1998, and 1999. This document was submitted on December 16, 1999.
The workpapers to Exh. 76 indicate that this analysis is not responsive to the ALJ's direction. The Commission needs information regarding reasonable assumptions for bill savings per home, based on measures actually installed in the homes in each year. (RT at 1162 to 1167.) The numbers produced for the exhibit, at least in the case of PG&E, base the lifecycle savings on the number and mix of measures installed in homes in 1995.
As a result, the numbers for PG&E show a fixed amount of savings per home, and do not reflect any changes in the mix or number of measures per home from year to year. For the purpose of comparing the impact of competitive bidding on potential bill savings, these figures are essentially useless. Moreover, we are not assured that the figures presented in Exh. 76 are consistent with the assumptions and methodologies approved for measuring program costs and benefits in our Annual Earnings Assessment Proceedings. In fact, we observe that the cost-effectiveness ratios for the southern California utilities, calculated from the bill savings and expenditure levels in Exh. 76, are dramatically higher than the PY2000 cost-effectiveness tests presented in their testimony.43
For example, the figures SDG&E presents in Exh. 76 indicate a ratio of 1.03 for life-cycle bill savings relative to total program expenditures under SDG&E's PY1999 LIEE program. However, in calculating its projected PY2000 performance incentives, SDG&E projects the present value of bills avoided at $2,035,348 relative to $7,281,545 in measure costs or $5,015,204 in utility costs, which would indicate a bill savings per total cost ratio in the range of 0.28-0.40. This range is also more in line with the other ratios of cost-effectiveness that SDG&E presents in its testimony, i.e., 0.22 for the utility cost test and 0.21 for the total resource cost test.44
SCE's figures in Exh. 76 suggest a bill savings/cost ratio of 1.85, i.e., a highly cost-effective program from the perspective of benefits to low-income customers. However, this calculation does not appear to be "in line" with SCE's calculations of cost-effectiveness in Exh. 8, Table C. In those calculations, the program is not cost-effective from either the utility cost or total resource test of cost-effectiveness, with ratios of 0.66 and 0.657, respectively.
Although SoCal's LIEE program is far from cost-effective under any calculation presented in the record, the figures in Exh. 76 suggest a bill savings/costs ratio of 0.20 for the LIEE program in 1999, whereas the ratio between the present value of bills avoided and measure costs or utility costs in Exh. 47 (Attachment C, Table 2) yield a lower ratio in the range of 0.12-0.14.
In sum, our inquiry is limited by the lack of consistent data on program bill savings, expenditures and cost-effectiveness calculations, with which to evaluate the relative performance of the utilities' LIEE programs. Competitive bidding aside, this is fundamental information that should be readily available to program evaluators, program implementors and the general public. In Section 19 we discuss steps to acquire this information in the future.
10.3 Installation Quality
Most of the testimony in this proceeding focused on whether or not competitive bidding would compromise the installation quality and performance of weatherization contractors. We examine the evidence below.
10.3.1 Per-Home Pass Rates
One measure of performance presented during evidentiary hearings was the per-home inspection pass rates. All of the utilities apparently record this measure, so it was readily available across program years, utilities, and contractors. A per-home pass rate indicates how many homes, out of the total inspected, pass an inspection of all of the measures installed in that home. The exhibits presented in this proceeding show that per-home inspection pass rates in PG&E's program are consistently less than those in SoCal's and SDG&E's programs. Table 1 below summarizes the comparisons presented in Exh. 23 (PG&E), 53 (SDG&E), 56 (SoCal) and 57 (SCE):
Table 1 Overall Utility Pass Rates on LIEE Services | |||
1997 |
1998 |
1999 | |
PG&E |
84.3% |
74.4% |
79.8% |
SCE-Weatherization |
96.3% |
92.7% |
93.1% |
SCE-Evap Coolers |
95.9% |
96.9% |
93.8% |
SCG |
87.4% |
88.1% |
88.2% |
SDG&E |
99.0% |
99.5% |
99.1% |
In considering this evidence, we must evaluate whether it is reasonable to assume that the differences in these overall utility pass ratings are due to actual and proportionate differences in the quality of the work being done. As discussed below, we do not believe that this assumption is reasonable. For example, we believe it is unlikely that the same PG&E contractors dropped an average of 10% points from 1997 to 1998. We also find it unlikely that, for at least each of the past three years, PG&E's contractors have been consistently 10-15% worse than the other utilities, including the period when its program was administered by the same prime contractor, RHA, that administers the SDG&E program.45
In fact, the record indicates that the same contractors and administrators doing the same work at the same time in different service territories had lower pass rates in PG&E's program than SoCal's and SDG&E's programs, comparable to the overall differences in pass rates.46 For individual contractors, Western Insulation has worked simultaneously in both PG&E and SDG&E's program, and Winegard Energy and San Luis Obispo EOC have worked simultaneously in both PG&E and SoCal's program. (RT, p. 903.) PG&E's exhibits identified each subcontractor by name, while SDG&E's and SoCal's pass rate exhibits only identify contractors by type. Nonetheless, Western Insulation has a significantly lower pass rate in PG&E's program than any contractor, public or private, in SDG&E's program, as shown in Table 2 below.
Table 2 Western Insulation Pass Rates for PG&E and SDG&E | ||||
Contractor |
1997 |
1998 |
1999 |
Average |
Western-PG&E |
90.8% |
88.5% |
87.3% |
88.9% |
SDG&E-Private1 |
99.5% |
99.6% |
99.0% |
99.4% |
SDG&E-Private2 |
99.6% |
100.0% |
100.0% |
99.9% |
SDG&E-Private3 |
99.0% |
99.8% |
99.5% |
99.4% |
SDG&E-Privates |
99.4% |
99.8% |
99.5% |
99.6% |
Similar anomalies surface when one compares the pass rate data for PG&E and SoCal with respect to San Luis Obispo EOC, a CBO, and a private contractor, Winegard Energy. San Luis Obispo EOC works in both PG&E and SoCal's program, under the same supervisor and same crews and treats the same county, San Luis Obispo. Winegard Energy does the same in another set of counties, Kern and Tulare. (RT at 910.) Yet, the recent pass rates (1999) from PG&E's inspectors are much lower in each case than those from SoCal's inspectors:
Table 3 1999 Rates for San Luis Obispo EOC and Winegard | |||
Contractor |
Utility |
Pass Rate |
Notes |
Winegard |
PG&E |
73.5% |
Kern and Tulare counties only |
Winegard |
SoCal |
88.3% |
identified as WMDVBE-3 (the only private WMDVBE) |
San Luis Obispo |
PG&E |
68.2% |
San Luis Obispo County only |
All CBOs |
SoCal47 |
87.3% |
Range: 76.1%--98.7% |
There appear to be other discrepancies in pass rate scoring, even within PG&E's program. For example, a comparison of pass rates for the same PG&E subcontractors between 1997 and 1998 show major swings, with the preponderance of the shift in the downward direction. Winegard Energy dropped almost 24 percentage points, and San Luis Obispo dropped nearly 17 percentage points, while continuing to treat homes in the same county as before. (Exh. 23.) From the exhibits in this proceeding, one can also compare the pass rates for the same subcontractor operating in adjoining PG&E inspection districts. Pass rate differences range from 11% to 24.6%, even though the counties are treated by the same subcontractors. (Contractors' Coalition Opening Brief, Table 5, p. 40; Reply Brief p. 39.)
As a result of this issue, the ALJ asked that a joint late-filed exhibit, Exh. 73, be prepared to compare and contrast the utilities' inspection procedures. Unfortunately, only PG&E produced a written compilation of inspection policies and procedures for the joint exhibit, so it became very difficult to compare official policies when none of the other utilities had any that they were willing to produce. According to PG&E, this exhibit became more problematic when SDG&E's stated policy changed from the time of the joint party meetings to the written compilation of the exhibit because the person attending the meeting didn't know what SDG&E's current policy was. (PG&E Opening Brief, p. 23.)
What we can tell from the exhibit is that if one measure in a home fails in PG&E's program, the whole home is counted as a fail (RT at 297); it is unclear how the southern California utilities count failures. That is a significant difference. In southern California, at least in some instances, if an inspector finds a failed measure, he can fix the measure himself and the home will count as a pass; PG&E does not provide such a service. Differences like these make the pass rates incomparable. The following examples highlight the fact that PG&E lists many items as per home "fails" that are not so listed as such by one or more of the other utilities:
1. PG&E automatically fails a house if a feasible measure is not installed, while SoCal and SCE allow the contractor to correct it without counting it as a fail. An estimated 25.4% of PG&E fails are due to this factor and it is a contributing factor in 45.2% of all fails. (Exh. 73, p. 9, 1.5; p. 37, 1.7.)
2. SoCal/SCE allow contractors to correct door weatherstripping if it is out of adjustment or light shows around the sides or top. PG&E has failed this in 60% of the instances and provided a correction opportunity in the other 40%. PG&E indicates that it will henceforth place of these in the "correction fail" category and not count them as fails. Door weatherstripping is the most common failure and differences here have a serious impact. (Exh. 73, p. 14, 2.3; p. 38, 2.3.)
3. There are several categories for which other utilities either allow the contractor to correct or have the inspectors themselves correct without issuing a "fail," while PG&E provides an automatic fail in the same situation. This includes, for example, minor caulking, weatherstripping and gasket mistakes. PG&E calculated that such similar items made up about 6% of the homes in some of the common counties (Kern, San Luis Obispo) had been failed for similar programs. (Exh. 73, p. 10, 1.8; p. 14, 2.3, 2.4; p. 25, 3; p. 37, 1.8.)
4. SDG&E tries to inspect all units and, prior to January 1, 1999, if any is missed, it is counted as a "pass," while all other utilities ignore it in the pass rate calculations. (Exh. 73, p. 37, 1.8.)
5. SDG&E's pass rate covers only "weatherization" measures and not compact florescent lights, porch lights, or refrigerators, which PG&E inspects and counts in its pass rates. (Exh. 73, p. 22, background.)
In its comments to the ALJ's Ruling dated March 9, 2000, SDG&E/SoCal contends that comparisons across utility per-home pass rates are not meaningful because the inspection rates differ across utilities. We disagree. This objection is akin to saying that comparative analysis is only meaningful if the same sample size (and sampling technique) is used across all populations being observed. PG&E presented pass rates based on 100% inspections of all homes where attic insulation was installed and at least 20% of all other homes. There is no basis in fact to conclude that the pass rates resulting from these observations are unrepresentative of the pass rates that would have been obtained if every home was inspected. Throughout this proceeding, SDG&E/SoCal and others have used these pass rate figures in drawing conclusions about the performance of PG&E's contractors. (See, for example, SDG&E/SoCal Opening Brief, p. 29.)
Moreover, SDG&E itself does not inspect 100% of all its work, as SDG&E/SoCal contend. (SDG&E/SoCal Comments, p. 2, 3.) Instead, SDG&E only inspects jobs where the inspector can gain entry. (Exh. 73, p. 11.) In fact, as noted in Section 10.3.1, SDG&E counts any units it cannot enter as a "pass," whereas all other utilities ignore it in the pass rate calculations.
In their comments, SDG&E/SoCal attempts to extrapolate from PG&E's sampling of pass rates what the pass rates would have been if all homes were inspected, in order to demonstrate that PG&E's fail rates are much higher than presented in the exhibits. We agree with Contractors' Coalition that SDG&E/SoCal's analysis is mathematically flawed and has no validity.
SDG&E/SoCal take the average inspection rate of PG&E's program (42.75%) and the average failure rate (20.47%) across the 1997-1999 period.48 Then, SDG&E/SoCal multiply the average failure rate by the inverse of the inspection rate to calculate PG&E's failure rate at a 100% inspection rate. On the basis of this calculation, SDG&E/SoCal claim that if 100% of the jobs were inspected, the failure rate would be 47.88%. This claim forms the basis for SDG&E/SoCal's objection to the use of the comparisons presented in the tables above.
However, there is no basis for this claim. If the true failure rate were 47.88%, as SDG&E/SoCal claim in their March 16, 2000, comments, then the rate of failure in the uninspected 57.25% of jobs must have been 68.35%, a rate that is more than triple the actual failure rate found in the inspected homes.49 In fact, SDG&E/SoCal's analysis will always conclude that the fail rate for the uninspected homes is much, much higher than the inspected homes.
There is simply no support on the record for this assumption. In fact, the evidence suggests that the uninspected homes are likely to have fewer problems than the inspected homes. PG&E maintains a policy of focusing its inspections on the homes with the more complicated set of measures or potential filtration problems. For example, PG&E inspects all homes receiving attic insulation and homes receiving infiltration measures that have no record of a passed CAS test. (Exh. 73, p. 7.) In addition, PG&E increases inspections where it believes more quality control is required. (RT at 338.) Therefore, if any assumption could be reasonable made from the record it would be that PG&E's fail rates based on a 100% inspection rate would be lower than those indicated in the exhibits.
Moreover, applying SDG&E/SoCal's calculation to extrapolate "true" fail rates leads to obvious nonsensical results. This can be shown with a simple, reasonable hypothetical where PG&E inspected 20% of the jobs and the fail rate was 26%. The SDG&E/SoCal analysis would conclude that the true fail rate (under a 100% inspection regime) was actually 130%. In other words, SDG&E/SoCal's analysis would conclude that the raw number of homes failing would be 20% more than the raw number of homes treated, which is obviously impossible. SDG&E/SoCal's analysis would produce the same nonsensical results for any circumstances in which the fail rate exceeds the inspection rate.
In sum, we find that SDG&E/SoCal's objections to the comparative use of the pass rates presented on the record in this proceeding are baseless.
Finally, we note that SDG&E's/SoCal's March 16, 2000, comments were clearly beyond the scope of the ALJ's March 9, 2000, ruling. That ruling gave parties the opportunity for very limited comment on the accuracy of Contractors' Coalition's crosswalk between the exhibits and the calculations presented in Tables 1-5 of Contractors' Coalition's Opening Brief. Instead, SDG&E/SoCal improperly used this comment period to augment its arguments concerning pass rate information and present new theories and calculations. Contractors' Coalition did not seek to strike these comments, but rather responded to them substantively in its March 23, 2000, response. We also respond to them in today's decision. However, we put SDG&E/SoCal on notice for the future that any document tendered for any improper purpose in our proceedings may invoke the actions described in Section (f) of Rule 2.2 of the Commission's Rules of Practice and Procedure, including disciplinary action.
10.3.2 SESCO's Pass Rate Performance
SESCO, as the primary contractor for PG&E's LIEE program beginning in the second quarter of 1998, originally planned to do relatively little installation work itself. (RT at 977.) However, in geographic areas where SESCO could not find other weatherization contractors (private or non-profit) to do the work, it became an installation subcontractor under the program. In that role, SESCO began installations at the end of September 1998, and currently installs approximately 15% of the dollar value of the work done under PG&E's program. (RT at 724.) We note that as an installation contractor, SESCO received the same price structure for its work as the other installation contractors, i.e., there were no separate deals negotiated on a contractor by contractor basis. (RT at 730-731.)
SESCO's performance as an installation contractor was heavily criticized during this proceeding, in particular with respect to low pass rates. For the last three months of 1998, SESCO's overall average per-home pass rate was 66.2%. Notwithstanding the difficulties in comparing and evaluating pass rate data, discussed above, Mr. Esteves, the Vice-President of SESCO, acknowledged that this performance needed improvement to meet PG&E's pass rate goals. He also described the numerous steps SESCO took in cooperation with PG&E to correct this situation. He explained that, for the first two quarters of 1999, SESCO improved its pass rates by 13% points in those counties where the pass rate had been below 80% during the last quarter of 1998. (RT at 1002-1003, 1011-1012.)
We do not believe that SESCO's pass rate performance as an installation subcontractor is indicative of competitive bidding per se, or that it is limited to PG&E's recent bid experience. A review of Exh. 23 shows that a number of contractors working in the program just prior to the recent competitive bid experienced comparable or lower pass rates in Northern California counties. For example, American Synergy in Alameda and CHDC in San Joaquin each had a 66% pass rate for 1997, and Glen County HRA/CAD in Colusa had a 42.11% pass rate. We do not have 1996 or earlier pass rates for weatherization contractors under RHA's management with which to further compare pre-competitive bid performance. All we can state with certainty is that the parties involved recognized a problem with an individual installation subcontractor (SESCO) and took steps to improve performance.
10.3.3 Per-Measure Pass Rates
In addition to tracking per-home pass rates, PG&E compiles information on per-measure pass rates. The utility looks at an individual measure, such as weatherstripping, and calculates by contractor what the pass rate would be for that measure. (RT at 296.) None of the other utilities compile this information. (RT at 335.) In its proposed RFP for PY2000, PG&E is switching from evaluating performance based on per-home pass rates to using measure pass rates, and establishing a financial incentive for contractors to keep those rates very high.50
An examination of the per measure pass rates compiled by PG&E indicates very high per measure pass rates both before and after the 1998 competitive bid, on average above 95% during the 1997-1999 period.
10.3.4 Hazard Fails
A hazard fail is a situation where a measure has been installed that would create a hazard either to the occupants of the home or the structure, e.g., cause a fire or other hazardous situation in the home. (RT at 528.) One example of a hazard fail would be the installation of weatherization over a heat-producing device.
For 1997-1999, SDG&E had 0 to 2 hazard fails per year. (RT at 527-529.) RHA and Southern California Agencies compare this figure to the number of hazard fails recorded in PG&E's program, and conclude that bidding will lead to an unacceptable number of life-threatening situations for low-income customers.
We do not reach this conclusion. First, we note that the definition of hazard fails in PG&E's program, beginning in 1998, now includes infiltration measures that are installed prior to a CAS test, installed after a dwelling fails a CAS test or if a CAS inspection is not performed. This was not the case in 1997, prior to PG&E's competitive bid, nor is it the case for the other utilities at present. Therefore, there hazard fails being counted under PG&E's program that are simply not being inspected for or reported as hazard fails under the other utilities' programs. (Exh. 73, p. 9, 1.3; p. 10, 1.6; p. 36.)
Moreover, we do not believe that the increase in hazard fails between 1997 and 1998/1999 when SESCO took the program over (and PG&E's CAS testing requirement began) is alarming in terms of the percentage of homes inspected. In 1997, on a per-home basis, hazard fails for PG&E were recorded at 0.4% of homes inspected. From April through December 1998, when SESCO took over, this figure was 1.1%. For 1999, the figure is 1.7% on a per-home basis.51 When calculated on a per measure basis, hazard fails were maintained well below 1% throughout the 1997-1999 period, except during the 1997 roll-over period under RHA's program management, when they increased to approximately 2%.52
In terms of the ultimate safety to the low-income customer, we note that for any hazard fail, PG&E, SDG&E, and SCE (all electric) contractors must either reinstall or correct any measures that failed because of a hazardous condition within 24 hours of being notified by the inspector. SoCal inspectors will mitigate hazardous conditions and require the contractor to make permanent corrections within three days. (Exh. 73, p. 10.)
10.4 Accessibility to Non-Utility Programs That Serve Low-Income Communities
By adding Section 381.5 to the Public Utilities Code, the Legislature directs the Commission to evaluate the effectiveness of low-income energy efficiency programs by considering factors other than cost. In particular, the Legislature directs us to consider the degree to which the program provides maximum access to quality programs offered by "entities that have demonstrated performance in effectively delivering services to the communities." Contractors' Coalition argues that nothing in this section defines those entities as CBOs to the exclusion of private firms. (Contractors' Coalition Opening Brief, p. 17.) However, this interpretation ignores the clear intent of the Legislature, in directing this consideration, to "protect and strengthen the current network of community service providers" (§ 381.5, first line, emphasis added). The plain meaning of this section requires us to examine the degree to which participants in the low-income energy efficiency programs have access to the programs and services that CBOs make available in their communities.
As discussed in this proceeding, in addition to doing weatherization work, CBOs can also offer job training and access other social services to meet the needs of low-income families, such as food vouchers or medical assistance. In addition, some CBOs have access to federal funding for low-income weatherization services, i.e., the Low-Income Home Energy Assistance Program (LIHEAP), that is administered by the state.53 In this way, utility funding can be augmented to expand the types of measures installed or to reach homes that are not eligible under the utility program. Several parties also testified that CBOs perform very effectively in the outreach for these various programs and services, because they have gained the trust of families in the low-income communities that they serve. (RT at 311, 315-316, 531-533, 804. Exh. 51, pp. 8-9; Exh. 26, pp. 4-6.)
10.4.1 Direct CBO Involvement in the Program
One way to provide access to community-based programs is to directly involve CBOs directly in the low-income energy efficiency programs as weatherization contractors. Therefore, we examine the history of CBO involvement in these programs.
In SDG&E's program, since 1991, RHA has contracted with two CBOs and three private contractors. During the 1997-1999 period, the breakdown in terms of units treated was 65% private contractors and 35% CBOs. SDG&E testified that it used both CBOs and private contractors prior to 1991 for its predecessor weatherization program, but does not know the breakdown. (RT at 538-539, 623.)
For SCE's weatherization program, which was competitively bid out in 1991, the mix of CBOs and private contractors is approximately 50/50. SCE could not provide the breakdown for its evaporative cooler program, which is also bid out. The relamping program, which is not competitively bid, has always been delivered by CBOs. (RT at 474-475.)
The history of SoCal's weatherization program, in terms of CBO participation, is as follows: In 1994, all the work was performed by CBOs. In 1995, 84% was performed by CBOs and 16% by private contractors. In 1996, the breakdown was 89% and 11% for CBOs and private contractors, respectively. For 1997 through 1999, the breakdown has been consistent at approximately 94% CBO participation and 6% private contractors. Currently, there are 15 CBOs and 1 private contractor working in the program. (RT at 598.)
For PG&E, in 1995/1996, there were 18 CBOs participating in the program, and by the time RHA's tenure ended in 1997 and early 1998, this figure dropped to eight CBOs performing approximately 30% of the treatments. In 1999, under SESCO's contract, CBO participation dropped further to two CBOs participating in the program, treating approximately 9% of the homes. (RT at 1070; Exh. 4, p. 7, Exh. 20.) At the end of 1999, SESCO added a new CBO, North Coast Energy Services, which brings the total to three. (RT at 1300.)
10.4.2 Referrals and Leveraging
Another way in which program participants can have access to the services provided by CBOs is through a referral system, where either program participants are directed to the local CBO, or that CBO is notified that a utility customer could benefit from other services and programs. In this way, various sources of low-income assistance funding (utility, state and federal) can be effectively "leveraged" to provide comprehensive services to the low-income utility customer.
Parties to this proceeding acknowledge that access to CBO programs and leveraging funds from non-utility programs could be accomplished through a referral system if private contractors do the weatherization work instead of CBOs. However, apparently, none of the programs have set up a system that would identify the needs of participants in low-income energy efficiency programs and direct them over to the CBOs and other low-income agencies so they can maximize the benefits that are available to them. (RT at 66, 316-317, 804-805.) Nor do the utilities generate information about the degree to which their contractors have worked with CBOs to leverage non-utility weatherization program funding. (RT at 213, 393, 477, 601.)
With regard to increasing the total amount of federal dollars for California's LIHEAP program, only PG&E effectively provides this financial leveraging. As indicated in the discussion of Exh. 74, the Department of Community Services and Development (CSD) receives federal "leveraging" dollars under a formula based on the non-federal dollars spent on low-income energy services in the state. This leveraging is primarily met today by PG&E's CARE program, since PG&E's CARE program is the only utility CARE program that CSD uses when securing Federal leverage funding. This is because only PG&E satisfies certain prerequisites, namely a written and verbal referral system between the CARE and LIHEAP programs. As CSD reports, utility funds represent the largest group of resources used for leveraging, with most of this coming from CARE rate discounts. (Exh. 74, p. 4; CSD Table 4.)
10.4.3 Conclusions
Based on the evidence discussed above, we conclude that experience with competitive bidding for LIEE programs to date supports a finding that bidding can reduce unit costs appreciably, resulting in more homes being weatherized under the LIEE program. However, we have no data with which to compare these reductions in unit costs with the utility's costs of administering each bidding process. Nor do we have comparable data on savings-per-measure installed that would allow us to translate these unit cost reductions into measurable bill savings to the low-income customer, or to compare the bill savings per dollar of expenditure across utility programs.
With regard to the performance of weatherization contractors under a competitive bidding program or other outsourcing approach, we find that the evidence raises more questions than it answers. We cannot conclude, as some parties urge us to, that PG&E's per-home inspection pass rates reflect a lower quality program. Nor does the evidence lead to any definitive conclusions about whether bidding in general reduces the quality of work. As discussed above, the discrepancy between per-home pass rates in PG&E's program and the southern California utility programs existed even prior to PG&E's PY1998 competitive bid. There are simply too many variables at work here that contribute to the per-home pass rate determination, including potential differences in inspection standards and procedures, differences in the definition of pass rate "fails," and differences in the number and type of measures installed per home.
In our opinion, the most glaring shortcoming with using a per-home pass rate as an indicator of relative performance quality, is that it does not tell you anything about the nature of the problem in the installation of measures or minor home repairs, and its impact on home energy savings. For example, a home can "fail" because a single strip of weatherstripping around a door is not secured. And yet, the contractor may have properly installed 20 other measures (including weatherstripping around all the other doors in the home), resulting in a substantial savings in energy use for the home. In contrast, a contractor could install a small number of measures in the home perfectly, but those measures produce only a fraction of the home energy savings as those installed by the contractor with the fail described above. Which is the higher quality installation? Which is the higher quality program? Per-home pass rates do not provide this information.
We believe that PG&E is moving in the right direction by compiling and examining pass rates that relate to the types of individual measures installed in the home, rather than relying exclusively on per-home pass rates. We note that PG&E's experience with competitive bidding in 1998 indicates no apparent drop in quality of installations when evaluated on a per measure pass rate basis.
However, we are not convinced that an evaluation of performance based on per measure pass rates is without its drawbacks. As in the case of per-home pass rates, this measure of performance does not indicate to what extent the expected savings per home (based on the type and number of measures being installed correctly) is being achieved by the contractor. Therefore, as discussed further in Section 19 below, we direct the utilities to develop improved methods for tracking and reporting performance quality-ones that can recognize true differences in the quality of the work provided to low-income customers.
With regard to hazard fails, we do not find any appreciable difference in hazardous conditions arising from PG&E's competitively bid program. We believe that parties placed too much emphasis on the numeric counting of hazard "fails" rather than on a careful examination of the underlying conditions that are (or are not) reflected in those statistics. The important issue for the safety of low-income customers receiving weatherization services is to ensure that the utility's inspection and response procedures effectively protect all LIEE program participants from potentially hazardous situations in the home. By today's decision, we affirm the Assigned Commissioner's ruling that directs the utilities to achieve greater consistency in these procedures, including CAS testing. (See Section 17.)
In terms of access to programs provided by community service providers, we observe that PG&E's program has experienced a precipitous drop in direct CBO participation, and currently has the lowest level of CBO participation in terms of the percentage of units treated by CBOs. However, we cannot conclude, as some parties do, that this decline is attributable to the competitive bidding process that took place for PY1998. We note that the trend of declining CBO participation began well before SESCO assumed the role of PG&E's prime contractor under that bid. Unfortunately, RHA did not put forth a witness to discuss its experience with declining CBO participation during the period in which it was PG&E's prime management contractor. So, we cannot know all the factors that initiated this decline, or that have kept CBO participation dropping throughout the period. With regard to SESCO's tenure as prime contractor, the record indicates that several factors may have contributed to the further decline in CBO participation over the last two years. These include PG&E's initiation of the CAS testing and inspection process that resulted in delayed payments to contractors, fixed per measure prices that were lower than previous years and a bonus payment system based on pass rate performance and other performance criteria. (RT at 972-973.)
Irrespective of the specific causes for the decline in CBO participation in PG&E's program, we believe that this decline has adversely affected PG&E's program with respect to the type of access intended by Pub. Util. Code § 381.5. This is not to imply that access to programs made available by community service providers can only be achieved by direct participation of CBOs as weatherization contractors in the program. However, the type of referral and leveraging system that could create this access, in the absence of CBO direct participation, is currently not in place. And while PG&E's effective referral system between the CARE and LIHEAP program has increased the amount of funds available to CBOs for the LIHEAP program, if the referral system between PG&E's private contractors and CBOs is not in place, the low-income customer's access to that additional funding cannot be effectively maximized. Accordingly, we direct the utilities to report on their progress to improve this access. (See Section 19 below.)
The evidence in this proceeding also indicates that financial leveraging for California's LIHEAP could be increased if all the regulated investor-owned energy utilities were to satisfy certain prerequisites (as does PG&E) that would allow CSD to secure more Federal leveraging funding. We note that SoCal and SDG&E have contacted CSD regarding this issue and are exploring how they can maximize this resource, and direct SCE to do the same. (SoCal/SDG&E Reply Brief, p. 27.)
In view of the above, we do not have sufficient basis in fact to endorse competitive bidding as the best outsourcing approach for all utilities at this time. On the one hand, competitive bidding appears to have served low-income ratepayers well by reducing the unit costs of the program, thereby increasing the numbers of homes that can be weatherized. However, experience to date indicates that competitive bidding has not served low-income customers in the way envisioned by the Legislature when it enacted Pub. Util. Code § 381.5, namely, by facilitating access to other community-based programs designed to serve the needs of these customers.
Moreover, due to the lack of consistency in inspection procedures and reporting, we cannot determine the relative impact of competitive bidding, or any other outsourcing approach for that matter, on the quality of work performed by weatherization subcontractors. Nor can we determine the extent to which competitive bidding offers efficiency savings that can keep the costs of the programs reasonable for nonparticipating ratepayers. To do so, we would have to know the one-time administrative costs associated with the bidding process, as well as have consistent data on bill savings and expenditures across utilities. We do not have that information at this time. We initiate a process today that will provide that information for our consideration no later than the PY2002 program planning cycle. (See Section 19 below.)
In the meantime, we believe that the most practical course of action is to continue to allow utility administrators the flexibility to choose how they will outsource LIEE program functions, i.e., via competitive bidding, contract renegotiations, or a combination of both, subject to the policy guidance presented in the following sections.
41 Assigned Commissioner's Ruling Clarifying Scoping Memo, dated September 16, 1999, p. 4. 42 August 20, 1999, Prehearing Conference Statement of LIAB. 43 See Exh. 14, p. 75. 44 Exh. 40, Attachment D.3. 45 As discussed in Section 2.3 above, RHA competitively bid to procure its weatherization subcontractors, whereas SESCO renegotiated contracts with existing subcontractors when it became the project management firm for PG&E in 1998. If there is a lesson to be drawn from the records of pass rate changes from year to year at PG&E, it is that competitive bidding for subcontractors may produce a higher pass rate than negotiated subcontracts. However, it is only the prime contractor, not ratepayers, that benefits financially from competitive bidding of the subcontractors. (RT at 977-979.) 46 See: Exh. 18 (p.7), Exh. 23, Exh. 53, Exh. 43 (p.7), and the Declaration of George Sanchez attached to RHA's Opening Brief regarding the same administrators doing the same work in PG&E's and SDG&E's service territories, but receiving different pass rates. See also RT at 903, 905, 906, and 910. 47 Note: SoCal has not identified the pass rates of an individual CBO, so it is not feasible to know which of them is San Luis Obispo EOC. However, all of the scores are well above the 68% pass rate given by PG&E's inspectors. 48 As Contractors' Coalition points out, even this calculation is mathematically flawed because SDG&E/SoCal have failed to weight the averages by the number of jobs completed and inspections conducted during the periods listed in the table. For example, PG&E's simple averaging of three numbers fails to account for the fact that the 1999 line in the table represents half of a year and a different number of inspections. 49 For simplicity, assume the total number of homes treated is 10,000. PG&E inspected 42.75% of the total (4275) jobs and found 20.47% to be fails. That means PG&E found 875 fails out of the 4275 jobs inspected. If the true fail rate were 47.88%, as SDG&E/SoCal claim, then the true number of fails must have been 4788 out of the 10,000 homes treated. The number of jobs not inspected by PG&E was 5725 (10,000 - 4275 = 5725). Thus, among those 5725 jobs must have been 3913 fails (4788-875 = 3913). This means that the failure rate for the uninspected homes must have been 68.35% (3913/5725 = 68.35%). 50 SDG&E/SoCal misrepresent PG&E's proposal as allowing payment for houses in which a failure has occurred. (SDG&E/SoCal Reply Brief, pp. 31-32.) There is nothing in PG&E's proposal or the testimony that changes PG&E's current procedure to forbid invoicing for measures installed until all measures in that residence pass the inspection process. 51 These percentages were calculated from Exh. 23 by taking the total number of hazard fails (private and CBO) and dividing by the total number of inspections (private and CBO), per period. 52 Exh. 58, same calculations as described in footnote above. 53 Under state statute, CBOs (e.g., community action agencies, local governments and certain non-profit organizations) are the only authorized agencies that can apply the federally supplied funds that the State of California oversees in its Low Income Energy Assistance Program. The number is limited by county, and only a specific subset of CBOs are qualified to implement the program. If the agency wants to or needs to subcontract out its work with a private organization, it can do so. (RT at 235, 290; Exh. 13, p.4; Exh. 74, Budget Table 1.)