At the Commission's direction, ORA's witness Dr. Marek Kanter readministered a survey of Pacific's customers based on one ORA conducted in 1995. Based on the responses given to 36 survey questions concerning service quality, ORA concluded that "Pacific's quality of service has declined in the period between 1995 and 2001."186
ORA's comparison showed problems in residential and small business customers' perceptions of Pacific's service quality. Of 36 questions in the survey germane to service quality, the responses to 23 questions showed a difference between customer perceptions in 1995 and 2001. In each of the following 19 questions, more customers chose a less favorable response in 2001 than they had in 1995:
· Q8. How often have you noticed static or noise on the line?
· Q9. How often have you noticed voices fading in or out?
· Q10. How often have you heard voices echoing?
· Q15. How often was the line dead upon picking up the phone?
(Questions 21-27 relate to "long distance calls carried by your local telephone company.")187
· Q21. How often have you noticed static or noise in the last 30 days?
· Q22. How often have you noticed voices fading in and out in the last 30 days?
· Q23. How often have you heard voices echoing in the last 30 days?
· Q24. How often have you not heard the other party in the last 30 days?
· Q27. How often have you been disconnected while talking in the last 30 days?
(Questions 31-32 relate to contacts with the local company's business office.)
· Q31. Were the office personnel assisting you courteous?
· Q32. Were you satisfied with the help you received from the office personnel?
· Q34. Regarding contacts with the local company's telephone operators, were you satisfied with the help you received from the operators?
(Questions 37-38 relate to telephone installation and repair.)
· Q37. Was the work completed on time?
· Q38. Were you satisfied with the work?
· Q40. Was your most recent local telephone bill correct?
· Q42. How would you rate your local phone service for the last 30 days?
· Q43. Compared with the last 6 months, rate your service in the last 30 days.
· Q44. What is your overall satisfaction with your local telephone service?
· Q46. Rate the service of [the] present provider, compared with previous providers you have had in the last three years.
For each of the following 4 questions, more customers chose a more favorable response in 2001 than they had in 1995:
· Q13. How often have you heard other voices on the line?
· Q16. How often have you reached a number not dialed?
· Q26. How often have you heard other voices on the line in the last 30 days?
· Q41. If your most recent bill was incorrect, has the problem been resolved?
Pacific's witness Dr. Hauser states that ORA's survey is "biased, noisy, and non-representative."188 Dr. Hauser's main objection is that the survey sample is not representative of all of Pacific's customers, due to nonresponse bias. He claims ORA lacked procedures to minimize nonresponse bias, and ORA's sample is highly likely to be biased towards customers who are more dissatisfied than the typical Pacific customer.189 Hauser also pointed out flaws regarding the ORA's statistical analysis: use of inappropriate and biased hypothesis tests, incorrect calculations of the joint significance tests, inappropriate comparisons over time and typographical mistakes.190 According to his analysis, if corrected, "37% of the statistically significant declines in service between 2001 and 1995 that ORA found are incorrectly labeled."191 Due to all these flaws and mistakes, Hauser claims that ORA's survey results are not a valid measure of Pacific's service quality.
ORA did not change the survey questions - again at the Commission's direction - because it wanted the results to be comparable over time. While Pacific criticizes the poor quality of the survey, we find that ORA did precisely what it was directed to do: use the same survey as it used in 1995 so as to have a basis to compare Pacific's results. In this regard, the Commission stated in the OIR that, "Parties that conduct surveys are encouraged to adhere to the following principles. First, in developing the survey, the party should use as a starting point the surveys of Pacific and Verizon customers conducted by Commission staff in previous proceedings."192
On the sample size, it is true that ORA did not follow up with customers in an attempt to increase the size of the sample of customers taking the survey. However, ORA did not follow up in 1995 either. As ORA points out, "had ORA attempted follow-up procedures that were different than the procedures in place in 1995, it would have lost the ability to do a fair comparison of the 1995 with the 2001 results." Dr. Kanter also explained that, "had I done follow-up phone calls, I would have changed the cast of characters, so to speak. The people responding would not have been as directly comparable to the people responding in 1995."193 The drop in the response rate in the 2001 survey from that of 1995 may limit our ability to draw conclusions from the survey with statistical confidence.194 However, while the ORA 2001 response rate is less than 50% of its 1995 rate, the total number of responses in 2001 is 881, which does lend some credibility to the findings.195 Further, since a disparity in the response rate also occurred with the ORA survey of Verizon customers, we would expect such purported bias to yield a similar survey response for Verizon. Yet, as discussed below, the findings of the ORA survey of Verizon are opposite that of its findings for Pacific. Such an extreme difference suggests that the purported bias claimed by Pacific's witness is not powerful enough to discredit the ORA survey findings.
The methodological discussions brought up by Pacific in this proceeding regarding how to conduct a proper survey and analyze its results caution us on drawing conclusions based solely on ORA's survey instrument. As with almost all the other data presented in this proceeding, ORA's survey suffers from flaws. However, the survey still suggests that the consumer perception of Pacific's service quality fell between 1995 and 2001.
Pacific also submitted its own surveys. One of them was conducted by a global marketing information firm J.D. Power. Even though Pacific submitted little information about what the survey asked customers, Pacific's witness, Dr. Hauser, explained that these surveys did not "measure satisfaction with recent service events with Pacific (e.g., installations or repairs), but rather provided a general measure of satisfaction with overall customer service and its aspects."196 That is, overall customer satisfaction is determined "by surveying over 12,000 households on the areas of customer service, cost of service, corporate image, call quality, promotions, billing, calling cards, and operator service."197
Pacific received a score of 110 in 2001 from J.D. Power, where 104 is the industrial average score.198 Furthermore, Pacific is ranked in the top six out of the sixteen local service providers surveyed. 199 Pacific's witness, Dr. Hauser also stated that Pacific has consistently exceeded the industry average for every year from 1996 to 2001 and it has consistently ranked in the top six of local service providers.200
However, the information Pacific submitted indicated that the survey also included several factors that we consider peripheral to a true assessment of service quality, such as "corporate image" (which respondents ranked as one of the top three factors relevant to customer satisfaction, with 21% finding it important), "cost of service/value" (with 24%) and "calling card," which appear to relate to Pacific's prices and calling card services. These are not elements of service quality as examined in this decision. Thus, the J.D. Power surveys broader aspects of service quality than are the focus of our study. Nevertheless, it provides some evidence that indicates that consumers are satisfied with Pacific's service quality. However, because the survey focuses on certain aspects of service quality - cost of service, corporate image, promotions, billing, and calling cards - with which we are not concerned in this proceeding, it is of limited evidentiary value.
Pacific's expert Dr. Hauser also summarized the results of a 2000 survey of various local exchange carriers by IDC, entitled "Telecommunications Consumer Brands Survey." According to Dr. Hauser, IDC is "a leading provider of technology forecasts, insights and advice."201 Dr. Hauser reported that the IDC survey found that Pacific's customers are more satisfied than the average local telephone customer for all attributes studied except one; Pacific's customers are the second most overall satisfied for customer service; Pacific's customers are the third most satisfied for voice quality; and Pacific is one of the top three providers in over 85% of the areas measured. According to Dr. Hauser, the IDC survey polled 805 households nationally, and measured local telephone service customers' satisfaction with "customer service, fees, marketing, reputation, pricing structure and voice/service quality."202
Attachment 31 to Dr. Hauser's testimony summarizes the results of the IDC study. Two indicia of service quality contained in the survey are "customer service" and "voice or service quality."203 For "customer service," 73.8% of respondents ranked Pacific as a 4 or 5 (with 1 = not very satisfied, and 5 = very satisfied). This places Pacific in the middle of the range for comparable carriers. Of the non-SBC companies, GTE/Verizon's comparable result was 83.1%, Bell Atlantic's was 80.7%, and Bell South's was 72.6%, and US West's was 63.1%.
Similarly, on "voice or service quality," 85.7% of customers ranked Pacific a 4 or 5. Of the non-SBC companies, Bell South scored 86.3, GTE scored 85.9, US West scored 83.8, and Bell Atlantic scored 83.5. Thus, when analyzing the tale of the distribution - those most satisfied - Pacific's results for these two measures are comparable to the other non-SBC carriers' results.
However, Dr. Hauser acknowledged that the sample of Pacific customers surveyed in this study was small, just 42 Pacific customers. Dr. Hauser also conceded that Pacific's results "may not be statistically significant from other LECs' results."204 Consequently, similar to the ORA survey, we have questions about the statistical reliability of the IDC survey.
Pacific's witness, Dr. Hauser, explained that Pacific has a centralized organization that collects data on an ongoing basis by surveying customers with recent service interactions with Pacific. A sample of the customers is surveyed by an independent marketing firm, Market Insights, every month, 7-10 days after the service event and asked about their interaction with the business office and network operations.205 These surveys are the source of data provided to the FCC in the ARMIS 43-06 reports.206 The survey results are also reported to the CPUC under the P.A. 02-04 reporting requirement. The tables below summarize Pacific's performance from 1990 through 2001. The December measures reported includes the result for the previous 11 months as well, and therefore offers a tabulation of the entire year.
While Pacific reports the CPUC the percentage of the customers satisfied or very satisfied with Pacific's service, it also reports to the FCC the percentage of the customers dissatisfied or very dissatisfied with service as shown in the table below.
Unlike the surveys discussed before, in these surveys "customers rate their overall satisfaction with their service interaction. In addition to their satisfaction with the service event, customers are asked about the ease of getting through to the office as well as several questions that measure the skill of the Pacific representative answering the call, e.g., was the representative courteous, and the performance of the Pacific service technician who performed any necessary repairs, e.g., doing quality work and completing the work in a timely manner."207
Pacific has modified the surveys over the years by changing its rating scale in 1992 and 1998. The wording was also changed in 1998, with further changes following in 1999.208 Consequently, as Pacific's witness, Dr. Hauser stated "In the Pacific CSQ survey, it is extremely difficult to compare responses prior to January 1998 with responses after the change in wording.209 Therefore, he compared the data for the years 1994-1997 and 1998-2001 and presented the results in Attachment 32 and 33 of his testimony. His analysis included a comparison to survey results reported by the top ten LECs to the FCC.
According to Dr. Hauser, "Pacific's customers who are surveyed about repair work are three to six percentage points less dissatisfied than the average of the top ten LECs. Furthermore, Pacific's customers are less dissatisfied about the business office and installation work for each customer group surveyed. This analysis suggests that Pacific's service is good relative to its peers in 2001."210 Dr. Hauser also examined the percentage change between 1998 and 2001 and reported the results in Attachment 32 of his testimony. According to his findings, Pacific's customers' dissatisfaction rose for only installation services for residential and large business customers and business office services for residential and large business customers. The dissatisfaction declined for all other services and categories. In comparison, over the same period, the dissatisfaction for the services of the reference group rose for all categories except for repair services for large business customers. Pacific's witness Mr. Flynn identified dissatisfied ratings as relatively stable from 1994 through 2001.211
However, Dr. Hauser explained that comparisons of Pacific's survey results with those of other ILECs are "potentially biased". He noted that each carrier has the discretion to conduct their surveys as they see fit with no prescribed uniform methodology, questions or response scales. Carriers also have the discretion to change surveys over time without posting notice of those changes.212
For the reasons explained by Dr. Hauser, we find that, as with the other survey data for Pacific, the comparisons of survey data among carriers may not be statistically reliable.
During the audit phase of this proceeding, the Commission's consultant, Overland Consulting (Overland), alleged that Pacific used a third-party research firm to conduct customer satisfaction surveys during the NRF period, and that Pacific did not file the surveys with the Commission as required by the NRF monitoring program.213 According to Overland, the surveys were conducted under Pacific's Customer Service Quality (CSQ) process, and surveyed customers who had recent experience with Pacific in the areas of sales, billing, maintenance, installation, and operator services.
Overland reported that Pacific should have filed the surveys under NRF monitoring report P.A. 02-03, and that Pacific refused Overland's requests for copies of the surveys. In response to Overland's assertion that Pacific failed to file the surveys as required, Pacific states, "It is possible Overland has confused two monitoring reports, P.A-02-03 and P.A-02-04. Pacific understands that P.A-02-03, Customer Survey Report, refers to surveys initiated by the Commission . . . ."214 Pacific argues that it should not be obliged to produce its customer surveys because the requirement "has not been raised by the Commission or its staff in the last 11 years. . . ."215We have reviewed the origins and purposes of reports P.A. 02-03 and P.A. 02-04, and find the following. After completing a series of workshops in 1990, the Commission adopted a comprehensive monitoring program for Pacific and Verizon "as described and envisioned in the Commission's Advisory and Compliance Divisions (CACD) three workshop reports... [including]...the reporting requirements recommended in CACD's Workshop II Report..."216
The Commission in D.91-07-056 also directed the staff to produce "a written assessment explaining who prepares each monitoring report that the utilities provide to our staff, and what purpose each of these reports serves for the utility and for the staff."217 The staff's Monitoring Report Assessment, filed on May 1, 1992, contained the following description of "Customer Surveys" Pacific is required to file under report P.A. 02-03:
6. Customer Surveys: These surveys are given to customers who have direct contact with Pacific Bell and are used to measure customer satisfaction levels and perceptions of the company. These surveys are conducted through the Corporate Research organization at Pacific Bell, and historically have been provided to the DRA Telecommunications Rate Design Branch, and is [sic] used in DRA's ongoing service quality evaluation. The surveys are provided as initiated. It is recommended that these surveys continue."218
The Monitoring Report Assessment also describes a separate set of ongoing survey results that Pacific is required to file monthly under Report P.A. 02-04, as follows:
"7. Quality of Service Performance - Customer Opinion Surveys: These surveys are conducted by the Company Measures and Statistics organization at Pacific Bell. A monthly report identifying the percentage of customers that are satisfied with Pacific Bell's service quality is provided to the DRA Telecommunications Rate Design Branch. DRA uses the information in these reports is used in it's [sic] service quality monitoring efforts. It is recommended that these surveys continue."219
Thus, the Monitoring Report Assessment describes two separate and distinct monitoring reports addressing different kinds of customer surveys: P.A. 02-03 contains surveys conducted from time-to-time through Pacific's Corporate Research organization measuring customer satisfaction levels and perceptions of the company, while P.A. 02-04 contains a monthly report prepared by Pacific's Measures and Statistics organization on an ongoing basis identifying the percentage of satisfied customers.
Pacific asserts that the P.A. 02-03 report refers only to surveys initiated by the Commission. We find nothing in D.91-07-56, in the staff's workshop report, or in the staff's Monitoring Report Assessment supporting Pacific's assertion that only Commission-initiated customer surveys are to be filed with the Commission under report P.A. 02-03.
We will assume for purposes of this decision, however, that Pacific was confused about the difference between reports P.A. 02-03 and P.A. 02-04. Pacific should now produce any customer satisfaction surveys prepared at its direction, including surveys prepared by outside firms as part of its CSQ process, as part of its testimony in Phase 3B of this proceeding. We will resolve how to treat these reports in the that phase.
For the reasons stated above, we find that each of the surveys related to Pacific's service quality are of limited evidentiary value. To the limited extent that they offer useful data, they present a mixed view of Pacific's customer satisfaction. The ORA survey finds the most problematic service quality, whereas the J.D. Power and IDC surveys - which focused in some cases on irrelevant issues - find that Pacific performs as well as its peers. Pacific's ARMIS 43-06 data show that Pacific's customers' dissatisfaction rose for four measures -- installation services for residential and large business customers and business office services for residential and large business customers -- but that dissatisfaction fell with respect to five measures -- repairs (all customers), installations for small business customers and business office services for small business customers.
ORA's customer service quality survey for Verizon showed that in the minds of the customers surveyed, Verizon's service quality has improved since 1991. Undoubtedly because Verizon did not dispute the survey findings, there was no challenge to ORA's survey methodology with respect to Verizon.
Verizon claims it surveys its California customers by conducting over 1,000 interviews per month covering Directory Assistance, Consumer and Business Provisioning (which covers installation of new service), Consumer and Business Repair (which covers diagnosis, repair, and restoration of existing service), and Consumer and Business Request and Inquiry (which covers requests and inquires directed to the Business Office regarding customer bills, products and services, prices, and company policies).220 The results of these surveys show that Verizon offers good service quality. Neither ORA nor TURN challenged the results of these surveys.