VIII. Customer Satisfaction and Service Quality Surveys - Pacific

At the Commission's direction, ORA's witness Dr. Marek Kanter performed a survey of Pacific's customers based on one ORA carried out in 1995. ORA did not change the survey questions - again at the Commission's direction - because it wanted the results to be comparable over time. While Pacific criticizes the survey, we find that ORA did precisely what it was supposed to do: use the same survey as it used in 1995 so as to have a basis to compare Pacific's results. In this regard, the Commission stated in the OIR that, "Parties that conduct surveys are encouraged to adhere to the following principles. First, in developing the survey, the party should use as a starting point the surveys of Pacific and Verizon customers conducted by Commission staff in previous proceedings."138

Moreover, assuming for the sake of argument that Pacific is correct in claiming the survey contains flaws, the same flaws were there in 1995, and so its criticisms do not undermine the survey's usefulness as a means of comparing Pacific's performance over time. Even ORA's own witness conceded some of the flaws, but repeatedly pointed out that his main purpose in using the survey was to gain comparative data.

We find that the survey has merit when used for comparative purposes despite the flaws Pacific points out. Indeed, had ORA changed the 2001 survey from the one used in 1995, ORA notes that Pacific would have criticized the survey on that basis. As its witness Dr. Harris stated, "[i]t is difficult to evaluate the meaning of changes in survey responses over time if there has been a change in the survey itself."139

Pacific's key alleged flaws related to "nonresponse error" and to the sample size and ORA's lack of efforts to increase it. Nonresponse error relates to the theory that those who respond to a survey are more likely to have complaints. If one does not control for this bias, according to Pacific's expert, Dr. Hauser, the results of the survey will contain more negative responses than reality would dictate. However, Dr. Hauser himself could not identify a study backing up his claim:


Q. So is it your testimony that dissatisfied customers are statistically significantly more likely to respond to a survey than satisfied customers?


A. I really wish I could hold up at this moment in time a particular academic study that . . . shows that statistical significance and it's been done to high methodology (sic). I just can't do that at this point in time.140

Moreover, ORA points out that in this case, "[t]o demonstrate bias Dr. Hauser would have to show not only that `dissatisfied customers are more likely to be more concerned with making their voices heard,' but also that such a tendency has increased in the period from 1995 to 2001."141 ORA used the survey to compare Pacific's performance over time, and Pacific fails to explain why the results changed so markedly from 1995-2001.

On the sample size, it is true that ORA did not follow up with customers in an attempt to increase the size of the sample of customers taking the survey. However, ORA did not follow up in 1995 either. As ORA points out, "had ORA attempted follow-up procedures that were different than the procedures in place in 1995, it would have lost the ability to do a fair comparison of the 1995 with the 2001 results." Dr. Kanter also explained that, "had I done follow-up phone calls, I would have changed the cast of characters, so to speak. The people responding would not have been as directly comparable to the people responding in 1995."142 Thus, we find that for the purpose Dr. Kanter intended it - comparison between Pacific's performance in 1995 and its performance in 2001 - Pacific's criticisms lack merit.

The comparison showed serious problems in residential and small business customers' perceptions of Pacific's service quality. The survey found that according to its customers, Pacific's quality of service declined in the period between 1995 and 2001. Of 36 questions in the survey germane to service quality, the responses to 23 questions showed a difference between customer perceptions in 1995 and 2001. Of these 23, in only 4 questions did customers choose a more favorable response in 2001 than they had in 1995. In each of the following 19 questions, the results were worse in 2001 to a statistically significant extent than they were in 1995:

      · Q8. How often have you noticed static or noise on the line?

      · Q9. How often have you noticed voices fading in or out?

      · Q10. How often have you heard voices echoing?

      · Q15. How often was the line dead upon picking up the phone?

(Questions 21-27 relate to "long distance calls carried by your local telephone company.")143

      · Q21. How often have you noticed static or noise in the last 30 days?

      · Q22. How often have you noticed voices fading in and out in the last 30 days?

      · Q23. How often have you heard voices echoing in the last 30 days?

      · Q24. How often have you not heard the other party in the last 30 days?

      · Q27. How often have you been disconnected while talking in the last 30 days?

(Questions 31-32 relate to contacts with the local company's business office.)

      · Q31. Were the office personnel assisting you courteous?

      · Q32. Were you satisfied with the help you received from the office personnel?

      · Q34. Regarding contacts with the local company's telephone operators, were you satisfied with the help you received from the operators?

(Questions 37-38 relate to telephone installation and repair.)

      · Q37. Was the work completed on time?

      · Q38. Were you satisfied with the work?

      · Q40. Was your most recent local telephone bill correct?

      · Q42. How would you rate your local phone service for the last 30 days?

      · Q43. Compared with the last 6 months, rate your service in the last 30 days.

      · Q44. What is your overall satisfaction with your local telephone service?

      · Q46. Rate the service of [the] present provider, compared with previous providers you have had in the last three years.

Regardless of the individual criticisms Pacific leveled at the questions and the way the survey was formatted, they were the same questions and the same formats Pacific's customers saw in 1995, and the same questions ORA used for Verizon's survey, which produced positive results for that company. We find the ORA survey to provide very strong evidence of a decline in Pacific's customer satisfaction between 1995 and 2001.

138 R.00-09-001, mimeo., at A-3. 139 ORA Opening/Service Quality at 19, citing Exh. 2B:254 at 34:6-7. 140 18 RT 2176:16-22 (Hauser). 141 ORA Opening/Service Quality at 21, citing Exh. 2B:354 at 45:14-16. 142 ORA Opening/Service Quality at 20, citing 18 RT 2147:2-12. 143 Pacific correctly pointed out, in our view, that this question might have confused customers, and more so in 2001 than in 1995 with the differentiation in local toll and long distance calling and the proliferation of long distance providers.

Previous PageTop Of PageNext PageGo To First Page