The Commission has a statutory duty to ensure that telephone corporations provide customer service that includes reasonable statewide service quality standards, including, but not limited to, standards regarding network technical quality, customer service, installation, repair, and billing.28 (See e.g., Pub. Util. Code §§ 709, 2896 and 2897.) The current GO 133-B implements this requirement through reporting of failure to meet standards associated with service quality measures (exception reporting).
The Commission initiated this rulemaking because it concluded the existing measures needed revision in light of a competitive marketplace for URF ILECs and due to changes in federal and state telecommunications law. The OIR noted existing measures deserve review because they are both technologically outdated and inconsistently reported by carriers. Further, as stated in the 2007 ACR, state policy promotes service quality regulation which aims to: (1) rely on competition, wherever possible, to promote broad consumer interests; and (2) promote development of a wide variety of new technologies and services in a competitive and technologically neutral manner.29
This decision examines whether customer satisfaction surveys, revised measures, and/or service quality monitoring best fulfill these policies, as well as our obligation to ensure carriers provide reasonable statewide service quality standards. In assessing which metrics will best address service quality goals, we were mindful to weigh arguments regarding the impact of competition in certain telecommunications markets in the service quality context. While we have relied on competition to ensure that rates are "just and reasonable,"30 reliance on competition in the service quality context must be tempered with an acknowledgment of our statutory duty to ensure telephone corporations provide reasonable service quality standards. We do not believe competitive environments completely obviate the need for any service quality measures. However, GO 133-C does eliminate a number of reporting measures for competitive carriers.
We were also mindful of the OIR's goal to achieve technologically neutral outcomes. While that might suggest the exact same requirements should apply to all carriers, our requirements must recognize certain jurisdictional limitations in the areas of wireless and Internet Protocol-enabled services. Even certain commenters recognized it may be impossible to fashion service quality standards that are exactly the same for all carriers.31 This decision also examines whether MSI reporting should mirror the FCC's reporting guidelines or continue under the Commission's reporting requirements.
4.1. Customer Satisfaction Surveys
We first address whether the Commission should conduct annual customer satisfaction surveys for all wireline and wireless carrier services. Customer satisfaction surveys would review performance of a broader range of carriers than the ILEC (wireline) carriers that currently report under GO 133-B. As noted in the 2007 ACR, publishing customer survey results is not intended to trigger investigations or penalties. However, surveys may be a tool to promote customer education regarding indicators such as installations, repairs and answer time. They may also assist customers in choosing or changing carriers.32 The 2007 ACR also raised issues regarding the content and format of surveys and who would be responsible to conduct and pay for them.
The commenting parties have generally established that numerous customer satisfaction surveys already exist for the wireless industry, raising a threshold issue of whether a Commission-required survey would be unnecessary and redundant. Wireless surveys include J.D. Power and Associates,33 Consumer Reports,34 PC Magazine's Readers' Choice,35 Consumers' Checkbook,36 mindWireless,37 Mountain Wireless,38 the FCC,39 and the Better Business Bureau.40
There are fewer surveys applicable to wireline carriers. Wireline surveys include J.D. Power and Associates (business), Consumer Reports, and American Consumer Satisfaction Index. The FCC also requires customer satisfaction surveys per ARMIS Report 43-06. 41 However, not all carriers are required to file ARMIS data and the FCC recently sought comment on whether service quality and customer satisfaction reporting should continue, what specific information should be collected, and whether industry-wide reporting should be required. 42
Finally, some carriers also conduct internal surveys that they have found focus on customers' concerns. For example, Verizon uses an outside market research firm to survey customers on an almost daily basis. Verizon gets detailed information about provisioning (including installation of new service), repair (diagnosis, repair, and restoration of existing service), and request and inquiry (contacts to the business office regarding customer bills, products and services, prices, and company policies). These surveys show what is important to Verizon's customers and the priority placed on key attributes.43
Notably, these priorities do not necessarily correspond with the service attributes the Commission historically has measured in its current rules. Verizon has found that customers value a quick response to their requests, a job done right the first time, and maintaining close communications with them. Verizon reports it continuously reminds its employees of these priorities. This type of higher level survey is quite different from traditional Commission surveys that focus on service quality standards using more dated metrics such as installation, repair, and answer time. We do see merit to the argument that the type of higher level survey information referenced by Verizon may more accurately reflect issues that are of importance to modern day customers.
Overall, the parties are somewhat split on whether the Commission should require customer satisfaction surveys. Clearly in support of such a requirement are DRA and TURN. DRA argues surveys are useful because competitive markets thrive with increased information to customers. In DRA's view, existing surveys are insufficient, because only AT&T, Verizon and Contel report ARMIS customer satisfaction surveys.44 Further, DRA argues that the need for customers to have sufficient information is consistent with the Commission's URF Phase I Decision and Pub. Util. Code § 709.45
TURN generally supports surveys, but argues that surveys alone provide insufficient information to allow consumers to make optimal choices. Further, TURN recognizes that surveys have certain disadvantages, in that external events may influence customer opinion and satisfaction, and results may vary absent uniformity of questions and format for all surveys.46
DOD/FEA supports continued Commission monitoring of performance of all California LECs, including reporting of California-specific ARMIS customer satisfaction surveys. DOD/FEA asserts that the FCC's ARMIS 2006 customer satisfaction surveys for AT&T and Verizon's California customers show a significant level of dissatisfaction with installation, repair, or business office contacts.47
DisabRA supports customer satisfaction surveys as one service quality requirement and recommends that surveys include some questions specific to the provision of services to the disability community. DisabRA notes that surveys may need to specifically target populations such as customers with disabilities since such groups are not likely to otherwise be included.48
Other parties support customer satisfaction surveys, but only for monitoring purposes and if any other GO 133-B or service quality reporting is eliminated.49 For example, AT&T states that existing third-party customer satisfaction surveys are adequate to provide the Commission with information on customers' experience and would facilitate competition by addressing the need to have information regarding all providers that is comparable, accurate and reliable. Unlike service quality measures and standards, surveys can adapt quickly to changes occurring in the telecommunications market.50 However, AT&T suggests that should the Commission determine to use surveys to assess service quality, it should conduct workshops to develop the exact nature and format of information that would be included.51
The remaining parties do not support customer satisfaction surveys. The Small LECs assert their service quality is excellent, and is already examined for most GRC ILECs through the rate case process. Further, as they already are subject to GO 133-B reporting, customer satisfaction surveys would be an unnecessary additional expense that is unlikely to yield any benefit.52
The Joint Parties argue additional surveys are neither necessary nor advisable, since numerous surveys currently exist and the competitive market dictates that they provide high service quality.53 Additionally, the Joint Parties point out that indicators such as installation or repair time are not meaningful for wireless services since wireless capability is activated rather than installed, and wireless carriers do not dispatch technicians to repair wireless.54
Cbeyond argues that customer satisfaction surveys are not necessary for business customers, because there is sufficient competition in the business market and CLECs lack sufficient resources to conduct surveys and deploy new services and facilities.55
CTIA and Verizon Wireless echo the Joint Parties' position that customer satisfaction surveys are not meaningful for wireless carrier services, particularly given the range of existing surveys in the wireless industry.56 CTIA also argues that Commission-sponsored surveys could distort the competitive market by giving the appearance that the Commission is endorsing the services of a specific carrier.57 Finally, T-Mobile asserts that nothing suggests a Commission-sponsored survey would provide any additional material benefit to consumers.58
We generally agree that Commission-required surveys could have the advantage of being a tool that applies to all aspects of intermodal voice competition. Unlike standards that cannot be applied to all types of carriers either due to differences in services (wireline versus wireless), or jurisdictional concerns (telephone corporations vs. wireless carriers vs. VoIP services), customer satisfaction surveys could reach both wireless and wireline customers served by any technology. We agree that customers and the market benefit from the availability of such information.
However, two factors lead us to conclude it is premature to adopt an independent Commission customer satisfaction survey as a component of service quality regulation under GO 133-C. One, the record reflects there are already many existing surveys which cover a range of issues and questions. An independent Commission survey would only be a valuable tool if it provides customers with new information that does not merely mirror other existing surveys. We do not believe the current record contains any specific proposal regarding what set of customer satisfaction attributes, and format, would be uniformly meaningful as an indicator of customer priorities across all carrier types (e.g., wireline, wireless, small carriers and large carriers).
Two, we believe we can benefit from information and evaluation that will come out of the FCC's pending rulemaking on customer satisfaction survey issues. The FCC Service Quality Opinion noted that service quality and customer satisfaction data could help consumers make informed choices in a competitive market but only if available from the entire relevant industry.59 The Commission's goals are consistent with this viewpoint. To avoid redundancy, the results of the FCC's inquiry should be a starting point for any Commission adopted customer satisfaction survey. If the Commission ultimately undertakes to adopt its own service quality survey, the FCC's determination regarding what information and attributes most accurately reflect customer priorities across all service platforms would be an appropriate starting point.
Pending the FCC's decision on this issue, we require carriers that currently file ARMIS Report 43-06 with the FCC (AT&T and Verizon) to also furnish the California-specific data to this Commission's Director of the Communications Division at the same time. It is our understanding that customer satisfaction data will continue to be reported to the FCC at least until September 6, 2010.60 If the FCC determines to continue Report 43-06 or modifies the required customer satisfaction data and/or the classes of carriers required to report, carriers should report California-specific data to this Commission accordingly. Should the FCC cease requiring customer satisfaction data, carriers should continue reporting California-specific Report 43-06 data to this Commission through December 31, 2011. If parties believe California-specific reporting should continue beyond that date, they should file a petition for rulemaking under Rule 6.3 of the Commission's Rules of Practice and Procedure with this Commission to seek consideration of whether an independent Commission survey should be required or some or all of California-specific ARMIS reporting should continue.
4.2. Service Quality Measures
As previously noted, the Commission's current service quality measures are embodied in GO 133-B. The GO requires all telephone utilities providing service in California to report on nine (9) measures.61 Realizing that at least some of these traditional measures were becoming increasingly irrelevant and out of date due to changes in the competitive telecommunications market, the Commission opened this rulemaking to revise GO 133-B in a manner that would reflect current technological and business conditions. In particular, the 2007 ACR acknowledged that current service quality requirements are not technologically neutral and responsive to the competitive intermodal market.
In view of the fact that the existing service quality measures were adopted in the era of a monopoly landline phone system, all parties generally agree that some changes to the existing measures are warranted. The recommendations, in comments and reply comments filed in both 2003 and 2007, ranged from eliminating GO 133-B in its entirety, to revising it to reflect a smaller and more contemporary set of measures.62 There was also general agreement that a one-size fits all approach does not make sense in view of the effect that different services, competitive conditions, and technologies may have on a consumer's view of service quality priorities.
It is undisputed that service quality measures and standards should apply to GRC ILECs and the GRC ILECs themselves recommend no changes to the current GO 133-B reporting requirements. URF ILECs and CLECs oppose being subject to service quality reporting. Consumer groups support revised standards for GRC ILECs, URF ILECs and CLECs.
TURN and DRA support revised service quality measures, as both legally required and necessary to monitor service quality for health and safety purposes. TURN and DRA propose measures for wireline carriers that largely are based on ARMIS reporting requirements per ARMIS Report 43-05 and not the current GO 133-B measures. They propose positive reporting of service quality measures at regular intervals rather than the current practice, exception reporting when carriers have not met existing standards. Other consumer groups and businesses also support streamlined measures.
Consistent with our stated statutory obligations, the record before us, and the intent of this OIR, we adopt GO 133-C, which revises and replaces GO 133-B's nine service quality measures with a minimum set of five service quality measures for carriers that provide local exchange service. These five measures are considerably narrowed from the 30 measures proposed in the OIR and reflect our acknowledgment of parties' comments and proposals for minimum service quality measures. The five measures will apply to GRC ILECs. In light of the competitive intermodal market, we will apply a somewhat reduced set of measures, three measures, to URF ILECs and CLECs that have more than 5,000 customers. These measures reflect our established policy of supporting reduced reporting requirements for competitive carriers.
In view of our current deference to the FCC's pending rulemaking regarding rules applicable to VoIP and IP-enabled services, we decline to impose service quality measures and standards on IP-enabled and VoIP providers (including cable). As discussed below, we also exempt resellers, wireless carriers, and small URF ILECs and CLECs with fewer than 5,000 customers.
Our goal is a uniform and consistent reporting format. A template for reporting the adopted service quality data is attached to the GO. Reporting of data for the new GO 133-C measures will begin on January 1, 2010.
4.2.1. Consumer Groups and Businesses Support Minimum Service Quality Measures
Both DRA and TURN propose a minimum set of service quality measures for wireline carriers.63 The measures DRA proposes would apply to carriers with over 5,000 customers,64 and would be reported on a positive basis each quarter.
DRA's specific proposed measures are: operator service (reduces the GO 133-B answer time to one measure); time to reach a live operator (new);65 trouble reports per 100 lines (existing GO 133-B);66 installation commitments met (ARMIS);67 installation intervals (ARMIS);68 initial OOS repair intervals (ARMIS);69 repeat out of service as a percentage of initial OOS reports (ARMIS).70 DRA states reporting requirements should be limited to services provided to small business customers, those that purchase five or fewer lines.71
DRA asserts these minimum measures should be adopted as essential for consumer protection and public health and safety.72 DRA contends the proposed installation and repair measures are necessary to ensure California's telecommunications infrastructure is consistent with the national standards found in ARMIS.73 DRA argues a sound infrastructure is necessary for California's economy, and California service providers should, at a minimum, perform as well as the telecommunications industry nationwide.74 Further, DRA argues that repair standards are critical, because a customer who needs repair service does not have a competitive option. Nonetheless, DRA agrees measures should be streamlined from the 24 repair measures found in ARMIS.75 DRA's proposed standards are based on a proxy for industry standards using historical data from 1996-2006. DRA averaged the performance of URF ILECs and GRC ILECs and the reference group of large ILECs the Commission used to compare the performance of AT&T and Verizon in D.03-10-088, supra. These averages were the basis of DRA's proposed standards for installation, maintenance and answer time.76
TURN proposes four indicators for wireline carriers:77 average installation interval (per ARMIS standard);78 average out of service repair interval (per ARMIS standard);79 average wait time to speak with a live agent;80 and Commission complaints per million customers.81 In addition, TURN recommends the Commission monitor percent of calls receiving busy signal and percent of calls abandoned.82 TURN recommends these measures be applied to all wireline carriers, including VoIP.83
In support of its proposed measures, TURN states that minimum service quality measures and information allowing comparisons between how various providers have fared in meeting such measures is a critical element in promoting consumer choice.84 TURN notes that AT&T's own expert Harris stated in 2003 that minimum service quality measures ensure that customers will have a baseline level of quality, reducing the information needed to make buying decisions.85
A number of other parties also endorse minimum measures and point out that many states already have adopted minimum service quality measures applicable to incumbent and competitive carriers. For example, AARP noted that Ohio, Vermont, and Michigan have adopted minimum measures consistent with the Commission's OIR proposal and that Washington, Oregon, Colorado, Illinois, Pennsylvania, Texas, and Florida have adopted generic service quality measures that focus on local exchange carriers.86
Allegiance provided more detail regarding those states' adopted minimum measures and also noted that Georgia and New York have adopted minimum measures.87 Ohio's, Vermont's, Oregon's, Illinois' and New York's rules apply to ILECs and CLECs. Florida's and Georgia's rules exclude CLECs. The other states' rules apply to telecommunications carriers, generally.
DisabRA supports adoption of either the DRA or TURN proposals.88 DOD/FEA recommends ARMIS reports be filed by carriers that currently provide that information to the FCC, that all ILECs continue to report under GO 133-B, and that CLECs report under GO 133-B or provide in the alternative, customer satisfaction and service quality data consistent with ARMIS Reports 43-05 and 43-06.89
NCLC supports minimum service quality measures covering installation, trouble reports, and answer time in order to assist consumers in obtaining the most valuable information.90 The California Small Business Roundtable and California Small Business Association (CSBR/CSBA) stated that the issues most important to small business were how quickly carriers met service orders, responded to trouble reports, cleared outages and answered calls with a live person.91
AT&T and Verizon (i.e., the URF ILECs) oppose the DRA and TURN proposals, arguing that no evidence indicates the suggested measures are necessary for public health and safety, or are of particular concern to customers. For example, AT&T notes that 19 states do not regulate answer times. Further, AT&T argues there is no evidence or cost/benefit analysis to support the specific metrics TURN and DRA propose. AT&T estimates it would incur substantial costs to comply with the proposed answer time measure.92
AT&T and Verizon contend that all service quality measures and reporting requirements should be eliminated. They assert that in view of the development of competitive markets and the Commission's policy direction in URF, continued reporting to the Commission is unnecessary because competition is sufficient to protect consumers' interests.93 Verizon adds that service quality measures are outdated, are not competitively and technologically neutral, and in its view distort the incentives competition already provides for achieving adequate service quality. Verizon suggests the Commission should rely on major service outage reporting and ARMIS data.94
AT&T mirrors these arguments, commenting specifically that service quality measurements are outmoded, do not provide information for consumers to select among carriers, and impose costs on the affected carriers, which are not borne by other providers. AT&T notes that both GO 133-B and the FCC's MCOT reporting are outdated and are neither competitively nor technically neutral.95 In AT&T's view, the Commission should rely solely on customer satisfaction surveys.96
SureWest argues that applying service quality obligations on regulated carriers distorts the competitive intermodal market. In SureWest's view, the costs of imposing reporting requirements outweigh the benefits.97
Frontier states GO 133-B requirements are duplicative, unnecessary and should be eliminated. Frontier would replace GO 133-B with federal and state MSI reports and third-party customer satisfaction surveys.98
The CLECs oppose continued GO 133-B reporting on the ground that their services are competitive and so obviate the need to continue GO 133-B reports.99 They argue that the cost of compliance with GO 133-B or the DRA and TURN proposals would be prohibitive. CALTEL argues that CLECs predominantly serve medium to large business customers and must provide high quality service.100 CALTEL argues that reporting requirements would increase operational costs for competitive carriers without justification, even with the small carrier exemption proposed by DRA.101 Cbeyond elaborates on these concerns, stating that service quality measures are unnecessary for CLECs serving business customers because those customers have more competitive options, have access to greater resources, possess more technical expertise, and have greater bargaining power to resolve service quality disputes.102
VON argues that the Commission lacks jurisdiction over VoIP and should continue to defer to resolution of this issue in the pending FCC rulemaking to consider the regulatory treatment for VoIP and IP-enabled services.103
The Small LECs (i.e., GRC ILECs) are willing to continue reporting under the current GO 133-B.104 They assert the data submitted in 2003 illustrated their excellent service to their customers and that nothing has changed since that time.105 They argue that additional reporting would be expensive and unjustified, since GRC ILECs consistently have not had service quality problems, and continue to be subject to rate base regulation which affords the Commission opportunity to review their service.106 Accordingly, the Small LECs oppose the DRA and TURN proposals and request an exemption from any new reporting requirements.107 They assert DRA's rationale for exempting small carriers that are not COLRs from new service quality standards applies to all small carriers. Cost and efficiency should influence the amount of service quality measurement and reporting required of smaller carriers.108
As we have previously stated, the Commission has a statutory duty to ensure customers receive adequate service quality pursuant to Pub. Util. Code §§ 709, 2896 and 2897. We agree with the general consensus of the parties that certain aspects of GO 133-B are outdated and no longer reflect today's competitive markets and the Commission's regulatory policies consistent with URF. We also agree that ARMIS reporting could in some instances be a sufficient replacement for at least some aspects of our current reporting requirements. However, ARMIS data alone may not be enough, and the status of continued ARMIS reporting remains uncertain. If we were to rely solely on ARMIS data and the FCC were to eliminate ARMIS service quality reporting per ARMIS Report 43-05, it could compromise our ability to meet our statutory obligations to California customers.
We concur with DRA and TURN that minimum service quality measures and corresponding standards should be adopted to replace the existing GO 133-B measures. Although we do not adopt either proposal in its entirety, we will eliminate outdated components of GO 133-B, modify others, and rely on ARMIS measures and standards, where possible. We do not agree with the Small LECs' argument that GO 133-B measures should remain unchanged because the Commission has not found their particular service quality to be inadequate. Adopting requirements based on the performance of any one group of carriers is not a practical or reasonable solution. As the parties have demonstrated, our existing service quality measures and standards lag behind current market realities as well as recently adopted minimum measures in force in other states. Our measures need to be revised. At the same time, we agree with the parties that while our requirements should strive to be competitively and technologically neutral, it is not practical to fashion identical service quality measures for all classes of carriers.
Today, we adopt GO 133-C to replace GO 133-B. GO 133-C does not contain outdated and inadequate service quality indicators that parties have recommended we eliminate. Measures that have been eliminated are: held primary service orders; installation-line energizing commitments; dial tone speed; and dial service. Answer time measures have been combined, and reporting for directory assistance and operator assistance answer times has been eliminated.
The revised minimum measures encompass metrics related to installation, repair, maintenance and answer time in fewer measures than found in GO 133-B. Based on the record before us, these are the indicators that are most relevant in today's more competitive telecommunications market to reflect actual customer priorities and satisfaction.
The minimum measures we adopt are: (1) telephone service installation intervals (five business days); (2) installation commitments (95%); (3) customer trouble reports (six reports per 100 lines for reporting units with 3,000 or more working lines and lower standards for smaller reporting units); (4) OOS repair intervals (90% within 24 hours excluding Sundays and federal holidays, catastrophic events and widespread outages); and (5) answer time (80% within 60 seconds related to trouble reports and billing and non-billing issues with the option to speak to a live agent).109 These five reporting measures will apply to GRC ILECs, since they are fully regulated as the monopoly providers in their service territories and are designated COLRs in their service territories.
Fewer measures will apply to URF ILECs and CLECs since the competitive markets these entities operate in provide greater external pressure to ensure service quality and customer satisfaction. It is consistent with our policies in URF to minimize regulatory and reporting oversight in such competitive markets. The three measures we adopt for URF ILECs and CLECs are: (1) customer trouble reports (six reports per 100 lines for reporting units with 3,000 or more working lines and lower standards for smaller reporting units); (2) OOS repair intervals (90% within 24 hours excluding Sundays and federal holidays, catastrophic events and widespread outages); and (3) answer time (80% within 60 seconds related to trouble reports and billing and non-billing issues with the option to speak with a live agent).110 Consistent with the recommendation of DRA, these measures will apply only to carriers with over 5,000 customers, unless the carrier is also a COLR.
We also narrow reporting for certain measures to residential and small business customers as explained below. We grant specific exemptions from GO 133-C reporting requirements as explained below.
We are aware that Pub. Util. Code § 321.1 states that it is the intent of the legislature for the Commission to generally assess the economic effects or consequences of its decisions. Consistent with that intent, the assigned Commissioner and ALJ requested comments in 2003 on the costs and benefits of the proposed measures. Few carriers provided specific or conclusive cost information either in 2003 or 2007 comments. We do not believe a lack of definitive cost information bars us from revising GO 133-B here.
As we have previously noted, § 321.1 does not require the Commission to perform a cost-benefit analysis or consider the economic effect of a decision on specific customer groups or competitors. 111
Nor does it require the Commission to conduct analyses beyond those which can be accomplished with existing resources or structures. Lacking evidence to the contrary here, we believe it is reasonable to conclude that the overall reduction in reporting measures in the new GO 133-C should result in long-term cost savings for most carriers that currently report under the GO 133-B nine exception reporting categories, even though positive reporting is now required. Carriers should also realize some economic savings by our replacing the current Commission standard for MSI reporting with the FCC's NORS reporting, as discussed below.
GO 133-B contains service quality measures for held primary service orders and installation-line energizing commitments. Held primary service orders measure installation delays over 30 days due to lack of plant. Installation-line energizing commitments measure the percentage of commitments met for non-key telephone service.
DRA states the held primary service order measure is not necessary since this is no longer a problem in California given the reduced demand for second lines.112 TURN similarly contends the measure is no longer useful in that reporting trends suggest it may only reflect extremely poor installation performance rather than current customer expectations.113 Cox adds that held service orders are inconceivable in competitive markets, since carriers have every incentive to provide service quickly.114
With respect to line energizing commitments, TURN states that the goal of meeting 95% of the commitments is too low to be meaningful, and carriers have exceeded the goal for many years. Thus, including this as a current measure would distort reporting results since it is so easily met.115
We agree that these measures are outdated and ineffective, and should be eliminated and replaced with more effective installation measures. The proposed measures which better indicate current service quality expectations are installation interval and installation commitments. These are discussed below.
The standard we adopt for reporting installation intervals is based on ARMIS data, as recommended by both DRA and TURN. The installation interval measures the amount of time to install basic telephone service. If an additional feature is included in a basic service installation, the installation interval should reflect the basic service installation. Measurement is done in business days and an average is calculated. Although TURN proposed three business days, we prefer the five business day standard proposed by DRA, consistent with the nationwide industry average.116 This average is based on data compiled separately for small, mid-sized and large ILECs and is the lowest performance of a representative sample of carriers.117 Small ILECs' average is consistent with the adopted standard, while mid-sized and large ILECs exceed the average. We believe TURN's proposed three business days is too far outside the industry average.
We next consider a proposed exemption from reporting for business customers. Cbeyond recommends such an exemption.118 As previously noted, CBeyond maintains the level of competition in the market for business services is greater than residential, and business customers have greater resources and technical expertise, as well as bargaining power to resolve service quality concerns.119 CALTEL asserts medium and large business customers are sophisticated customers that insist on a wide variety of voice and data solutions that deliver on both cost and quality. Most of these customers receive multiple bids from service providers and negotiate service guarantees and penalties as a part of individual-case-basis customers. Service quality for these carriers is good because it has to be. CALTEL has not been infirmed by Commission staff, either the Consumer Affairs Branch or the Public Advisor's Office, of any documented or anecdotal evidence of systemic problems involving either individual carriers or the competitive industry as a whole.120 DRA agrees somewhat, recommending that reporting for business customers be limited to small business customers, those that purchase five or fewer lines.121
DOD/FEA opposes an exemption, pointing to ARMIS data that illustrates California business customers are dissatisfied with maintenance and business office contacts comparable to dissatisfaction levels among residential customers.122
We recognize that competition is generally greater for business local exchange services than it is for residential services. The competitive landscape requires some accommodation for reporting on business services. Although we decline to exempt all reporting for business customers, we generally support DRA's proposal that it makes sense to limit reporting to smaller businesses. However, any exemption for reporting for larger business customers should have a definition that is consistent with what is reported under ARMIS. ARMIS makes no distinction between small and large business customers for reporting data per ARMIS Report 43-05. (See http://www.fcc.gov/wcb/armis/instructions/2008/definitions06.htm#T1C.)123 In addition, the current GO 133-B definition for small business is business accounts that are not designated by the utility for special handling. This definition is imprecise and subject to carrier interpretation. It does not meet our goal of a uniform and consistent reporting format. Instead, we find DRA's proposal to limit business services subject to reporting to small businesses purchasing five or fewer lines the most precise proposal. This proposal also is consistent with other states' definition of small business in terms of lines purchased. 124 We will limit installation interval reporting to services provided to residential and small business customers, consistent with DRA's proposal and requirements in other states.125
We will require data for this measure to be compiled monthly and reported quarterly. Quarterly reports will be due within 45 days of the end of the quarter. Carriers' performance shall be evaluated at least annually.
In adopting this measure, we recognize that the cost for carriers to change from the existing ARMIS requirement is not fully known.126 In 2003, AT&T estimated that its labor costs to report under a new requirement would be low.127 Some parties have suggested that costs should not be limited to monetary costs, but that the Commission also should focus on the generalized economic costs of establishing uniform service standards.128 Others argue there is no mandate to consider a cost-benefit analysis in the adoption of service quality measures, since Pub. Util. Code § 2896 requires the adoption of reasonable statewide service quality standards without a cost-benefit analysis.129
We also recognize it is difficult to compare tangible, out of pocket implementation costs with benefits that may not easily translate to dollar amounts. Service quality rules were not designed to provide direct financial benefits to consumers. Benefits are largely intangible, although poor customer satisfaction will certainly increase customer frustration and dissatisfaction. We note NCLC's suggestion that a regulated industry almost always over-estimates the costs of proposed regulations.130
In view of these considerations, and because the parties offered no evidence to find otherwise, we believe it would not be prohibitively costly to provide California-specific reporting of installation interval data. The URF ILECs already report under ARMIS. There is no disagreement that customer satisfaction with their carriers' service is likely to be higher with prompt basic service installation. Thus, it is probable the benefit of adopting this measure would exceed the cost.
This installation measure should apply to GRC ILECs, because they are the sole provider of basic local exchange service in their service territories. There is little or no competitive market. In contrast, minimum service quality measures for URF ILECs and CLECs should reflect the competitive landscape in which they operate. Competitive carriers have a strong incentive to install service promptly. That incentive is illustrated by the industry averages compiled by DRA. Mid-sized and large ILECs exceed the installation average of small ILECs. Thus, there is no need to require installation interval reporting for URF ILECs and CLECs. URF ILECs and CLECs are exempt from reporting installation intervals.
The standard we adopt for installation commitments is based on GO 133-B and ARMIS, as proposed by DRA. Installation commitments for basic service will be expressed as a percentage. The adopted standard is 95% of commitments met and excludes commitments that are not met due to customer actions. We believe DRA's proposal is reasonable since it is based on nationwide industry averages.131 Small ILECs meet this average, while mid-sized and large ILECs exceed this average.132 Consistent with DRA's proposal, this measure is limited to installation intervals for residential and small business customers. We will require installation commitments met to be compiled monthly and reported quarterly. Quarterly reports will be due within 45 days of the end of the quarter. Carriers' performance shall be evaluated at least annually.
There is no evidence establishing the cost for carriers to change from the existing reporting measure to this new measure. In 2003, AT&T estimated that labor costs to report under a new requirement would be low.133 Consistent with our reasoning above, customer satisfaction with their carriers' service will likely to be higher if installation commitments are met and thus, it is probable the benefit of adopting this measure would exceed the cost.
This reporting measure will apply to GRC ILECs because they are the sole provider of basic local exchange service in their service territories. Thus, this standard is adopted for GRC ILECs. Minimum service quality measures for URF ILECs and CLECs should reflect the competitive landscape in which they operate. Competitive carriers have a strong incentive to meet installation commitments and install service promptly. That incentive is illustrated by the industry averages compiled by DRA. Mid-sized and large ILECs exceed the installation average of small ILECs. Thus, there is no need for installation commitment standards for URF ILECs and CLECs. URF ILECs and CLECs are exempt from reporting installation commitments.
The existing GO 133-B customer trouble report standard measures initial trouble in relation to lines or equipment. It is expressed as the number of reports per 100 lines. DRA supports retaining the existing standard, which is six reports per 100 working lines for reporting units with 3,000 or more lines, eight reports per 100 working lines for reporting units with 1,001-2,999 working lines, and 10 reports per 100 working lines for reporting units with 1,000 or fewer working lines.134 The smaller the GRC ILEC, the more lenient the standard.135 A significant benefit to retaining this measure is its illustration of network reliability.136
TURN recommends we eliminate this particular measure, reasoning that the threshold of six trouble reports per 100 lines (and up to ten trouble reports for smaller central offices) is far too high to represent good service and that carriers significantly exceed this standard.137 TURN prefers we require reporting of the number of complaints per million customers. TURN argues that complaint data represents the real issues that customers face.138
We decline to adopt a standard associated with the number of complaints received by the Commission. Although complaints are one indicator of customer dissatisfaction, they normally span a range of issues which may or may not be tied to the actual indicators of service quality adopted under GO 133-C. We believe that on whole, customer trouble reports will provide more useful and relevant information. Although TURN argues that six reports per 100 lines is a weak standard, no other party supports that position. The Small LECs support continuation of the existing standard.139 Accordingly, we will retain the minimum standard of no more than six trouble reports per 100 working lines with more lenient standards for smaller central office sizes: eight reports per 100 working lines for units with 1,001-2,999 working lines and ten reports for units with 1,000 or fewer lines. This standard for customer trouble reports is based on GO 133-B and ARMIS. This measure will apply to local exchange service provided to residential and business customers, consistent with ARMIS and requirements in other states.140 Customer trouble reports will be compiled monthly and reported quarterly. Quarterly reports are due within 45 days of the end of the quarter. Carriers' performance shall be evaluated at least annually.
We next address the DRA and TURN recommendation that trouble reports must be defined consistently. We agree. DRA recommends that all calls to the repair center should count as true troubles, without exclusion.141 We believe that may be too broad. For purposes of reporting this measure, customer trouble reports are defined as all reports affecting service as well as those regarding service that is not working.
As with the preceding measures, there was no evidence quantifying the precise costs for carriers to comply with this measure. In 2003, AT&T estimated that labor costs to report under a new requirement would be low.142 In as much as we are largely retaining the existing standard, we do not expect the cost to be burdensome. Customer satisfaction with their carriers' service is likely to be higher if service is reliable, and the incidence of trouble reports is one measure of reliability. Thus, it is probable the benefit of adopting this measure would exceed the cost. This service quality measure shall apply to GRC ILECs, because they are the sole provider of basic local exchange service in their service territories. We believe URF ILECs and CLECs should also be responsive to customers and prompt in addressing service difficulties. In this respect, the reporting of maintenance standards represented by the incidence of customer trouble reports would be beneficial. Maintenance standards such as this address critical health and safety concerns, and the industry averages compiled by DRA illustrate that larger ILECs tend to have lower performance on maintenance standards than do smaller ILECs. Further, not all customers in service territories of URF ILECs have competitive choices. Thus, we will require URF ILECs and CLECs to report this measure. However, consistent with DRA's overall recommendation, we will only require this reporting for URF ILECs and CLECs with 5,000 or more customers, unless the carrier is a COLR.
GO 133-B does not currently require the reporting of OOS repair intervals. This indicator reflects how long a customer may have to wait to have service repaired. Both TURN and DRA recommend we adopt such a service quality measure. TURN suggests we use the ARMIS definition and set a maximum goal of 36 hours.143 DRA recommends 25 hours.144
We note that Texas requires a carrier to clear 90% of OOS trouble reports within eight working hours (measured on a monthly basis);145 and Illinois requires 95% of OOS troubles on basic service to be cleared within 24 hours.146 Illinois excludes customer caused delays, emergency situations and OOS troubles occurring on holidays and weekends.147
We agree that restoring service is critical given customers' reliance on their phones for summoning help in an emergency. Given the various proposals and standards of other states, we adopt a standard of 90% of OOS trouble reports cleared within 24 hours, which is generally consistent with the standard in place in Illinois. We decline to exclude full weekends and all holidays, but will exclude Sundays and federal holidays. Ninety percent cleared within 24 hours from the time the carrier receives the OOS trouble report to the time service is restored is the minimum standard, consistent with the nationwide industry average calculated by DRA.148 This measure will apply to local exchange service provided to residential and small business customers. While carriers should collect and provide data at the exchange or wire center level, the reporting unit for purposes of evaluation will be based on a state-wide average. ILECs and CLECs that do not have exchanges or wire centers should report at their operating level and should concurrently submit the raw data supporting their report.
The adopted OOS repair interval measure should measure in hours and minutes the time from receipt of the trouble report to the time service is restored for outages that are within the reporting carrier's control. Sundays and federal holidays are excluded. Maintenance delays due to circumstances beyond the carrier's control, including catastrophic events or widespread service outages, that occur in one or more months of the year are excluded and should be reported separately.149 Delays due to customer's requested appointment are excluded and reported separately by identifying the number of such appointments and the time, in hours and minutes, associated with these appointments. OOS repair intervals shall be calculated by adding the total time in hours and minutes for restoring service for each trouble report for the period less the time associated with causes beyond the carrier's control, as defined above, divided by the total number of trouble tickets for those outages that are not outside of the carrier's control.
The large ILECs meet this nationwide industry average, while mid-sized and small ILECs' performance is better than the average.150 Repair intervals will be compiled monthly and reported quarterly. Quarterly reports will be due within 45 days of the end of a quarter. Carriers' performance shall be evaluated at least annually.
Because carriers do not currently report this measure, the associated cost of reporting is unknown and no evidence was presented to estimate the anticipated costs. However, in 2003, AT&T estimated that labor costs to report under a new requirement would be low.151 Customer satisfaction with their carriers' service is likely to be higher if service problems are addressed promptly. Thus, it is probable the benefit of adopting this measure would exceed the cost.
This measure shall apply to GRC ILECs because they are the sole provider of basic local exchange service in their service territories. Thus, this standard is adopted for GRC ILECs. Although URF ILECs and CLECs should also be responsive to customers and prompt in addressing service difficulties, measurement and reporting of maintenance standards would still be beneficial. Not all customers in service territories of URF ILECs have competitive alternatives. Maintenance standards address critical health and safety concerns, and the industry averages compiled by DRA illustrate that larger ILECs have lower performance on maintenance standards than do smaller ILECs. Thus, this standard is adopted for URF ILECs and CLECs. However, consistent with DRA's overall recommendation, we will only require this reporting for URF ILECs and CLECs with 5,000 or more customers, unless the carrier is a COLR.
This measure reflects how quickly a customer can expect to speak with a live agent when calling a carrier's business office regarding an issue. Existing answer time standards separately measure toll operator, directory assistance operator, trouble report, and business office answer times. Toll operator answer time measures calls answered within 10 seconds. Directory assistance answer time measures calls answered within 12 seconds. Trouble report and business office answer times measure calls answered within 20 seconds. DRA supports the current standard with a combined measure of 80% within 20 seconds.152 The Small LECs support the current standard without modification and note that the existing standard for trouble report and business office answer time only applies to four of the 12 Small LECs that joined in the comments, since reporting only applies to traffic offices with 10,000 or more lines.153 The Small LECs assert that many have conveniently-located and highly visible local offices and small customer bases.154
AT&T opposes answer time reporting, noting that 19 states do not regulate answer times.155 AT&T also avers that operator assistance is a competitive service.156 The Coalition suggests adopting a single answer time measure, which it believes would better reflect the operational structure of competitive carriers.157 The Coalition notes that CLECs often resell the ILECs' directory assistance and operator assistance, so they do not control and cannot measure the level of performance.158 The Coalition also questions whether data on directory assistance or operator assistance are facts material to a consumer's purchasing decision. In SureWest's view, the offering of free directory assistance renders measurement of directory assistance answer time obsolete.159 Finally, many states do not measure directory assistance or operator assistance answer time.160 Verizon asserts that market forces should drive a provider to design its automated response unit to give its customers answer time service attributes, along with other service choices, which maximize the attractiveness of its options.161
TURN notes that many issues can now be resolved by a customer's choice of automatic menu options; however, more complex issues may still require a live representative to be resolved.162 TURN recommends a maximum goal of 60 seconds after the automated response unit, and asserts that an answer time measure should specifically include calls related to billing, repairs, trouble reports, as well as any other calls to the business center.163 TURN also recommends that there be an option to speak with a live operator after no more than 45 seconds of menu choices.
AARP recommends Ohio's approach.164 Ohio requires an option to transfer to a live operator within the initial automated message as well as an operational feature that will transfer a customer to a live attendant if the customer fails to interact with the automated system within 10 seconds following the prompt.165 Otherwise, answer time must be measured from the point of the first ring at the business or repair office or from the time the customer enters the queue after the automated response. AARP asserts it is the long wait time after attempting to reach a live operator that bothers customers, not the simple fact that they reach an automated system, so there is no need to measure the answer time on automated calls.166 Verizon's initial messages offer customers the option to transfer immediately to a live operator.167 Ohio requires reporting for large ILECs (with 50,000 or more access lines) and prompt contact is verified as an average monthly speed of ninety seconds in answering calls placed to business and repair offices.168
We believe a standard which simplifies the reporting of answer times is preferable. We also agree with AT&T and SureWest that directory assistance now is a competitive offering. For example, free directory assistance offerings are available by phone and on the Internet. For this reason, we believe that measuring answer time to speak with directory assistance is no longer necessary or useful as a component of minimum reporting standards. AT&T also asserts that operator assistance is a competitive service; prepaid and debit cards offer operator assistance calling.169 Similarly, operator assistance answer time does not furnish information customers are likely to find useful, and many states do not measure it. For these reasons, we believe measuring answer time to speak with operator assistance is no longer necessary or useful as a component of minimum reporting standards. We will limit the reporting of answer time to calls related to trouble reports and billing and non-billing issues.
In adopting a measure for answer time we recognize that carriers have invested substantial resources to develop automated voice response systems which are capable of resolving many types of customer calls without the need to speak to a live agent. We encourage these efforts and believe this capability provides an overall benefit to customers by enabling a convenient and expedient resolution of many routine matters.
The standard we adopt for answer time related to trouble reports, and billing and non-billing issues excludes those calls resolved by the automated voice response system. The measure we adopt is 80% of calls to be answered within 60 seconds from the time the customer is transferred from the automated response system, consistent with TURN's proposal. We are persuaded by TURN and AARP that customers' frustration results from the time spent waiting for a live operator once the customer has selected that option in an automated response system. The standard applies to the time it takes to speak to a live agent after completing the interactive voice response (IVR) or automatic response unit system.170 As recommended by AARP and TURN, the carrier must offer the customer the option on the IVR or automatic response unit to speak with a live agent, preferably in the first set of options. The live agent option will ensure that customers who are confused by the menu choices or whose issue is not among those choices have their concerns addressed. We decline to prescribe a specific interval at which that option should occur in agreement with Verizon's position that market forces should permit carriers the ability to design their automated response unit options to maximize the carriers' service attributes. The adopted standard for answer time is based on GO 133-B, with the exception of billing inquiries. Excluded from answer time measurement is any group of specialized business account representatives established to address the needs of a single large business customer or a small group of such customers, consistent with the requirements in New York.171
Answer time will be compiled quarterly and reported annually. The annual report is due on February 15 of the following year.
AT&T argues that the cost of reporting answer time is high. Its estimated labor costs in 2003 were highest related to answer time reporting. Toll operator answer time was the least costly to report and billing and non-billing related answer time each was over 16 times more costly to report.172 AT&T asserts that meeting the DRA and TURN answer time proposals would require hundreds of full-time equivalent employees and would cost $30 million and $20 million a year, respectively.173
Answer time reporting is already required under GO 133-B for ILECs. Modification of the answer time measure updates it to recognize that carriers provide IVR or automated response. Answer time reporting for directory and operator assistance is eliminated, resulting in cost savings to carriers that currently are subject to reporting those measures. Accordingly, we are not convinced that the modification of this measure that we adopt today will result in an incremental cost that is unduly burdensome. We must also weigh cost issues against the fact that a customer's satisfaction with service quality will certainly be higher if its contacts with the carrier are answered within a reasonable timeframe. Conversely, frustration will increase if a prompt response does not occur. No evidence establishes that the cost of reporting this information in a modified format outweighs the Commission's duty to ensure customers have functioning telecommunications equipment. Further, customers have no alternative than the carrier in resolving maintenance issues. We also note that GRC ILECs have the means to recover the costs of implementing this measure in their GRCs. However, many GRC ILECs have local offices that are convenient for their customers, reducing the need to rely on telephonic access regarding trouble report and business office issues. Although this measure will apply to GRC ILECs, we will maintain the existing reporting unit and limit answer time reporting units to traffic offices with 10,000 or more lines.
We already have determined that URF ILECs and CLECs should report on trouble report and OOS measures, because maintenance issues present health and safety concerns that are vital to customers. The same rationale applies to reporting answer time for trouble reports. Prompt handling of customers' contacts with the business office, particularly with billing issues, remains desirable. Although other business office contacts are less critical, some competitive carriers only have aggregated data on answer time contacts.174 Accordingly, answer time standards for billing and non-billing issues, trouble reports, and OOS repair intervals shall apply to URF ILECs and CLECs with 5,000 or more customers. URF ILECs and CLECs with fewer than 5,000 customers are exempt from these answer time measures unless the provider is a COLR. 175 Reporting units are limited to traffic offices with 10,000 or more lines. URF CLECs with nationwide operations that do not track answer times on a state-specific basis will report their overall average.
4.2.3.7. Miscellaneous Issues Regarding Installation and Maintenance Measure Reporting
Consistent with our stated intent in this proceeding, the measures adopted in this Decision will apply to local exchange service. Installation and repair measures do not apply to interexchange carrier services. The adopted installation and maintenance standards will also apply only to facilities-based carriers,176 because only facilities-based carriers have access to the underlying network. And the adopted installation standards will not apply to additional features, such as call waiting and call forwarding.
The OIR proposed that parties consider whether it is necessary to distinguish between primary and additional lines and report that data separately.177 It was suggested that some measures only apply to primary lines (installation measures) and others to primary and other telephone lines (e.g., customer trouble reports).178
CPSD asserted the only definition of a primary line is in the context of administering the California High Cost Fund B, not for measuring service quality. 179 In addition, we note ARMIS makes no such distinction. Since the measures adopted in this Decision conform to ARMIS, no distinction between primary and additional lines is necessary. GO 133-C is consistent and defines a line as an access line which provides dial tone and which runs from the local central office to the subscriber's premises.
4.3. Reporting Exemptions for Wireless Carriers, Resellers and IP-Enabled Services
In this section we discuss whether an exemption from reporting service quality measures should be granted for wireless carriers, resellers, and IP-enabled services (including VoIP and cable services).
Verizon Wireless and CTIA argue that wireless carriers should be exempt from any service quality reporting. Verizon Wireless contends that reporting of service quality measures makes no practical sense and is outside the scope of the Commission's jurisdiction.180 Moreover, Verizon Wireless argues that because the wireless industry is competitive, the market has already responded to the need for information on customer satisfaction.181 CTIA mirrors this view, pointing to the variety of information sources already available in the market place that allow customers to assess the service quality of various wireless providers.182
DOD/FEA does not subscribe to the same rationale as Verizon Wireless and CTIA for allowing an exemption, stating rather that it is unclear in their view whether the Commission has requisite statutory authority to require such carriers to report service quality information.183
TURN and DRA agree that the installation, repair, and maintenance indicators that apply to wireline carriers are not relevant to wireless carriers.184 Accordingly, they say, there would be no point to requiring the reporting of related service quality measures. However, TURN does propose three indicators that it believes would be useful for wireless carriers to report, specifically, call-success rates, service coverage information (street level), and call drop-out rates.185 TURN also suggests that the Commission monitor the average wait time to speak with a live agent, Commission complaints per million customers, the percent of calls receiving busy signals, and the percent of calls abandoned.186 CTIA states these requirements do not factor in availability of information through an IVR, a carrier's website or the cell phone itself.187
DRA does not make any specific recommendations, but agrees that the information TURN identifies would contribute to the efficiency of the market and would be informative to customers.188 DisabRA specifically notes the usefulness of call-success and drop rates for wireless customers.189
We believe it is premature to address whether this Commission has jurisdiction to require service quality reporting for wireless, VoIP, and IP-enabled carriers. As we noted previously, the Commission has deferred any final decision on such issues pending the FCC's pending rulemaking regarding appropriate regulatory treatment of such carriers.190 For this reason, we are disinclined to adopt a reporting requirement for these groups of carriers at this time. We also believe that should we determine to adopt service quality measures for these carriers in the future, we would prefer a more fully developed record concerning the types of measures that would be meaningful, and not duplicative of already available information, for wireless, VoIP and IP-enabled customers.
Accordingly, we decline to adopt TURN's recommendation. Wireless carriers, VoIP and IP-enabled carriers (including cable) are exempt from service quality standards. For somewhat different reasons, we will also exempt resellers from reporting service quality measures.191 Although AARP opposes such an exemption,192 we believe that some degree of control over the underlying network facilities is a critical component in a carrier's ability to affect service quality. Resellers, as non-facilities-based carriers, cannot control the underlying network, at least in respect to issues regarding installation, maintenance and repair measures. With respect to answer time, the record does not contain sufficient evidence to determine whether reporting of this information for resellers would in fact be relevant or beneficial for reseller customers.
4.4. Commission Publishing of Carrier Data
In 2003, the parties commented on whether the Commission should publish carriers' reported service quality data as an alternative or interim step to establishing measures and measure-specific quality assurance mechanisms for some measures.193 Many parties supported publishing carriers' data, especially if the Commission adopts specific minimum service quality measures.194 Fewer parties were opposed to publishing such information, most notably the wireless carriers.195 Some parties supported workshops to address publishing data.
We believe publishing carriers' reported service quality information is reasonable since such information provided to the Commission is part of the public record. We also believe this information could be helpful to consumers. However, publishing such data only is helpful if carriers report information in a uniform and consistent format. In that way, customers can use the information as another data point to decide whether a particular carrier provides their required level of service in areas that are important to them.
A template for reporting GO 133-C service quality data is attached to the GO. Our goal is a uniform and consistent reporting format so that the data to be published will be reliable, will be consistently gathered, and will be posted in a format that is consumer friendly and provides meaningful comparisons, such that apples are being compared to apples and oranges to oranges. Any publishing of service quality data will be consistent with the Commission's commitment to accessibility.196
4.5. Major Service Interruption Reporting
A service interruption is major under current standards if there is a complete loss of inward and/or outward calling capability from a central office for periods in excess of 30 minutes (carriers with fewer than 10,000 primary stations) or 10 minutes (carriers with 10,000 or more primary stations).
The Commission currently requires MSI reporting. To date, however, the Commission's requirements have not been formalized in a general order or decision. Rather, the Commission's requirements and guidelines are the result of a 1977 Communications Division memo, and they do not apply to all carriers.197 The FCC has a more formalized reporting scheme as adopted in FCC 05-46 Report and Order (February 25, 2005). The FCC requires all voice providers, including wireless, to report all outages that last at least 30 minutes and potentially affect at least 900,000 user minutes to the FCC under NORS. These reports must be filed within two hours of discovering the outage. A more detailed initial report must be filed within 72 hours and a final report must be filed within 30 days.
DRA recommends the Commission require both types of reporting outages because the FCC outage reports do not include all MSIs in California, only the most severe outages.198 DRA also suggests requiring both wireless and wireline carriers to prepare a report similar to the annual report all eligible telecommunications carriers (ETC) submit to the FCC on outages affecting 10% or more of customers.199 DRA asserts this report would permit year-to-year comparisons and would make it easier for carriers to qualify as ETCs for federal high cost funding. TURN supports inclusion of all carriers, wireline and wireless, in reporting major service quality interruptions.200 Verizon recommends maintaining the existing Commission and FCC reporting requirements.201
The GRC ILECs, SureWest, and Frontier recommend conforming the Commission's reporting requirements to the FCC's.202 SureWest and Frontier would continue to report pursuant to the Commission's MSI requirement, but prefer the FCC's reporting scheme.203 AT&T believes it would be consistent with the Commission's approach in URF to simply require submittal of the FCC report.204
CTIA reports that the Department of Homeland Security (DHS) supported sharing outage information reported to the FCC with state public utility commissions rather than requiring separate filings at the state level. Since much of the outage information is homeland security information, DHS noted that sharing the information with state authorities would more effectively safeguard sensitive information.205 Similarly, CTIA asserts that providing ETC information would afford no benefit to wireless carriers since only one is an ETC in California and would present confidentiality concerns.206
In determining whether to continue requiring MSI reporting and if so, in what form, we are guided by the same reasons that led the FCC to extend its mandatory reporting of outage information in In the Matter of New Part 4 of the Commission's Rules Concerning Disruptions to Communications ("Rules Concerning Disruptions Order") ET Docket No. 04-35, Release Number: FCC 04-188 FEDERAL COMMUNICATIONS COMMISSION, 19 FCC Rcd 16830; 2004 FCC LEXIS 4658. Specifically, the FCC recognized the "critical need for rapid, complete, and accurate information on service disruptions that could affect homeland security, public health or safety, and the economic well-being of our Nation, especially in view of the increasing importance of non-wireline communications in the Nation's communications networks and critical infrastructure."207
The receipt of MSI information is no less critical to state regulatory commissions than it is to the FCC. Information concerning functioning telecommunications systems is imperative in emergencies and in connection with homeland security functions. Nevertheless, we generally favor streamlined reporting consistent with the policies we adopted in URF and recognize that the outage information currently provided to this Commission may be more extensive or difficult for carriers to provide than the information provided to the FCC.
Balancing the Commission's need for robust service outage reporting and a policy favoring streamlined reporting requirements, wherever possible, we determine that we can achieve both objectives by conforming our reporting requirements to the FCC's. Our preferred method of obtaining the FCC NORS data would be through password-protected access to the FCC's NORS data base to access California-only outage/disruption data. In FCC Docket CC No. 99-200, the Commission gained similar access to another FCC database containing confidential carrier-specific numbering data maintained by the North American Numbering Plan Administrator (NANPA).208
To date, however, we have been unable to obtain access to the FCC's NORS database. Consistent with our preference for obtaining NORS data directly from the FCC, we direct staff to initiate steps to submit a formal request to the FCC requesting password-protected access to all California-specific NORS data.209 Until such access is granted, we will require all facilities-based certificated and registered carriers to furnish to the Communications Division and DRA in a written report the information electronically submitted to the FCC under NORS when California service is affected, regardless of whether the California outage independently would meet the FCC's significant disruption and outage reporting threshold. 210 Concurrent reporting under NORS will begin with the issuance of this decision.
In requiring NORS information, we recognize that MSI data has critical utility infrastructure implications. Consistent with the FCC's treatment of NORS data, we will afford the information confidential treatment pursuant to the Commission's well-established protections under Pub. Util. Code § 583 and GO 66-C. Carriers should designate "Section 583" on the reported data.211 Adopting the FCC's reporting requirements should have the additional benefit of reducing carriers' costs of complying with MSI reporting.
Written reports normally are satisfactory, but where large numbers of customers are impacted or the impact is of great severity, carriers shall report promptly by telephone.
DRA proposes a new annual report comparable to the federal ETC report, but DRA has not established that it is necessary for the Commission to develop a comparable report. The FCC requires ETCs to submit an annual report that provides detailed information on any outage lasting at least 30 minutes and potentially affecting 10% or more of their customers in a designated service area.212 However, with adoption of the FCC's NORS reporting requirements, requiring ETCs to concurrently file their FCC ETC report with this Commission is consistent with conforming our MSI reporting requirements with the FCC's and would provide broader outage information, particularly for smaller carriers. Smaller carriers' outages might not reach the NORS outage/disruption reporting threshold but probably would meet the 10% ETC report threshold. Access to the ETC report would significantly increase our knowledge of MSIs for those carriers that might not otherwise meet the NORS reporting threshold. Thus, we require ETCs to concurrently submit their annual FCC ETC report to the Communications Division and DRA. We acknowledge that information contained in the ETC report will include confidential information provided under NORS for some carriers. Thus, the confidentiality provisions under § 583 and GO 66-C for concurrent NORS reporting also should apply to the annual FCC ETC report.
4.6. Service Quality Monitoring
At issue here is whether the Commission should monitor service quality by requiring carriers to file federal ARMIS and MCOT reports with this Commission, and whether we should continue or eliminate certain existing Commission state-specific reporting. This issue arose in the URF proceeding and was referred to this proceeding by the modified URF Phase I Decision.213 Pursuant to November 9 and 16 assigned Commissioner's rulings in that proceeding, URF carriers filed lists of all reports filed at the Commission and the FCC on November 21, 2006. The discussion that follows is based on the lists included in these filings.214
ARMIS was created by the FCC in 1987 and now consists of ten public reports covering information regarding finances, operations, service quality, customer satisfaction, switch downtime, infrastructure, and usage.215 URF ILECs and some small LECs file ARMIS service quality data.216 However, ARMIS reporting is not required of non-URF ILECs, CLECs, and non-wireline carriers (wireless, cable). In this rulemaking, the Commission is focusing on ARMIS service quality reporting in Report 43-05.
MCOT reports were imposed as merger conditions by the FCC in the 2000 Bell Atlantic/GTE and 1999 SBC/Ameritech mergers.217 The MCOT reports include information regarding installation for basic service, access line for basic service, repair for basic service, customer complaints, and answer time performance.218 In California, MCOT reporting applies only to Verizon and AT&T affiliated ILECs. The FCC MCOT reports were only required for a limited period of time and ceased in approximately 2002.
TURN supports continued reporting of MCOT data by AT&T and Verizon if TURN's metrics are not adopted.219 TURN argues that MCOT reports contain carefully considered indicators. If TURN's indicators are adopted, TURN sees no requirement for continued MCOT reporting. TURN recommends the Commission continue to monitor the California-specific indicators from ARMIS for URF ILECs, and that other carriers should report TURN's metrics.220
Verizon argues MCOT reporting should be eliminated as it is outdated, ILEC-centric, and not competitively or technologically neutral. In Verizon's view, such reporting cannot replicate the dynamic price/quality preferences individuals have in a competitive marketplace, thus it risks distorting competition, to the detriment of consumers.221 While Verizon believes ARMIS reporting suffers from the same problems, it would favor Commission monitoring of ARMIS if the Commission remains interested in monitoring ILEC legacy service quality metrics.222 AT&T recommends eliminating MCOT reporting for similar reasons, stating the measures imposed by MCOT have little value to consumers in choosing among competitive alternatives, and have inherent costs.223
In determining whether to continue or eliminate MCOT reporting, we look to whether its purpose is still relevant. Specifically, its underlying rationale was to monitor service quality post-merger. It has been over eight years since the respective mergers, and we note that the FCC discontinued MCOT reporting in 2002. Accordingly, it is reasonable to conclude that the immediate concerns which triggered the MCOT requirements no longer apply. We are aware that in 2003 this Commission directed Verizon and AT&T to continue providing MCOT data pending further notice.224 However, that determination predated our considerations regarding competition in URF.
We agree with Verizon and AT&T that the MCOT reports are outdated and are inconsistent with the Commission's goal of more uniform and neutral reporting requirements. And although some parties encourage us to leave MCOT reporting in place, no evidence was offered which would compel such a result. For these reasons, we will discontinue MCOT reporting. Discontinuing MCOT reporting will result in cost savings to Verizon and AT&T.
With respect to ARMIS service quality reports, we previously noted the FCC's pending rulemaking to consider issues related to continuation and scope of those reports.225 As recently determined by the FCC, such reporting shall continue for 24 months while the FCC evaluates whether ARMIS-like reporting should be developed for different classes of carriers. Pending the FCC's consideration of this issue, carriers currently required to file ARMIS service quality data with the FCC in Report 43-05 will continue to furnish California-specific service quality data to this Commission until September 6, 2010. Carriers should submit this data at the same time it is filed with the FCC.
If the FCC determines that service quality data should be furnished by different classes of carriers in Report 43-05 or a successor report, those carriers shall compile and furnish California-specific service quality data to the Commission at the same time, consistent with the practice for ARMIS reporting. The Director of the Communications Division may provide instructions to the carriers on how to furnish that data, as necessary. If the FCC reverses its tentative determination that such data should be reported by all classes of carriers, we shall require the currently reporting URF ILECs to continue to file California-specific ARMIS service quality data in Report 43-05 with the Commission through December 31, 2011. If parties believe the Commission should continue to require such reporting beyond that date, they should file a petition for rulemaking with this Commission requesting consideration of continued reporting requirements.226
Carriers currently file certain service quality monitoring reports with the Commission, not otherwise discussed in this decision.227 These are: (1) GO 152 service measures for private line alarm service; (2) the subscriber complaint report;228 and (3) complaint response for general/disability telephone-related issues. Not all carriers are required to file all of the reports, although AT&T and Verizon file most of them.229
No party proposes elimination of these monitoring reports, and parties did not comment on these reports in this proceeding. The GO 152 private line alarm service measures, revised in D.88-11-018, include alarm held orders, installation due date, service trouble report, and repair responses.230 The subscriber complaint report, adopted in D.00-03-020 and modified in D.00-11-015 pursuant to Pub. Util. Code § 2889.9(d), requires billing telephone companies to track and report billing disputes concerning cramming by third parties.231 The complaint response for general and disability telephone-related issues is required by the FCC and is reported to the Commission by AT&T. It addresses how these complaints are resolved.
By this Decision, we have fulfilled the directive of the URF Phase 1 Decision, [D.06-08-030, as modified by D.06-12-044], supra, to consider service quality monitoring reports in this proceeding. We affirm that this Decision does not alter the existing reporting requirements of the three aforementioned reports. Carriers that currently submit these reports should continue to do so.
4.7. Wireless Coverage Maps
We next address the issue of whether the Commission should require wireless carriers to provide coverage maps. Currently, there is no such requirement although we are aware that many carriers already do provide such information on their websites and at their retail locations consistent with voluntary compliance agreements reached with the Attorneys General from several states.232
DRA recommends that wireless carriers provide detailed street coverage and service maps as compiled by wireless carriers.233 DRA suggests such maps should be provided at the point of sale, and should show areas of weak and strong reception.234 TURN also supports street level service coverage maps.235
Wireless carriers generally oppose such a requirement. CTIA comments that detailed coverage maps are used to tune and retune the cellular radios that comprise the carriers' networks. These maps are not intended to ensure customers that a particular call will go through, given that many factors impact whether a particular call goes through such as network congestion, geographic factors, and weather. 236 SprintNextel states that detailed wireless service coverage information is available on carriers' public websites and customers have the opportunity to terminate service within 30 days of signing a service contract without incurring an early termination fee.237 SprintNextel illustrates in detail how detailed coverage information is available on its website.238 AT&T notes that the Commission's CalPhoneInfo initiative suggests customers test their phone and its features during the carrier's trial period.239
DRA also notes that many wireless carriers have entered into an agreement of voluntary compliance with Attorneys General from several states.240 The agreement provides that carriers implement procedures during a sales transaction at a retail location and on their websites to provide maps depicting approximate wireless service coverage. These maps would depict approximate outdoor coverage based on signal strength and signal strength confidence levels under normal operating conditions. California has not entered into the agreement and DRA states the agreement does not achieve the information disclosure needed by California consumers due to the lack of a common metric.241 The agreement specifically provides that:
Carrier shall implement procedures to provide during a Sales Transaction at its retail locations, and provide on its website, maps depicting approximate Wireless Service coverage applicable to the Wireless Service rate plan(s) being sold. The maps will be at Carrier's retail locations in printed materials that Consumers may take with them and on Carrier's website as electronic documents that Consumers may print out. The maps will be generated using predictive modeling and mapping techniques commonly used by radio frequency engineers in the wireless service industry to depict approximate outdoor coverage, based on then-appropriate signal strength for the applicable wireless technology and signal strength confidence levels under normal operating conditions on Carrier's network, factoring in topographical conditions, and subject to variables that impact radio service generally. All such maps will include a clear and conspicuous disclosure of material limitations in Wireless Service coverage depiction and Wireless Service availability. To assist Consumers in making comparisons among carriers, Carrier will make available to Consumers separate [sic] such maps depicting approximate Wireless Service coverage on a nationwide and regionwide basis as applicable to its Wireless Service rate plans that are currently offered to Consumers.242
Two variables influence a customer's ability to determine wireless coverage before they obtain service: the availability of adequate coverage maps; and the opportunity to view those maps online or at retail outlets. DRA's informal survey measured the availability of wireless coverage information as requested by sophisticated "customers" at retail outlets. DRA noted that no wireless carrier provided coverage maps at its store with the realism on available engineering maps, although one carrier provided that realism in its online map.243 Instead, wireless carriers provided printed maps with little detail on local coverage, although some were able to provide online maps with more specificity. A retail location reselling carriers' services was less helpful.244
We agree that whether wireless coverage is satisfactory in the area where a customer needs service is a primary component of a customer's satisfaction with that service. 245 The ability to obtain expected coverage information either online or at a retail location assists a customer in purchasing wireless service, especially if that customer is an informed purchaser of wireless service. And if customers do not know that detailed coverage information exists, they are dependent on salespeople to provide that information. Although the standard 30-day service trial periods may serve as a backstop and may help a customer assess whether coverage is satisfactory, its usefulness depends on the customer's willingness and ability to travel to all areas of interest during the trial periods. It does not, and should not, act as a substitute for carriers providing relevant coverage information so that customers can make an informed decision regarding their selection of wireless service at the time of purchase.
Although DRA is unimpressed with the type of voluntary agreements described above, we see such a commitment as a necessary starting point for customer access to coverage information. Our preference is to make California's requirements for coverage map disclosure consistent with the agreements adopted in other states, as that level of information most closely approximates a national standard. Consistent with the voluntary compliance agreements, we do not specifically require carriers to provide street level maps at their retail locations. We shall require wireless carriers to provide coverage maps depicting approximate wireless service coverage applicable to the wireless service offered rate plans. These maps should be provided in printable format on carriers' websites and in a printable or pre-printed format at their retail locations that customers can take with them. We expect that coverage maps will show where wireless phone users may generally expect to receive signal strength adequate to place and receive calls when outdoors under normal operating conditions. All maps should include a clear and conspicuous disclosure of material limitations in wireless service coverage depiction and wireless service availability. We decline to specify that the detail provided conform to specific engineering standards.246 However, retail locations which are capable of accessing a carrier's website during a sales transaction should also communicate any additional relevant coverage information to a customer. Depending on the particular capability of any individual retail location, that may be done verbally, by allowing a customer to view the information, or by printing such information as practicable. Also consistent with the voluntary agreements, carrier representatives at retail locations shall implement procedures to make available during a sales transaction maps depicting approximate wireless service coverage applicable to the wireless service rate plan(s) being sold. When customers are able to obtain this coverage information, they will be able to make a more informed selection of a specific wireless carrier and wireless plan; the trial period will allow customers to test the accuracy of the information provided. These provisions governing wireless coverage maps will begin 90 days after issuance of this decision.
4.8. AT&T's Out of Service Repair Interval Reporting
In 2001, the Commission resolved a complaint filed against AT&T regarding its residential repair interval performance.247 In that decision the Commission found, among other things, that the increase in AT&T's average number of hours to restore dial tone to residential customers between 1996-2000 violated the requirement under Pub. Util. Code § 451 to provide "adequate, efficient, just, and reasonable" service.248
DRA recommends continuing the AT&T OOS repair interval ARMIS reporting requirement since it was adopted in D.01-12-021 as a penalty for violating merger requirements and regulations.249 DRA reports that AT&T has had ongoing performance issues on OOS intervals and it is rare for any California ILEC to have worse OOS intervals for residential customers. AT&T counters by arguing these requirements are inconsistent with competitive parity since no other ILEC has the same requirement.250 AT&T argues that these requirements also are inconsistent with URF competitive parity requirements,251 and have outlived their usefulness because the merger commitment to maintain or improve service quality expired in 2002.252
The OOS repair interval reporting adopted in this Decision as part of the new GO 133-C is consistent with other states' requirements and is comparable to the requirement we set under D.01-12-021 (new 90% within 24-hour repair interval standard, excluding Sundays and federal holidays, as opposed to the prior 29.3-hour requirement). As noted above, we will also require carriers that report ARMIS OOS repair interval data (including AT&T) to continue reporting such information until at least September of 2010, and potentially through 2011, as discussed above.
The reporting required under GO 133-C in combination with ARMIS reporting should enable us to determine whether AT&T's repair service interval is adequate. Further, GO 133-C permits a staff investigation as the means to address any failure to achieve OOS repair interval service levels which may occur for six or more consecutive months. With the adoption of GO 133-C, AT&T will be held to the same standard as other URF carriers and its obligation to report under D.01-12-021 shall cease.
4.9. Parties' Additional Proposals
Parties' presented separate proposals as discussed below. We generally decline to adopt these additional proposals.
DRA recommends the Commission website display a service provider report card to show the performance results of each carrier on the adopted measures, arguing that this information would assist customers in choosing a provider. As discussed above, we support the publishing of reported data once a uniform and consistent reporting format has been developed for that purpose.
DRA recommends the Commission require remedial actions for carriers with two or more reported measures below the adopted standards in one year or two years in a row below the reported industry average on any one measure. As a first remedial action, the carrier would be required to meet with the Communications Division to present proposals on improving performance. If poor performance continues during the following three months, as a second remedial action the Communications Division may require monthly reporting requirements.
We agree that in order to be effective and meaningful, there should be certain ramifications for failure to meet the service quality standards we adopt today. Authorizing staff to undertake the above actions improves the efficiency of the Commission's processes and helps ensure compliance with our orders and requirements. Staff may also recommend the Commission institute a formal investigation into a carrier's performance and alleged failure to meet the reporting service level for six or more consecutive months. These remedies are not intended to apply to the provisions governing wireless coverage maps.
TURN recommends the Commission require service guarantees so that carriers would be required to compensate customers when commitments are not met for appointments, installation of primary lines, and restoring service.253 We decline to impose service guarantees at this time as a remedy for carriers' failure to meet service quality standards. Service guarantees are not currently required and this record does not establish that carriers generally fail to comply with existing GO 133-B standards, necessitating the imposition of service guarantees.254
28 A telephone corporation, as defined under Pub. Util. Code § 234(a), includes every corporation or person owning, controlling, operating, or managing any telephone line for compensation within this state. A telephone corporation, for purposes of applying service quality standards under the Commission's general orders, includes every certificated or registered carrier. In deference to the FCC's pending rulemaking regarding VoIP and other IP-enabled services, this Commission has not adopted any final decision regarding the regulatory treatment of these services. (See Order Instituting Investigation on the Commission's Own Motion to Determine the Extent to Which the Public Utility Telephone Service Known as Voice over Internet Protocol Should be Exempted from Regulatory Requirements [D.06-06-010] (2006) __Cal.P.U.C.3d __, at p. 3 (slip op.).) Should the FCC define the role of state commissions over VoIP, the Commission will determine the applicability of its service quality standards at that time.
29 2007 ACR, at p. 3.
30 URF Phase 1 Decision, supra, [D.06-08-030], at p. 33 (slip op.).
31 See e.g., TURN 2007 Comments, at p. 15.
32 2007 ACR, at p. 4.
33 See http://www.jdpower.com/telecom/ratings/wireless/index.asp
34 See http://www.consumerreports.org. A subscription required to access this information.
35 See http://www.pcmag.com.
36 See http://www.checkbook.org/. A subscription is required to access this information.
37 See http://www.mindwireless.com/index.
38 See http://www.mountainwireless.com/.
39 See http://www.hraunfoss.fcc.gov/edocs.
40 See http://www.bbb.org/about/stat2006/us06indusort.pdf.
41 The FCC's ARMIS customer satisfaction surveys are conducted by reporting carriers and not by FCC staff or independent third parties.
42 See Memorandum Opinion and Order and Notice of Proposed Rulemaking, In the Matter of Service Quality, Customer Satisfaction, Infrastructure and Operating Data Gathering, et al., WC Docket No. 08-190 et al., ¶¶ 12, 35, released September 6, 2008 (FCC's Service Quality Opinion). The FCC tentatively concluded that ARMIS customer satisfaction reporting should continue for at least 24 months from the effective date of its order.
43 Verizon 2007 Comments, Hernandez Declaration, Exhibit A, at p. 4.
44 DRA 2007 Comments, at p. 6 n.5.
45 DRA 2007 Reply Comments, at p. 6 citing to D.06-08-030, supra, at pp. 32, 179 (slip op.).
46 TURN 2007 Comments, at pp. 16-17.
47 DOD/FEA's 2007 Reply Comments, at pp. 5-9, 12.
48 DisabRA's 2007 Comments, at pp. 4, 6.
49 AT&T 2007 Comments, at pp. 7-9; Frontier 2007 Comments, at pp. 3-4.
50 AT&T 2007 Comments, at pp. 8-9.
51 Id. at p. 8.
52 Small LECs 2007 Comments, at p. 3.
53 Joint Parties 2007 Comments, at pp. 4-5.
54 Id. at p. 6.
55 Cbeyond 2007 Comments, at pp. 1-4.
56 CTIA 2007 Comments, at pp. 2-7; Verizon 2007 Comments, at pp. 3-4.
57 CTIA 2007 Comments, at p. 2.
58 T-Mobile 2007 Reply Comments, at p. 6.
59 FCC Service Quality Opinion, supra at ¶ 35.
60 Id.
61 The service measures under GO 133-B are: held primary service orders; installation-line energizing commitments; customer trouble reports; dial tone speed; dial service; toll operator answering time; directory assistance operator answering time; trouble report service answering time; and business office answering time.
62 Some parties' positions on the need for service quality measures changed from 2003 to 2007.
63 In 2003, both AT&T and Verizon endorsed minimum standards comparable to the standards adopted in this decision. AT&T and Verizon no longer support minimum standards for URF carriers.
64 DRA 2007 Comments, at p. 21; DRA 2007 Reply Comments, at p. 11.
65 The standard would be 80% in 20 seconds.
66 The standard would be six per 100 lines with no differentiation between initial and repeat.
67 The proposed standard is 95%.
68 The proposed standard is five days for basic service orders only.
69 The proposed standard is 25 hours.
70 The proposed standard is 17%.
71 DRA 2007 Reply Comments, at p. 11.
72 DRA 2007 Comments, at pp. 2-3.
73 Id. at pp. 18-19.
74 See D.03-10-088.
75 DRA 2007 Comments, at p. 13.
76 DRA Reply Comments, at p. 10.
77 TURN 2007 Comments, at p. 11.
78 The proposed standard is maximum three days for basic service orders only.
79 The proposed standard is maximum 36 hours with no differentiation between initial or repeat.
80 The proposed standard is 60 seconds. The measure must be combined with the option on the company's answering menu to speak with a live agent after no more than 45 seconds of menu choices. TURN acknowledges that many issues can now be resolved by a customer's choice of menu options. However, more complex problems require a representative. (TURN 2007 Comments, at p. 9.)
81 TURN argues that while the level of actual complaints does not represent the true level of problems, this data presents real issues that customers face. (TURN 2007 Comments, at p. 10.)
82 TURN 2007 Comments, at p. 11.
83 TURN 2007 Comments, at pp. 7-11.
84 Id. at p. 5.
85 TURN Reply Comments at pp. 5-6, citing AT&T 2003 Comments, Appendix 3, at p. 20.
86 AARP 2003 Comments, at p. 6.
87 Allegiance 2003 Reply Comments, at pp. 8-13.
88 DisabRA 2007 Reply Comments, at pp. 1-2.
89 DOD/FEA 2007 Comments, at pp. 12-13.
90 NCLC 2003 Comments, at p. 18.
91 CSBR/CSBA 2003 Comments, at p. 3. In addition, CSBR/CSBA asserted small businesses value carriers' prompt correction of billing problems and keeping promises. Id.
92 AT&T 2007 Reply Comments, at pp. 13 n.60, 15, 16, 17.
93 Verizon 2007 Comments, at pp 1-3; AT&T 2007 Comments, at pp. 1-4.
94 Verizon 2007 Comments, at p. 2.
95 AT&T 2007 Comments, at pp. 2, 11-15.
96 AT&T 2007 Comments, at p. 2.
97 SureWest 2007 Comments, at pp. 1-4.
98 Frontier 2007 Comments, at pp. 1-6.
99 Joint Parties 2007 Comments, at p. 9.
100 CALTEL 2007 Reply Comments, at pp. 4-5.
101 CALTEL 2007 Reply Comments, at pp. 5-6.
102 Cbeyond 2007 Comments, at pp. 1-3.
103 VON 2007 Reply Comments, at p. 4.
104 Small LECs 2007 Comments, at pp. 1-3.
105 Id.
106 Id.
107 Id. and Small LEC 2007 Reply Comments, at p. 3.
108 Id. at pp. 3-4.
109 Answer time reporting shall be limited to traffic offices with 10,000 or more lines.
110 Answer time reporting shall be limited to traffic offices with 10,000 or more lines.
111 Order Instituting Rulemaking on the Commission's own Motion to Establish Consumer Protection Rules Applicable to All Telecommunications Utilities, [D.06-12-042] [2006], pp. 17-18 __ Cal. P.U.C.3d __, (slip op.)
112 DRA 2003 Comments, at p. 10.
113 TURN 2003 Comments, at pp. 16-17.
114 Cox 2003 Comments, at p. 15.
115 TURN 2003 Comments, at pp. 16-17.
116 DRA 2007 Reply Comments, at p. 10.
117 Id.
118 Cbeyond 2007 Comments, at pp. 1-2.
119 Id.
120 CALTEL 2007 Reply comments, at pp. 4-5.
121 DRA 2007 Reply Comments, at p. 11.
122 DOD/FEA 2007 Reply Comments, at p. 11. AT&T does not report disaggregated data for large business customers whereas Verizon does.
123 However, ARMIS permits carriers to define small and large business customers for reporting customer satisfaction survey data per ARMIS Report 43-06. ( http://www.fcc.gov/wcb/armis/instructions/2008/definitions05.htm#T2C.)
124 See New York Public Service Commission Notice of Issuance of Uniform Measurement Guidelines, p. 16, http://www3.dps.state.ny.us/pscweb/WebFileRoom.nsf/ArticlesByCategory/F1F99E0C9A229CC685256DF1004CC36D/$File/doc8602.pdf?OpenElement. See also Michigan Telecommunications Service Quality Rules, R 484.520(1)(w) (small business defined as having three or fewer access lines), In the Matter, on the Commission's own motion, to revise the service quality rules applicable to telecommunications providers, 2007 Mich. PSC LEXIS 276, Exhibit A *51; Ohio Furnishing of Intrastate Telecommunications Service by Local Exchange Companies, OAC 4901:1-05-01 (FF) (small business defined as having three local exchange service access lines or less).
125 See Michigan R 484.558(1), supra; New York Public Service Commission Notice of Uniform Measurement Guidelines, supra, at p. 16.
126 Carriers that do not currently report this measure under ARMIS could incur additional costs to establish reporting.
127 AT&T 2003 Comments, Attachment 2, at p. 10. AT&T's labor costs were filed under seal. Although AT&T's estimate does not necessarily have general applicability to other carriers, it is useful to assess a range of costs from low to high, even for measures that AT&T is exempt from reporting.
128 Coalition 2003 Comments, at p. 29.
129 NCLC 2003 Comments, at pp. 7-8.
130 Id. at pp. 9-11.
131 DRA 2007 Reply Comments, at p. 10.
132 Id.
133 AT&T 2003 Comments, Attachment 2, at p. 7.
134 DRA 2007 Comments, p. 9.
135 DRA 2007 Reply Comments, at pp. 9-10.
136 DRA 2007 Comments, at p. 9.
137 TURN 2003 Comments, at p. 17.
138 TURN 2007 Comments, at p. 10.
139 Small LECs Comments, at p. 3.
140 See DRA 2007 Comments, p. 14; New York Public Service Commission Notice of Issuance of Uniform Measurement Guidelines, supra, at pp. 4-5.
141 DRA 2003 Comments, at p. 15.
142 AT&T 2003 Comments, Attachment 2, at p. 21.
143 TURN 2007 Comments, at p. 9 (also referencing ARMIS 43-05, rows 144, 145, 148, and 149).
144 DRA 2007 Reply Comments, at p. 10.
145 Chapter 26 of the Texas Administrative Code, Title 16, Part II, §§ 26.54(c). (See http://www.puc.state.tx.us/rules/surules/telecom/26.54/26.54.doc.)
146 83 Ill. Adm. Code 730.535.
147 Id.
148 DRA 2007 Reply Comments, at p. 10. The adopted reporting measure may result in some carriers needing to make certain adjustments in systems and/or procedures. It is the intent that the GO 133-C effective date of January 1, 2010 will afford carriers time to make any necessary adjustments. Should reporting data suggest problems, particularly during the initial 6 to 12 months of reporting, Commission staff and a given carrier should meet and confer as contemplated under Section 4.9.2 of this Decision.
149 A catastrophic event is any event in the reporting carrier's service area for which there is a declaration of a state of emergency by a federal or state authority. A widespread service outage is an outage affecting at least 3% of the carrier's customers in the state. The reporting carrier shall provide supporting information on why the month should be excluded and work papers that show the date(s) of the catastrophic event or widespread outage and how the adjusted figure was calculated. These definitions and reporting requirements are based on D.01-12-021. See D.01-12-021, at p. 40 n.38 and n.39, Ordering Paragraph 9 (slip op.).
150 Id.
151 AT&T 2003 Comments, Attachment 2, at p. 26.
152 DRA 2007 Comments, at p. 2.
153 Small LECs 2007 Reply Comments, at p. 4.
154 Small LECs 2007 Comments, at p. 3.
155 AT&T 2003 Reply Comments, at p. 13, n.60.
156 AT&T 2003 Comments, Attachment 2, at. p. 41.
157 Coalition 2003 Comments, at p. 26.
158 Id. at p. 24.
159 SureWest 2007 Comments, at pp. 1-4.
160 See, e.g., Michigan Public Service Commission, Rule 61(c) and (d); OAC Ann. 4901:1-5-03 (A)(2) (Ohio).
161 Verizon 2007 Reply Comments, Fernandez Declaration, at p. 8.
162 TURN 2007 Comments, at p. 9.
163 Id. at pp. 9-10.
164 AARP 2003 Comments, at p. 10.
165 See Ohio Administrative Code Ann. 4901:1-5-03.
166 AARP 2003 Comments, at p. 10.
167 Verizon 2003 Reply Comments, at pp. 14-15.
168 OAC Ann. 4901:1-5-03(A)(2).
169 AT&T 2003 Comments, Attachment 2, at p. 40.
170 TURN notes that a Southern California Edison Company survey showed that customer dissatisfaction increases with a 60-second average response time and significantly increases with a three-minute average response time. (TURN 2007 Comments, at p. 10, n.6.)
171 See New York Public Service Commission Notice of Issuance of Uniform Measurement Guidelines, supra, at pp. 21-22.
172 AT&T 2003 Comments, Attachment 2, at pp. 40, 41, 43, 44, 45.
173 AT&T 2003 Reply Comments, at pp. 18, n.81, (Koester Declaration, ¶¶ 2-3.)
174 Coalition 2003 Comments, at p. 26.
175 We decline to extend this exemption based on the size of a carrier to the GRC ILECs. Trouble report and business office answer time measures currently apply only to all centralized service groups which support 10,000 or more lines. We continue to limit reporting units.
176 A facilities-based carrier is a local exchange carrier that uses facilities it owns, operates, manages, or controls to provide service, including partially or totally owning, operating, managing or controlling such facilities. A local exchange carrier providing service solely by resale of the ILEC's local exchange services is not a facilities-based carrier.
177 OIR, at pp. 24, 25, 26.
178 Id. at p. 26.
179 CPSD 2003 Comments, at pp. 4-5.
180 Verizon Wireless 2007 Comments, at p. 1.
181 Id. at p. 2. See also AT&T Wireless 2003 Comments, at p. 18.
182 CTIA 2007 Comments, at pp. 2-5.
183 DOD/FEA 2007 Reply Comments, at p. 14.
184 TURN 2007 Comments, at pp. 11-14; DRA 2007 Reply Comments, at pp. 12-13.
185 TURN 2007 Comments, at pp. 11-14. Call-success rate would measure the number of successful calls established over the total number of call attempts. Service coverage would measure the network's ability in achieving a signal strength of -100 dBm or better during the mobile call holding period. Call drop-out would measure the unintended disconnection of mobile calls by the network during a 100-second call holding period for each call.
186 Id. at p. 14.
187 CTIA 2007 Reply Comments, p. 9.
188 DRA 2007 Reply Comments, at p. 13.
189 DisabRA 2007 Comments, at p. 3.
190 See ante, fn. 24.
191 Newton's Telecom Dictionary defines reseller as "A company that does not own its own transmission lines. It buys lines from other carriers and then resells them to subscribers." We decline to adopt a different definition of reseller for purposes of an exemption from reporting service quality measures.
192 AARP 2003 Comments, at p. 3.
193 See issue raised in the Assigned Commissioner and Administrative Law Judge's Ruling Denying in Part and Granting in Part Motion to Suspend, dated March 7, 2003, at p. 1.
194 AT&T supported publication as an alternative for all but a few measures related to consumer health and safety. (AT&T 2003 Comments, at p. 16.) DRA supported publishing as an adjunct to adopting minimum service quality standards. (DRA 2003 Comments, at p. 6.) DOD/FEA expressed customers need to have access to such comparative service quality data. (DOD/FEA 2003 Comments, at p. 2.) Working Assets supported standards for reporting adopted measures. (Working Assets 2003 Comments, at p. 9.) TURN supported comparisons of carrier performance. (TURN 2003 Comments, at p. 25.) CSBRT/CSBA supported access to service quality information in a customer-friendly format to inform purchasing decisions. (CSBRT/CSBA 2003 Comments, at p. 6.) Cox supported publishing service quality information for all carriers. (Cox 2003 Comments, at p. 20.) Verizon supported publishing a narrow range of measures. (Verizon 2003 Comments, at p. 21.) Frontier and the GRC LECs supported publishing for carriers that face competition. (Frontier 2003 Comments, at p. 12; Small LECs' 2003 Comments, at p. 11.) SureWest supported publishing only for carriers that have documented service quality problems. (SureWest 2003 Comments, at p. 12.)
195 CPSD questioned the value of posting data. (CPSD 2003 Comments, at p. 27.) See also AT&T/ASI 2003 Comments, at p. 9); CCAC 2003 Comments, at pp. 6-7; T-Mobile 2003 Comments, at pp. 14-15; and Cingular Wireless 2003 Comments, at pp. 2-3.
196 As noted on the Commission's website, "[t]his State of California website has been developed in compliance with California Government Code 11135, located in Section D of the California Government Code. Code 11135 requires that all electronic and information technology developed or purchased by the State of California Government is accessible to people with disabilities. There are various types of physical disabilities that impact user interaction on the web. Vision loss, hearing loss, limited manual dexterity, and cognitive disabilities are examples, with each having different means by which to access electronic information effectively. Our goal is to provide a good web experience for all visitors."
197 October 5, 1977 memo and attached MSI Report Form from Ermet Macario, Acting Chief - Surveillance Branch, Communications Division to all telephone utilities.
198 DRA 2007 Reply Comments, at p. 11.
199 DRA 2007 Comments, at p. 18.
200 TURN 2007 Comments, at pp. 19-20.
201 Verizon 2007 Comments, at p. 12.
202 Small LECs 2007 Comments, at p. 3.
203 SureWest 2007 Comments, at p. 6; Frontier 2007 Comments, at p. 4.
204 AT&T 2007 Reply Comments, at p. 21.
205 CTIA's 2007 Reply Comments, pp. 11-12. CTIA notes adoption of FCC reporting requirements is consistent with the reliance on FCC ARMIS reporting under the URF Phase I Decision, [D.06-08-030], supra, at p. 217. The Network Outage Reporting System (NORS) is the Public Safety and Homeland Security Bureau's Internet-based system for filing reports of telecommunication service disruptions pursuant to Part 4 of the FCC's rules. The system facilitates the filing of required Notifications, Initial Reports and Final Reports. The information on service disruptions is essential to maintain and improve the reliability and security of the telecommunications infrastructure. http://www.fcc.gov/pshs/services/cip/nors.html
206 Id. at 12.
207 Rules Concerning Disruptions Order, supra, 19 FCC Rcd 16830 ¶¶ 1, 10-13.
208 In the Matter of Numbering Resource Optimization, Second Report and Order, CC Docket No. 99-200, Release Number: FCC 00-429 ¶¶ 116-123; see also In the Matter of Number Resource Optimization, Third Report and Order, CC Docket No. 99-200, Release Number: FCC 01-362 ¶¶ 133-138.
209 It is our hope that access to the NORS database will be obtainable within nine months. However, should our request be denied or still pending at that time, we will reopen the proceeding for the limited purpose of seeking comment on whether interconnected VoIP providers (including cable) should submit California-specific NORS data to this Commission.
210 If in the future the Commission is granted access to the FCC's NORS database, carriers may petition the Commission to modify our decision and GO 133-C to eliminate the requirement for separate California reporting of NORS data.
211 We also note relevant confidentiality protection provided for critical infrastructure information under the California Public Records Act. (See e.g., Cal. Govt. Code, § 6254 (k), (aa), (ab).)
212 In the Matter of Federal-State Joint Board on Universal Service, Report and Order, CC Docket No. 96-45, Release Number: FCC 05-46 ¶ 69 n. 194. The report must include: (1) the date and time of onset of the outage; (2) a brief description of the outage and its resolution; (3) the particular services affected; (4) the geographic areas affected by the outage; (5) steps taken to prevent a similar situation in the future; and (6) the number of customers affected. The FCC rejected the NORS reporting threshold as insufficient for purposes of determining ETC functionality during emergency situations because populations can vary.
213 See ante, fn. 17.
214 See Attachment 4.
215 See http://www.fcc.gov/wcb/armis/. Specific measures contained in the ARMIS tables can be found in Exhibit A of Verizon's 2007 Comments.
216 The small LECs in California that are required to file ARMIS service quality information in Report 43-05 are Verizon West Coast, Citizens-Golden State, and Citizens Tuolumne. See Attachment 4.
217 In the Application of GTE Corporation, Transferor, and Bell Atlantic Corporation, Transferee, For Consent to Transfer Control of Domestic and International Sections 214 and 310 Authorizations and Applications to Transfer Control of a Submarine Cable Landing License, CC Docket No. 98-184, Condition 51 (MEMORANDUM OPINION AND ORDER) FCC 00-211 (Adopted June 16, 2000); In re Applications of Ameritech Corp., Transferor, and SBC Communications Inc., Transferee, For Consent to Transfer Control of Corporations Holding Commission Licenses and Lines Pursuant to Sections 214 and 310(d) of the Communications Act and Parts 5, 22, 24, 25, 63, 90, 95, and 101 of the Commission's Rules, CC Docket No. 98-141, Appendix C, Condition XXIV, ¶ 62 (MEMORANDUM OPINION AND ORDER) FCC 99-279 (Adopted October 6, 1999).
218 Installation data includes installation order and performance. Repair includes trouble report volume, type, location, and performance. Answer time includes calls attempted and completed for automated systems, calls abandoned, calls receiving a busy signal, average answer time and percentage of calls abandoned and receiving busy signals.
219 TURN's Comments, p. 19. TURN's proposed metrics on answer time, abandoned calls and calls receiving busy signals are based on MCOT.
220 Id. DRA generally supports monitoring but offers no specific recommendations.
221 Verizon 2007 Comments, at pp. 17-18; see also Aron Declaration, at p. 37.
222 Verizon 2007 Comments, at p. 19. MCOT reporting should be eliminated for all carriers subject to that reporting, including Verizon West Coast Inc.
223 AT&T 2007 Comments, at p. 15.
224 Commission's Own Motion to Assess and Revise the New Regulatory Framework for Pacific Bell and Verizon California, Inc., Order Instituting Investigation on the Commission's Own Motion to Assess and Revise the New Regulatory Framework for Pacific Bell and Verizon California Inc. ("Interim Opinion re Phase 2B Issues Service Quality of Pacific Bell and Verizon California") [D.03-10-008] (2003) __ Cal.P.U.C.3d __, at pp. 117-124 (slip op.).
225 See ante, fn. 36.
226 See Rule 6.3 of the Commission's Rules of Practice and Procedure and Pub. Util. Code § 1708.5. Parties may request continued reporting of some or all of California-specific ARMIS reporting.
227 The service quality monitoring reports, discussed above, are GO 133-B service measures; the MSI report; and MCOT. Consistent with the adoption of minimum service quality measures, GO 133-B reporting is replaced with GO 133-C reporting. GO 133-C includes both service measures and the MSI report.
228 AT&T also files the business office referral cramming report, including dial tone slamming.
229 The carriers that currently file service quality reports with the Commission and the FCC are listed in Attachment 4. Consistent with D.03-10-088, the AT&T and Verizon merger condition reports are referred to as MCOT reports. Verizon calls its MCOT report a NARUC report. By eliminating MCOT reports, the Commission is eliminating the Verizon NARUC report.
230 AT&T, Verizon, SureWest, and Frontier submit this report.
231 AT&T, Verizon, SureWest, Frontier, the GRC ILECs, and CLECs submit this report.
232 AT&T 2007 Reply Comments, at p. 22.
233 DRA 2007 Reply Comments, at p. 3.
234 Id.
235 TURN 2007 Reply Comments, at p. 19.
236 CTIA 2007 Reply Comments, at p. 4.
237 SprintNextel's Reply Comments, at p. 3.
238 Id. at pp. 4-8.
239 AT&T 2007 Reply Comments, at p. 22.
240 DRA 2007 Wireless Coverage Comments, at p. 11.
241 Id. at p. 11.
242 http://www.nasuca.org/CINGULAR%20AVC%20FINAL%20VERSION.pdf
243 DRA 2007 Comments, Witteman Declaration ¶ 9.
244 See DRA 2007 Comments, Appendix A.
245 At least one 2008 study suggests there has been a drop in wireless retail sales customer satisfaction from 2006 levels. J.D. Power & Associates 2008 Wireless Retail Sales Satisfaction Study Volume 2, October 23, 2008. http://www.jdpower.com/telecom/articles/2008-Wireless-Retail-Sales-Satisfaction-Study
246 Any voluntary compliance agreement reached between the California Attorney General and any wireless carrier shall supersede any and all requirements concerning access to coverage maps contained in this decision.
247 The Office of Ratepayer Advocates v. Pacific Bell Telephone Company ("Opinion Granting Complaint, In Part") [D.01-12-021] (2001) __ Cal.P.U.C.3d __, at pp. 1, 56-57 (slip op.).
248 Id. at p. 1. As a result, AT&T was required to file ARMIS monthly reports for initial and repeat OOS repair intervals. In any year in which AT&T exceeds the initial repair interval of 29.3 hours or the repeat OOS repair interval of 39.4 hours AT&T must pay a penalty of $300,000 for each month of the year where it exceeded the standard. The standards were based on AT&T's performance in 1996. If there is a catastrophic event or widespread service outage in a particular month, AT&T can request exclusion of results for that month.
249 DRA 2007 Comments, at p. 22; DRA 2007 Reply Comments, at pp. 13-14.
250 AT&T 2007 Reply Comments, at p. 13.
251 AT&T 2003 Comments, at p. 15.
252 AT&T 2007 Reply Comments, at pp. 9-10.
253 TURN 2007 Comments, at pp. 14-15. TURN recommends a credit of $30 if the four-hour appointment standard is not met, a $30 credit if a primary line is not installed within five days of receipt of the request, and $10 for each day out of service beyond the first 24 hours.
254 The Commission currently does not have an OOS repair interval, so AT&T has not failed to comply with a GO 133-B standard.