|This document is available in two formats: this web page (for browsing content) and PDF (comparable to original document formatting). To view the PDF you will need Acrobat Reader, which may be downloaded from the Adobe site. For an official signed copy, please contact the Antitrust Documents Group.|
| Before the|
FEDERAL COMMUNICATIONS COMMISSION
Washington, D.C. 20554
EVALUATION OF THE
UNITED STATES DEPARTMENT OF JUSTICE _______________________________________________________
FEDERAL COMMUNICATIONS COMMISSION
Washington, D.C. 20554
EVALUATION OF THE
UNITED STATES DEPARTMENT OF JUSTICE _______________________________________________________
Introduction and Summary
The United States Department of Justice ("the Department"), pursuant to Section 271(d)(2)(A) of the Telecommunications Act of 1996(1) ("the 1996 Act"), submits this evaluation of the joint application filed by BellSouth Corporation, BellSouth Communications, Inc., and BellSouth Long Distance, Inc. on October 2, 2001, to provide in-region, interLATA services in Georgia and Louisiana.
This joint application to the Federal Communications Commission ("FCC" or "Commission") is BellSouth's third application for the state of Louisiana and its first for the state of Georgia.(2) In the three years since the FCC's denial of BellSouth's Louisiana applications, BellSouth has made significant progress toward opening its local markets in Georgia and Louisiana to competition under the active guidance of the Georgia and Louisiana Public Service Commissions ("PSCs"). CLECs in Georgia, in particular, have made significant gains in serving business customers in urban areas using some or all of their own facilities. To a lesser extent, the same is true of CLECs operating in Louisiana. The Department has taken these signs of competitive progress into account in evaluating this application.
CLECs have been less successful in other parts of the local markets in Georgia and Louisiana. Commercial activity by means of the unbundled network element platform ("UNE-platform") and digital subscriber lines ("DSL") has been more limited. An array of CLECs have lodged credible complaints about the sufficiency of BellSouth's operations support systems ("OSS") and neither the reported performance data nor the results of the third-party OSS testing relied on in this application are sufficient to determine that these complaints are unfounded. In addition, BellSouth's reported performance measures appear to be unreliable in several significant respects.
In support of its application, BellSouth relies on undertakings to make future improvements in some of these areas. In this Evaluation, the Department notes several other areas that merit careful scrutiny by the Commission. Therefore, the Department is not in a position to support the application on the present record. The Department does not, however, foreclose the possibility that the Commission may be in a position to find that concerns in these areas have been adequately met prior to the conclusion of its review of BellSouth's application.
The Georgia and Louisiana PSCs have submitted extensive evaluations of BellSouth's performance based on the state filings, which both conclude that BellSouth has satisfied the competitive checklist requirements of section 271. The Louisiana PSC determined that it could address areas where BellSouth's performance fell below the stated benchmark through future proceedings.(3) Both Commissions also ordered BellSouth to implement a number of OSS upgrades within the next several months.(4) The Department also is concerned about the capabilities of BellSouth's OSS. Requiring BellSouth to prove nondiscriminatory access to its OSS before this application is granted is important particularly because its first successful filing may well serve as the benchmark for evaluation of its OSS in states regionwide.
Over the past several years, the Georgia PSC has conducted extensive proceedings concerning BellSouth's section 271 compliance. Throughout the process, the Georgia PSC has shown a genuine commitment to implementing market opening measures. The Commission held numerous dockets, arbitrations, technical conferences, formal hearings, and collaborative workshops, and actively worked with BellSouth and numerous competing carriers to define the terms, conditions, and operational details necessary for the development of local competition in Georgia.(5) It also adopted permanent service quality performance measurements with retail analogues and benchmarks, and instituted a self-executing penalty plan.(6) The Georgia PSC also held technical conferences and informal workshops to address CLEC concerns about access to BellSouth's OSS,(7) which traditionally have been weak.(8) As a result of these proceedings, and the efforts of the PSC, BellSouth made a number of enhancements to its OSS.(9)
Recognizing that commercial experience had not fully demonstrated the operational readiness of BellSouth's OSS, the Georgia PSC also required BellSouth to engage in third-party testing of its OSS as well as its performance measures.(10) Although the Georgia KPMG test provides some evidence of the functionality and operability of BellSouth's OSS, the test has significant limitations. First, the Georgia test was limited in scope.(11) Although the Commission ultimately required some additional testing and other improvements, a number of key areas remained outside the parameters of the test.(12) Second, unlike in New York, in Georgia KPMG did not draft the Master Test Plan.(13) Third, a number of Georgia test "exceptions" appear to have been closed without adequate verification that the problems had been resolved.(14) Finally, KMPG has not completed the metrics testing ordered by the Georgia PSC.(15)
Since the second denial of BellSouth's section 271 application for the state of Louisiana, the Louisiana PSC has conducted a number of local competition proceedings. The Louisiana PSC, assisted by a third-party consultant, conducted technical discussions and workshops that resulted in, inter alia, the development of service quality performance measurements with retail analogues and benchmarks, and a self-executing enforcement plan (under which BellSouth began reporting its wholesale performance for July). The Commission also held hearings concerning the establishment of rates, including geographically deaveraged rates for UNEs and UNE combinations, and held workshops to encourage resolution of operational issues.(16) To the extent that commercial experience in Louisiana was insufficient to fully demonstrate that BellSouth's OSS is nondiscriminatory, the Louisiana PSC relied on the Georgia OSS test.(17)
BellSouth asserts that its OSS are regional. Although as a general matter concerns relating to states outside those at issue in the application may not be relevant to section 271 review, here the regional nature of the OSS indicates that OSS testing conducted in BellSouth states other than Georgia and Louisiana may be relevant to evaluating this application. The Florida PSC is in the process of determining whether BellSouth has complied with section 271. It considered BellSouth's request that it rely on the Georgia OSS test.(18) Based on the evidence presented by competitors, however, the Florida PSC decided that a separate test was required in order to demonstrate compliance with section 271 requirements.(19) The Florida test is broader in scope and promises to provide a more robust assessment of BellSouth's OSS than did the Georgia OSS test. Indeed, KPMG's Florida OSS test is identifying problems that were not detected during the Georgia OSS test -- problems that BellSouth is working to fix.(20) The Commission should be attentive to information generated by the Florida test as well as information about BellSouth's ability or willingness to fix any problems identified in Florida.
The Department looks first to the actual entry in a market as the best indicator of openness.(21) But the Department does not broadly presume that all three entry tracks -- facilities-based, unbundled network elements and resale -- may be open on the basis of aggregate levels of entry standing alone.(22) Although the Department presumes that fully facilities-based competition is not hindered in a competitively significant manner based on the entry recorded in Georgia, the amount of entry does not justify extending such a presumption to other modes of entry in Georgia. In Louisiana, the amount of entry does not entitle BellSouth to a presumption of openness as to any mode of entry.
BellSouth has approximately 4 million retail access lines in its Georgia service area, divided roughly into 1.5 million business and 2.5 million residential.(23) BellSouth estimates that as of September CLECs in Georgia serve approximately 860,000 lines (17.7 percent of the total),(24) divided roughly into 570,000 business (28.6 percent of business lines) and 290,000 residential (10.2 percent of residential lines). These numbers are categorized by mode of entry in Table 1, which also presents the net line growth or loss for each category between July and September.Table 1: CLEC Access Lines in BellSouth's Georgia Service Area(25)
One of the most striking facts about these numbers is that facilities-based entry is such a large proportion of CLEC services. In the 17.7 percent of total lines served, facilities-based service accounts for 11.9 percent of the total lines. Most of the 28.6 percent of business lines served are facilities-based: 23.4 percent. More than one-third of the 10.2 percent of residential lines served, 3.9 percent, are facilities-based. In the absence of a showing of significant competitive effect, it seems reasonable to conclude that fully facilities-based entry has not been unduly hindered by problems associated with obtaining necessary inputs from BellSouth in Georgia.
In reporting facilities-based entry, BellSouth combines service that is fully facilities-based (i.e., provided entirely over CLECs' own facilities) with service via UNE loops. According to BellSouth, however, only 15 percent of the total facilities-based entry (representing 1.8 percent of the total lines in BellSouth's service area) is delivered by means of UNE loops.(26) The numbers of UNE loops for business and residential customers, respectively, are not available. However, because UNE loops are such a small proportion of the lines BellSouth reports as facilities-based, it is reasonable to analyze facilities-based numbers as referring almost exclusively to fully facilities-based service. Likewise, the Department finds the amount of entry using UNE loops too small to serve as evidence that the costs of acquiring such loops from BellSouth are acceptably low.
CLECs have used the UNE-platform to serve 3.9 percent of the lines in BellSouth's service area and resale to serve 1.9 percent. Entry via the UNE-platform accounts for 4 percent of business lines and resale accounts for 1.2 percent of such lines. Entrants using the UNE-platform serve 3.8 percent of all residential lines, and those using resale serve 2.4 percent of such lines. Discounting the total number of lines by the number served through CLEC facilities does not make a material difference in these percentages.(27)
Nondiscriminatory CLEC access to UNEs in Georgia therefore cannot be presumed on the basis of the entry numbers. Such access is important if a broad range of residential consumers are to have a competitive choice for local service now.(28) Under these circumstances a showing that BellSouth, in its commercial provision of inputs to CLECs, is meeting the performance benchmarks remains an important aspect of assuring that a broad base of customers in Georgia will have access to CLEC competitors.
The Department continues to believe that resale should be accessible to those competitors that rely on it.(29) CLECs reselling BellSouth's voice services in Georgia have not articulated complaints in this proceeding. Although entry via resale for both business and residential customers is declining in Georgia, one reason may be that some CLECs are converting these customers to UNE-platform service.(30) Based on the lack of complaints, the Department concludes that the Georgia market is open to competitors seeking to resell BellSouth's voice services.(31)
BellSouth has approximately 2.3 million retail access lines in its Louisiana service area, divided roughly into 0.7 million business and 1.6 million residential.(32) Bell South estimates that, as of September, CLECs in Louisiana serve approximately 215,000 lines (8.4 percent of the total), divided roughly into 145,000 business (16.8 percent of business lines) and 70,000 residential (4.1 percent of residential lines). These numbers are categorized by mode of entry in the Table 2, which also presents the net line growth or loss for each category between July and September.Table 2: CLEC Access Lines in BellSouth's Louisiana Service Area(33)
Overall CLEC penetration into BellSouth's Louisiana service area is much less than in Georgia (8.4 percent compared to 17.7 percent). CLEC penetration by facilities-based entry in Louisiana is 4.3 percent, whereas in Georgia it is 11.9 percent. CLEC penetration by resale in Louisiana is 3.3 percent compared to 1.9 percent in Georgia; penetration by the UNE-platform is 0.7 percent in Louisiana compared to 3.9 percent in Georgia. Only resale penetration is higher in Louisiana than in Georgia, and even that rate has been falling.
Of the 8.4 percent of total lines served by CLECs in Louisiana, facilities-based service accounts for half: 4.3 percent. Most of the 16.8 percent of business lines served are facilities-based: 12.6 percent of the total. Of the 4.1 percent of residential lines served, however, facilities-based service is 0.1 percent.(34) Entry via the UNE-platform accounts for 1.8 percent of business lines and resale accounts for 2.5 percent of such lines. Entrants using the UNE-platform serve 0.2 percent of all residential lines, and those using resale serve 3.8 percent of such lines.
In the absence of complaints from CLECs providing service entirely over their own facilities in Louisiana, the Department concludes that the Louisiana market is open to fully facilities-based competition. With regard to UNEs, however, a showing that BellSouth, in its commercial provision of inputs to CLECs in Louisiana, is meeting the performance benchmarks remains an important aspect of assuring that a broad base of customers will have access to CLEC competitors.
The number of business customers served via resale is declining in Louisiana, as in Georgia. However, CLECs reselling BellSouth's voice services have not filed complaints in this proceeding and the decline may be due to CLEC conversions of these customers to service via the UNE-platform service. Based on the lack of complaints, the Department concludes that the market for resold voice services in Louisiana is open.
Access to fully functional OSS is essential for CLECs to provide their services to all types of customers throughout Georgia and Louisiana using all the entry strategies established by the 1996 Act. Despite BellSouth's improvement of its systems for processing CLEC orders during the past three years, the Department is concerned that the remaining deficiencies may negatively affect CLECs ordering UNEs, both bundled and unbundled, in a number of ways. Orders that are manually processed are more likely to be provisioned incorrectly, and manual processing prevents CLECs from relying on their own automated systems and slows CLECs' response to customer inquiries. When CLECs cannot place UNE-platform orders by simply referencing the customer's telephone number, more orders are rejected by BellSouth, CLECs are more likely to have to resubmit orders, and the provisioning process is delayed. When electronic interfaces are unavailable to CLECs, they cannot submit orders for new customers or initiate changes to existing services via those interfaces or use them to access information needed to respond to customer inquiries. Finally, CLEC efforts to create robust electronic connections to BellSouth are hindered by an inadequate test environment and a process for implementing changes to BellSouth's OSS that appears overwhelmed by the demands placed on it.
Some of these problems should be resolved by OSS changes that BellSouth intends to implement within the next six months. Some may be resolved as OSS testing is completed in Florida.(35) The Department is concerned, however, that the combined effects of contending with these problems -- many of which most affect CLECs relying on the UNE-platform and DSL-capable loops -- may raise costs for CLECs operating in Georgia and Louisiana, degrade the quality of service CLECs offer to their customers, erode CLEC reputations and customer relationships, and constrain CLECs from aggressively marketing their services.
The ability of CLECs to compete with BellSouth -- particularly in the residential market, where volumes are high and margins are thin -- will depend largely on efficient electronic processing of orders and provisioning notices.(36)
Several CLECs attempting to compete with BellSouth in Georgia have complained in state proceedings, and now to this Commission, that BellSouth is processing a large number of their orders manually.(37) To manually process an order, BellSouth's service representatives re-type some or all of the information on the CLEC order form into an internal electronic service order. This manual processing increases the expense of CLEC ordering, lengthens the time required to place customers in service, and creates errors that cause service requests to be improperly rejected or to be provisioned incorrectly.(38)
The Department has not been able to determine with confidence how many CLEC orders BellSouth processes manually. BellSouth asserts that less than 10 percent of all UNEs ordered by CLECs cannot be ordered electronically.(39) Others can be ordered electronically but manual processing occurs when orders that are electronically submitted fall out of the electronic systems either because the systems were not designed to fully automate the ordering and provisioning process, or because a problem within the systems requires human intervention.(40) BellSouth has repeatedly revised its reported flow-through performance measures for electronically submitted orders and recently informed the Department that a whole category of orders, DSL orders, have not been included in these calculations.(41) BellSouth's most recent iteration of its achieved flow-through rates indicates that its service representatives process about a third of electronically submitted UNE orders manually.(42)
The magnitude of manually processed orders for some CLECs, however, is even greater. Covad Communications, currently the major competitor using DSL technology in Georgia,(43) asserts that significantly more of its orders have to be faxed to BellSouth -- and then completely re-typed by BellSouth services representatives -- because BellSouth has not designed its electronic ordering interfaces to accept the types of orders Covad submits.(44) For Covad, the required faxing makes ordering more expensive in Georgia than it is in California, where Covad can place orders electronically.(45) Manual submission of orders prevents Covad from having real-time access to the electronic functions necessary to maintain good customer relations.(46) The FCC anticipated such problems when it established that, to achieve checklist compliance, an RBOC must demonstrate development of sufficient electronic and manual interfaces to allow competing carriers to access all necessary OSS functions and, in particular, equivalent electronic access to functions that the RBOC itself accesses electronically.(47)
Other CLECs also claim that BellSouth processes more orders manually than BellSouth's reported performance would suggest. Birch Telecom, a new entrant in Georgia that is offering UNE-platform service to small business and residential consumers, is able to submit most of its orders electronically.(48) However, Birch asserts that roughly 35 to 45 percent of its electronically submitted orders are manually processed either because BellSouth's OSS has not been sufficiently developed to process the order on an automated basis or because a glitch in the software causes them to fall out for manual intervention.(49)
BellSouth contends that its manual processing is adequate because it returns order confirmations and rejects invalid orders in a timely fashion.(50) Timeliness of BellSouth's ordering process communications with CLECs is a relevant factor in determining whether manual processing is negatively affecting the ability of CLECs to compete in Georgia and Louisiana. The competitive effects of timely but inaccurate order processing, however, are also relevant as is the extent to which manual processing may have other negative consequences for CLECs. BellSouth's service representatives find it difficult to accurately reproduce the orders CLECs submit.(51) If the manually created service order is not the same as the original order submitted by the CLEC, the customer may not get the features ordered and may blame the CLEC, whose reputation for providing quality service will suffer.(52) To avoid this outcome, some CLECs impose a lesser harm on themselves by extending the time in which they ask BellSouth to provision the service so that they can catch BellSouth errors and make sure they get corrected before the line is incorrectly provisioned.(53) This option, however, limits CLECs' ability to provide service as quickly as BellSouth, raises their costs as they spend time and resources to back-stop BellSouth's work, and precludes them from using these resources to expand their marketing efforts and other aspects of their businesses.(54)
BellSouth acknowledges that its service order accuracy rates are low, but contends that the errors should be discounted because other performance measures suggest that these errors are not affecting customers.(55) BellSouth asserts that if inaccurate service orders were having a significant negative impact on customers that would be reflected in the amount of provisioning complaints or invoice accuracy reported.(56) However, when BellSouth does not provision a feature because a service representative leaves the item off the manually processed service order, the CLEC must re-order the feature, not submit a request to fix a trouble.(57) Therefore, the usefulness of the provisioning complaints measure as a diagnostic tool is limited, because it does not reflect many ordering errors. Likewise, BellSouth's billing accuracy measures provide no comfort that these problems do not really affect CLEC customers because features that are not provisioned are not billed (and incorrect long distance selections are not likely to result in a BellSouth billing dispute).
In addition to introducing errors when manually creating internal service orders, BellSouth apparently often rejects a significant amount of CLEC orders that it should accept for processing.(58) Processing of these orders is necessarily delayed as the rejected order must be re-submitted. Moreover, the extent of the delay depends on how long it takes the CLEC to determine that the order it originally submitted did not contain errors and should be re-submitted in its original form.(59)
Even if done properly, manual processing may hurt a CLEC's ability to compete by increasing its costs and by degrading the quality of the service it can offer its customers.(60) Orders that must be transmitted manually or that are processed manually upon receipt take longer to process than orders that are electronically submitted and processed because BellSouth is permitted to take longer to return order confirmations and rejects.(61) This delay may mean that when CLECs order service they are unable to advise their customers of the installation date when they order service with the same certainty that BellSouth advises its customers.(62)
Manual processing of orders negatively affects CLECs in another way. In order to deal with errors and delays caused by BellSouth's manual processing, the amount of manual work CLECs must do increases, and their ability to rely on the mechanized ordering systems they have installed decreases.(63) As a result, CLECs cannot provision service to their customers as quickly and accurately as BellSouth, and CLECs' potential to provide higher quality service than BellSouth evaporates.
CLECs also complain that their subscribers are increasingly suffering from loss of dial tone upon conversion to the UNE-platform.(64) Such outages damage CLECs' reputations as customers' initial impressions are ones of inferior service.(65) These problems may be occurring, in part, when BellSouth fails to complete contemporaneously the two internal service orders it creates to provision each UNE-platform order,(66) particularly on manually processed orders.(67) BellSouth acknowledges that its two service order process can cause conversion outages, but claims that by July 18 it modified the system to prevent certain errors and that the outage rate since has declined to .56 percent, within the tolerance level established by the FCC.(68) WorldCom contends, however, that nearly 3 percent of its residential customers in Georgia had reported loss of dial tone or, in some cases, an inability to receive inbound calls, and that the magnitude of the problem is growing as daily sales volumes increase.(69)
The Department is unable to resolve this factual dispute, but is concerned that if the volume of CLEC UNE-platform orders continues to increase there will be more opportunities for the two internal service orders to disassociate and cause service problems. The Georgia PSC also shares this concern, recognizing that "the process should be improved to minimize the potential of future problems as UNE-platform becomes a more viable solution to provide service to residential customers in Georgia."(70) The Georgia and Louisiana PSCs have both ordered BellSouth to implement a single-order process for UNE-platform provisioning,(71) but BellSouth informed the Georgia PSC that it will not be able to implement this process improvement until April 2002, despite the imposition of a $10,000 per day fine to be imposed by the Georgia PSC if BellSouth misses its January 5, 2002, deadline.(72) The Commission should assure itself that this process improvement will be completed in an adequate manner.
When a CLEC places a UNE-platform order to transfer (or "migrate") a customer's existing service to the CLEC, allowing the CLEC to identify accounts simply by telephone number ("TN migration") helps to minimize the number of such orders that the RBOC rejects for processing due to incorrect or mismatched address information. The unnecessary rejection of these orders dampens the prospects for mass-market competitive entry by delaying the completion of the orders, thus damaging CLECs' customer relations, and by raising CLECs' cost of competing as they expend additional resources to correct and retransmit the orders.(73) As order volumes climb, the problem may prevent effective mass-marketing of CLEC services. TN migration is therefore an important precondition for competitive entry to occur on a mass-market basis.(74)
The FCC has recognized that TN migration is "particularly beneficial to carriers such as WorldCom who plan to increase volumes considerably in offering service to the mass market" because it enables CLECs to reduce reject rates and avoid the potentially time-consuming task of resolving address-related rejects.(75) BellSouth, however, has not permitted CLECs to order UNE-platform service using telephone numbers alone. BellSouth requires CLECs to provide a correct service address for each customer, even when the only provisioning activity is the change in ownership of the account from BellSouth to the CLEC, which does not depend on address information.(76) Requiring correct service addresses appears to result in a large number of rejects,(77) which would not occur with TN migration. Such rejects increase the burden on CLECs seeking to compete in the BellSouth region compared to those that compete in other RBOC regions.(78) Recognizing its usefulness, the Georgia PSC ordered BellSouth to implement TN migration by November 3, 2001.(79) Although BellSouth told the Georgia PSC that it was unlikely to meet this deadline,(80) the Department believes that TN migration may have been implemented this past weekend. However, the current record does not indicate whether this implementation has been successful or has resolved the problems associated with address-related rejects.(81) The Commission should assure itself that the process improvement is implemented in an adequate manner.
The Department also is troubled by CLEC allegations of interface unavailability. Reliable electronic connections between trading partners is a necessary prerequisite for CLECs to compete, particularly as they submit increasing numbers of orders to the RBOC. When BellSouth's OSS pre-ordering and ordering interfaces are partially or totally out of service, the CLECs' ability to access customer information for prospective customers, order services to serve new customers, or make feature changes is severely diminished.(82) CLECs operating in the BellSouth region complain of significant service outages, including slow or degraded service.(83) By contrast, BellSouth reports virtually no downtime for any of its interfaces for June, July, and August,(84) despite the fact that one interface was so severely degraded for several days in August that at least one CLEC could place only a fraction of the orders it usually submits.(85) This discrepancy may be due to the fact that BellSouth only reports full interface outages. Excluding service degradation from OSS availability appears to mask the competitive burden placed on CLECs.(86)
BellSouth also has not demonstrated that it supports CLECs' need to build and maintain the interfaces they use to submit orders to BellSouth. In particular, BellSouth's quality assurance testing environment for its interfaces appears inadequate, and its "change management" process for resolving problems affecting BellSouth's interfaces and updates to its systems appears unresponsive to CLEC concerns.
A robust test environment is critical to opening local markets.(87) CLECs need to be able to ensure that their software interfaces interact correctly with the RBOC's interfaces before using them to submit orders for actual customers. A test environment that mirrors the RBOC's production environment, yet is sufficiently independent of the production environment, does not interfere with ongoing actual transactions.(88)
BellSouth offers CLECs two testing environments: the "original" one, in which CLECs test their initial attempts to establish interface connections with BellSouth, and the CLEC Application Verification Environment ("CAVE"), in which CLECs test upgrades to these interfaces.(89) BellSouth created CAVE "to modify the existing testing environment from one that operated in production to [one] that would mirror production[,] a wholly separate non-production testing environment."(90) Despite BellSouth's expressed intention, however, both the original and CAVE test environments remain closely linked to BellSouth's production environment. In particular, BellSouth employs a common service order processor for its production and CAVE test systems.(91) This lack of separation appears to have caused test and production transactions to become mixed up -- BellSouth recently sent more than 1,500 messages related to production orders into WorldCom's test environment.(92) Such incidents highlight the need for BellSouth to implement separate testing and production systems. Links between such systems limit the CLECs' ability to perform robust interface testing and raise their costs of implementing and maintaining these interface connections with the RBOCs.(93)
In addition, CAVE is not currently equipped to permit testing of DSL orders, which means that DSL providers cannot test upgrades to their systems. The Department is aware that BellSouth is upgrading CAVE to permit testing of DSL ordering, but the upgrades will not be available to CLECs until December.(94)
Finally, BellSouth's stated goal in creating CAVE -- that of creating an independent testing environment -- suggests that the prior test environment for upgrades required working in BellSouth's production environment. BellSouth attempted to remedy this defect in the test environment for upgrades, CAVE, but the record contains no indication that the "original" test environment was ever similarly modified. CLECs seeking to turn up interfaces for the first time are the least experienced with BellSouth systems; therefore, their need for an independent testing environment is particularly great.(95) The Commission should assure itself that BellSouth's testing environment supports local competition in Georgia and Louisiana.
Change management is the process by which CLECs and BellSouth determine which OSS changes are needed, and implement those changes in an appropriately prompt and responsive manner so that they do not have a significant negative effect on CLECs' ability to maintain their electronic connections to BellSouth. CLEC complaints about this process abound, many of which appear to be corroborated by questions raised by the KPMG test in Florida relating to BellSouth's prioritization and implementation of change requests.(96) The Department is concerned that the process in place does not appear to prompt efficient implementation of system fixes for known defects in BellSouth's OSS(97) as well as system enhancements desired by CLECs.
Proper analysis of BellSouth performance is important, not only to determine whether local markets are open to competition, but to ensure that once opened they remain open. In reviewing BellSouth's previous section 271 applications, both the Department and the FCC found significant deficiencies in its performance measures.(98) Since that time, the Georgia and Louisiana PSCs have labored to develop more robust sets of measures. The fruits of this labor are still relatively new in both states. The Georgia PSC ordered BellSouth to undertake a broad expansion of the Georgia measures in January 2001.(99) In early April 2001, BellSouth implemented the Georgia PSC's order by revising its service quality measurement ("SQM" ) plan to detail the business rules to be used to define the more robust measures(100) and it began collecting data pursuant to the new rules as of May.(101) The Louisiana PSC ordered BellSouth to undertake a similar broadening of its measures in May 2001.(102) BellSouth filed the business rules designed to implement the Louisiana PSC's order in June,(103) and began collecting performance data pursuant to the Louisiana measures this past July.(104)
The newness of these performance measures has had two-interrelated consequences. First, BellSouth has asked both the Louisiana PSC and the FCC to evaluate its performance for Louisiana using data that is reported pursuant to the Georgia measures.(105) Second, although BellSouth has been working hard to implement the changes to the Georgia measures, a number of problems have arisen, which is not surprising given the magnitude of the endeavor and the brief period within which BellSouth has sought to complete it. These problems cause concern about the reliability of the performance that BellSouth reports in this application for both Georgia and Louisiana.
Performance measures are reliable if the measures are "meaningful, accurate and reproducible."(106) The Department and the FCC place great weight on performance data in evaluating the actual commercial experience of BellSouth's competitors. Moreover, the establishment of reliable performance benchmarks (i.e., performance consistently achieved and therefore presumably achievable in the future) before the FCC approves an application increases the probability that regulators will be able to ensure that the RBOC continues to provide services at levels such that CLECs will have a meaningful opportunity to compete.(107)
Problems with BellSouth's performance data have been identified by CLECs,(108) KPMG, the Department, and BellSouth itself.(109) The Department does not know the genesis of all BellSouth's metric problems, which relate to data collection, data handling, and metric calculation, but a large number appear to relate to the significant amount of software code BellSouth has had to write (or rewrite) in order to capture and report the data as required by the new Georgia performance measures. In the course of making these software changes, BellSouth in some instances neglected to include categories of data that should be included.(110) BellSouth also seems to have made coding errors that have caused the data to be processed incorrectly.(111) Sometimes BellSouth's attempts to correct such errors have introduced new coding errors into the software.(112) For many of the measures that relate to the timeliness of BellSouth's performance, BellSouth has had problems in determining when and where to measure appropriate start and stop times.(113) Although KPMG has not yet completed auditing BellSouth's current measures, information gleaned from its earlier Georgia audits and the ongoing testing in Florida indicates that at times BellSouth has problems systematically collecting and processing the data underlying its measures.(114)
BellSouth is to be commended for the fact that each time a problem is discovered it works to revise the software code and restates the affected performance, although sometimes these restatements are issued months after the original performance reports. The Department recognizes that the relative effects of each of these deficiencies may vary and that in some cases the problems may have caused BellSouth to understate rather than overstate its performance.(115) However, this pattern of restatements -- particularly those that result in changes to the performance on which BellSouth relies in this application(116) -- makes it difficult to conclude that these data accurately depict BellSouth's performance and can be relied upon to establish benchmarks for future performance.
Recent experience with BellSouth's flow-through metrics demonstrate how all of these problems can affect a single measure. In June, problems with implementing software code to calculate due dates caused many orders that otherwise would have flowed through to fall out for manual processing and to be mischaracterized as "planned manual fallout."(117) Consequently, affected orders were erroneously excluded from BellSouth's Percent Flow-Through metrics. BellSouth recalculated its June and July metrics, and included them in this application.(118) BellSouth subsequently determined that the restated data inadvertently included KPMG test orders and, in turn, this resulted in BellSouth recalculating its flow-through metrics for June, July, and August.(119) BellSouth also discovered that it had introduced a coding error in July when adjusting a software script associated with counting electronic notices to CLECs confirming the cancellation of orders (so-called "dummy FOCs").(120) BellSouth then found it had counted certain orders as manually handled that had, in fact, flowed through ("TSIGNOUT"), and attempted to correct the problem in August by modifying another software script.(121) This script modification was inaccurate in that it excluded too many orders, effectively introducing a problem far greater than the one it was intended to correct, so BellSouth removed the script, leaving the smaller error in place until it could determine how to correct it.(122) As a result of these recalculations, BellSouth re-filed its flow-through metrics on October 9, 15, and 30.(123) Finally, BellSouth realized that it had failed to build the necessary links from its OSS platform that processes xDSL orders to its system that processes metrics data, and thus erroneously excludes xDSL orders from its flow-through measures.(124) BellSouth has not yet created an electronic means to include these orders in its flow-through metrics, and has stated that it would manually include such orders in its September flow-through results.(125)
In addition to these types of problems, which affect the accuracy of BellSouth's performance reports, the Department is concerned about the validity of a number of measures that should be revised to provide regulators and competitors with meaningful performance data. These measures include those pertaining to OSS availability,(126) rejected orders,(127) flow-through rate,(128) jeopardy notices,(129) hot cut timeliness,(130) order completion interval,(131) and trunk group performance.(132)
The Department is optimistic that as BellSouth gains experience with these new measures, problems will continue to be addressed as they are identified and fewer problems will arise. The Department commends the Georgia and Louisiana PSCs for their demonstrated commitment to creating useful performance data and for their continued involvement in examining BellSouth's performance reporting. With their efforts, the Department expects these metrics issues will be resolved in the near future. In the interim, the Commission should assure itself that it can be confident of the reliability of any performance data that the Commission believes are material to its review.
Opening local markets to competition has proved to be a long and difficult process, requiring great efforts on the part of the RBOCs, CLECs, and state commissions, among others. Under the guidance of the state commissions in Georgia and Louisiana, BellSouth has made significant strides over the past three years in doing the work required to fully and irreversibly open the markets in these states. Although the Department believes that the record adequately demonstrates that local markets in Georgia and Louisiana are fully and irreversibly open to competition for resale and fully facilities-based competitors, serious questions remain regarding the extent to which BellSouth's OSS are adequate to support entry by UNE competitors -- those providing service using the UNE-platform or UNE-loops, particularly DSL loops. These questions preclude the Department from supporting this joint application on the basis of the current record. The Department does not, however, foreclose the possibility that the Commission may be able to determine that these concerns have been adequately addressed prior to the conclusion of its review. The Department urges the Commission to give careful attention to the issues raised in this Evaluation.
November 6, 2001
Certificate of Service
I hereby certify that I have caused a true and accurate copy of the foregoing Evaluation of the United States Department of Justice to be served on the persons indicated on the attached service list by first class mail, overnight mail, hand delivery or electronic mail on November 6, 2001.
1. Pub. L. No. 104-104, 110 Stat. 56 (1996) (codified as amended in scattered sections of 47 U.S.C.).
2. In November 1997, BellSouth filed with the FCC its first application pursuant to section 271 for authorization to provide interLATA service in Louisiana. The Department's Evaluation concluded that BellSouth failed to demonstrate (1) that it was offering access to the unbundled network element platform ("UNE-platform"), (2) the ability to provide nondiscriminatory access to its operations support systems ("OSS"), and (3) that all of its prices (including geographic deaveraging, and rates for collocation and vertical features) were cost-based. In addition, the Department found that BellSouth lacked adequate performance measures. DOJ Louisiana I Evaluation at iii-iv, 4, 10-11, 27-28. The Department's Evaluation of BellSouth's second Louisiana application, filed in July 1998, concluded that BellSouth had improved access to its OSS and that competitive entry by facilities-based carriers and resellers had increased. DOJ Louisiana II Evaluation at 3-4. The Department noted, however, that CLEC market penetration remained modest and that there was still virtually no UNE competition. Id. at 8. The Department also noted that most of the barriers to UNE competition identified in the DOJ Louisiana I Evaluation were still in place. Id. at 3-4, 26. The FCC denied both applications. FCC Louisiana I Order ¶ 1; FCC Louisiana II Order ¶ 1.
3. LA PSC Evaluation at 40-41. The Louisiana PSC stated that it would monitor all performance results during the six-month review process, and if necessary, take action prior to the conclusion of that review. Id. at 41. The Louisiana PSC also stated that several of the issues raised by CLECs would be addressed through various continuing proceedings (i.e., six-month interim review, third-party audit and/or the Self-Executing Enforcement Mechanism ("SEEM")). Id. at 30, 40-42, 45. In particular, the Commission stated that BellSouth's performance needed improvement on several measures, including Order Completion Interval (resale and UNE), Reject Interval (mechanized resale and UNE), FOC & Reject Response Completeness (mechanized and partially mechanized resale and UNE), Percent Flow-Through Service Requests, Percent Provisioning Troubles within 30 Days (UNE loop/port combination), Average Completion Notice Interval (UNE loop/port combination), and Percent Repeat Troubles within 30 Days (xDSL). Id. at 40. Moreover, the Louisiana PSC "believes that such improvement will occur as the result of implementation of the SEEMs plan." Id. Likewise, the Georgia PSC indicated that several issues would continue to be addressed through six-month reviews, third-party audits, and monthly performance reports. GA PSC Comments at 126, 129, 131 (addressing service order accuracy, change control, and the provisioning troubles within 30 days). The Georgia PSC also stated it would continue to monitor the average response interval for customer service records ("CSRs"), and other "problem areas" KPMG identified. Id. at 90, 126, 129.
4. See BellSouth GA PSC Tr. Ex Parte at 3-4, 13 (requiring BellSouth to implement fully fielded and parsed CSRs, a single-order process for ordering the UNE-platform, migration of customers by telephone number, a 30-day time limit to respond to rejected orders, and electronic ordering of line splitting); LA PSC Evaluation at 12 & Attach. 1 at 4-5 (requiring BellSouth to implement parsed CSRs and a single-order process for UNE-loop combinations).
5. GA PSC Comments at 1-23; BellSouth GA Varner Aff. ¶¶ 9-18.
6. GA PSC Comments at 1-2; BellSouth GA Varner Aff. ¶¶ 16-18.
7. GA PSC Comments at 12-14.
8. See FCC Louisiana II Order ¶¶ 92-93; FCC Louisiana I Order ¶ 22; DOJ Louisiana II Evaluation at 26-27; DOJ Louisiana I Evaluation at 18-19.
9. GA PSC Comments at 12-14.
10. Id. at 113; GA PSC Third-Party Testing Petition Order at 1-2.
11. GA PSC Third-Party Testing Petition Order at 2.
12. Key areas omitted include testing of the systems for electronic ordering of xDSL-related loops and line sharing; the LENS interface, which is used to place the majority of CLEC orders; the most recent ordering system, OSS99 (an older version was used instead); documentation and support related to the design and development of CLEC interfaces; maintenance and repair and billing work centers; and general support processes, such as for establishing accounts, collocation processes, or training account team personnel. BellSouth Stacy Aff. ¶¶ 602, 606-07; AT&T GA PSC Post-Hr'g Comments at 29-31; Covad Comments at 4-5, 8-9.
13. KPMG GA MTP Final Report at II-2 - II-3; see also FL PSC OSS Testing Order at 6-7 (raising concerns about the independence of the Georgia OSS test). The Department recognizes that the Supplemental Test Plan ("STP") was drafted by KPMG. KPMG GA STP Final Report at II-3 n.3.
14. AT&T GA PSC Comments at 8-9. Similarly, the Department notes that a number of performance-related criteria were deemed satisfied even where performance did not meet established Georgia PSC standards. See, e.g., KPMG GA MTP Final Report at IV-A-18 - 19 & n. 25 (PRE-1-3-8), IV-D-10 (PRE-4-3-1), IV-E-10 (PRE-5-3-2), and V-J-17 (O&P-10-3-8). The Department is also gravely concerned by BellSouth's admission that it did not process test orders as it would have during the normal course of business. Rather, these orders were identified as test orders and processed with special management supervision. BellSouth Stacy Aff. ¶ 452-54. Such actions should not be condoned as they undermine the integrity of the Georgia test results as a whole. The Georgia PSC noted that BellSouth submitted performance data from all nine states in its region which indicated that whatever priority was given to manual orders in Georgia and Florida was short-lived and caused very little disparity in BellSouth's actual performance among states. GA PSC Comments at 122-123 n.35.
15. BellSouth Stacy Aff. ¶ 564.
16. LA PSC Evaluation at 1-10.
17. Id. at 25-30.
18. FL PSC OSS Testing Order at 3-6.
19. Id. at 6-7.
20. See BellSouth Stacy Aff. ¶¶ 595-652 & Attach. 81.
21. See DOJ Pennsylvania Evaluation at 3-4 ("The Department first looks to actual competitive entry, because the experience of competitors seeking to enter a market can provide highly probative evidence about the presence or absence of artificial barriers to entry. Of course, entry barriers can differ by types of customers or geographic areas within a state, so the Department looks for evidence relevant to each market in a state." (Footnote omitted.)).
22. See, e.g., DOJ Missouri I Evaluation at 6-7 ("The Department presumes that opportunities to serve business customers by fully facilities-based carriers and resellers are available in Missouri, based on the entry efforts reflected in SBC's application. There is significantly less competition to serve residential customers. There also is less competition by firms seeking to use UNEs, including the UNE-platform, and there are some indications that a failure by SBC to satisfy all of its obligations may have constrained this type of competition." (Footnotes omitted)).
23. Unless otherwise noted, all line counts and shares stated in this section are based on the BellSouth September Line Counts Ex Parte.
24. Estimated market share will vary depending on the methodology used to estimate facilities-based lines. BellSouth offers two sets of calculations, see BellSouth Wakeling Aff. ¶¶ 11-18, and the Department relies on the one based on E-911 database entries, because (1) the database is constructed for an independent purpose, and (2) it includes only CLEC switched lines, which is consistent with BellSouth's inclusion of only switched lines in its total line counts (i.e., both the numerator and denominator include the same types of lines). In any event, the effective difference between the two methods is small, resulting in estimates that CLECs serve between 17.2 (Method 1) and 17.7 (Method 2) percent of Georgia lines, and between 9.5 (Method 1) and 8.4 (Method 2) percent of Louisiana lines.
25. BellSouth September Line Counts Ex Parte; BellSouth Wakeling Aff.
26. Some facilities-based CLEC lines that are not being provided over UNE-loops are, in fact, being provided over special access lines from BellSouth. The record does not contain data necessary to quantify the extent to which CLECs are providing facilities-based service via these special access lines.
27. One might have worried that the high number of facilities-based CLEC lines tilts the scale against BellSouth. Subtracting CLEC facilities-based lines from the total lines in the BellSouth service area for each category only makes a noticeable difference for business lines, where the penetration of UNE-platform and resale would then be 5.2 percent and 1.5 percent, respectively.
28. Although AT&T's cable offering in Atlanta is a promising development, the degree to which residential customers throughout Georgia can benefit from such a service is questionable. There is no indication in the record that higher cost rural customers will receive such service.
BellSouth's application presents no data on the extent to which wireless telephony may be a substitute for fixed wireline residential service in Georgia. Measures of demand elasticities for wireline service and cross elasticities between wireline and mobile wireless services may shed light on this issue. Mobile phones, which were introduced as a business tool, have become mass market consumer devices nationwide and, for younger consumers, the preferred method of communication. FCC CMRS Report at 32. The FCC reports that only 3 percent of wireless customers use mobile phones to the exclusion of wireline service. Id. The ubiquity of mobile phones appears to be having the greatest effect on payphone service, which is expected to shrink as wireless penetration increases. Id. at 32 n.211.
29. DOJ New York Evaluation at 11-12.
30. BellSouth Wakeling Aff. ¶ 27.
31. Questions have been raised about whether BellSouth must resell DSL services in order to comply with section 251(c)(4) of the 1996 Act. See ASCENT Comments at 6. The Department defers resolution of this issue to the Commission.
32. Unless otherwise noted, all line counts and shares stated in this section are based on the BellSouth September Line Counts Ex Parte.
33. BellSouth September Line Counts Ex Parte; BellSouth Wakeling Aff.; BellSouth Line Counts Errata Ex Parte.
34. Moreover, the Department is aware that a non-trivial number of residential facilities-based lines in Louisiana are serving test rather than paying customers.
35. KPMG's Florida OSS test has identified concerns similar to those that CLECs have raised about BellSouth's billing in comments filed on this application. For instance, KPMG Florida has opened an exception to address incorrect wholesale billing rates, an issue also raised in CLEC comments on this application. KPMG FL OSS Test, Exception 62 at 1-2; Mpower/Network Plus/Madison River Comments at 17-18 (alleging incorrect billing for DSL and collocation); El Paso/Pacwest/US LEC Comments at 35-36 (claiming incorrect billing for interconnection).
36. FCC Michigan Order ¶¶ 129-32 (citing FCC Local Competition Order).
37. See, e.g., AT&T Comments at 22-24; AT&T Bradbury Decl. ¶¶ 57-58; Birch Comments at 16-17; Covad Comments at 11; WorldCom Comments at 15; WorldCom Lichtenberg Decl. ¶¶ 51-52.
38. See e.g., AT&T Bradbury Decl. ¶¶ 72, 73, 88, 89; Birch Comments at 16-17, 20-21; Covad Comments at 15-17; WorldCom Comments at 15-21.
39. BellSouth Stacy Aff. ¶ 294.
40. Id. ¶¶ 291, 314. For UNE-platform competitors, a troubling cause of designed fall out is that BellSouth's systems are designed not to complete processing of orders where the customers are subscribers to its voice-mail system. WorldCom Comments at 19-20.
41. BellSouth Flow-Through I Ex Parte at 7; see also infra notes 117-25 and accompanying text.
42. BellSouth September GA PMs Ex Parte at 42 (PM O-3: Percent Flow-Through Service Requests--Achieved) (UNE flow through of 68.8 percent).
BellSouth excludes from the flow-through calculations orders that fall out, but are rejected for CLEC error. See BellSouth GA Varner Aff. Attach. 1 at 2-5 - 2-6; see also WorldCom Comments at 18 n.13 (when an order falls out and a BellSouth service representative then finds an error in the address, the order is rejected and not counted against BellSouth's flow-through performance even if the address error alone would not have caused the order to fall out). A significant number of all rejected UNE orders are manually processed. See BellSouth August GA PMs Ex Parte (PMs O-7: Percent Rejected Service Requests (all UNE disaggregations), O-13: LNP-Percent Rejected Service Requests (all UNE disaggregations)).
BellSouth asserts that its flow-through numbers are roughly comparable to the rates Verizon reported on its successful section 271 applications pertaining to Massachusetts and Pennsylvania. BellSouth Flow-Through III Ex Parte at 2. This comparison, even if true, does not address the extent to which BellSouth's manual processing negatively affects CLECs.
43. Other CLECs have participated in collaboratives addressing "line splitting" which would allow them to offer DSL Internet access service to UNE-platform customers when the system development necessary to support such service becomes available. See BellSouth Williams Aff. ¶¶ 14-16.
44. BellSouth requires that orders of UCL-ND and UDC/IDSL be placed manually using a facsimile machine. Covad Comments at 12; Covad Attach. E at 11; see also NuVox/Broadslate Comments at 6-7. But see BellSouth Stacy Aff. ¶ 277 (asserting that Covad has not requested UCL-ND electronic ordering in change management). In addition, when orders for DSL include a request for conditioning, or when the CLEC has not been able to get a reservation identification number ("RESID") from the preordering data base for a new xDSL loop, or when line sharing is done with a CLEC-owned splitter, BellSouth requires the orders to be submitted manually. Covad Comments at 12, 24-25; Covad Attach. E at 11; cf. BellSouth Stacy Aff. ¶¶ 329-30 (list of retail products BellSouth handles manually for itself does not appear to include DSL). This primitive faxing of orders also appears to be necessary for some common order types used by UNE-platform providers, the very competitors trying to provide residential service to increasing volumes of customers. See AT&T Seigler Decl. ¶ 59 n.13 (LENS cannot process UNE-platform orders for moves or changes of address); see also KPMG FL OSS Test, BellSouth Response to Observation 87 at 1.
45. Covad Comments at 15-17
46. Id. at 17-18 (functions include loop order status, on-line error correction abilities and on-line, real-time jeopardy notifications).
47. FCC Michigan Order ¶ 137 & n.332 ("'[A]n incumbent that provisions network resources electronically does not discharge its obligation under section 251(c)(3) by offering competing providers access that involves human intervention, such as facsimile-based ordering.'" (citing FCC Local Competition Order ¶ 523)).
48. Birch Comments at 1, 7-8.
49. Id. at 12-13 (challenging the integrity of BellSouth's reported flow-through data). Birch finds this especially troubling since the vast majority of its orders are for simple POTS (plain old telephone service). Id. at 17.
50. BellSouth Br. at 76. CLECs challenge the integrity of BellSouth's timeliness metrics. See Birch Sauder Decl. ¶¶ 19-22 (documenting receipt of multiple FOCs and asserting that "timeliness data reported by BellSouth also requires scrutiny"); NuVox/Broadslate Comments at 5 (claiming that over 20 percent of its LNP orders are not captured in the FOC and Reject timeliness metrics); WorldCom Comments at 8-13 (documenting missing notifiers and delays in re-transmitting missing notifiers).
51. BellSouth missed by a wide margin almost all of the order accuracy performance standards for UNEs in June and July in both Georgia and Louisiana. For example, order accuracy for UNEs in Georgia ranged from a low of 38 percent to a high of 71 percent in July; in August, although BellSouth satisfied two of the submetrics, the others ranged from a low of 64 percent to a high of 89 percent. BellSouth August GA PMs Ex Parte at 17 (PM P-11: Service Order Accuracy (Design (Specials) and Loops (Non-Design)). Similarly, in Louisiana, the rate in July ranged from 33.3 percent to 75 percent. BellSouth LA Varner Aff. Attach. 16 at 35-36 (PM P-11: Service Order Accuracy (Design (Specials) and Loops (Non-Design)). In addition, in the cover letter accompanying its final report to the Georgia PSC, KPMG noted three areas in the ordering and provisioning category that had not been satisfied, one of which was the "accuracy of translation from external (CLEC) to internal (BellSouth) service orders resulting in switch translation and directory listing errors." KPMG GA Order Accuracy Letter at 2. KPMG concluded that BellSouth's unsatisfactory results in these areas could adversely affect a CLEC's ability to compete. Id. Furthermore, in its Florida test, KPMG issued an exception in which it concluded that "BellSouth's systems or representatives have not consistently provisioned service and features as specified in orders submitted by KPMG Consulting." KPMG FL OSS Test, Exception 112 at 1 (of the 190 CSRs that KPMG analyzed in Florida, BellSouth updated only 54 percent accurately).
52. Birch Comments at 16-17. Examples of errors on BellSouth's internal service orders include "omitting vertical features, incorrectly arranging hunt groups, assigning incorrect PIC codes, and in some cases omitting one or more of the telephone lines addressed on the Birch LSR." Birch Sauder Decl. ¶ 23.
53. Birch Comments at 19-20.
54. Id. at 16-17.
55. BellSouth Br. at 81. BellSouth is taking steps to improve its service order accuracy performance, including retraining service representatives. Id. at 82; BellSouth GA Varner Aff. ¶ 153. In September, BellSouth's performance improved, but it still missed all of the UNE service order accuracy measures. BellSouth September GA PMs Ex Parte at 34 (PM P-11: Service Order Accuracy (Design (Specials) and Loops (Non-Design)). But see NewSouth Comments at 3-5 (commenting favorably on significant improvements in the training of BellSouth personnel as well on recent improvements in the process for UNE-platform ordering and provisioning).
56. BellSouth Br. at 81; BellSouth GA Varner Aff. ¶ 153; BellSouth Stacy Aff. ¶ 445.
57. Birch Comments at 20-21.
58. WorldCom Comments at 16-17; WorldCom Lichtenberg Decl. ¶ 29 (5 percent of manually processed rejects in September were invalid rejects, and another 11 percent were rejected for reasons WorldCom could not determine.); see also AT&T Seigler Decl. ¶ 19 (more than 14 percent of improper LSR rejects received by AT&T caused by BellSouth service representatives' mistakes). In its Georgia test, KPMG also identified problems with erroneous or inaccurate rejects which BellSouth ultimately attributed to handling errors by service representatives. BellSouth Stacy Aff. ¶¶ 497-501. BellSouth claims that efforts to retrain these representatives should prevent such errors from causing any materially adverse impact on local competition. Id. ¶ 501.
59. See WorldCom Lichtenberg Decl. ¶¶ 29-31, 54. The ability of CLECs to resubmit mistakenly rejected orders (or those whose rejection codes are difficult to decipher) has been hindered by BellSouth's practice of canceling the order after 10 days. WorldCom Comments at 29. Recognizing the difficulties created by this 10-day time frame, the Georgia PSC ordered BellSouth to extend the time CLECs may take to respond to rejected orders to 30 days when it approved BellSouth's Georgia section 271 application. BellSouth GA PSC Tr. Ex Parte at 3, 13. BellSouth was to implement this change by November 3, 2001. Id.
60. If BellSouth's ability to process electronic orders is more limited than it asserts, increasing order volumes are likely to exacerbate the extent to which processing is dependent on BellSouth's manual processes. KPMG Georgia's capacity test provides little evidence about BellSouth's ability to process high volumes of orders electronically. In Georgia, KPMG conducted the majority of volume testing in a separate test environment. BellSouth Stacy Aff. ¶ 584. At the Georgia hearing on the test, KPMG admitted that results obtained in the test environment do not assure that the production systems will perform to Georgia PSC standards. GA PSC OSS Hr'g Tr. at 226-27. BellSouth has since augmented the capacity of its production systems because the Florida test requires that capacity testing be done in production systems. See BellSouth Stacy Aff. ¶ 594; BellSouth Volume Test Ex Parte at 4. However, KPMG has suspended Florida capacity testing because issues it has identified apparently prevent the test from proceeding at this time. BellSouth Volume Test Ex Parte at 2-3.
61. Pursuant to the BellSouth performance plan, even orders that are supposed to flow through are given a substantially longer interval to return the order confirmation if, for any reason, BellSouth processes the order manually. The reject interval benchmark for mechanized orders is 97 percent within 1 hour, whereas for partially mechanized orders it is (as of August 1) 85 percent within 10 business hours and for non-mechanized orders it is 85 percent within 24 business hours. BellSouth GA Varner Aff. Attach. 1 at 2-24. Similarly, the FOC timeliness benchmark for mechanized orders is 95 percent within 3 hours, for partially mechanized orders (as of August 1) 85 percent within 10 business hours, and for non-mechanized orders 85 percent within 36 business hours. Id. at 2-27.
62. See, e.g., AT&T Comments at 22; AT&T Bradbury Decl. ¶ 139. Although the FCC has found that a longer interval to return manually processed orders where there is no retail analog is not discriminatory, FCC New York Order ¶ 160, the competitive effect of a longer manual processing interval is particularly acute in the context of residential and non-complex business orders for which customers expect service providers to be able to quickly and accurately track the status of their orders. Moreover, granting the RBOC a longer period to process electronically submitted orders that fall out for manual handling (whether by design or due to system errors) does not encourage the RBOC to increase flow-through. By contrast, mandating a standard response interval for all electronically submitted orders creates such an incentive. Cf. FCC Texas Order ¶ 171-73 (discussing standard 5-hour interval for all electronically submitted FOCs). BellSouth should be able to meet such a standard. According to a PriceWaterhouseCoopers study commissioned by BellSouth, it takes BellSouth's service representatives approximately 15 minutes to input orders into BellSouth's ordering OSS. BellSouth Stacy Aff. Attach. 86 ¶ 18.
63. See, e.g., Birch Comments at 22 ("These errors have caused Birch to adjust its internal processes to perform in a more manual mode than necessary, rendering Birch's provisioning group needlessly inefficient and Birch's mechanized ordering systems almost useless. The end result is that Birch cannot provision service to its end users as quickly and sometimes accurately as BellSouth due to BellSouth's system errors and/or personnel errors, thus creating a lack of parity between Birch and BellSouth retail.").
64. AT&T Seigler Decl. ¶ 10; WorldCom Comments at 3-8.
65. WorldCom Comments at 3-4 ("For a new local competitor, nothing is more critical to maintaining dial tone for its customers."); WorldCom Lichtenberg Decl. ¶ 42 (at least 8 percent of WorldCom customers who have lost dial tone have switched to another carrier, many shortly after experiencing that disruption of service); AT&T Seigler Decl. ¶¶ 38-41 ("The result is a business customer who lost dial tone after purchasing AT&T's service with the customer (and other potential customers with whom that customer communicates) holding AT&T responsible for the outage even though BellSouth caused the problem.").
66. See BellSouth Ainsworth Aff. ¶ 60 (BellSouth processes two internal service orders to provision a single UNE-platform order.). For the service orders to be completed simultaneously a Reuse Related Service Order ("RRSO") code must be placed on both and a Sequence FID code must be placed on one of the orders. WorldCom Lichtenberg Decl. Attach. 9 at 0937-39.
67. AT&T Seigler Decl. ¶¶ 17, 39-40; WorldCom Comments at 5. WorldCom asserts that when an order falls out for manual handling, the possibility that the necessary codes to associate the two orders will not be entered is particularly high. WorldCom Comments at 5 (citing BellSouth Pate Testimony in AL PSC Proceedings at 36, 46-47).
68. BellSouth Ainsworth Aff. ¶¶ 59-61; see, e.g., FCC Texas Order ¶¶ 198-99 (determining that documented lost dial tone problems due to SBC's three-order process disassociation were "very rare," less than 1 percent). The Georgia PSC concluded that the outage rate was less than 1 percent based on data from early summer. GA PSC Comments at 135 (although "any loss of dial tone is regrettable, two instances of lost dial tone out of 3,400 UNE-platform conversions (or .0006 percent) does not indicate a systemic problem" (referring to WorldCom's experience with its launch of residential service in Georgia)).
69. WorldCom Lichtenberg Decl. ¶¶ 41-42; see also AT&T Comments at 13 ("Even at today's low volumes of CLEC orders, nearly 8 percent of AT&T's Georgia UNE-P business customers are losing service and experiencing other troubles during and after migration as a result of BellSouth's UNE-P provisioning deficiencies."); AT&T Seigler Decl. ¶ 17. WorldCom asserts that 536 of its customers lost dial tone within 10 days of migration and 1,214 lost dial tone within 30 days. WorldCom Comments at 3-4; WorldCom Lichtenberg Decl. ¶ 42 (reporting 419, 639, and 771 outages in July through September, respectively -- including customers who had called in by Sept. 23 -- as daily order volumes increased during the same period).
70. GA PSC Comments at 135.
71. Id. at 135 (ordering implementation by January 5, 2002); LA PSC Evaluation at 53-54 (ordering implementation no later than April 1, 2002).
72. BellSouth GA PSC Letter at 1.
73. WorldCom Lichtenberg Decl. ¶ 25.
74. BellSouth provides CLECs with parsed address information in its Regional Street Address Guide ("RSAG") database. BellSouth Stacy Aff. ¶ 198. BellSouth claims that CLECs can populate this information directly in the service order. Id. ¶ 199. This feature, however, has not enabled WorldCom to reduce its address related rejects. See WorldCom Comments at 26 ("WorldCom should not be receiving address rejects if BellSouth is properly parsing the information in RSAG and then editing the orders WorldCom transmits against RSAG."). BellSouth also claims that it provides CLECs with the ability to parse information from customer service records ("CSRs"), BellSouth Stacy Aff. ¶¶ 220-26, but two CLECs challenge this assertion, AT&T Bradbury Decl. ¶¶ 27-40; WorldCom Comments at 22-23. (Parsed CSRs automatically "populate" information supplied by the RBOC's pre-ordering systems on an order form, enabling CLECs to integrate pre-order and ordering.)
75. FCC Texas Order ¶ 178.
76. BellSouth Stacy Aff. ¶ 264.
77. For example, in Georgia, in June, July, and August 2001, BellSouth rejected 44, 48, and 56 percent of all non-mechanized UNE-platform orders, respectively; 49, 34, and 35 percent of all partially mechanized UNE-platform orders during the same periods, respectively; and 17, 13, and 16 percent of all mechanized UNE-platform orders during the same periods, respectively. BellSouth August GA PMs Ex Parte at 6 (PM O-7: Percent Rejected Service Requests). The reject rates for mechanized xDSL were in the low to mid-teens for all three months (although partially mechanized orders had low reject rates). Id. Reject rates were about the same in Louisiana. BellSouth August LA PMs Ex Parte at 5 (PM O-7: Percent Rejected Service Requests). WorldCom Lichtenberg Decl. ¶¶ 20, 22 (discussing rejects caused by mistakes in name and address information that would be irrelevant if TN migration were available; 21 percent of rejected migration orders have been for incorrect name or address), ¶ 24 (WorldCom's reject rates in Georgia are roughly double its reject rate in Verizon and SBC states where TN migration is available).
It would be inappropriate to attribute to BellSouth all errors resulting in order rejection. Moreover, BellSouth notes that it has achieved reasonable reject rates for CLECs in Georgia and Louisiana other than WorldCom, and that when past applications presented similar variations, the FCC dismissed claims that the RBOC was at fault. BellSouth GA Varner Aff. ¶ 110; BellSouth LA Varner Aff. ¶ 125; see FCC Massachusetts Order ¶ 75 (refusing to blame Verizon for relatively high reject rates (43 to 49 percent resale, 21 to 25 percent UNE) because rates "vary widely by individual competing carrier" (from 5 to 83 percent)); see also FCC Kansas/Oklahoma Order ¶ 143 ("[A] wide variation in the individual reject rates suggests that the disparate reject rate may be a function of a competing carrier's experience using the system, rather than the system itself.").
However, not all rejects should be considered outside the control of BellSouth, particularly those caused when addresses on orders submitted by CLECs do not match those in the RBOC address database(s). Such rejects can be avoided for orders that do not depend on sending a technician to the customer's premises to complete provisioning by permitting CLECs to identify orders by telephone number in lieu of address.
78. WorldCom Comments at 23-24, 27-28; WorldCom Lichtenberg Decl. ¶¶ 16-24.
79. BellSouth GA PSC Tr. Ex Parte at 3, 13.
80. BellSouth GA PSC Letter at 2 (BellSouth may not be able to implement TN migration by the Georgia PSC's November 3, 2001 deadline); BellSouth GA PSC Tr. Ex Parte at 4 (assessment of $10,000/day penalty until upgrade implemented).
81. See WorldCom Comments at 24 (claiming BellSouth's documentation is unclear, making it difficult to revise CLEC coding of interfaces for TN migration).
82. Comptel Comments at 9; Mpower/Network Plus/Madison River Comments at 4-5; see also Birch Comments at 30; Birch Wagner Decl. ¶ 8.
83. Id.; see also Mpower/Network Plus/Madison River Comments at 4-5 (as result of outages many orders Mpower submits electronically are processed manually, requiring longer provisioning intervals, revised delivery dates, and disrupted customer schedules). CLECs are also affected by lack of notice of the outage. One exception issued by KMPG in Florida addresses BellSouth's failures to provide notification of all system outages, and to provide them in a timely fashion. KPMG FL OSS Test, Amended Exception 12 at 1-8; see also AT&T Comments at 31.
84. See, e.g., BellSouth August GA PMs Ex Parte at 19 (PM OSS-2: Interface Availability (Pre-Ordering/Ordering)/EDI/Region, LENS/Region, TAG/Region).
85. Birch Comments at 30; Birch Wagner Decl. ¶ 8 (ability to provision orders mechanically depends upon proper operation of TAG; during prolonged TAG failure from August 2-6 Birch was unable to provision 75 percent of normal daily order volume despite working through the weekend). Due to the repeated TAG failures, Birch recently decided to recruit an information technology analyst to manage BellSouth's OSS systems and release management initiatives. Birch Wagner Decl. ¶ 9.
86. Although BellSouth performance reports indicate virtually no downtime, in its application BellSouth states that during July, the LENS system was out of service or providing only degraded service during a total of about 20 hours, or almost 4 percent of total LENS scheduled system availability. BellSouth Stacy Aff. ¶ 353. BellSouth's analysis may understate the extent of the problem because, according to one CLEC, it does not include outages of less 20 minutes. See Birch Wagner Decl. ¶ 6. Birch claims that in June 2001, it experienced more than 30 TAG failures that did not show up in BellSouth's Change Control Outage Report (which only lists failures longer than 20 minutes in duration). Birch Comments at 30; Birch Wagner Decl. ¶ 6.
87. See DOJ New York Evaluation at 35-36.
88. See FCC New York Order ¶ 121 (establishment of testing environment physically separate from production environment remedied major problem identified by KPMG and competing carriers); cf. FCC Texas Order ¶ 133 & n.355 (commenting favorably on current testing environment that is physically separate from production environment, in contrast with that which existed prior to November 1999).
89. BellSouth Stacy Aff. ¶ 152.
90. Id. ¶ 167.
91. See id. ¶ 168. In contrast, Verizon duplicates its system "up to and including the service order processor." Verizon CLEC Testing Homepage, http://www22.verizon.com/wholesale/clecsupport/content/1,16835,east-wholesale-cte-cte,00.html.
92. WorldCom Comments at 42.
93. WorldCom Lichtenberg Decl. ¶ 174 (more than 40 percent of WorldCom's IT resources for local exchange service are spent on BellSouth even though less than 10 prevent of its monthly transaction volume is in BellSouth region); AT&T Bradbury Decl. ¶ 215 & Attachs. 48-50 (BellSouth coding requirements forces CLECs to reprogram their systems or manually enter codes on LSR); see also Birch Comments at 30-33 (BellSouth's refusal to allow Birch to conduct pre-implementation testing of LENS forced Birch to replace its standard mechanical provisioning process with a manual one).
KPMG opened an exception in the Florida test stating that "BellSouth lacks an appropriate process, methodology, and robust test environment for the testing of the electronic data (EDI) interface." KPMG FL OSS Test, Amended Exception 6 at 1. In describing the impact of this deficiency, KPMG explains that deficiencies in environment make it difficult for CLECs to develop defect-free interfaces, and therefore affect their ability to deliver uninterrupted service to customers. Id. at 3. Thus, KPMG asserts that "[t]o facilitate market entry by a CLEC, BellSouth should make available a robust test environment for the EDI interface." Id. at 1.
94. See BellSouth Test Environment Ex Parte at 2-3; see also BellSouth Stacy Aff. ¶¶ 179-80 (CAVE moratorium through December to allow for integration and deployment of new non-LNP systems).
95. AT&T Bradbury Decl. ¶ 222; WorldCom Comments at 41.
96. AT&T Comments at 26-28; Birch Comments at 32-35; CompTel Comments at 5-8; Covad Comments at 30-34; WorldCom Comments at 34-35.
In the Florida test KPMG opened an exception because the "BellSouth Change Control Prioritization Process does not allow CLECs to be involved in prioritizing of all CLEC impacting change requests." KPMG FL OSS Test, Exception 88 at 1. BellSouth responded that the change requests in whose prioritization CLECs are not involved address internal changes that are transparent to the CLECs. KPMG FL OSS Test, BellSouth Response to Exception 88 at 2-3. However, the CLECs have no warning of such changes, which may turn out to be CLEC- affecting ones. As KPMG explained, "[t]his policy inhibits one of the primary objectives of the CCP [Change Control Process, which is] 'to allow for mutual impact assessment and resource planning to manage and schedule changes.'" KPMG FL OSS Test, Exception 88 at 2.
In the Florida test KPMG also opened an exception to address the fact that "[t]he BellSouth IT Team does not have criteria to develop the scope of a Release Package." KPMG FL OSS Test, Exception 106 at 1. KPMG added that "[t]he lack of established and documented development criteria may result in the BellSouth IT team overlooking and/or ignoring important change requests. Important change requests that remain unimplemented prevent CLECs from receiving requested order and pre-order functionality that may allow CLECs to compete more effectively in the local exchange carrier market." Id.
97. Birch Comments at 34 (wait of more than four months for BellSouth to fix problems inadvertently caused by its system software updates preventing CLECs using the LENS interface from accessing important information electronically). The Department notes that such system defects could have been identified before implementation had BellSouth's test environments allowed for CLEC testing of LENS.
98. DOJ Louisiana I Evaluation at 31-33; DOJ Louisiana II Evaluation at 38-40; FCC Louisiana I Order ¶¶ 28, 36, 40-46; FCC Louisiana II Order ¶¶ 91-93.
99. GA PSC PMs Order at 3.
100. BellSouth GA Varner Aff. Attach. 1.
101. Id. ¶ 7.
102. BellSouth LA Varner Aff. ¶ 23; LA PSC PMs Order at 5.
103. BellSouth LA Varner Aff. Attach. 13.
104. BellSouth LA Varner Aff. ¶¶ 30-32.
105. When the Louisiana PSC evaluated BellSouth's performance, BellSouth had not yet started reporting performance data using Louisiana measures. For this reason, the Louisiana PSC relied on Louisiana data formatted using the standards and definitions for performance reporting in Georgia. LA PSC Evaluation at 11 n.8; see also BellSouth Br. at 22; BellSouth LA Varner Aff. ¶¶ 24-34. BellSouth began reporting Louisiana data in the Louisiana format in July 2001. BellSouth LA Varner Aff. ¶¶ 32-33. The Department is concerned that the types of problems now affecting Georgia metrics could also have an impact on the Louisiana-specific measures, particularly in those instances where the Louisiana measures differ from the Georgia measures. Id. ¶¶ 26-29.
106. DOJ Texas I Evaluation at 5. "Meaningful metrics require clear definitions that will allow measurement of activities or processes in a way that has real-world, practical significance. Accurate metrics are faithful to established definitions in that they are correctly calculated from the proper subset of raw data using processes that ensure the data are accurately handled and transferred. Reproducible metrics can be reproduced at future dates for verification purposes because the raw data have been archived for an appropriate period in a secure, auditable form and because changes to the systems and processes used for gathering and reporting metrics are carefully controlled and fully documented." Id. at 5-6.
107. DOJ Schwartz Suppl. Aff. ¶¶ 36-40. See infra notes 126-32 and accompanying text.
108. BellSouth states that no CLEC complaints have been lodged pursuant to the metrics oversight procedures established by the Georgia and Louisiana commissions. See, e.g., BellSouth GA Varner Aff. ¶ 40. It is clear, however, that before BellSouth filed this application with the FCC, CLECs repeatedly raised issues about the metrics' accuracy with BellSouth, with KPMG during OSS and metric tests, and with representatives of the Georgia and Louisiana PSCs. See, e.g., AT&T GA PSC Norris Aff. ¶¶ 14-20, 27-37, 45 & Attachs. SEN-3, SEN-4, SEN-5, SEN-6, SEN-8, SEN-9, SEN-10 (correspondence between AT&T and BellSouth relating to performance data inconsistencies (beginning Feb. 2001)); AT&T LA PSC Norris Aff. ¶¶ 11-20, 25-35 & Attachs. SEN-3, SEN-4, SEN-5, SEN-6, SEN-8 (same); Covad GA PSC Davis Aff. ¶¶ 17-22 & Attach. 5 (correspondence between Covad and BellSouth regarding Covad-specific data (May 2001)); Covad LA PSC Davis Aff. ¶¶ 25-27 & Attach. 8 (same); NuVox/Broadslate Campbell Aff. ¶¶ 5-19 (describing discussions with BellSouth regarding NuVox-specific performance data (May 2001)).
109. These problems notwithstanding, BellSouth states that a series of three performance metrics audits in Georgia demonstrate that its data are accurate. BellSouth Br. at 24; BellSouth GA Varner Aff. ¶¶ 388, 418. The Department is not persuaded. First, these audits are not truly cumulative, as Phase III will include the first audit of a significant number of new product disaggregations and newly implemented measures. BellSouth GA Varner Aff. ¶ 422. Second, BellSouth's myriad changes to the inputs of the flow-through metric and the resulting restatements of the performance data suggest that any prior audits of this measure, at least, failed to uncover significant problems.
110. For example, BellSouth erroneously excludes xDSL orders from reported flow-through measures. BellSouth asserts that it will manually include xDSL orders in September flow-through results. BellSouth Flow-Through I Ex Parte at 7. Similarly, BellSouth's calculation for customer trouble report rates for xDSL and line-sharing, which measure network quality in terms of the frequency with which troubles are reported, understated the retail analog by not including all comparable retail data, suggesting that metrics did not provide accurate comparisons of wholesale and retail performance. BellSouth GA Varner Aff. ¶¶ 203-04.
111. BellSouth acknowledges, for instance, that coding errors for BellSouth's FOC and Reject Response Completeness measures make them unreliable: "For mechanized LSRs, this measure understates BellSouth's performance and cannot be relied upon to assess BellSouth's performance. For partially mechanized LSRs, the coding is incorrect and produces inaccurate results." BellSouth GA Varner Aff. ¶ 42. BellSouth asserts that changes are being made to August and September data. Id.; see also id. ¶¶ 51-53 ("[M]inor implementation issues" affect BellSouth's data on Reject Interval and FOC Timeliness and BellSouth "temporarily lost its ability to properly identify and account for multiple submissions of the same LSR CC/PON/VER combination."). Similarly, BellSouth introduced a coding error into its flow-through calculation when it wrote a new software script to include so-called "dummy FOCs" in its flow-through calculation. See BellSouth Flow-Through I Ex Parte at 2, 5. BellSouth also notes that time-stamp data for the LNP Standalone metrics is being counted multiple times, an issue it is still addressing. BellSouth GA Varner Aff. ¶ 273.
112. For example, when moving the time stamp for the start and stop times of the Reject and FOC Timeliness Intervals, see infra note 113, BellSouth reports that it created another as yet unresolved counting problem, BellSouth GA Varner Aff. ¶¶ 51-53, 115, which should result in data that understates its performance.
113. For instance, BellSouth has not been accurately measuring the Reject and FOC Timeliness Intervals; in particular, it did not start the clock when the order was received at the gateway, but rather at a later internal stage of processing, the effect of which is to understate BellSouth's performance. BellSouth GA Varner Aff. ¶¶ 51-53, 115. In addition, BellSouth admits that incorrect start dates were used for measuring CLEC orders that had a critical data field populated, affecting Completion Notice Interval metrics at least through June. Id. ¶ 133. BellSouth represents that the problem, which overstated the interval, was fixed with July data and that it continues to monitor the implementation of this metric. Id. BellSouth also relied on the wrong time stamps to calculate the CLEC interval for the Pre-Ordering Average Response Time measure, an error it corrected as of the July data. Id. ¶¶ 56-57.
114. See id. ¶¶ 407-16 (identifying open exceptions from the on-going Georgia metrics evaluation review by KPMG); KPMG FL OSS Test, Exception 109 at 1-2 (After KPMG determined it could not replicate the data in BellSouth's May 2001 reports for Acknowledgment Message Timeliness, BellSouth discovered that its coding was incorrect and implemented a system fix to resolve the discrepancies.); see also KPMG FL OSS Test, Exception 101 at 1, 3 (After KPMG was unable to replicate BellSouth's reported data in its January 2001 report on Total Service Order Cycle Time, BellSouth attributed the discrepancy to inclusion of pending orders and subsequently scheduled coding changes for implementation as of August data.); KPMG FL OSS Test, Exception 113 at 1 (KPMG found that BellSouth fails to capture xDSL transactions processed through the Corporate Order Gateway in its flow-through measures); KPMG FL OSS Test, Exception 114 at 1-2 (KMPG found that BellSouth incorrectly excludes data from calculation of FOC Timeliness measures for fully and partially mechanized orders).
115. In the Department's view, BellSouth cannot ignore errors that result in reported performance being worse than actual performance. In order to establish effective benchmarks that readily can be used to hold an incumbent to an appropriate level of wholesale performance, metrics must neither understate nor overstate actual performance. An incumbent should not be able to argue that it need not maintain a particular level of performance because, due to a problem with the relevant metric, past reports created an inappropriate benchmark.
116. Bell South's application, filed on October 4, is based on performance data for the months of May through July 2001. BellSouth GA Varner Aff. ¶ 7; BellSouth LA Varner Aff. ¶ 7. BellSouth has supplemented the record with August performance data for Georgia and Louisiana, which it filed in October, and with September performance data for Georgia, which it filed in early November. BellSouth August GA PMs Ex Parte; BellSouth August LA PMs Ex Parte; BellSouth September GA PMs Ex Parte.
117. BellSouth Flow-Through I Ex Parte at 2, 4.
118. Id. at 4.
119. Id. (re-filed data also categorized separately for residential, business, and UNE, whereas data filed with this application was for combined residential, business, and UNE).
120. Id. at 5.
121. Id. at 6.
122. See id.
123. BellSouth Flow-Through I Ex Parte at 8; BellSouth Flow-Through II Ex Parte at 3.
124. See BellSouth Flow-Through I Ex Parte at 7; KPMG FL OSS Test, BellSouth Response to Exception 113 at 1-2.
125. BellSouth Flow-Through I Ex Parte at 7.
126. The Percent Interface Availability measures should indicate the degree to which BellSouth OSS are available to receive and process CLEC transactions. BellSouth GA Varner Aff. Attach. 1 at 1-6 (PM OSS-2: Percent Interface Availability). The measure, however, does not reliably depict CLEC experience since it tracks only full outages, id., and excludes instances in which an interface is not totally inoperative but is providing service so degraded as to be practically unusable, Birch Comments at 30; see also supra notes 82-86 and accompanying text.
127. BellSouth's reject measures understate the number of CLEC orders rejected because the business rules permit it to count "auto clarifications" but exclude "fatal rejects." BellSouth GA Varner Aff. Attach. 1 at 2-19. Whether BellSouth labels a response an "auto clarification" or a "fatal reject" depends on which edit-check the order fails, but from a CLEC's viewpoint it is a distinction without a difference: both are rejects and, in each case, the LSR is to be corrected before re-submission. BellSouth's performance data indicate that BellSouth issued 17,062 fatal rejects and 43,852 auto clarifications in August, suggesting that its reject rate metrics for that month fail to account for roughly one fourth of all mechanized rejects. BellSouth SQM Flow-Through Ex Parte, Percent Flow-Through Service Requests Report (August 2001) at 13, 38.
128. BellSouth's flow-through measure may be artificially inflated by its recent decision to include electronic notices that confirm the cancellation of orders do not result in the creation of an internal service order (so-called "dummy FOCs"). BellSouth Flow-Through I Ex Parte at 2-5.
129. BellSouth acknowledges that it has not been measuring Average Jeopardy Notice Interval correctly and that its "calculation does not provide a meaningful measure" because it stops the clock on the final order completion date, not when the order was originally due. BellSouth GA Varner Aff. ¶ 44. CLECs need to know before the due date when an order is in jeopardy. Id.
130. BellSouth's UNE-loop hot cut duration measurement excludes the time it takes to notify the CLEC that the loop has been cut. See BellSouth GA Varner Aff. Attach. 1 at 3-18 (PM P-7: Coordinated Customer Conversions Interval); BellSouth LA Varner Aff. Attach. 13 at 3-16 (PM P-6: Coordinated Customer Conversions Interval). Because the CLEC is not in contact with BellSouth at the time the hot cut takes place, this notification is necessary before the CLEC can send the command that allows the customer to keep the same telephone number. Id. For purposes of this application, BellSouth has provided supplemental data, which includes the notice interval in the duration metric, and these data show that BellSouth provides timely hot cuts. BellSouth Hot Cut Ex Parte at 5-6 (reporting the time it takes BellSouth to notify CLECs of completion of hot cuts). Without continued reporting of such data, however, neither the Georgia PSC, nor the Louisiana PSC, nor the FCC will be able to ascertain whether BellSouth is performing timely hot cuts in the future. Thus, the current duration measure should be revised to include the time it takes to notify CLEC customers that a cut has been completed.
BellSouth measures the quality of hot cuts by reporting provisioning troubles that occur within seven days of a cut's completion and average recovery time, the period necessary to resolve service outages that occur after a cut's completion. See, e.g., BellSouth GA Varner Aff. Attach. 1 at 3-24 - 3-25 (PM P-7C: Hot Cut Conversions -- Percent Provisioning Troubles Received within 7 Days of a Service Order), 3-22 - 3-23 (PM P-7B: Coordinated Customer Conversions -- Average Recovery Time). BellSouth, however, does not measure how many outages occur each month or the proportion of hot cuts that result in these outages. Although the Department believes that such information may be derived from BellSouth's reported data by dividing the number of outages (the denominator of the average recovery time metric) by the total number of hot cuts (the denominator of the hot cut interval metrics), the business rules for the performance data do not so state and any regulatory entity seeking to hold BellSouth to its current levels of performance in this area will have to perform these calculations itself.
131. The Order Completion Interval measures the average time it takes BellSouth to complete an order, from the time BellSouth issues a FOC to the CLEC to BellSouth's completion of the order. BellSouth GA Varner Aff. Attach. 1 at 3-10. Thus, this measure does not capture the service provisioning interval from when a CLEC sends its order to BellSouth to when an order is actually provisioned. Total Service Order Cycle Time does include the FOC interval, but it is not helpful to determine parity completion intervals because it includes the notice to the CLEC that provisioning is complete, a step that is non-existent for BellSouth retail, and because BellSouth does not report corresponding retail data. Id. at 3-31 - 3-33 (PM P-10: Total Service Order Cycle Time).
132. The Trunk Group Performance measures are intended to indicate the quality of service on interconnection trunks that carry traffic from BellSouth's network to a CLEC's network and whether BellSouth manages those trunk groups to keep call blocking at an adequately low level. BellSouth GA Varner Aff. ¶ 100 & Attach. 1 at 9-1 - 9-3; see also BellSouth LA Varner Aff. ¶ 115. The current metric may fail to depict a meaningful, "apples-to-apples" comparison of wholesale and retail call blockage because not all relevant trunk groups are included in the retail calculation. See BellSouth GA Varner Aff. ¶ 105. Recognizing this problem, BellSouth has asked the Georgia commission to add several categories of trunk groups to the "BellSouth affecting" category. See id. ¶ 105. It is critical that the categories be modified so that categories correspond properly and provide meaningful comparisons.