Higher standards regulation in the network age.
TABLE OF CONTENTS I. INTRODUCTION II. CHALLENGES OF NETWORK REGULATION A. The Network Age B. Two Stories About Regulating Complex Networks C. Limits of Current Approaches III. STANDARDS AS REGULATORS A. Functions of Standards B. Government and the Standards Process C. Carterfone and Part 68 IV. THE AGENCY AS STANDARDS CATALYST A. Standards Facilitation B. Case Study 1: Network Management C. Case Study 2: White Spaces V. CONCLUSION
Standardization is regulation. As digital networks proliferate, standardized interfaces will define the economic and normative dynamics of markets. open standards, which are created through participatory processes and available to anyone who chooses to use them, are becoming increasingly important. (1) These developments have very significant implications for administrative law. Any model of networked markets that ignores the influence of standards will be incomplete. Legal scholars have begun to examine the relationship between standards and regulation, but only in scenarios where government either defers to or subsumes private efforts. (2) In so doing, they miss the opportunity to use standards as a regulatory tool.
Standards can define the substantive relationships among competitors and partners and even shape the structure of industries. (3) Theorists of regulation recognize that technical standards can serve a regulatory function. (4) Compliance with a standard, even if not enforced with the threat of governmental sanction, is a restraint on private economic activity. standards therefore exert the same kind of pull on unfettered private action as do competitive forces, which may also lead companies to act in a desirable manner from the standpoint of public policy. standardization is a process of cooperation rather than competition. (5)
Regulators should see themselves as participants in the standards marketplace. Administrative agencies should evolve to emulate the best aspects of the private standards-setting process, where adoption is the most valuable currency. By leveraging the power of open standards, regulators can become more responsive and efficient, while promoting important public interest goals of accessibility, investment, and innovation. (6) In particular, administrative agencies can certify standards as safe harbors, avoiding the problems of both command-and-control regulation and private negotiations. (7)
Two current proceedings at the Federal Communications Commission ("FCC") illustrate the value of a standards-based approach. The first concerns the network management practices of broadband access providers; the second involves unlicensed wireless devices that operate on frequencies adjacent to those used by broadcast television. (8) In each case, the FCC has chosen to adopt rules rather than facilitate standards. Instead of viewing standardization as peripheral to its core mission, the FCC should catalyze adoption of open standards that promote its regulatory objectives.
In leveraging standards to achieve its regulatory goals, the FCC would be in line with a broad trend in administrative law. Agencies increasingly rely on privately developed standards. (9) From accounting to workplace safety, industry standards now define substantive obligations that regulators enforce. (10) While usually seen as a form of deregulation or devolution, regulation by standardization can be an effective means for the FCC to implement a positive agenda. The critical elements for the FCC to consider are the procedural context in which standards are developed and the openness of the standards themselves. (11) By certifying open coordination standards, the FCC would give market participants flexibility to implement the best solutions, while promoting competition and open networks.
This Article explains why standardization can address fundamental regulatory tensions in complex network industries and describes how an administrative agency, such as the FCC, can incorporate standardization into its policymaking. Part II articulates the limitations of traditional regulatory tools and the failure of prior efforts to reform those tools. Part III describes the role of standards in complex network industries and explains how the standards can be incorporated into the regulatory process. Part IV uses the FCC's broadband network management and white spaces proceedings as case studies for the standards-based approach.
II. CHALLENGES OF NETWORK REGULATION
In complex network industries, operators must interconnect with one another to provide seamless service. The same networks also serve as platforms for other providers, such as application and content companies in the case of the Internet. Participants in such industries thus can be both competitors and partners of one another. A company such as YouTube may be dependent on network owners such as Comcast that simultaneously compete with it, regulation in complex network industries is not a zero-sum game. The difficulty of interdependence is accentuated when, as with the FCC, the various networks and services involved are subject to an inconsistent and overlapping patchwork of legacy regulation.
A. The Network Age
The age of networks is upon us. Many industries exhibit a network structure. (12) An airline offers flights (links) connecting airports (nodes). A telephone company routes calls between phones (nodes) across its local and long-distance wires (links). A long-haul trucking firm carries freight between cities (nodes) along roads (links). (13) What distinguishes these industries is that the shape or "topology" of the network has a strong impact on the costs and opportunities of the business. (14) Networks are non-linear, in that there are typically multiple potential paths between two nodes, making the behavior of the networked system surprisingly complicated. (15) Network industries are subject to network externalities, called network effects. (16) A bigger network is more valuable because it provides connections to more users. (17) New customers will rationally choose the network that lets them reach more existing customers, and those existing customers will benefit from reaching the new customers. (18) A self-reinforcing dynamic tends to occur, with the biggest network expanding to the point where it overwhelms all competitors. (19) This is one reason the major network industries--telecommunications, electricity, natural gas, airlines, and trucking--were subjected to extensive regulation. (20) Market forces alone did not produce a competitive environment; they tended to reinforce a trend toward monopolization. Network effects are only one important characteristic of networks. (21) A new interdisciplinary field, network science, studies new-found patterns in networks of all types. (22)
The sociologist Manuel Castells, in his magisterial three-part work, The Information Age: Economy, Society and Culture, explains how global social, political, and economic factors have come together at the dawn of the twenty-first century to produce a "Network society." (23) Individuals, firms, and governments are increasingly interconnected in a global web of relationships. This networked structure becomes the defining paradigm for economic interactions: "[U]nder the new historical conditions, productivity is generated through and competition is played out in a global network of interaction" between business networks. (24)
As Castells explains, the Network Society extends well beyond the industries that have traditionally involved network infrastructure. However, information technology and, in particular, the Internet, are critical enablers of this shift. (25) The communications and media sector, always based on distribution networks, has turned into a heavily interconnected network of networks. (26) Traditionally, each type of communications network was separate from others. (27) Radio networks did not carry video programming, let alone interactive telephone calls, while television networks offered only one-way video distribution. Telephone networks sometimes interconnected with one another, but for most of the twentieth century, AT&T held a monopoly position throughout the bulk of the country. (28)
Today, those boundaries are breaking down. Wired cable television systems compete with over-the-air terrestrial and satellite broadcasters to offer the same television programming, and all of them increasingly compete against video programming delivered over the Internet. The Internet is not tied to any one kind of network. It can run over any wired or wireless infrastructure capable of delivering data packets in the TCP/IP format.
What I will call "complex network industries" have an additional property: services delivered to end-users typically involve multiple horizontal and vertical relationships among independent firms. An airline provides a single integrated travel service across its network of airports. In the more complex network industry of telecommunications, however, all network operators are geographically constrained. Delivering service to a terminating user (such as a recipient of a phone call) typically means handing off to that user's chosen service provider, rather than substituting for that provider. Moreover, what users pay for is increasingly not just a single integrated service offering. The basic connectivity service is a gateway to content, services, and applications, some of which are offered through the network operator and some of which are not.
The Internet accentuates this multi-dimensional environment. Internet transmission infrastructure, such as routers and servers, sits on top of physical communications networks. (29) The Internet itself then serves as a platform for applications such as the World Wide Web and e-mail, services such as search engines and electronic commerce, and content such as news and music. (30)
Consider what happens when a user requests a Web page from a site such as CNN.com. The content is generated by CNN and stored on a server computer located at a hosting facility operated by a dedicated hosting provider. The hosting provider purchases high-capacity connections to Internet backbones. A content delivery network such as Akamai replicates the page on local caches throughout the global Internet. The user connects through an ISP such as Verizon, which may host a local cache that redirects the request and delivers the content. Rich media and two-way services may be even more complex.
There have, perhaps surprisingly, been few efforts to analyze the telecommunications and Internet industries using network models. (31) In a recent book, Daniel Spulber and Christopher Yoo develop a model for telecommunications using the tools of network science. (32) Spulber and Yoo describe interconnection decisions in terms of Coasian tradeoffs between internal and external relationships. (33) Ronald Coase, in his seminal article, The Nature of the Firm, explained that firms organize hierarchically to the extent that the transaction costs of arms-length market relationships exceed the organization costs of internal management. (34)
Following this logic, Spulber and Yoo argue that telecommunications network operators can decide to provide network links themselves or interconnect with other networks. (35) Regulation influences the incentives to expand networks instead of expanding interconnection relationships. The trouble with this model is that it is exceedingly flat. Interconnection is an either/or proposition. And there is only one layer of functionality: the physical infrastructure. In the real world, networks involve not only horizontal connections among functionally equivalent network operators, but also vertical connections among different kinds of providers. (36)
Network industries historically have been subject to significant regulation. (37) one reason is the tendency toward monopolization arising from network effects. (38) Another is that these industries often involve massive fixed costs of infrastructure build-out, giving them attributes of natural monopolies. (39) Historically, railroads, utilities, and telecommunications have been of critical economic significance. Finally, the historical development of these industries often involved explicit government grants of competitive exclusivity or access to government-controlled resources, such as wireless spectrum or local rights-of-way to run cables. (40)
Beyond these historical reasons is a basic justification for regulation of complex network industries. A networked environment implies some underlying infrastructure or platform that forms the basis for the network. That infrastructure may be government-owned, as with the interstate highways that are the basis for the trucking industry, it may be privately held and unregulated, as with Microsoft's Windows operating system, or it may be regulated, as with stock exchanges and the telephone network. Government control of industrial infrastructure is the exception rather than the rule in the united states. For privately owned infrastructure, there is always the possibility of conflicts of interest between the platform owner and platform users.
The goal of public policy in platform industries is to create optimum incentives for both the platform owner and its users. Too many restrictions on the platform would result in insufficient investment in creating an environment that would produce significant value. on the other hand, an unregulated platform monopolist may over-extract rents from the platform ecosystem in a way that reduces the platform's overall value. (41) That was the issue in the Department of Justice's antitrust case against Microsoft. (42) The personal computer industry benefited enormously from the common standard of Windows, but Microsoft allegedly leveraged its control of that standard to limit the competition and innovation that would otherwise have developed. (43) The literature on platform economics recognizes that a platform owner has incentives to encourage activity and investment by others on top of its platform. (44) However, there are many situations where that platform owner fails to act on those incentives. (45)
When there are platform providers supporting higher-level applications and services, three kinds of platform tensions are possible. The first involves the vertical relationship between owners of a platform and those who use it to communicate or provide other functionality. The second involves horizontal interconnection between different platforms, so that users and application providers on each of them can communicate. The third involves horizontal relationships between different users of the same platform. These relationships may or may not be mediated by the platform owner.
Platform regulation issues are particularly acute in connection with the Internet. (46) The Internet is a universal interconnection framework for all digital information networks. (47) As a system for moving interchangeable digital bits, it is inherently malleable. The same infrastructure that carries telephone calls also delivers business e-mails, movies, and any other form of information. The Internet is therefore not one platform, but many. And it is a layered environment, in which every level of functionality depends on the platform layers below it. (48)
A well-developed literature traces the importance of the Internet's "end-to-end" design. (49) The Internet architecture pushes application-specific functionality whenever possible to the edges of the network. The core infrastructure is kept as simple and generic as possible. The job of the Internet's routers is merely to forward data packets toward their destination, subject to some basic, well-publicized algorithms. Because particular functions are not baked into the network, they cannot become limitations when new innovations develop. (50) Application providers can deploy anything they wish across the network, subject only to the minimal requirements of the TCP/IP standard. A central concern of current policy debates is whether broadband network operators are breaking the Internet's end-to-end design and whether the FCC should respond. (51)
The end-to-end principle can be seen as a technical response to platform tensions. The infrastructure platform is told simply to ignore the applications. End-to-end is effectively a separation of the market for moving bits from the markets for doing things with them. The end-to-end approach was extremely successful in the development stages of the Internet. (52) It left room for application and content providers to innovate, while allowing the network operators to focus on gaining subscribers for their access businesses. This open environment, characterized by interconnection, produces extraordinary innovation. (53)
The limitation of the end-to-end perspective is that it treats the network as a black box. The Internet must be "stupid" to allow data to pass freely between endpoints. The pathways in between are seen as unimportant. In reality, those connection points are critical.
B. Two Stories About Regulating Complex Networks
Consider the following example. An Internet user notices that she is having difficulty accessing online videos. Connections are slower, more erratic, and sometimes interrupted entirely. Her provider says that a third party is disrupting the traffic flow, degrading performance for some users. Press reports are filed, complaints are lodged, and lobbyists are deployed. The regulators get involved. Internet network designers express concern about the process. They frown upon the practices that started the controversy, but they worry about inflexible legal mandates replacing good engineering. Perhaps, some of them venture, there are technical solutions to what are, after all, fundamentally technical challenges. All sides declare that the future of the Internet is at stake.
The preceding paragraph describes the fight over Comcast's "throttling" of traffic using the BitTorrent peer-to-peer ("P2P") file-sharing protocol. (54) In August 2008, the FCC, acting on a complaint from several public interest groups, sanctioned Comcast for interfering with the rights of users to access the open Internet. (55) Yet the same description covers the reverse scenario: one in which Comcast blames BitTorrent users for causing the disruption, because they tie up an excessive share of network capacity. In this scenario, other Comcast customers are the ones complaining about service degradation. or, the problem could be somewhere in between. Delivery of services across the Internet, especially novel applications like P2P file-sharing, involves many participants in a complex dance. Each naturally seeks to optimize its own performance, so as to maximize benefits for its customers.
Now consider a different scenario. Vendors wish to sell a new wireless communications service. They are convinced it will quickly become popular. The problem is that a different kind of wireless device is set to begin operating on nearby frequencies. They worry that complaints about interference will harm the market for their offerings. Regulatory pleadings are filed, press releases are issued, and lobbyists are mobilized, in order to emphasize the greater importance of each use. Again, the engineers shake their heads. They encourage discussions about technical solutions for cooperation, but their voices are difficult to hear during the wrangling before the regulator.
This second story describes the fight over unlicensed use of the "white spaces" in between broadcast television channels. (56) With advances in wireless technology and the transition to digital television, it is now possible to build devices that can sense and avoid existing uses, and can use the most appropriate frequency for a specific local area. (57) Broadcasters and wireless microphone vendors who historically operated in these bands assert that new unlicensed devices in the white spaces would interfere with and degrade their services. (58) Potential vendors and users of unlicensed white-spaces devices claim that broadcasters are the ones standing in the way of innovation. (59) They see the incumbents' hypersensitivity to interference as effectively cutting off their ability to serve their own customers. once again, the complexity lies in the technical details. Wireless devices that are designed to cooperate can function more efficiently in a shared environment, but, without a reason, each side will optimize for its own use.
As should be obvious, the network management and white spaces issues share a common structure. Both involve new forms of communication across data networks that potentially impact other users, as well as the integrity of the networks themselves. The major difficulty lies in the complexity and uncertainty of network uses. Neither case is a simple horizontal interconnection between two equivalent providers.
Network management involves the links between physical networks, logical routing systems, P2P applications, and content subject to intellectual property rights. White spaces involve interaction of many wireless devices under the control of different users, serving different functions.
These two case studies will be examined at greater length below. Some significant elements, however, bear noting at the outset. First, the issues are reversible; the "cause" of the problem depends on your perspective. This is because the key issues concern the interface between two industry segments. In the network management case, the connection is between broadband Internet access providers and application providers of P2P file-sharing. In white spaces, the link is between two different kinds of wireless system--unlicensed wireless data devices and broadcast television--which potentially interfere with one another. There is no a priori basis to privilege one use and declare the other the "cause" of trouble. (60 A regulatory system needs some independent basis to decide which actors to protect and which to constrain.
Second, the issues involve regulatory resolution of disputes that might be better addressed through technical mechanisms. The FCC decisions, whatever they are, will influence or sometimes mandate the technical approaches that the parties take to manage their networks, software, and devices. The more the private players can work together to develop joint solutions before going to the regulator, the better the outcomes are likely to be.
The problem is that regulation and private technical solutions are typically not connected. Either the industry can work out issues voluntarily or regulators get involved. In cases where a single dominant entity controls all the key elements of the network environment, such an approach is workable. That, however, no longer describes the communications and media environment today. There are still incumbents that dominate market segments and important platforms, but even they must interface with many other companies. The increasing decentralization and complexity of the network environment poses a challenge to the existing regulatory paradigm.
C. Limits of Current Approaches
The FCC has taken many positive steps over the years that paved the way for the Internet economy. (61 Today, however, the FCC finds itself out of step with the basic challenges it faces. The FCC is failing to address the key areas of network regulation for three reasons: (1) it was structurally designed for a different era; (2) its current environment creates a new set of technical challenges; and (3) it has chosen to divorce itself from proactive technical analysis in which it could be engaged.
Centralized regulators are trying to oversee a decentralized, interconnected environment. For a century, telecommunications and mass media have been subject to centralized control. (62) Dominant networks such as AT&T and the major television broadcasters exerted private control over user activity, and government regulators exerted public control over them. Today, the power of centralized networks is breaking down. (63) The Internet and intelligent devices are empowering users to organize and contribute to their communications and media experiences. (64) The value those users experience increasingly comes from applications and services independent of the networks themselves. Yet crucial policy decisions continue to be made by a central bureaucratic regulatory agency established in 1934.
With the convergence of information technology, communications, and digital media, FCC decisions are increasingly significant to virtually every company, and to the nature of public discourse, economic well-being, and communities. The FCC has made a number of beneficial decisions in recent years. It deserves significant credit for both actions and inaction that spurred the growth of the computer industry and the Internet. (65) Its core mission of ensuring that communications networks serve the public interest is more important than ever. The problem is that the FCC as currently constituted is ill-suited to address the major issues it faces today. It needs to be fundamentally restructured, not to destroy the agency, but to reinvigorate it.
Technological inventions take place independent of the desires of bureaucrats. However, regulatory decisions can significantly impact when and how those innovations unfold in the marketplace. The United States far surpassed other countries in initial deployment of the commercial Internet, thanks in part to policy decisions of the FCC and other parts of the government. (66) While inherent economic, cultural, and geographic factors played a role, they were not necessarily determinative. Today, the United States is falling behind in rankings of broadband Internet deployment, despite the same baseline advantages, because other countries have adopted different regulatory policies. (67)
The key challenge for the evolving Internet ecosystem is not competition, but cooperation. All participants in the market, from network operators to content providers, from search engines to wireless-device manufacturers, participate in the same interconnected network of networks. The fundamental technical and economic question is how they can act independently, pursuing their own private ends, while still contributing to the health and stability of the global mesh. The designers of the Internet brilliantly overcame this conundrum through both social and technical engineering. Their solutions, however, are increasingly failing to meet modern challenges.
The FCC is a direct descendant of the Interstate Commerce Commission ("ICC"), established by Congress in 1887 to oversee the railroad industry. (68) The ICC gave birth to the notion that some industries were so significant and posed such great public policy risks that they needed dedicated regulatory agencies to oversee their conduct. Railroads were the first continent-spanning industry. Their scale and scope forced the creation of modern corporate management principles. They quickly became central to commerce and transportation for the entire country. In so doing, they created enormous wealth. The railroads were seen as so powerful that ongoing supervision was needed to ensure they acted in the public interest. (69) The rationale for the ICC was paternalistic: market forces alone were insufficient to discipline the conduct of the railroads. (70) Only a government agency could ensure that the railroad industry served the interests of all Americans, not merely its own narrow self-interest.
If one pillar of the modern FCC is railroad regulation, the second is the technocratic instincts of the New Deal. When the extraordinary technology of communication over the airwaves became commercially viable, it was afforded the same treatment as the railroads, with the creation of a Federal Radio Commission ("FRC"). (71) By 1934, when the FRC was given jurisdiction over the telephone as well, and renamed the Federal Communications Commission, the New Deal was in full swing. In seeking to lift the united states out of the Great Depression, Roosevelt's followers erected the administrative state. (72) Cadres of experts would oversee large segments of the economy, mediating the excesses of the market.
The basic approach of the FCC was public utility regulation. The companies the FCC oversaw, such as AT&T and broadcasters, were private entities, but subject to far-reaching government oversight of their businesses. AT&T and its affiliates needed FCC approval for the prices they charged and the services they offered; broadcasters gained their most essential asset--access to the airwaves--on terms and conditions the FCC set. (73) The consolation for this intervention was often insulation from competition and the monopoly rents that followed.
After the 1960s, the FCC gradually moved away from the protection of regulated monopolies and toward a "deregulatory" approach of managed competition. (74) This approach is at the heart of the 1996 Telecommunications Act, the first comprehensive rewrite of the 1934 legislation. (75) Managed competition represents a major shift from the earlier regulatory approach, but it retains some key elements. (76) In particular, the FCC remains committed to achieving "public interest" objectives through direct oversight of dominant firms.
The FCC today is not only structurally ill-suited to tackle the challenges it faces, it has deliberately ignored the very skills it needs to cultivate. Administrative agencies are supposed to be subjectmatter experts. That expertise is their fundamental analytical advantage over the generalist Congress in addressing technical issues. Yet the FCC today has limited technical capacity and is not using even the capabilities it has. The agency has only a small number of engineers on its staff. (77) With few exceptions, it relies almost entirely on the parties before it to define its agenda and provide it with data and analysis. (78) And even this limited technical function is not used to its fullest potential. After a brief resurgence under former Chairman Michael Powell, the FCC has taken technical analysis largely out of its policy development process. The position of Chief Technologist was not filled for an extended period, and the FCC's Technological Advisory Council did not meet for over three years. (79)
The situation became so bad that in June 2008, the Institute of Electrical and Electronic Engineers ("IEEE"), the primary engineering trade association, sent a letter to the FCC complaining about its lack of technical expertise. (80) IEEE pointed out that some of the FCC's greatest recent successes came from its willingness to engage in dialogue with technical standards groups, industry, and academia. (81) The FCC had, for example, authorized unlicensed wireless devices in the 2.4 GHz band, making possible the flowering of WiFi and other technologies in this way. (82)
The FCC did not always operate in such a technical vacuum. As IEEE pointed out in its letter, the FCC in the past regularly sought assistance from academia, federally funded R&D centers, and the National Academies on challenging technical matters. (83) Forty years ago, when the Commission began the Computer Inquiries, it faced a similar challenge of assessing the convergence of telecommunications and computing. (84) Its 1966 Notice of Inquiry and 1967 supplemental notice drew over 3000 pages of comments from over sixty parties, a huge number at the time. (85) The FCC recognized that it lacked the technical capacity to digest fully the issues, so it enlisted the Stanford Research Institute (now SRI International), a non-profit research and development organization, to review the comments and make recommendations. (86) SRI summarized the comments of the parties, and then offered its own recommendations to the Commission on how to proceed. (87) The Commission used the SRI report as the analytical basis for its initial Computer Inquiry decision, now referred to as Computer I. (88)
Around the same time, the Commission was implementing its historic Carterfone decision to allow interconnection of third-party devices with the telephone network. (89) Mindful of the technical complexity of the new interconnection regime it wished to implement, the FCC in 1969 asked the National Academy of Engineering to recommend a comprehensive interconnection policy. (90) This helped the FCC produce the Part 68 regime that successfully ushered in competition and innovation in network-attached devices. (91) Despite this successful experience, the FCC has not asked for studies from the National Academies since the 1970s. (92)
III. STANDARDS AS REGULATORS
While the FCC and other government actors struggle to make sense of the complex networked environment, a different kind of regulation exists within those industries. That regulation takes the form of standards.
Standards are more than just mechanical specifications. They have powerful impacts on market performance, innovation, and user empowerment. Firms in the telecommunications, digital media, and Internet markets simultaneously compete and cooperate. Network operators, such as Verizon and Comcast, interconnect with other networks, as well as with providers of applications and content, such as Disney and Google. At the same time, all these companies fight to capture customers and to control value chains. These relationships parallel the standards-based interfaces in the unregulated computer industry. In both cases, technical interfaces are bound together with business terms.
The Internet, which is the confluence of network-based communications systems with isolated computers, represents the furthest advance of standardization. An open standard, TCP/IP, defines what it means to be part of the Internet. (93) Many other standards, developed over time by various sources, set the terms of engagement for the multiplicity of participants in the Internet economy. This network of networks has produced an extraordinary outpouring of innovation and productive economic activity. (94) It has also created a new set of business and competitive challenges, which strategists and policymakers are still struggling to understand.
The dominant original insight of cyberlaw, developed in the work of Lawrence Lessig and others, is that the software of the Internet operates alongside formal law to shape conduct on the network. (95) That software often encodes technical standards, such as digital rights management technologies for controlling access to copyrighted content. (96) While mainstream cyberlaw scholarship thus recognizes the importance of standards, it fails to consider the crucial questions of how those standards develop and how they can be applied outside of traditional private mechanisms. (97)
A. Functions of Standards
A standard is a common specification or model for market participants. (98) Such technical standards should be distinguished from performance obligations set by regulatory agencies. (99) Regulations such as the Department of Transportation's fuel economy requirements for auto manufacturers or the Environmental Protection Agency's acceptable levels of particulate matter in drinking water, are not really standards at all. They do not specify a common mechanism for companies to employ in order to reach those levels. Another way to put it is that a performance standard defines outputs and a technical standard defines inputs. (100) Performance standards are a staple mechanism of the administrative state. Technical standards, the focus of this Article, are usually developed in the private sector. (101)
Technical standards are essential to the communications, computer, and Internet industries. (102) By their very nature, these industries involve connections between software, hardware, content, and services of different providers. No company, no matter how dominant, can totally avoid interfacing with someone else. Systems must therefore have interfaces. The more standardized those interfaces, the easier it is to connect with them. The shift from monopoly to competition in telecommunications, from integrated mainframes to independent hardware and software in computing, and from private data networks to the Internet all greatly enhanced the importance of standards.
Specifically, IT standards serve three basic needs: (1) allowing systems to interoperate, (2) allowing applications and content developed for one platform to be ported to others, and (3) allowing data exchange among otherwise distinct systems. (103) Taken together, these capabilities mean that systems can be built in a modular configuration. (104) Instead of one integrated whole, providers can construct smaller pieces of the system, connect them at a defined interface and not have to worry about what happens on the other side of the interface. Kim Clark and Carliss Baldwin, in their landmark study, Design Rules, identified modularity as a key reason for the success of the computer industry. (105) Innovation thrives when new and established providers can compete to offer particular components of a complex system like a PC.
In communications systems, standards are essential whenever networks must interconnect. A television needs standards to pick up broadcast stations. A telephone handset needs standards to link to a telephone wire. A router on the Internet needs standards to know how to exchange traffic with routers on other data networks. The history of the communications industry can be seen as a slow march toward increasingly interconnected systems. Radio stations broadcast on their own channels. The development of the telephone raised the issue of how two networks could interconnect to hand off calls. (106) A few decades later, the FCC's Carterfone decision mandated interconnection standards between the telephone network and end-user devices. (107) The Computer Inquiries added interconnection standards between telecommunications and data processing equipment in the network. (108) Finally, the Internet put into practice a network defined primarily by its ability to interconnect. (109)
As all communications systems converge on digital transmission and packet switching, interconnection is increasingly becoming the defining process for whole industries. (110) Communications systems are increasingly adopting modular, layered architectures. (111) This makes standardization all the more essential. (112) The FCC today addresses an array of standardization issues in many different areas. (113)
Economists studying standards have analyzed the competitive impacts of standardization. (114) The decision to adopt a standard represents a choice on the part of a firm to cooperate rather than to compete. once a standard gains momentum and sufficient adoption, other firms may feel compelled to support it, in order to reach a large market. (115) In network industries, network effects can make a standard difficult to deviate from, even if it is not the best solution. (116) This "lock-in" effect counter-balances the benefits of standardized approaches.
There are many different kinds of standards, and several ways to categorize them. Standards serve three legally significant functions: interoperability, performance, and coordination. (117) An interoperability standard such as the universal serial bus ("USB") technology built into virtually all current personal computers allows different systems to work together as a single unit. A performance standard such as the Department of Transportation's fuel economy requirements obligates firms to meet some specified level of performance. A coordination standard such as the FCC's Part 15 rules for unlicensed wireless devices defines an acceptable arrangement within which two or more independent actors can interact. The devices need not interoperate; the standards allow them to coexist.
There are two basic mechanisms by which standards develop in the private sector: proprietary and consensus processes.
One or more companies propound proprietary or "de facto" standards, such as Microsoft's Windows operating system or Wal-Mart's purchase order specifications for its vendors. (118) Others choose to follow the proprietary standard, either because of its inherent merit or because of the economic value derived from participating in a platform market. Proprietary standards are extremely widespread in many markets. They are frequently necessary when a product involves independently produced complements or involves arm's-length business relationships. (119) A sufficiently powerful proprietary standard can become a platform, as discussed above, or simply a bottleneck to competition if use of the standard is too heavily restricted. (120) Both cases may call for a response using antitrust, intellectual property, or regulatory tools.
Consensus standards are developed through a voluntary cooperative process. (121) The Internet's TCP/IP protocol, overseen by the Internet Engineering Task Force, is a canonical example. (122) Consensus standards bodies raise several legal issues. Since they can involve discussions and agreements among potential competitors, they necessarily raise antitrust questions. (123) In high-technology industries, where standards bodies are an entrenched part of the environment, these issues have largely been resolved through careful structuring of the standards organizations. A second set of questions concerns strategic behavior by participants in standards organizations. (124) Controlling a standard is competitively valuable, so firms can be expected to engage with standards bodies in ways calculated to serve their own interests.
A third category of policy issues around standards bodies involves intellectual property and the importance of "open standards." (125) The exact definition of open standards is contentious, but is generally understood to include procedural protections for open and fair participation in the standards development process, and licensing of the standard and associated intellectual property on either a royalty-free basis or a reasonable and non-discriminatory basis. (126) Moreover, standards bodies have various policies regarding the appropriate intellectual property practices for contributions to standards bodies. A large number of bodies are involved in the promulgation of such open standards. (127) Sometimes these standards bodies have overlapping areas of consideration. Different standards organizations have their own definitions of what makes their standards open. (128)
Every standards development mechanism has positive and negative attributes. The more open and informal the process, the greater the range of views that can be taken into account. on the other hand, such openness can lead to delay and other inefficiencies. The IETF is rare in its ability to function so effectively despite its radically decentralized and open structure. (129) One author has even claimed that the IETF standards process was the only real-world example of valid "practical discourse" according to the ethics of philosopher Jurgen Habermas. (130)
There is value, in fact, in having multiple types of standards organizations. organizations can decide which sort of standard is most appropriate for what they hope to achieve. The old saw that "the great thing about standards is that there are so many of them" holds some truth. Standards ultimately succeed or fail based on the response of the marketplace. Having too many standards, or too many standards organizations, covering the same ground may be significantly less costly than converging too soon on a single approach. Even so, the economic literature on standards is rife with examples of "lock-in," where inferior standards were used for far too long. (131)
The standards process itself creates an ancillary benefit in bringing stakeholders together. Standards organizations are neutral meeting places for a defined segment of the interested community. Fully open processes such as the IETF's allow participants to self-define their interests. (132) This allows contributions to the process to come from unexpected places. on the other hand, more limited standards groups may be more efficient or they may make certain participants more comfortable engaging in private negotiations. In both scenarios, the standards process brings interested parties into a conversation. This can create a sense of community. It can also create an environment in which additional issues can be discussed, beyond the initial standards debate.
B. Government and the Standards Process
Standards can emerge not only from private mechanisms, but also through government involvement. (133) Where industry standards and associated norms are well-entrenched and operating effectively, regulators need not intervene. The Internet is a model of a well-functioning non-governmental standards regime. (134) The Internet Engineering Task Force has succeeded in gaining widespread adoption for its specifications, based on a strong set of social norms and an effective procedural regime for standards development. There is simply no reason for government to interfere with this system. In fact, to do so would risk destabilizing the Internet industry.
In the IETF's world of "rough consensus and running code," engineers have more legitimacy than governments to make basic technical decisions. (135) A more controversial example of deferral to private standards is the FCC's decision not to mandate a transmission standard for 2G mobile phones. (136) In that case, the TDMA and CDMA standards, as well as Motorola's incompatible iDEN technology, each won over some of the major networks in the U.S. and abroad. (137)
The opposite situation is when the private sector cannot develop necessary standards without government involvement. In markets where there is not a universally accepted standards arbiter, such as the IETF, or where government involvement is the rule rather than the exception, it may not be feasible for companies to come together and develop necessary standards on their own. The market may be sufficiently fragmented that no one approach gains a critical mass of support. Conversely, if a few large players dominate the market, other companies may not perceive any standard they support as being legitimate. In other cases, there may not be sufficient time for an open standards process such as the IETF to run its course. Finally, where standards impinge on critical public policy issues, such as public safety, government involvement may be essential.
The development of digital television standards illustrates most of these factors. (138) The FCC created an advisory committee that forced all the key companies to work together and create a standard that incorporated elements of all their technologies. (139) Had there been a private standards war, the implementation of digital TV might have been significantly delayed, if not derailed.
As a general matter, the United States is reluctant to engage in mandatory standards-setting. (140) This is in keeping with American faith in the marketplace and skepticism of government-defined solutions. In particular, the United States took the position early in the development of the commercial Internet economy that standards for electronic commerce and related activities should take place in the private sector, in contrast to the more top-down government involvement in other major industrialized nations. (141)
The US government is heavily involved with information technology standardization in other ways. First, government agencies are significant consumers of standards-based products and systems. Government representatives participate directly in many private standards bodies, representing their own agency's interests as potential users of standards. Under OMB Circular A-119 and the National Technology Transfer and Advancement Act of 1996, the federal government directs its agencies to utilize commercial open standards whenever possible. (142)
Beyond the user role, the government engages with the standards process in a wide variety of ways. The National Institute of Standards and Technology ("NIST") is actively engaged in promoting standards. Though NIST is an obscure agency, its annual budget, at more than $1.5 billion, is more than triple that of the FCC. (143) Other federal agencies such as the office of Management and Budget ("OMB") and General Services Administration ("GSA"), which set purchasing policies for federal agencies, and the US Trade Representative ("USTR"), which is active in international standards discussions, also play a significant role in the standards process. The U.S. government also serves as an official representative to some international standards bodies, such as the International Telecommunications Union ("ITU").
The federal government played an important indirect role in the success of Internet standards. The key Internet standards were the creation of the IETF, which sat outside the formal standards bodies. The International Standards Organization ("ISO"), made up of official standards bodies from each country, participated in developing an alternative model for data networking called Open Systems Interconnection ("OSI") in the late 1970s. (144) The ITU developed a set of standards, called X.400, that were an alternative to Internet e-mail standards. (145) Both of these standards were multilateral and intended as "open" efforts. Their outputs, however, were significantly more rigid than the Internet standards they sought to supplant. The U.S. government backed away from mandating internal adoption of particular standards, allowing the market to sort out the best solution. (146)
The FCC has a long history of involvement with standards. Standards-setting, primarily for broadcasting, was one of the Commission's functions from the beginning. (147) In allocating spectrum licenses to particular holders, the FCC defined the technical limits of transmissions in those frequencies. Those technical specifications for broadcasters effectively framed the technical standards for radio and television receivers. (148) The Commission went further to select application-level standards for the services delivered through these broadcast licenses, including AM and FM radio, as well as broadcast television. (149)
Section 256 of the Communications Act, recognizing prior practice, expressly authorizes the Commission to participate "in the development by appropriate industry standards-setting organizations of public telecommunications network interconnectivity standards." (150) It also has jurisdiction over the North American Numbering Plan, which defines processes for allocating telephone numbers. (151) In wireless telecommunications, the FCC's standards-setting activities are even more direct. The Commission specifically develops and authorizes standards for many of its wireless allocations, including for the AM and FM standards for radio, for the NTSC (analog) and ATSC (digital) standards for television, for the Part 15 specifications for unlicensed wireless devices, and for the standards governing ultrawideband devices. Internationally, it participates in the World Radio Conference ("WRC") every four years, which harmonizes global standards for broadcasting and wireless systems.
The FCC is best known for situations in which it actually sets or declines to set standards for the industry, such as ATSC for digital television or AMPS for first-generation mobile phone service. (152) When the FCC does so, the standards involved are typically interoperability standards. All devices following the standard are part of a virtual meta-system. Most of the economic literature focuses on such interoperability standards questions such as the famous clash between the Betamax and VHS standards for videocassette recorders, or more recently, Blu-ray versus HD-DVD for high-definition DVDs. (153)
The regulatory questions for interoperability standards concern whether to mandate a standard and which standard to pick. The FCC chose to set a standard for FM radio and analog mobile phones, but declined to do so for AM stereo or "second-generation" digital mobile phones. (154) These decisions come down to a judgment about whether the market, left alone, will produce standards that facilitate beneficial network effects.
C. Carterfone and Part 68
Coordination standards regulate industry behavior in two ways: they can define the terms of competitive engagement and they can integrate public policy considerations into the technical "code" of the industry. When the FCC reached the Carterfone decision to remove AT&T's restriction on "foreign attachments," and adopted the associated Part 68 standards for attachments to the telephone network, it opened the door for an explosion of competition and innovation. (155) Carterfone represented the FCC's first foray into a complex network market. This change enabled new entrants to compete with the incumbent network operator. (156)
The FCC's 1968 decision invalidated the foreign attachment provisions in AT&T's tariffs, allowing connection of not only the Carterfone, but any other third-party device that did not harm the network. (157) This action alone, however, was not self-actualizing. AT&T could no longer prohibit all end-user attachments to its network, but it could still exclude those that might be harmful. Carterfone built on earlier antitrust jurisprudence which held that a platform owner could not preclude third-party add-ons, but could adopt standards for them. (158) The initial Carterfone decision left the definition of the conditions for interconnecting with the network to AT&T.
Carterfone merely adopted a principle. Specifically, Carterfone invalidated the foreign attachment provisions in AT&T's tariffs, finding them not to be "just and reasonable" as required under the Communications Act. (159) In the earlier Hush-a-Phone case, an appeals court had rejected AT&T's foreign attachment restrictions for precluding attachment of a rubber cup to improve privacy of conversations. (160) However, the FCC allowed AT&T to file revised tariffs, which precluded most other forms of terminal attachment, including any involving electrical connections to the network. (161) Carterfone went further, invalidating all prohibitions on foreign attachments. It thus adopted the important principle requiring AT&T, the regulated telephone network, to interconnect with third-party network devices. The Carter fone principle, in effect, segmented the regulated network infrastructure from the competitive equipment market. (162)
The enduring significance of the Carterfone principle is threefold. Terminal equipment itself proved to be a substantial market opportunity, giving rise to innovative new telephones, fax machines, and other end-user devices. Businesses took advantage of Carterfone to connect their internal private branch exchanges ("PBXs") to the public telephone network, revolutionizing the market for business communications. Eventually, with the introduction of digital modems and networked personal computers, the Carterfone principle allowed for the creation of dial-up Internet service providers ("ISPs"), the basis for the early growth of the Internet. (163)
Carterfone itself, however, produced few of these benefits. The decision did nothing to remove AT&T's stranglehold over the architecture of the public switched telephone network. It merely denied AT&T the ability to issue a flat prohibition on device interconnection. Recognizing this, AT&T responded to Carterfone with a set of new tariffs which eviscerated the ruling. AT&T's post-Carterfone tariffs allowed foreign attachments, but only through a "protective connecting arrangement" ("PCA"). (164) The PCA involved an AT&T-manufactured device that sat between the third-party terminal equipment and the network, and a monthly service fee to AT&T. In practice, the charge for the PCA was sufficiently high to make independent terminal equipment uneconomical except for PBX systems involving at least ten phones. (165) Even then, AT&T sometimes delayed interconnection with terminal devices because the PCA equipment was unavailable. (166)
AT&T argued such PCAs were necessary to safeguard its network. These claims were as specious as its prior assertion that foreign attachments had to be completely prohibited. In some cases, AT&T's regional Bell Operating Companies ("BOCs") purchased terminal equipment from the same manufacturers who sold directly to endusers. (167) Identical equipment required a PCA when connected privately, but not when sold by AT&T. The FCC allowed AT&T's PCA tariffs to go into effect. (168) In 1969, however, the FCC asked the National Academy of Engineering to evaluate whether the PCA approach was technically required. (169) The National Academy report concluded that a regime of FCC-mandated technical standards was an acceptable alternative. (170) The report paved the way for the FCC decision that realized the promise of Carterfone.
In 1975, the FCC adopted a new regime of technical standards for interconnection of terminal equipment with the telephone network. (171) The specifications themselves are located in Part 68 of the FCC's rules. (172) Part 68 replaced the PCA regime with one of certification. Any device certified by the FCC as complying with the Part 68 standards could be connected to the telephone network. AT&T could not require a special connective device or charge a fee.
Part 68, not Carterfone, marked the true beginning of open interconnection to the public telephone network. AT&T, recognizing the significance of this decision, challenged the Part 68 rules in court. (173) They did not go into effect until the Supreme Court denied certiorari in 1977. (174) Even then, AT&T launched a final effort before the FCC to prevent the complete deregulation of terminal equipment. (175) Only when the FCC's Computer II decision implemented full detariffing did the era of true terminal equipment competition begin. (176)
Thus, Carterfone itself, while of great symbolic significance, had limited practical impact. It was only when the FCC later adopted technical standards for device interconnection that real competition could emerge. The FCC lost sight of this fact in subsequent decisions. It moved away from a standards-based approach, gradually weakening its ability to promote the robust innovation that Carterfone came to symbolize. The Commission should return to standardization in order to reinvigorate Carterfone for the present age of digital convergence.
IV. THE AGENCY AS STANDARDS CATALYST
The FCC can take advantage of standardization to overcome limitations of traditional regulatory techniques. The FCC should recast itself as a standardization organization in virtually everything it does. (177) In some cases, the Commission will need to establish standards itself. In others, it can look to the private sector to do so. (178) The FCC has a range of options: it can defer to private solutions (whether proprietary private standards or consensus industry efforts), establish self-regulatory organizations, certify externally developed standards, or define and impose standards directly. Each of these has a place, depending on the nature of the issue at hand.
Standardization represents a new direction for communications law. Communications regulation developed as a series of isolated silos covering broadcasting, telephone networks, cable television, wireless communications, and other services. Now those networks are converging. When everything can be reduced to an interchangeable digital bit, standards define how information flows across the interconnected network of networks. That puts standards squarely within the FCC's mandate. As a regulator, the FCC's function is to police the vertical and horizontal relationships among network operators, their users, and the providers of content, applications, and services on those networks. The FCC should use and encourage private standardization efforts as a means to achieve its policy objectives.
The existing literature on agency involvement with technical standardization focuses on situations in which a single standard defines an industry, such as digital video disc ("DVD") encoding or FM stereo broadcasting. In a complex network environment, however, standards play a more complicated role. Multiple standards may coexist. Alternatively, a relatively lightweight standard at the interface between key layers of the network may allow for great variety on either end, as with the TCP/IP standard which defines the Internet. When standards set boundaries between market segments that remain distinct, the FCC has many more options at its disposal than when it is called upon to set an industry-wide standard.
A. Standards Facilitation
As a starting point, the Commission should greatly expand its technical competency, as the IEEE recommended in its letter to Chairman Martin. (179) The FCC staff is dominated by lawyers and, to a lesser extent, economists. The agency has an Office of Engineering and Technology that concentrates on technical issues such as wireless-device testing. The FCC needs greater technical expertise, however, throughout its policy-making bureaus and on the staffs of the FCC Chairman and Commissioners. With more engineers in-house, the FCC will be better able to judge the technical implications of its decisions and to understand where industry standards could substitute for regulation.
The FCC should also reinvigorate the Technological Advisory Council ("TAC"), a federal advisory committee that helped the FCC understand key technical issues. The TAC, which was very active under Chairman Michael Powell, became moribund under his successor, Kevin Martin. (180) Having representatives of vendors, the academic community, and industry standards organizations directly involved in an FCC advisory mechanism would bring the agency into much closer contact with external developments.
When needed, the FCC should solicit input from technical organizations, such as the National Academy of Engineering, or from academic research organizations. It could also call upon standards organizations directly to offer their views on the standards landscape in particular areas. (181)
The FCC's primary mechanism for interfacing with the standards process should be to certify privately developed standards. Certification would mean FCC approval of a particular technique. FCC-certified standards could either be pre-existing standards from a recognized standards-development organization, new standards created through some collaborative process, or privately developed specifications that are documented and made available. (182) In essence, companies that proposed standards would be voluntarily committing themselves to accept a certain form of interconnection or shared access. FCC certification would give protection against both opposition from competitors and FCC imposition of different obligations.
The biggest problem with private development of coordination standards is that the standards will benefit only the companies involved. Network operators can always adopt private specifications or even undocumented standards. The FCC gets involved precisely because sometimes those private efforts fail to benefit users or the market as a whole. In certifying standards, therefore, the FCC must ensure that some check exists to promote the public interest.
One option would be to create new self-regulatory organizations ("SROs"), through which industry actors could work to resolve thorny problems. An FCC-chartered SRO could serve three primary functions: identify relevant norms, issue advisory opinions, and adjudicate disputes. (183) While an SRO might be useful for some situations, however, it would be limited to those problems that are sufficiently well-defined and subject to industry resolution.
The other mechanism the FCC could use to police standards is to require that they be developed through open processes and made freely available. (184) There are many definitions of open standards. (185) Generally speaking, an open standard is freely accessible and has been developed through an open process incorporating procedural protections. Freely accessible means that the standard itself is not subject to licensing terms, fees, or intellectual property protections that limit who can review its text and implement it. In effect, the standard must be open source.
The National Technology Transfer and Advancement Act specifies five requirements for standards adopted by federal agencies, derived from OMB Circular A-119: openness, balance of interest, due process, an appeals process, and consensus (defined as general agreement, not necessarily unanimity). (186) The American National Standards Institute ("ANSI") also has general procedural requirements for accredited industry standards bodies. (187) The FCC adheres to a practice of delegating only to ANSI-accredited standards bodies. (188) Thanks in part to requirements such as these, private standards bodies have become more open over time. (189
Where existing standards address market needs, they could be grandfathered in, on either an individual or blanket basis. Most or all standards approved through recognized industry standards bodies, such as the Internet Engineering Task Force, which foster active participation and review by interested parties, could be grandfathered into such a certification regime. (190)
At a minimum, standards that the FCC certified would have a sort of "Good Housekeeping seal," which would encourage voluntary adoption. Such seals are used in many industries. (191) Programs such as the Department of Energy's EnergyStar certification, for electronic equipment that meets requirements for limited power consumption, show how the private sector will go along with government-defined voluntary certifications that have marketing benefits. (192) Online, seals such as the Better Business Bureau and TRUSTe are adopted, even though strictly voluntary. (193)
An FCC standards-certification process would be in line with the use of "private ordering" mechanisms by administrative agencies. (194) Private ordering can create results that are preferable to public processes. (195) Commentators examining this phenomenon have focused on the need to ensure legitimacy of the private decisions involved. The substantive and procedural protections of the administrative rulemaking process do not necessarily apply to the decision-making process of private actors. (196) Steven Schwarcz, in a survey of the private ordering literature, argues that delegation to private actors can be safeguarded by imposing constraints directly on the private actors that relate to the underlying goals the agencies seek to accomplish. (197) An FCC standards certification process with an open standards requirement would reflect this approach.
At first blush, the standardization approach seems to restrict market actors more than the alternative approaches. The regulatory and contractual models assume a framework of negative liberty, in which market participants generally have the right to remain free from interference with their actions. (198) Network providers are generally considered to have the right to engage in any action not expressly prohibited. Thus, the FCC's investigation of Comcast's alleged BitTorrent throttling focused on whether the particular conduct at issue should be defined as impermissible. (199) The standardization approach, in contrast, affirmatively defines certain practices that are permissible.
In reality, the greatest threat to market-driven innovation in convergence industries is not what is clearly prohibited, but what is uncertain. Companies deciding whether to put capital at risk are unlikely to do so if the playing field may change after their investment is already sunk. When the rules are clear, even if they impose limits, companies can make decisions about whether the benefits of the innovation or investment exceed the costs. For new and innovative kinds of investment, the greater the uncertainty about those costs, the less likely the investment will take place. Such "chilling effects" are well-understood in the intellectual property area. (200)
The standardization framework does not preclude deployments outside the standards. It essentially defines a set of "safe harbors" where market participants have certainty that their conduct will not run afoul of the regulatory process. For most forms of potential legal liability, the law provides a safe harbor mechanism that protects online intermediaries. For tort claims such as defamation, that safe harbor appears in Section 230 of the Telecommunications Act of 1996. (201) The Digital Millennium Copyright Act includes a similar provision, Section 512. (202) These provisions shield intermediaries from secondary liability for their users' conduct, so long as they follow basic requirements. (203))
The Section 230 safe harbor in particular has been important to the commercial development of the Internet. It states that:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.... No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section. (204)
Without this provision, the commercial Internet as we know it could not exist. If Google, for example, were strictly liable for all material in its search index, or Amazon.com for any comments posted in its book reviews, those companies would have to make radical changes to their business models. (205) The editorial filtering approach that works for a daily newspaper, which can review and judge what content to publish, simply does not work in a digital environment where the costs of information creation, aggregation, and distribution are so much lower. (206)
The point of safe harbors such as Section 230 is that good actors should be encouraged, not discouraged. Without legal protection, online intermediaries who affirmatively tried to remove infringing and illegal material would expose themselves to liability. The safe harbor helps such companies, while still allowing liability for those that are found to be bad actors, such as Napster and Grokster. In the cases described above, broadband access providers that seek to work with P2P companies and white-spaces-device manufacturers that seek to avoid interference are the equivalent of these responsible intermediaries.
By providing guidance and protection for such companies, the FCC would create incentives for participation in cooperative technical efforts. This would make it less likely that potential conflicts would play out either in a technological arms race or through the blunt instruments of regulation and the courts.
Two recent case studies illustrate the potential value of a standards-centric approach.
B. Case Study 1: Network Management
The battle over broadband network management practices illustrates the value of a standardization approach. In 2005, the FCC adopted an order classifying broadband Internet access service over telephone lines as an integrated "information service." (207) The Commission classified broadband over cable television networks as an information service around the same time. (208) These decisions meant that the underlying network transmission capabilities for broadband services were not discrete "telecommunications services" under Title I of the Communications Act. (209)
Telecommunications services are subject to several regulatory obligations. The most important here are the requirements of interconnection and unbundled access. (210)) A provider of telecommunications service must make the underlying components of that service available to competitors at non-discriminatory and reasonable rates. As a practical matter, therefore, the FCC decision meant that independent ISPs were not entitled to interconnect with the broadband facilities of incumbent network operators, nor were those operators subject to the traditional non-discrimination obligations of common carriage. (211)
To leaven its decision, the FCC adopted an Internet policy statement alongside its wireline reclassification order. (212) The policy statement declared that:
[C]onsumers are entitled to access the lawful Internet content of their choice ... [;] consumers are entitled to run applications and use services of their choice, subject to the needs of law enforcement ... [;] consumers are entitled to connect their choice of legal devices that do not harm the network ... [;] consumers are entitled to competition among network providers, application and service providers, and content providers. (213)
The four principles are subject to a blanket caveat, included in a footnote, that "[t]he principles we adopt are subject to reasonable network management." (214)
The FCC stated that the policy statement itself did not constitute enforceable rules. It indicated that it would implement the statement as part of its ongoing policy-making activities. (215) This left a great deal of uncertainty about whether the document would actually support FCC action, and if so, how. Because the broadband networks are now classified as information services, their regulatory status is unclear. (216)
If the FCC fashions rules for broadband access, it must draw on the Commission's ancillary jurisdiction, similar to cable television rules before Congress adopted the cable-specific Title VI of the Communications Act. (217) In affirming the Commission's decision to reclassify cable modem service as an information service in National Cable and Telecommunications Association v. Brand X, the Supreme Court noted in dicta that the Commission could adopt such rules to deal with any ongoing competitive problems. (218) However, the limits of that jurisdiction remained to be tested.
In late 2007, press reports circulated that Comcast, the nation's largest cable broadband provider, was manipulating P2P file-sharing traffic on its network. (219) Comcast initially denied the reports. It eventually acknowledged that it had implemented traffic management systems that targeted P2P file-sharing services such as BitTorrent. The Comcast system generated artificial "reset" packets when users of these applications transmitted over a certain volume threshold, causing transfers to proceed more slowly or not at all. Comcast offered several explanations and descriptions of its traffic management techniques, ultimately committing to replace them with a new applicationneutral system by the end of 2008. (220)
Comcast's actions made it a target for advocates of network neutrality, a view that Internet network operators should not discriminate in their treatment of applications and content. (221) Providers of telecommunications services have traditionally been considered "common carriers," required to treat all traffic on their network equally. (222) The FCC's decision to classify wireline broadband as an information service removed the telephone companies' digital subscriber line ("DSL") broadband services from this category, and affirmed that competing cable modem services would be similarly exempt from common carrier regulation. Academics and policy advocates began to express concern that, given the duopoly structure of the broadband access market, cable modem and DSL providers would discriminate against unaffiliated Internet services and content. (223) Network neutrality became a significant political battle, with several bills introduced in Congress and conditions imposed in telecommunications merger reviews. Until Comcast's P2P throttling came to light, however, there were few concrete examples of broadband network operators actually differentiating in their treatment of applications. (224)
A coalition of advocacy groups filed a complaint with the FCC alleging that, by degrading or blocking access to P2P file-sharing services, Comcast had violated the FCC's 2005 policy statement. (225) The FCC solicited public comment and held two public hearings on the matter. Comcast argued that the policy statement on its face declared that it was not a set of enforceable rules and that its conduct constituted permissible "reasonable network management" to address congestion from P2P traffic. (226 The FCC rejected these objections. It issued an order granting the complaint. (227)
In its order, the FCC sanctioned Comcast for violating the policies it articulated in the policy statement. The Commission first determined that it had sufficient jurisdictional authority under Title I to adjudicate the complaint. (228) It expressed its intention to evaluate alleged violations of the Policy Statement on a case-by-case basis, rather than through comprehensive rulemaking. (229) It then concluded that Comcast had engaged in impermissible application blocking. (230 The Commission declined to impose any fines or other sanctions on Comcast, ordering the company only to disclose full details of its practices and to cease and desist its application-based traffic management. (231)
The Comcast P2P Order was a landmark decision, but a flawed one. In sanctioning Comcast, the FCC put teeth into the aspirational words of the Policy Statement. For the first time, the FCC put itself on record as promoting the openness of the Internet. After several years of declining to regulate Internet-based services, the FCC clearly signaled that it intended to exercise oversight on broadband networks, even though the networks were classified as information services. There are significant legal questions about the FCC's use of adjudication to implement the non-enforceable Policy Statement and about whether the FCC exceeded the scope of its ancillary jurisdiction. (232) Comcast is challenging the order in court. (233) Moreover, the FCC action occurs against the backdrop of efforts to pass network neutrality legislation.
Whether or not the Comcast P2P Order ultimately sticks as the basis for FCC action to promote Internet openness, the Commission has established an approach that differs from the traditional models. In rejecting Comcast's network management scheme, the FCC diverged sharply from its deregulatory path of recent years. The FCC's conclusion that Comcast was subject to special scrutiny is a rejection of the efficiency approach and its focus on generic market power tests. The FCC did not conclude that Comcast was necessarily disadvantaging P2P applications to protect its existing video distribution business, or was otherwise engaged in an example of market failure. (234) Instead, the FCC hearkened back to a policy statement based on communications exceptionalism, emphasizing difficult-to-quantify concepts such as innovation and user empowerment. (235)
At the same time, these public-interest-style obligations were lodged within an approach that varies greatly from the classical public utility model. The FCC did not tell Comcast how to price its services, or how to manage its network in a neutral manner. It explicitly refused to adopt a comprehensive regulatory scheme.
The Commission's decision in the Comcast case thus represents a new direction. Unfortunately, that new direction is flawed. While the FCC's use of adjudication may avoid some of the problems of traditional regulatory techniques in the new digital converged environment, it fails to account for the new realities of that environment. The FCC's decision only told one network operator what it could not do. It said little about what congestion management practices might constitute "reasonable network management," other than caps on bandwidth utilization. (236)
As a practical matter, network operators can be expected to move in that direction. Comcast has already announced a 250 GB monthly cap and several other broadband operators are exploring both caps and metered billing. (237) The bigger problem with the FCC approach is that it gives too little guidance to the industry. Comcast and other broadband providers know how they cannot manage traffic, but they have virtually no information about how they can. Moreover, the FCC offered only a cursory analysis of the critical term "reasonable network management." The Commission determined that "experts in the field generally disagree strongly with Comcast's assertion that its network management practices are reasonable." (238) However, the Commission did not itself evaluate the technical claims nor did it directly cite to any pronouncements of Internet standards bodies. Instead, when stating that the IETF "has promulgated universal definitions for how the TCP protocol is intended to work ... [and] Comcast's practices contravene those standards," (239) it cited to experts who testified at its field hearing and to comments filed in the proceeding. (240)
The FCC's interpretation of "reasonable network management" is thus a sort of "we'll know it when we see it" approach. This may be appropriate in cases where a given practice is far beyond the pale. However, network engineers are not uniform in their views about what techniques are "reasonable." Some very widespread practices -including the use of content delivery networks such as Akamai to redirect traffic to local caching servers, Network Address Translation to increase the effective number of Internet Protocol addresses, port 80 spoofing to tunnel through firewalls, and multiple parallel TCP sessions (as is common among P2P file-sharing applications)--are either in the grey areas beyond established standards or contrary to established best practices. (241) In the specific case of Comcast, an equipment vendor called Sandvine developed the network management technology that throttled P2P file transfers. (242) Comcast was not even the originator of the technique. The FCC approach makes it challenging for companies such as Comcast to judge, in advance, whether regulators will, after the fact, consider a technology appropriate.
The real problem with the FCC's approach to network management is that it ignores the potential for leveraging standards. The Commission framed its Comcast decision as a step to promote the openness of the Internet. (243) The advocacy groups leading the charge for action saw it as a step toward network neutrality. However, the specific contents of the decision were technical mechanisms. (244)
Comcast's broadband network management practices effectively define the interfaces between the network infrastructure and P2P applications. The focus of Comcast's activities was to limit P2P traffic. There are reasons, however, Comcast and other broadband operators would want to encourage P2P deployment, under the right circumstances. P2P technology provides major efficiencies in the distribution of rich media content. (245) Because they use the power of many computers throughout the network to deliver content, P2P distribution systems can be cheaper and more scalable than centralized systems. That is why services that have nothing to do with distribution of infringing rich-media files, such as the Skype VOIP service, use P2P architectures.
Prior to the FCC action, Comcast negotiated an agreement with BitTorrent for efficient use of the P2P service on Comcast's network. (246) It also participated in development of a standard, P4P, which is now being managed by an industry group, the Distributed Computing Industry Association. (247) Such an arrangement benefited BitTorrent and its users along with Comcast. P2P systems are more efficient when more content is delivered locally. If a user's BitTorrent client can fetch a greater share of content from other clients nearby, rather than traversing the Internet backbone, BitTorrent's performance improves. Such a technique also benefits the network operator, who can avoid unnecessary long-haul traffic. The more the physical network knows about the architecture and demands of a P2P service such as BitTorrent, the more efficient it can be, and vice versa. For these reasons, Comcast and BitTorrent developed a standard for P2P file transfers across the last-mile network infrastructure. (248)
The FCC's approach to the Comcast P2P throttling case could choke off such private arrangements. The FCC reserved the right to review any network management practice through adjudication. (249) Comcast must hesitate before adopting any practice that could conceivably run afoul of the FCC's interpretation of the policy statement, especially since the recent FCC order provides so little guidance about what might be permissible. The fact that Comcast had reached agreement with the company BitTorrent on a network management mechanism was not sufficient for the FCC. (250) Several other companies use the open-source BitTorrent protocol or other P2P protocols, and they were not satisfied with Comcast's private agreement. Any private deal between a network operator and a particular application provider or group of providers could be challenged as either benefiting those companies to the detriment of their competitors or as a solution forced upon the P2P providers by the network operator.
Under these circumstances, Comcast is likely to choose the network management options that pose the least risk of FCC sanctions, rather than the options with the best technical performance. The new "application-agnostic" techniques Comcast is now implementing reflect this caution. (251)
FCC Commissioner Adelstein, in his concurring statement, recognized that open standards represent the best solution for network management questions:
[I]ndustry standard setting bodies, such as the Internet Engineering Task Force, the Internet Architecture Board, and the Internet Society... offer the best forum for resolving network management issues. It is certainly preferable for facilities-based providers and applications providers to work collaboratively, in an open and transparent manner, without the need for governmental intervention. To the extent that engineers can work out these issues among themselves, it obviates the need for Commission action. (252)
Adelstein opined that the FCC decision "sends a strong signal" that resolving issues in such standards bodies is preferable to regulatory intervention. (253) In reality, the message is the opposite. By sanctioning Comcast without a thorough analysis of the technical possibilities and by disregarding Comcast's efforts to help develop the P4P standard, the FCC action is likely to take network management disputes out of the standards bodies and into the regulatory arena.
A better approach would be for the FCC to view network management explicitly as a standardization problem, and identify mechanisms to push towards open standards solutions. under a standardization regime, the FCC would encourage the development of standards like P4P and the Comcast-BitTorrent arrangement. (254) These standards would have to be taken to a standards body that met threshold procedural criteria. If the standards gained approval through a consensus mechanism, they would have to be made available to others on a free and non-discriminatory process. Those standards could then be submitted to the FCC for certification. If the requirements were met, the Commission could declare the practice to be a form of reasonable network management. Neither Comcast nor any P2P provider would necessarily have to use the certified standards, but any P2P company would be able to.
The basic difficulty in the Comcast scenario is that there is no universal definition of "reasonable network management." Comcast's broadband access network is a platform for a host of independent Internet-based services. The relationship between companies such as Comcast and network-based applications is necessarily fraught. Broadband providers both enable and potentially compete with network-based applications. Those providers make a host of decisions about how to architect, provision, manage, and price their networks. All of those decisions affect the users of the platform, potentially in negative ways.
If the FCC adopted a standards-certification regime, broadband access providers would have a new set of options in how they managed their networks. They could engage in industry discussions about the implications of new techniques, either privately or through consensus standards organizations. The companies affected would have opportunities to work with them to identify win-win solutions to legitimate congestion issues. The FCC would serve as a backstop for any standards that emerged from these discussions, making sure there was adequate opportunity for participation.
C. Case Study 2: White Spaces
Wireless communication has been tightly regulated by the FCC since the creation of its predecessor, the Federal Radio Commission, in 1927. (255) In contrast to the Internet, where the ethos has always been that anyone can speak, use of the airwaves is permitted only with the express authorization of the government. only in recent years, with the advent of spectrum auctions and unlicensed allocations, has the FCC begun to move away from its centrally planned, command-and-control approach to spectrum. (256)
In particular, the FCC in the 1980s authorized the use of "unlicensed" devices that shared access to frequency bands on a "commons" basis. (257) This unlicensed approach allowed the creation of new industries using technologies such as WiFi and Bluetooth. Recognizing the success of the commons approach in promoting investment and innovation, the FCC subsequently allocated unlicensed capacity in additional bands, and also authorized low-power ultra-wideband technology that operates below the noise floor of other systems. (258)
Standards are a big part of the success of commons approaches to spectrum. When the FCC allocates unlicensed spectrum, it must ensure that users of those frequencies do not unreasonably impinge on each other, or on users of other frequencies. (259) With traditional command-and-control allocations, the Commission selects a licensee for the frequency, through a mechanism such as auctions. (260) It establishes parameters for how that licensee can use the spectrum, and writes those parameters into the license. Historically, the license specified the service that the licensee could provide, such as broadcast television, although more recent allocations provide flexibility to offer the service with the greatest market demand. (261) Because there is a single licensee, the Commission knows where to go to police interference. The licensee can determine the kinds of devices that can operate on its networks, using either proprietary or open standards. The licensee has an incentive to ensure that it can provide a viable service. (262)
With an unlicensed allocation, there is no licensee to oversee the device market. (263) Some other mechanism must therefore ensure that users can coexist. That mechanism is the use of standards. First, the FCC defines technical standards for the unlicensed band. Part 15 of the Commission's rules spells out power limits and other restrictions designed to prevent a "tragedy of the commons" in which no one can effectively communicate. (264) The FCC uses a private certification process for compliance with the Part 15 standard. On top of the FCC standards for all users of the band, there are also industry standards that apply. Most prominently, the IEEE's 802.11 standards for wireless local area networks are the basis for the various implementations of WiFi. (265) On top of the IEEE's consensus-based technical standards, the WiFi Alliance, a private organization of device manufacturers, certifies compliance with the WiFi standard for interoperability purposes. (266)
Viewed another way, all spectrum allocation is really a process of allocating property rights for use of wireless devices. (267) Standards bound those property rights to enable coexistence of many independent actors. They therefore function similarly to the registration and deed-recording system in real property. This administrative system is essential to the smooth functioning of not only the market for land, but also the massive edifice of capitalism built on top of it. (268)
In 2004, the FCC proposed authorizing wireless devices to operate in the vacant white spaces around broadcast television frequency bands. (269) When the broadcast television allocations took place many years ago, channels were deliberately left vacant in each city to prevent interference from broadcasters in neighboring cities. Several other channels remain dark because no broadcaster is operating in a particular city. A study in 2005 by Free Press found that a large percentage of television channels are not in use. (270)
The transition to digital television ("DTV") means that even more broadcast frequencies could be available for other uses. A collection of technology companies and public interest groups rallied to support opening up the white spaces for unlicensed devices, arguing that this would radically increase available capacity for innovative new services. (271) Television broadcasters and other incumbent users of the frequencies, such as wireless microphone companies, launched a furious counter-attack, claiming that use of the white spaces would disrupt service and potentially derail the transition to digital television. (272) In 2006, the FCC issued an initial order concluding that there was sufficient evidence to suggest that white space devices could operate without producing excessive interference. (273) Before authorizing deployment of white space devices, however, the Commission initiated a testing process for prototypes. (274)
In October 2008, the FCC's Office of Engineering and Technology released its final testing report. (275) The report suggested that white space devices could detect television receivers and other devices operating nearby, although broadcasters disputed the FCC's interpretation of the data. In November 2008, the FCC voted to authorize white space devices. (276) It required that such devices include geolocation capability to identify their current location and check a database of broadcasters operating there. (277) Devices must register their location with the database and must be certified for use by the FCC prior to introduction in the market. (278) The FCC also encouraged use of spectrum sensing and other technologies to protect incumbent users of the bands. (279) Despite these protections, the National Association of Broadcasters and the Association for Maximum Service Television sued in March 2009 to overturn the FCC decision. (280)
Standards will be essential to the success of any unlicensed regime for the white spaces. There are actually two levels of standards involved: (1) general FCC standards governing power levels and other attributes of any device operating in the bands, and (2) privately developed standards analogous to WiFi for wireless systems that operate there. These two levels of unlicensed standards have traditionally operated independently. The FCC is concerned about preventing interference, while standards-setting groups focus on enabling new services. In the case of WiFi and the 2.4 GHz band, the IEEE developed its wireless local area network standards years after the FCC acted to establish the band. (281) In some situations, this lack of coordination may be unavoidable. Technology may evolve to allow for new standards that were not feasible when the FCC took its initial action, for example. In other cases, though, better harmonization between the efforts of the FCC and standards bodies would be beneficial.
The critical issue for white space devices, or any unlicensed wireless devices, is how to share the spectrum without excessive interference. (282) Fundamentally, this is a matter of defining and adhering to standards. Effective standards can enable the development of a market, as was the case with the IEEE's 802.11b standard that is the basis for WiFi. The standards process can also incorporate mechanisms to deal with potentially non-compliant devices. As Philip Weiser and Dale Hatfield observe, standards bodies could play an important role in policing a spectrum commons. (283) The FCC could even charter a self-regulatory organization for white-space-device manufacturers. (284)
So far, the FCC has provided little indication of how its white spaces rules will interact with private standards-development efforts. The rulemaking process is not an ideal forum for the kind of back-and-forth and experimentation necessary to develop workable technical solutions. Though the Commission engaged in a testing process for prototype devices, that process was focused on whether particular equipment could meet pre-defined requirements, not on developing the best approaches to the problem. One of the concerns that led broadcasters to sue to overturn the decision was dissatisfaction with the FCC's technical conclusions. (285) If the FCC sees its role with respect to white spaces not as allocating spectrum, but as facilitating the development of efficient sharing mechanisms, it could foster a more open and collaborative process through which to address the challenging technical questions. The current situation, in which the issue winds up before a court, is far from that ideal.
Standards certification combines the benefits of decentralized, private, cooperative decision-making with protective government oversight. Most important, a standardization approach would force the FCC to focus on the issues that really matter. The distributed network of networks that is today's digitally converged communications universe is built from the ground up with standards. Hard regulatory policy questions, such as those in the Comcast network management battle, are, in reality, hard technical questions about standards.
The FCC should stop thinking like a regulator, and start thinking like those in the industries it oversees. It should reach for higher standards.
(1.) See Margaret Jane Radin, Online Standardization and the Integration of Text and Machine, 70 Fordham L. Rev. 1125, 1133 (2002); Daniel Benoliel, Comment, Technological Standards, Inc.: Rethinking Cyberspace Regulatory Epistemology, 92 Cal. L. Rev. 1069, 1086 (2004).
(2.) See Stephen Breyer, Regulation and Its Reform 96 (1982); Lawrence Lessig, Code and Other Laws of Cyberspace 44-53 (1999); Daniel Benoliel, Cyberspace Technological Standardization: An Institutional Theory Retrospective, 18 Berkeley Tech. L.J. 1259, 1329-30 (2003) [hereinafter Benoliel, Cyberspace Standardization]; Radin, supra note 1, at 1146; Joel R. Reidenberg, Lex Informatica: The Formulation of Information Policy Rules Through Technology, 76 Tex. L. Rev. 553, 583-85 (1998); Benoliel, supra note 1, at 1073-74.
(3.) See Benoliel, Cyberspace Standardization, supra note 2, at 1271.
(4.) See Reidenberg, supra note 2, at 566-67; Benoliel, supra note 1, at 1115.
(5.) See Paul A. David & Mark Shurmer, Formal Standards-Setting for Global Telecommunications and Information Services, 20 Telecomm. Pol'y 789, 789 (1996) (describing how standards organizations are a response to the difficulty of achieving coordination through market processes); Joseph Farrell & Garth Saloner, Coordination Through Committees and Markets, 19 RAND J. Econ. 235, 236 (1988).
(6.) See infra Part III.
(7.) See infra Part IV.A.
(8.) See infra Part IV.B-C.
(9.) See Jody Freeman, The Private Role in Public Governance, 75 N.Y.U. L. Rev. 543, 551 (2000) (describing administrative governance as a set of negotiations between public and private parties); Robert W. Hamilton, Prospects for the Nongovernmental Development of Regulatory Standards, 32 Am. U. L. Rev. 455, 459-60 (1982); Mark A. Lemley & David McGowan, Could Java Change Everything? The Competitive Propriety of a Proprietary Standard, 43 Antitrust Bull. 715, 753-54 (1998) (describing privately developed standards incorporated into international standards through mechanisms such as the Iso); Steven L. Schwarcz, Private Ordering, 97 Nw. U. L. Rev. 319, 326-27 (2002); cf. Gillian K. Hadfield, Privatizing Commercial Law, Reg., Spring 2001, at 40, 40-41.
(10.) See Freeman, supra note 9, at 639 n.396; Schwarcz, supra note 9, at 346-48.
(11.) See infra text accompanying notes 125-30.
(12.) A network is a set of nodes connected by links. See Mark Buchanan, Nexus: Small Worlds and the Groundbreaking Science of Networks 27-29 (2002); Duncan J. Watts, Six Degrees: The Science of a Connected Age 27 (2003).
(13.) Many other industries include networked components. For example, retailing is not a network business, but operating a large national or global retailer such as Wal-Mart requires a networked supply chain and distribution infrastructure.
(14.) For example, national airlines gained huge efficiencies when they adopted a "hub-and-spoke" approach to routing flights, but they suffered when smaller carriers such as southwest Airlines cherry-picked the most lucrative direct routes.
(15.) See Watts, supra note 12, at 29; M.E.J. Newman, The Structure and Function of Complex Networks, 45 SIAM Rev. 167, 180-96 (2003) (describing interesting properties of networks).
(16.) See Carl Shapiro & Hal R. Varian, Information Rules 183 (1998); Michael L. Katz & Carl Shapiro, Network Externalities, Competition, and Compatibility, 75 Am. Econ. Rev. 424, 424 (1985); Mark A. Lemley & David McGowan, Legal Implications of Network Economic Effects, 86 Cal. L. Rev. 479, 483 (1998).
(17.) See Lemley & McGowan, supra note 16, at 488-89.
(18.) See id.
(19.) The network theory literature on preferential attachment and scale-free dynamics offers an alternative explanation for this pattern. See, e.g., ALBERT-LASZLO BARABASI, Linked: How Everything Is Connected to Everything Else and What It Means 90-92, 216-17 (2002) (discussing the growth of networks and their relationship to preferential attachment and scale-free models).
(20.) other regulated industries such as banking are also dependent, increasingly, on networks.
(21.) See Kevin Werbach, The Centripetal Network: How the Internet Holds Itself Together, and the Forces Tearing It Apart, 42 U.C. Davis L. Rev. 343, 402-05 (2008).
(22.) See id. at 393-95.
(23.) See Manuel Castells, The Rise of the Network Society 469-78 (1996).
(24.) See id. at 66.
(25.) See id. at 469-78.
(26.) See Jonathan E. Nuechterlein & Philip J. Weiser, Digital Crossroads: American Telecommunications Policy in the Internet Age 121 (2005).
(27.) See Kevin Werbach, A Layered Model for Internet Policy, 1 J. on Telecomm. & High Tech. L. 37, 40 (2002).
(28.) See Joseph Farrell & Philip J. Weiser, Modularity, Vertical Integration, and Open Access Policies: Towards a Convergence of Antitrust and Regulation in the Internet Age, 17 Harv. J.L. & Tech. 85, 95 n.41 (2003); Kevin Werbach, Only Connect, 22 Berkeley Tech. L.J. 1233, 1239 (2007).
(29.) See Werbach, supra note 27, at 60.
(30.) See id. at 63-64.
(31.) See Werbach, supra note 21, at 345, 385 n.224.
(32.) See generally Daniel Spulber & Christopher Yoo, Networks in Telecommunications: Economics and Law (Cambridge Univ. Press 2009) (applying network models to telecommunications policy).
(33.) See id.
(34.) Ronald H. Coase, The Nature of the Firm, 4 Economica 386, 394-95 (1937).
(35.) See Spulber & Yoo, supra note 32, at 36-38.
(36.) See Werbach, supra note 28, at 1262.
(37.) See id. at 1234-35.
(38.) See sources cited supra note 16.
(39.) See Nuechterlein & Weiser, supra note 26, at 10-13.
(40.) See id.
(41.) See Farrell & Weiser, supra note 28, at 133.
(42.) See United States v. Microsoft Corp., 87 F. Supp. 2d 30, 38 (D.D.C. 2000).
(43.) See Timothy F. Bresnahan, A Remedy that Falls Short of Restoring Competition, Antitrust, Fall 2001, at 67, 68.
(44.) See Farrell & Weiser, supra note 28, at 97-99.
(45.) See id. at 99-101.
(46.) See Philip J. Weiser, Internet Governance, Standard Setting, and Self-Regulation, 28 N. Ky. L. Rev. 822, 833-35 (2001).
(47.) See Werbach, supra note 28, at 1235.
(48.) See Werbach, supra note 27, at 59; Kevin Werbach, Breaking the Ice: Rethinking Telecommunications Law for the Digital Age, 4 J. on Telecom. & High Tech. L. 59, 65-68 (2005).
(49.) See Jerome Saltzer et al., End-to-End Arguments in System Design, 2 ACM Transactions on Comp. Sys. 277, 277-78, 287 (1984); J. Kempf & R. Austein, The Rise of the Middle and the Future of End-to-End: Reflections on the Evolution of the Internet Architecture (Network Working Group, RFC 3724, 2004), available at http://tools.ietf.org/rfc/rfc3724.txt.
(50.) See Saltzer, et al., supra note 49, at 286-87.
(51.) See generally Mark A. Lemley & Lawrence Lessig, The End of End-to-End: Preserving the Architecture of the Internet in the Broadband Era, 48 UCLA L. Rev. 925 (2001) (discussing the effect of broadband providers on the end-to-end design of the Internet and considering whether government action is warranted).
(52.) See id. at 930-31.
(53.) Vinton G. Cerf, The Disruptive Power of Networks, Forbes, May 7, 2007, at 58, 62 ("Communication protocols, programming languages and operating systems have created platforms for innovation unlike anything in human history.").
(54.) See infra Part IV.B.
(55.) See Saul Hansell, F.C.C. Vote Sets Precedent on Unfettered Web Usage, N.Y. Times, Aug. 2, 2008, at C1.
(56.) See infra Part IV.C.
(57.) See Sascha D. Meinrath & Michael Calabrese, "White Space Devices" & the Myths of Harmful Interference, 11 N.Y.U. J. Legis. & Pub. Pol'y 495, 495, 504 (2008).
(58.) See Tech Companies, Broadcasters Battle Over TV 'White Space', FoxNews.COM, Apr. 8, 2008, http://www.foxnews.com/story/0,2933,348088,00.html.
(59.) See id.
(60.) See Kevin Werbach, Supercommons: Toward a Unified Theory of Wireless Communication, 82 Tex. L. Rev. 863, 904-05 (2004).
(61.) Steve Bickerstaff, Shackles on the Giant: How the Federal Government Created Microsoft, Personal Computers, and the Internet, 78 Tex. L. Rev. 1, 6 (1999); Kevin Werbach, The Federal Computer Commission, 84 N.C. L. Rev. 1, 2-8 (2005).
(62.) See Nuechterlein & Weiser, supra note 26, at 4-7, 231-32.
(63.) See Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom 30-34 (2006).
(64.) See id.
(65.) See Werbach, supra note 61, at 14-26.
(66.) See id.
(67.) See Robert D. Atkinson, Framing a National Broadband Policy, 16 CommLaw Conspectus 145, 145 (2007) ("It is difficult to pick up a business or technology magazine without reading that the United States is falling behind other nations in broadband telecommunications."); Richard Hoffman, When It Comes to Broadband, U.S. Plays Follow the Leader, InformationWeek, Feb. 15, 2007, http://www.informationweek.com/ story/showArticle.jhtml?articleID=197006038; Daniel K. Correa, Info. Tech. & Innovation Found., Assessing Broadband In America: OECD and ITIF Broadband Rankings (2007), http://www.itif.org/files/BroadbandRankings.pdf.
(68.) See Interstate Commerce Act, ch. 104, 24 Stat. 379 (1887).
(69.) See id.
(70.) See Joseph D. Kearney & Thomas W. Merrill, The Great Transformation of Regulated Industries Law, 98 Colum. L. Rev. 1323, 1331-32 & n.20 (1998).
(71.) See Nuechterlein & Weiser, supra note 26, at 232.
(72.) See Reuel E. Schiller, The Era of Deference: Courts, Expertise, and the Emergence of New Deal Administrative Law, 106 Mich. L. Rev. 399, 413-14 (2007).
(73.) See Nuechterlein & Weiser, supra note 26, at 235-36.
(74.) See Kearney & Merrill, supra note 70, at 1402.
(75.) Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56 (codified as amended in scattered sections of 47 U.S.C.).
(76.) See Kearney & Merrill, supra note 70, at 1402; Daniel F. Spulber & Christopher S. Yoo, Access to Networks: Economic and Constitutional Connections, 88 Cornell L. Rev. 885, 919, 921, 926 (2003); Daniel F. Spulber & Christopher S. Yoo, Network Regulation: The Many Faces of Access, 1 J. Competition L. & Econ. 635, 641 (2005).
(77.) A recent FCC public document reports a total of 273 engineers out of 1795 staff. See FCC, FCC Strategic Human Capital Plan, 2007-2011 3 (2008), http://fjallfoss.fcc.gov/ edocs_public/attachmatch/DOC-286801A1.pdf. This is approximately fifteen percent of the total workforce, low for an expert technical agency.
(78.) Philp J. Weiser, FCC Reform and the Future of Telecommunications Policy 4 (2009), http://fcc-reform.org/sites/fcc-reform.org/files/weiser-20090105.pdf.
(79.) See Letter from Russell J. Lefevre, President, IEEE-USA, to Kevin J. Martin, Chairman, FCC (June 5, 2008), http://www.ieeeusa.org/policy/POLICY/2008/060508.pdf.
(80.) See id.
(81.) See id.
(82.) Nuechterlein & Weiser, supra note 26, at 252-53, 585-86.
(83.) See Letter from Russell J. Lefevre to Kevin J. Martin, supra note 79.
(84.) See Reg. and Policy Problems Presented by the Interdependence of Computer and Comm. Servs., Tentative Decision (Computer 1), 28 F.C.C.2d 291, 295-96 (1970); Robert Cannon, The Legacy of the Federal Communications Commission's Computer Inquiries, 55 Fed. Comm. L.J. 167, 173, 180 (2003).
(85.) Donald A. Dunn, Policy Issues Presented by the Interdependence of Computer and Communications Services, 34 Law & Contemp. Probs. 369, 369 (1969) (summarizing the major findings of the report).
(86.) Computer 1, 28 F.C.C.2d at 291.
(87.) See id.
(88.) See id.
(89.) See In re Use of the Carterfone Device in Message Toll Telephone Service (Carterfone), 13 F.C.C.2d 420, 423-24 (1968); Lawrence Lessig, The Future of Ideas: The Fate of the Commons in a Connected World 148 (2001); Werbach, supra note 61, at 5.
(90.) Manley R. Irwin, Computers and Communications: The Economics of Interdependence, 34 Law & Contemp. Probs. 360, 363 (1969).
(91.) See Connection of Terminal Equipment to the Telephone Network, 47 C.F.R. [section] 68 (1998); Werbach, supra note 61, at 14-46.
(92.) Letter from Russell J. Lefevre to Kevin J. Martin, supra note 79.
(93.) See Lawrence Lessig, The Limits in Open Code: Regulatory Standards and the Future of the Net, 14 Berkeley Tech. L.J. 759, 760 (1999).
(94.) See Benkler, supra note 63, at 50-56; Jonathan L. Zittrain, The Generative Internet, 119 Harv. L. Rev. 1974, 2013 (2006).
(95.) See Lessig, supra note 2, at 6; Reidenberg, supra note 2, at 563.
(96.) See Radin, supra note 11, at 1128, 1141.
(97.) See, e.g., Julie E. Cohen, Cyberspace as/and Space, 107 Colum. L. Rev. 210, 252 (2007) ("The new regulatory fora are the expert processes by which technical standards are defined and revised.").
(98.) See Off. of Tech. Assessment, 96th U.S. Cong., Global Standards: Building Blocks for the Future 5 (1992); Carl F. Cargill, Information Technology Standardization: Theory, Process, and Organization 13 (1989); Carl F. Cargill, Open Systems Standardization: a Business Approach 65 (1997).
(99.) Breyer distinguishes between "performance" and "design" standards. Breyer, supra note 2, at 105.
(100.) The approaches are not in conflict. They are simply different kinds of requirements that share an ambiguous term.
(101.) For simplicity, the remainder of this article will use the term "standards" to describe technical, as opposed to regulatory standards.
(102.) See Off. of Tech. Assessment, 96th U.S. Cong., supra note 98, at 8, 10; Martin C. Libicki, Standards: The Rough Road to the Common Byte, in Standards Policy for Information Infrastructure 35, 75 (Brian Kahin & Janet Abbate eds., 1995); Nuechterlein & Weiser, supra note 26, at 385 ("By their nature, telecommunications networks rely on technological standards.").
(103.) Libicki, supra note 102, at 37.
(104.) See Jay P. Kesan & Rajiv C. Shah, Deconstructing Code, 6 Yale J.L. & Tech. 277, 357-58 (2004).
(105.) Carliss Y. Baldwin & Kim B. Clark, Design Rules: The Power Of Modularity 1 (2000).
(106.) See Werbach, supra note 28, at 1237-39.
(107.) See In re Use of the Carterfone Device in Message Toll Telephone Service, 13 F.C.C.2d 420, 423-24 (1968).
(108.) See Cannon, supra note 84 (discussing the legacy of the Computer Inquiries); Werbach, supra note 61, at 22-26 (discussing the effect of the Computer Inquiries on computers within the phone network).
(109.) See Werbach, supra note 28, at 1250-51.
(110.) See id. at 1235, 1237.
(111.) See Werbach, supra note 27, at 59 n.85.
(112.) See Linda Garcia, A New Role for Government in Standard Setting?, StandardView, Dec. 1993, at 2, 7 ("As businesses strive to become more networked, standards will become increasingly important to the functioning of the economy. It is unclear, however, whether the requisite standards will be available when they are needed.").
(113.) See, e.g., Kathleen M.H. Wallman, The Role of Government in Telecommunications Standard-Setting, 8 CommLaw Conspectus 235, 239-51 (2000) (discussing FCC involvement in the public safety spectrum, digital television, wireless telephony, cable set-top boxes, and cable modems).
(114.) E.g., Paul A. David & Shane Greenstein, The Economics of Compatibility Standards: An Introduction to Recent Research, 1 Econ. Innovation & New Tech. 1, 3-42 (1990); Katz & Shapiro, supra note 16, at 439; Michael L. Katz & Carl Shapiro, Product Compatibility Choice in a Market with Technological Progress, 38 Oxford Econ. Papers 146 (Supp. 1986).
(115.) See Katz & Shapiro, supra note 16, at 434.
(116.) See Martin Libicki et al., Scaffolding the New Web: Standards and Standards Policy for the Digital Economy (2000); Joseph Farrell & Garth Saloner, Standardization, Compatibility, and Innovation, 16 RAND J. Econ. 70, 71-72 (1985) (explaining how network externalities can produce "excess inertia" for standards); Marcus Maher, An Analysis of Internet Standardization, 3 Va. J.L. & Tech. 5 (1998). See generally sources cited supra note 16 (elaborating on network effects).
(117.) This discussion addresses standards in the technical sense of a common specification or approach. Jurisprudential standards, in the sense of general legal doctrines like the reasonableness standard in tort law, are something different.
(118.) See Janice M. Mueller, Patent Misuse Through the Capture of Industry Standards, 17 Berkeley Tech. L.J. 623, 633-34 (2002); Larry Seltzer, The Standards Industry: Corporate Consortia Are Supplanting Traditional Rule-Making Bodies, Internet World, Apr. 15, 2001, at 50.
(119.) See, e.g., Farrell & Weiser, supra note 28, at 124-25.
(120.) See id. at 124.
(121.) See A. Michael Froomkin, Habermas@discourse.net: Toward a Critical Theory of Cyberspace, 116 Harv. L. Rev. 749, 756, 788, 790, 819 (2003).
(122.) See Scott O. Bradner, The Internet Standards Process--Revision 3, RFC 2026 (Oct. 1996), http://www.ietf.org/rfc/rfc2026.txt. Standards can begin as proprietary and become consensus-based. For example, Sun Microsystems developed the Java programming language as a proprietary technology, but turned parts of it over to a standards body to foster greater industry adoption. See Howard A. Shelanski & J. Gregory Sidak, Antitrust Divestiture in Network Industries, 68 U. Chi. L. Rev. 1, 86 (2001).
(123.) James J. Anton & Dennis A. Yao, Standard-Setting Consortia, Antitrust, and High-Technology Industries, 64 Antitrust L.J. 247, 251 n.15 (1995); Sean P. Gates, Standards, Innovation, and Antitrust: Integrating Innovation Concerns into the Analysis of Collaborative Standard Setting, 47 Emory L.J. 583, 645 (1998); Mark A. Lemley, Antitrust and the Internet Standardization Problem, 28 Conn. L. Rev. 1041, 1065 (1996).
(124.) Stanley M. Besen & Joseph Farrell, Choosing How to Compete: Strategies and Tactics in Standardization, J. Econ. Persp., Spring 1994, at 117, 128.
(125.) See D. Crocker, Making Standards the IETF Way, Standard View, Sep. 1993, at 48, 48; Bradner, supra note 122, at 23-24, 28; A. Lyman Chapin, The Internet Standards Process, RFC 1310, 14-17 (Mar. 1992), http://tools.ietf.org/pdf/rfc1310.pdf.
(126.) See Mark A. Lemley, Intellectual Property Rights and Standard-Setting Organizations, 90 Cal. L. Rev. 1889, 1933, 1946, 1962-65 (2002).
(127.) See David & Shurmer, supra note 5, at 791.
(128.) See Lemley, supra note 126, at 1962-63.
(129.) See Froomkin, supra note 121, at 790-91.
(130.) See id. at 796-98.
(131.) See Farrell & Saloner, supra note 116, at 71.
(132.) See Froomkin, supra note 121, at 797.
(133.) Some standards processes are hybrids of these different processes. For example, the US standard for digital television, ATSC, came about through a combination of independent proposals and a government process. See Joel Brinkley, Defining Vision: How Broadcasters Lured the Government into Inciting a Revolution in Television 371-72 (1998); Nuechterlein & Weiser, supra note 26, at 385; Wallman, supra note 113, at 235.
(134.) See Froomkin, supra note 121, at 871-73.
(135.) See Joseph Reagle, Why the Internet Is Good: Community Governance that Works Well (Mar. 26, 1999) (unpublished manuscript), available at http://cyber.law.harvard.edu/ people/reagle/regulation-19990326.html (quoting David Clark).
(136.) See Wallman, supra note 113, at 246-47.
(137.) See id. at 246.
(138.) Christopher T. Marsden, The Challenges of Standardization: Toward the Next Generation Internet, in Internet Television 113, 119 (Eli Noam et al. eds., 2004); Wallman, supra note 113, at 243-45.
(139.) See Wallman, supra note 113, at 244-45.
(140.) See Garcia, supra note 112, at 2-4.
(141.) The White House, a Framework for Global Electronic Commerce, 2-3 (1997). But see Mark A. Lemley, Standardizing Government Standard-Setting Policy for Electronic Commerce, 14 Berkeley Tech. L.J. 745, 748-50 (1999) (arguing that the American government's pro-market rhetoric was, in fact, belied by interventionist policy decisions).
(142.) National Technology Transfer and Advancement Act of 1995, 15 U.S.C.A. [section] 272(b) (West 2006 & Supp. 2007); Office of Mgmt. & Budget, Executive Office of the President, OMB Circular A-119, Federal Participation in the Development and Use of Voluntary Consensus Standards and in Conformity Assessment Activities, 63 Fed. Reg. 8546 (1998). OMB Circular A-119 states: "All federal agencies must use voluntary consensus standards in lieu of government-unique standards in their procurement and regulatory activities, except where inconsistent with law or otherwise impractical."
(143.) See FCC, Budget Estimates Submitted to Congress, Fiscal Year 2009 (2008), http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-279991A1.pdf; Nat'l Institute of Standards and Tech., NIST Budget, Planning and Economic Analysis, Fiscal Year 2009, http://www.nist.gov/public_affairs/budget.htm (last visited Dec. 20, 2009).
(144.) See Hubert Zimmermann, OSI Reference Model--The ISO Model of Architecture for Open Systems Interconnection, 28 IEEE Transactions on Comm. 425, 425 (1980).
(145.) See Kai Jacobs, Even Much Needed Standards Can Fail--The Case of E-Mail, 5 J. Comm. Network 93 (2006); ITU-T, Message Handling System and Service Overview, http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-F.400-199906- I!!PDFE&type=items.
(146.) See NIST, Federal Information Processing Standards Publication 146-2, Profiles for Open Systems Internetworking Technologies (POSIT) (1995), http://www.itl.nist.gov/fipspubs/fip146-2.htm (authorizing the use of Internet standards for government systems).
(147.) See Nuechterlein & Weiser, supra note 26, at 387.
(148.) See Werbach, supra note 60, at 919-20.
(149.) See id.
(150.) 47 U.S.C. [section] 256(b) (2006).
(151.) See Nuechterlein & Weiser, supra note 26, at 208, 217.
(152.) See Wallman, supra note 113, at 243-47.
(153.) See Martin Fackler, Toshiba Acknowledges Defeat as Blu-ray Wins Format Battle, N.Y. Times, Feb. 20, 2008, at C2.
(154.) See Wallman, supra note 113, at 246-47; Jason B. Meyer, Note, The FCC and AM Stereo: A Deregulatory Breach of Duty, 133 U. Pa. L. Rev. 265, 266 (1984).
(155.) See FCC Connection of Terminal Equipment to the Telephone Network, 47 C.F.R. [section] 68 (2008); Bickerstaff, supra note 61, at 21-23.
(156.) See 47 C.F.R. [section] 68.
(157.) See In re Use of the Carterfone Device in Message Toll Telephone Service, 13 F.C.C.2d 420, 423 (1968).
(158.) Gerald W. Brock, The Telecommunications Industry: The Dynamics of Market Structure 241-42 (1981) (observing that Carterfone was consistent with the Supreme Court's 1936 decision in an antitrust case involving IBM).
(159.) Use of Carterfone, 13 F.C.C.2d at 422.
(160.) Hush-a-Phone Corp. v. United States, 238 F.2d 266, 267-69 (D.C. Cir. 1956).
(161.) See Gerald W. Brock, Telecommunications Policy for the Information Age: From Monopoly to Competition 86-87 (1994).
(162.) See Tim Wu, Wireless Carterfone, 1 Int'L J. Comm. 389, 395 (2007).
(163.) See Bickerstaff, supra note 61, at 21-23, 46; see also Werbach, supra note 61, at 21-22 (2005).
(164.) See Brock, supra note 161, at 86-87.
(165.) See id.
(166.) See id. at 88.
(167.) See id.
(168.) Id. at 87.
(169.) See id. at 88.
(170.) See id.
(171.) In re Proposals for New or Revised Classes of Interstate and Foreign Message Toll Telephone Service (MTS) and Wide Area Telephone Service (WATS), First Report and Order, 56 F.C.C.2d 593, 598-99 (1975).
(172.) Connection of Terminal Equipment to the Telephone Network, 47 C.F.R. [section] 68 (1998).
(173.) See Brock, supra note 161, at 92-93.
(174.) Am. Tel. and Tel. Co. v. FCC, 434 U.S. 874 (1977).
(175.) See Brock, supra note 161, at 93-98.
(176.) See id.
(177.) Some functions of the FCC, such as promoting universal service, are outside the standards framework, but these are the exception rather than the rule.
(178.) Philip J. Weiser & Dale N. Hatfield, Policing The Spectrum Commons, 74 Fordham L. Rev. 663, 689 (2005) ("In setting telecommunications standards, the FCC should be careful to institute only functional requirements and, where possible, to utilize the experience of established standard-setting bodies to define and enforce the relevant criteria.").
(179.) See Letter from Russell Lefevre to Kevin Martin, supra note 79.
(180.) See id.
(181.) The standards organizations would likely be reluctant to take sides on controversial issues. They might be concerned about maintaining objectivity, or they might have members with differing views. Standards organizations could inform the FCC, however, about the general status of certain standards processes and the outstanding technical questions in certain areas.
(182.) This approach differs from "audited self-regulation," in which government formally deputizes a particular private standards-setting body to perform a public function. See Freeman, supra note 9, at 649-50 (describing audited self-regulation). The FCC would be certifying individual standards, not completely deferring to specific standards bodies.
(183.) Cf. Weiser, supra note 46, at 824-25.
(184.) See Weiser and Hatfield, supra note 178, at 690 ("Significantly, the FCC is also in a position to ensure that standard-setting bodies develop standards based on a fair process that provides a collective benefit that would not be internalized fully by any individual user of spectrum.").
(185.) See Lemley, supra note 126, at 1962.
(186.) See Office of Mgmt. & Budget, supra note 142.
(187.) ANSI is the official US standards process representative, as part of the global International Standards Organization ("ISO").
(188.) See In re 2000 Biennial Regulatory Review of Part 68 of the Commission's Rules and Regulations, 15 F.C.C.R. 24944, 24952-56 (2000).
(189.) See Harold I. Abramson, A Fifth Branch of Government: The Private Regulators and Their Constitutionality, 16 Hastings Const. L.Q. 165, 173 (1989); Freeman, supra note 9, at 642-43.
(190.) See Froomkin, supra note 121, at 809-10.
(191.) See, e.g., Jay P. Kesan & Andres A. Gallo, Optimizing Regulation of Electronic Commerce, 72 U. Cin. L. Rev. 1497, 1603 (2004).
(192.) See Envtl. Prot. Agency, The Power To Make a Difference: Energy Star and Other Partnership Programs 12 (2000) (listing companies participating in the Energy Star program).
(193.) But cf. Kesan & Gallo, supra note 191, at 1603, 1606-07 & n.303 (noting that Internet fraud is nevertheless growing).
(194.) Schwarcz, supra note 9, at 324-25.
(195.) Jonathan R. Macey, Public and Private Ordering and the Production of Legitimate and Illegitimate Legal Rules, 82 Cornell L. Rev. 1123, 1140 (1997) ("Moreover, because informal norms generate outcomes that are generally welfare-enhancing, while law at best generates outcomes that are mixed (and tend strongly towards the welfare-reducing), informal norms should come with a strong presumption of legitimacy."). This analysis draws on the work of Robert Ellickson, showing how privately developed social norms can take the place of formal law. See Robert C. Ellickson, Order Without Law: How Neighbors Settle Disputes 131-32 (1991).
(196.) See Schwarcz, supra note 9, at 334-37.
(197.) See id. at 337 ("The foregoing discussion demonstrates, however, that commercial private ordering will not be economically viable unless these non-efficiency goals are safeguarded in a cost-effective manner. To accomplish this, I propose that the goals be safeguarded directly by imposing constraints on the private actors.").
(198.) See Isaiah Berlin, Two Concepts of Liberty 7-16 (1958).
(199.) See infra Part IV.B.
(200.) See Lessig, supra note 89, at 120.
(201.) 47 U.S.C. [section] 230 (1998); see also Zeran v. Am. Online, Inc., 129 F.3d 327, 330-33 (4th Cir. 1997) (applying Section 230).
(202.) 17 U.S.C. [section] 512 (1999).
(203.) See Mark A. Lemley, Rationalizing Internet Safe Harbors, 6 J. on Telecomm. & High Tech. L. 101, 113-15 (2008).
(204.) 47 U.S.C. [section] 230(c)(1), (e)(3).
(205.) See Lemley, supra note 203, at 102.
(206.) There are limits to the scope of immunity. In particular, the Supreme Court in the Napster and Grokster cases rejected claims that the Section 512 immunity under the DMCA should shield peer-to-peer file-sharing services that knew their primary use was copyright infringement. See Metro-Goldwyn-Mayer Studios, Inc. v. Grokster, Ltd., 545 U.S. 913 (2005); A&M Records, Inc. v. Napster, Inc., 239 F.3d 1004 (9th Cir. 2001).
(207.) Appropriate Framework for Broadband Access to the Internet over Wireline Facilities, 70 Fed. Reg. 60222, 60223-25 (Oct. 17, 2005) (to be codified at 47 C.F.R. pts. 51, 63, 64).
(208.) Inquiry Concerning High-Speed Access to the Internet Over Cable and Other Facilities, 17 F.C.C.R. 4798, 4802 (2002).
(209.) 47 U.S.C. [section] 153 (2006).
(210.) 47 U.S.C. [section] 251 (2006).
(211.) See Werbach, supra note 28, at 1266-70.
(212.) See In re Appropriate Framework for Broadband Access to the Internet over Wireline Facilities, Policy Statement (Internet Policy Statement), 20 F.C.C.R. 14986, 14987-88 (2005).
(213.) Id. at 14988.
(214.) Id. at 14988 n.15.
(215.) Id. at 14988 & n.15.
(216.) See Kevin Werbach, Off the Hook, 95 Cornell L. Rev. (forthcoming Mar. 2010) (manuscript at 10-11), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id= 1371222.
(217.) See FCC v. Midwest Video Corp., 440 U.S. 689, 713-14 (1979); United States v. Midwest Video Corp., 406 U.S. 649, 675-76, 680-81 (1972); United States v. Sw. Cable Co., 392 U.S. 157, 178, 181-82 (1968); Werbach, supra note 216 (manuscript at 38-51).
(218.) Nat'l Cable & Telecomm. Ass'n v. Brand X Internet Servs., 545 U.S. 967, 1001-02 (2005).
(219.) Peter Svensson, Comcast Blocks Some Internet Traffic: Tests Confirm Data Discrimination by Number 2 U.S. Service Provider, MSNBC.com, Oct. 19, 2007, http://www.msnbc.msn.com/id/21376597/.
(220.) Letter from David L. Cohen, Executive Vice President, Comcast Corporation, to Kevin J. Martin, Chairman, FCC, at 2 (Mar. 28, 2008); Letter from Kathryn A. Zachem, Vice President of Regulatory Affairs, Comcast Corporation, to Marlene H. Dortch, Secretary, FCC, at 8 n.28 (July 10, 2008).
(221.) Tim Wu, Network Neutrality, Broadband Discrimination, 2 J. on Telecomm. & High Tech. L. 141, 149-54 (2003).
(222.) See Werbach, supra note 28, at 1260.
(223.) See Wu, supra note 221, at 152.
(224.) The major example involved Madison River, a small rural telephone company. See In re Madison River Commc'ns, LLC, 20 F.C.C.R. 4295, 4297 (2005).
(225.) F.C.C. Chairman Favors Penalizing Comcast for Internet Blocking, N.Y. Times, Jul. 11, 2008, at C2.
(226.) Formal Complaint of Free Press and Public Knowledge Against Comcast Corporation for Secretly Degrading Peer-to-Peer Applications, Memorandum Opinion and Order (ComcastP2P Order), 23 F.C.C.R. 13028, 13028 (2008).
(227.) See id.
(228.) See id. at 13035-36.
(229.) See id. at 13048-49.
(230.) See id. at 13054.
(231.) See id. at 13047, 13060.
(232.) The Commission's jurisdictional theory was also suspect. See Werbach, supra note 216 (manuscript at 3).
(233.) See Roy Mark, Comcast Sues FCC over Network Neutrality Ruling, eWeek.com, Sept. 4, 2008, http://www.eweek.com/c/a/IT-Infrastructure/Comcast-Sues-FCC- OverNetwork-Neutrality-Ruling.
(234.) See id.
(235.) See In re Appropriate Framework for Broadband Access to the Internet over Wireline Facilities, Policy Statement (Internet Policy Statement), 20 F.C.C.R. 14986, 14987-88 (2005).
(236.) See In re Formal Complaint of Free Press and Public Knowledge Against Comcast Corporation for Secretly Degrading Peer-to-Peer Applications, Memorandum Opinion and Order (ComcastP2P Order), 23 F.C.C.R. 13028 (2008).
(237.) Chloe Albanesius, Comcast to Cap Data Transfers at 250 GB in Oct., PCMag.com, Aug. 28, 2008, http://www.pcmag.com/article2/0,2817,2329170,00.asp; Posting of Om Malik to GigaOM Blog, Memo To Comcast: Show Us the Meter for Metered Broadband, http://gigaom.com/2008/08/28/memo-to-comcast-show-me-the-meter-for- meteredbroadband/ (Aug. 28, 2008, 23:37 PDT).
(238.) Comcast P2P Order, 23 F.C.C.R. at 13054.
(240.) One of the experts who testified at a field hearing, David Reed, cited an IETF Request for Comment ("RFC") entitled "Inappropriate TCP Resets Considered Harmful." Id. at 13054 n.212. An RFC is an informational document that can be influential, but does not have the consensus status of a standard. Common practices such as the use of Network Address Translation ("NAT") contravene IETF RFCs. While the FCC's evidence is persuasive that Internet engineering experts objected to Comcast's practices, the conclusory statement that those practices "contravene" IETF standards is too strong.
(241.) See M. Kaat, Overview of 1999 IAB Network Layer Workshop 10 (IETF Network Working Group, RFC 2956, 2000), available at ftp://ftp.ietf.org/rfc/rfc2956.txt.
(242.) See Comcast Corporation, Description of Current Network Management Practices, http://downloads.comcast.net/docs/Attachment_A_Current_Practices.pdf (last visited Oct. 1, 2009).
(243.) See ComcastP2P Order, 23 F.C.C.R. at 13028.
(244.) See Weiser, supra note 46, at 833-35.
(245.) See Matthew Fagin, Frank Pasquale & Kim Weatherall, Beyond Napster: Using Antitrust Law to Advance and Enhance Online Music Distribution, 8 B.U. J. Sci. & Tech. L. 451, 501-03 (2002) (describing efficiencies of P2P distribution).
(246.) See Brad Reed, How ISPs Learned to Stop Worrying and Love P2P, Network World, Mar. 27, 2008, http://www.networkworld.com/news/2008/032708-p2p.html.
(247.) See Larry Hardesty, Internet Gridlock, Tech. Rev., July-Aug. 2008, at 56; Posting of Nate Anderson to Ars Technica, Comcastic P4P Trial Shows 80% Speed Boost for P2P Downloads, http://arstechnica.com/old/content/2008/11/comcastic-p4p-trial-shows- 80-speed-boost-for-p2p-downloads.ars (Nov. 3, 2008 18:20 CST).
(248.) See Reed, supra note 246.
(249.) See In re Formal Complaint of Free Press and Public Knowledge Against Comcast Corporation for Secretly Degrading Peer-to-Peer Applications, Memorandum Opinion and Order (Comcast P2P Order), 23 F.C.C.R. 13028, 13081-84 (2008) (Statement of Commissioner Adelstein).
(250.) See id. at 13065-68 (Statement of Chairman Martin).
(251.) See Comcast Corporation, Description of Planned Network Management Practices to be Deployed Following the Termination of Current Practices, http://downloads.comcast.net/docs/Attachment_B_Future_Practices.pdf (last visited Dec. 20, 2009) (describing "protocol-agnostic" management techniques).
(252.) See Comcast P2P Order, 23 F.C.C.R. at 13082 (Statement of Commissioner Adelstein).
(254.) See Anderson, supra note 247.
(255.) See Werbach, supra note 60, at 870.
(256.) See id. at 871.
(257.) See Yochai Benkler, Overcoming Agoraphobia: Building the Commons of the Digitally Networked Environment, 11 HARV. J.L. & TECH. 287, 325-26 (1998); Werbach, supra note 60, at 871-73.
(258.) See Werbach, supra note 60, at 894.
(259.) See id. at 895-97.
(260.) See id. at 872-73.
(261.) See id. at 878, 919-20.
(262.) In the broadcast model, the licensee does not control the market for receiver devices. In the mobile phone model, the licensee approves and certifies devices, even if third parties build them.
(263.) See Werbach, supra note 60, at 919-20.
(264.) See 47 C.F.R. [section] 15 (2008).
(265.) See LAN/MAN Standards Comm., IEEE, Wireless LAN Medium Access Control (MAC) and Physical Layer Specifications (PHY) (2007), http://standards.ieee.org/getieee802/download/802.11-2007.pdf.
(266.) See Wi-Fi: It's Fast, It's Here--and It Works, BusinessWeek.com, Apr. 1, 2002, http://www.businessweek.com/technology/content/apr2002/tc2002041_1823.htm.
(267.) See Werbach, supra note 60, at 925-26.
(268.) See Hernando De Soto, The Mystery of Capital: Why Capitalism Triumphs in the West and Fails Everywhere Else 6-7 (2000).
(269.) Unlicensed Operation in the TV Broadcast Bands, 69 Fed. Reg. 34103 (proposed June 18, 2004).
(270.) See Free Press, Measuring the TV "White Space" Available for Unlicensed Wireless Broadband 1 (2005), http://www.freepress.net/docs/ whitespace_analysis.pdf; Shared Spectrum Company, Spectrum Occupancy Measurements (2005), http://www.sharedspectrum.com/measurements/download/ NSF_Chicago_2005-11_measurements_v12.pdf.
(271.) See Tech Companies, Broadcasters Battle over TV 'White Space', supra note 58.
(272.) See id.
(273.) See Unlicensed Operation in the TV Broadcast Bands, 71 Fed. Reg. 66897 (proposed Nov. 17, 2006).
(274.) See id.
(275.) Steven K. Jones et al., FCC, Evaluation of the Performance of Prototype TV-Band White Space Devices Phase II (Oct. 15, 2008), http://hraunfoss.fcc.gov/ edocs_public/attachmatch/DA-08-2243A3.pdf.
(276.) See Unlicensed Operation in the TV Broadcast Bands, Second Report and Order and Memorandum Opinion and Order, ET Docket No. 04-186 and 08-260 (Nov. 14, 2008).
(277.) See id.
(278.) See id.
(279.) See id.
(280.) See Posting of Matthew Lasar to Ars Technica, Broadcasters Sue FCC Over White Space Broadband Decision, http://arstechnica.com/tech- policy/news/2009/03/broadcasterssue-fcc-over-white-space- broadband-decision.ars (Mar. 3, 2009 11:56 CST).
(281.) See Cheryl A. Tritt, Telecommunications Future, in PLI's 23rd Annual Institute on Telecommunications Policy & Regulation 85, 95-97 (2005).
(282.) Technically, interference is an artifact of devices, not the airwaves themselves. The term is shorthand for the limited ability of devices to communicate efficiently.
(283.) See Weiser & Hatfield, supra note 178, at 690 ("If managed optimally, the FCC's use of standard-setting bodies to develop the necessary etiquette standards can leverage the expertise of such standard-setting bodies as well as maintain a degree of oversight to be sure that such standards are adopted.").
(284.) See Weiser, supra note 46, at 842.
(285.) See Lasar, supra note 280.
Kevin Werbach, Assistant Professor of Legal Studies and Business Ethics, The Wharton School, University of Pennsylvania. Thanks to Kyle Dixon, Richard Bennett, and Tomoaki Watanabe for comments on earlier drafts, and to Erfat Sinister for research assistance. The author served as co-lead of the Federal Communications Commission Agency Review for the Obama-Biden Transition Project. The views expressed herein are entirely his own, and do not represent those of the obama Administration, the FCC, or any of its Commissioners. Contact: firstname.lastname@example.org.
|Printer friendly Cite/link Email Feedback|
|Publication:||Harvard Journal of Law & Technology|
|Date:||Sep 22, 2009|
|Previous Article:||Assigning rights and protecting interests: constructing ethical and efficient legal rights in human tissue research.|
|Next Article:||The neglected dimension of patent law's PHOSITA standard.|