Printer Friendly

The spectrum opportunity: sharing as the solution to the wireless crunch.

"Sharing" has become the latest and greatest buzzword in the somewhat interminable policy debates regarding the notion that more spectrum, lots more spectrum, be available to meet the exponentially growing demand for broadband and cellular wireless systems. The proximate cause of this notoriety was the PCAST Report to the President (2012), put together by a group of distinguished technologists who, inter alia, thoroughly described the reasons and the urgency to make large amounts of spectrum (1,000 MHz) available and suggested that the means to accomplish this was to share spectrum now controlled exclusively by the federal government with the private sector, under terms and conditions in which both private and public sector needs could be met via sharing.

The PCAST recommendation is but the latest in a long history of policy proposals for more efficient allocation and use of spectrum, starting with Coase (1959) and, more recently, this author (Faulhaber & Farber, 2003) and many others. The policy debate is focused on the use of licensed spectrum, such as what is currently used for cellular wireless service, versus unlicensed spectrum, such as WiFi. Broadly speaking, licensed spectrum is based on exclusive use of spectrum bands (for example, Verizon Wireless controls the spectrum bands over which their cellular service is offered), while unlicensed spectrum is based on sharing spectrum bands (for example, WiFi is a low-powered service in which bands are re-used by many consumers). Both sides of the debate understand that the fundamental issue is the control and management of radio interference. And both sides of the debate recognize that both forms of property rights are needed. But the sides differ dramatically on how much of each type is needed to achieve the maximum benefit from the available spectrum.

In their superbly written article "The Spectrum Opportunity: Sharing as the Solution to the Wireless Crunch," my Wharton colleague Kevin Werbach and Aalok Mehta use the "sharing" meme to recast the licensed/unlicensed policy debate as not-sharing/sharing. As we all learned in kindergarten, sharing is good; Werbach and Mehta position sharing and the PCAST report as weighing in on the side of unlicensed spectrum in this longer debate. Their review of the debate is quite well written and strongly supports their view that unlicensed is, or at least should be, the wave of future policy:

The evidence suggests that not only is spectrum sharing becoming more important and feasible, but that a framework that makes sharing the default approach offers significant political, economic, and societal benefits. Exclusive-use licenses will still be desirable in many circumstances, but they should have the burden of proof. (Werbach & Mehta, 2013, p. 2)

One might be excused for thinking that the debate is now over, and PCAST's support of sharing has won the day for unlicensed. A more careful examination of the evidence, however, suggests that the debate is far from over, and indeed the current enthusiasm for sharing, even in the world of federal spectrum, is unlikely to have the positive results the federal government expects.

While Werbach and Mehta conflate sharing a la PCAST with licensed/unlicensed a la the by-now-traditional spectrum debate, it is perhaps more useful to separate the two rather different concepts and consider each in turn, a task to which I now turn.

Sharing a la PCAST

The PCAST report is quite thorough and well-grounded. While most scholars have focused on the report's sharing recommendation, the document also highlights other critical factors of spectrum use, including (a) the importance of receiver standards (currently neglected); (b) the role of public safety radio, much talked-about but oft-neglected (Faulhaber, 2007); (c) the absence of any incentive for federal users to utilize their spectrum efficiently (since they receive and hold it for free); and (d) the need to make more spectrum, and lots of it, available very quickly.

The PCAST report rather quickly dismisses the concept that federal spectrum bands should be "cleared" (i.e., the federal user vacates its use of the band and is moved to another band) and reallocated to private use: "PCAST finds that clearing and reallocation of Federal spectrum is not a sustainable basis for spectrum policy due to the high cost, lengthy time to implement, and disruption of the Federal mission" (PCAST, 2012, p. vi). The report supported this statement by referencing an undocumented National Telecommunications and Information Administration (NTIA) finding and an ambiguous example from 2006 in which cleared federal spectrum resulted in a return to the government (in auction revenues, not social welfare) that was apparently not large enough, in PCAST's view. The report's dismissal of clearing and reallocating seems cursory and weak indeed.

The PCAST report immediately thereafter states its key policy conclusion: "The essential element of this new Federal system architecture is that the norm for spectrum use should be sharing, not exclusivity" (PCAST, 2012, p. vi). In essence, the conclusion is that some form of government-mandated and controlled sharing is to be implemented. There is no evidence presented for this conclusion. The remainder of the report essentially explicates how that government mandate and control should be exercised.

The report recognizes the urgency of getting on with the job as soon as possible. The report recommends "the Secretary of Commerce immediately identify 1,000 MHz of Federal spectrum in which to implement the new architecture and thereby create the first shared use spectrum superhighways" (PCAST, 2012, p. vii). Almost a year later, the White House issued a memorandum, "Expanding America's Leadership in Wireless Innovation" (Obama, 2013), that directed the relevant agencies (NTIA, FCC) to implement most of the recommendations of the PCAST report posthaste. Unfortunately, the 1,000-MHz mandate did not appear in this memorandum. And to date, nothing seems to have happened, although apparently meetings are indeed being held. Apparently, the term "immediate" in the public policy environment has a much more relaxed meaning than in the private sector, where the total amount of wireless data traffic increased from 866.7B MB to 1.468T MB in 2012 (CTIA, 2013), a considerably brisker pace.

What exactly does the PCAST report mean by sharing spectrum? For example, suppose the military controls a spectrum band that is used to operate unmanned aerial vehicles ("drones"). Clearly, the military's use of this spectrum meets a critical national defense need, but its use of it is occasional (drones are not always flying), may only be used in some parts of the country (e.g., East and West Coasts), and usually does not use all the spectrum (there may only be one or two drones in the air, as opposed to dozens). Even this important spectrum might be suitable for sharing with the private sector. In this case, the military would be the primary user, able to use the spectrum on a priority basis. A licensed cellular operator would be a secondary user, able to use parts of the spectrum that are not in use for military purposes. The report even suggests that the spectrum may support a third use, General Authorized Access, which includes machine-to-machine communications, remote meter reading, environmental device monitoring, and so on.

But how can the myriad of radios avoid interfering with the drone controllers (primary), or the cellular voice calls (secondary)? In the nonfederal customer world, WiFi, cordless phones, garage door openers, and baby monitors all seem to avoid interference, because they are all very low-powered devices that are unlikely to interfere with nearby receivers. But in a world of high-powered devices, avoiding interference becomes much more difficult.

Simple methods of sharing spectrum have been available for a number of years. For example, cordless phones have used spread spectrum technology quite successfully for 30 years to avoid interference between transceivers. WiFi, a wildly popular and useful technology for in-home broadband, uses simple sharing methods. Both these examples involve low-powered devices, but do illustrate that using technology to implement sharing is nothing new. However, if spectrum is shared by high-powered devices, the demands on devices to implement sharing increase substantially. One approach is the use of "cognitive" radios, which are aware of the spectrum environment around them and can switch spectrum bands in real time to avoid interfering with existing users. A second approach is to use a listening network that is aware of who is transmitting on what bands and communicates with other devices to let them use idle spectrum. There are several other means of sharing spectrum, generally under the rubric of Dynamic Spectrum Access.

While these technologies have been known in the laboratories for decades, very few are actually used commercially. Some are so complex that practitioners don't expect them to be deployed for many years. (1)

The PCAST report gets to the key problem in short order: Federal agencies have no interest in sharing their spectrum with anyone. "Federal users currently have no incentives to improve the efficiency with which they use their own spectrum allocation, nor does the Federal system as a whole have incentives to improve its overall efficiency" (PCAST, 2012, p. ix).

And here is the nub of the problem. If people don't want to share, they are not going to share, White House memoranda or not. Federal agencies gain nothing from sharing; they received the spectrum for free, no one monitors their use of it, and no one apparently has the power to take it away from them. Bottom line: Actual sharing of federal spectrum is highly unlikely.

PCAST was well aware of this problem, and of the necessity of providing federal agencies with incentives and oversight to ensure sharing actually happens. To this end, the report recommends establishing a series of committees and initiatives (which the Obama memorandum directs to be established):

* Spectrum Management Team, led by White House Chief Technology Officer

* Spectrum Sharing Partnership Steering Committee, consisting of industry CEOs

* Federal Spectrum Access System, a computer-based system for identifying each parcel of federal spectrum, its registration, and conditions of use

* Public Private Partnership, made up of federal, state, local, private, and academic representatives

It is not clear that any of these bureaucratic committees actually solve the problem of lack of incentives for sharing. But the report goes rather farther in providing such incentives. While again recognizing that "Federal users obtain no reward for reducing their own need for spectrum, for sharing spectrum with other agencies, or for sharing spectrum use rights with non-Federal users even when such sharing would be socially optimal" (PCAST, 2012, p. 55), the report recommends the creation of an ersatz currency, or "spectrum currency," for federal agencies to use in valuing spectrum, allowing them to trade spectrum with other agencies using this spectrum currency and even offer spectrum to private users. Whether this scheme actually provides incentives for federal agencies to share their spectrum remains to be seen; there are no apparent efforts to establish this currency or a market for sharing federal spectrum. (2)

A report from the Government Accountability Office (2012) examines the problem of incentives, reaching conclusions rather less optimistic about the feasibility of implementing incentive schemes, but nevertheless suggests the FCC and NTIA move forward and report to Congress on the results of their deliberations.

Even if incentives for sharing can be put in place, is there hope that this potential source of spectrum will be available soon to meet skyrocketing demands? The PCAST report's strong recommendation is that the Commerce Department must make 1,000 MHz of spectrum available immediately, although this recommendation did not make it into the president's memorandum (Obama, 2013) and apparently there is no current effort to make this happen. The report is rather more realistic in its assessment of actual spectrum sharing rollout:

Although complete accomplishment of this transformation, in all Federal spectrum, will take time--perhaps two or three decades--we stress that implementing our recommendations will lead to rapid results. The long-term direction ... can start to be operational in 1-3 years. [emphasis added] (PCAST, 2012, p. ix)

Again, the meaning of the word "rapid" in the public policy context seems a good deal more relaxed than in the real commercial world in which the rest of us live.

What are the prospects for the sharing of federal spectrum with commercial and general access users in a timely fashion? At the current pace of events, "timely" seems out of the question. The current rate of growth of wireless broadband suggests that the needed timescale for new spectrum is months; the current rate of progress on sharing suggests the actual timescale is years or possibly decades. In spite of the popularity of sharing in current policy debates, optimism that something good might actually happen is unwarranted.

Sharing a la Werbach and Mehta

The Werbach and Mehta article uses the buzzword "sharing" primarily as a means of recasting the long-running licensed/unlicensed debate; their references to the PCAST report emphasize its general support for sharing: "The core conclusion of the PCAST report is that 'the norm for spectrum use should be sharing, not exclusivity'" (Werbach & Mehta, 2013, p. 9).

Werbach and Mehta go a bit overboard to make their case for unlicensed spectrum. For example, they point to the well-known result that licensed spectrum is vastly underutilized and imply that it is licensing that is responsible for this outcome (Werbach & Mehta, 2013, p. 3). In fact, underutilization has been known for decades; also well-known is the cause: FCC regulation strictly limits the use to which each spectrum band can be placed. Any band in the "beachfront" spectrum of broadcast television can only be used (by FCC regulation) by television; a cellular carrier that was willing to pay for such bandwidth could not use it for wireless service. In fact, almost all the usable spectrum is subject to such restrictive use regulation; as a result, uses no longer so popular (such as UHF TV) occupy spectrum they cannot usefully sell. Hence, underutilization.

Werbach and Mehta (2013, p. 4) also set up the dichotomy between sharing and exclusivity, similar to the PCAST report. The use of the term "exclusivity" (or "exclusive use") seems to imply, pejoratively, that the licensee will exclude all others from using that spectrum band. This is false; almost every individual in the world has access to spectrum bands licensed to wireless phone companies simply by buying service from that company, much like hotels and apartment buildings rent or lease space in their properties to short- or long-term residents. In the commercial world that we are all familiar with, "sharing" is implemented by renting or leasing part of an owned facility. Additionally, cellular spectrum is also shared among wireless companies on a regular basis, subject to commercial contracts. There is nothing about exclusive use or commercial ownership that precludes sharing; in fact, sharing is often the commercial norm. Licensing does not preclude sharing; it often facilitates it.

Werbach and Mehta (2013) also note that "In recent years there has been an explosion of technical innovation around spectrum sharing" (p. 5) including cognitive radios, mesh networks, and smart antennas. In fact, these technologies have been well-known for well over 15 years. But in spite of the hoopla by unlicensed advocates, actual deployment of these technologies has been minimal to none. In fact, over a decade ago a similar list was published in The Economist (2002), promising these whizzy gadgets were right around the corner. Today the list of whizzy gadgets is about the same, and it is still "right around the corner." (3)

Werbach and Mehta (2013) make the excellent point that most cellular smartphones use WiFi (when available) as a means of accessing data and thereby conserve scarce cellular spectrum. Wireless firms are certainly not averse to sharing, as noted above, and WiFi admirably fills an important niche.

In the section "Shortcomings of Spectrum Clearing," Werbach and Mehta (2013) correctly note that there is no "new" spectrum and that clearing spectrum is "slow and expensive." This is certainly true, but it is an artifact of the regulatory process of clearing spectrum. As long as the regulations require the government to clear the spectrum, it will always be slow and expensive. Suppose, however, we treated all licensed spectrum like we treat any other property and let licensees buy and sell the licenses, subject only to interference restrictions (possibly including receiver restrictions). Then we can let the market take care of ensuring that the most valued use of the spectrum will be achieved. Clearly, establishing robust interference restrictions that work in a world of open spectrum markets would require serious technical efforts on the part of the FCC, but the result would be well worth it. If the market for spectrum worked as well as the market for, say, real estate, it would cease to be "slow and expensive" and would respond well to the needs of users. We are in a world of "slow and expensive" spectrum clearing because we have put ourselves there, via regulations. We are certainly capable of removing ourselves from that world.

Skorup (2013) suggests a number of alternatives to substantially lower the cost of clearing federal spectrum, but a very interesting proposal could be used for all spectrum, federal or nonfederal. It is based on the strategy used in the 1980s to close military bases. It was agreed that the United States post-Cold War had far too many bases, but closing any particular base resulted in strong lobbying by the affected districts. Congress established the Base Reallocation and Closure Commission (BRAC), a group of independent experts who decided which bases needed to be closed and presented their finding to Congress for a vote on the entire package, unbundling not permitted. Its success at solving the collective action problem led Senator Larry Pressler (author of the Telecommunications Act of 1996) to introduce legislation to "BRAC the spectrum" to establish an independent commission to reallocate federal spectrum, possibly with compensation. Unfortunately, the idea went nowhere, although it has recently been reintroduced by Representative Adam Kinzinger and Senator Mark Kirk. Although this is intended for reallocating federal spectrum, there is no reason this could not be applied to nonfederal spectrum (presumably not including spectrum already auctioned off by the FCC). Just as BRAC solved the political problem of special interests blocking base closures, a similar solution could solve the political problem of special interests blocking band clearing and reallocation. "Slow and expensive" doesn't have to happen; there are solutions available.

Werbach and Mehta (2013) also fault the auction process of allocating licensed spectrum for several not-very-compelling reasons.

1. "Auctions favor ... large providers"? Certainly not true of all auctions; perhaps we have large providers because this is a business with scale economies. And how do we explain new entrants such as MetroPCS and Cricket?

2. "Auctions favor ... certain business models" in which Werbach and Mehta include charging for airtime. This seems to be how virtually every other business in the world works, by charging customers for what they use. Is this supposed to be bad? The authors also resurrect the old chestnut about licensees "potentially implementing restrictions on certain applications." The evidence that any licensee actually does place restrictions on any applications is nil, and the authors have no new evidence to bring to the table. They are merely re-asserting an old and unsupported assertion.

3. "Auctions create incentives for anti-competitive behavior." Really? If so, how come there are no antitrust suits based on such anticompetitive behavior? The authors also resurrect yet another old chestnut, that the incumbents are "warehousing" spectrum, for which (again) there is no evidence whatsoever.

4. "Sharing ensures that spectrum is available, to more people." Virtually every person on the planet has available cellular service via licensed wireless carriers. It is their job to make spectrum available, and far more people have cellular service available to them than have WiFi available to them (to use the most popular unlicensed service).

Perhaps most surprising is the assertion by Werbach and Mehta that unlicensed sharing solutions will lead to greater innovation than unlicensed. The evidence suggests the complete opposite. Since the above-referenced article in The Economist, there has been rather little diffusion of existing innovations and rather modest new innovation specifically designed for unlicensed spectrum. Meanwhile, wireless cellular, which uses licensed spectrum, has seen a true explosion of innovation. In the handset market, we've moved from the Motorola StarTac to the iPhone, Droid Ultra, and Galaxy S4, with far more powerful processors, superior displays, and better Internet connectivity. In the application market, we've moved from virtually no apps to tens of thousands of apps, on multiple operating systems. And in the service provider market, we've moved from limited 1G coverage to far more extensive coverage using 4G LTE, yielding far higher bandwidth than we could have dreamed of a decade ago. The rather modest technical gains in the unlicensed arena have been overwhelmed by the tsunami of innovation in cellular wireless.4

White Space: Unlicensed Next Success ... or Not?

When the FCC established spectrum for broadcast television in the late 1940s, the analog technology of transmission and reception required television channels of 6 MHz each, and another 6 MHz of spectrum between active channels, so that the noisy transmitters and receivers of the day would not interfere with each other, creating a so-called white space. Currently, television receivers and transmitters are much better at eliminating out-of-band interference, so this white space is no longer necessary to prevent interference.

Following the report of the FCC's Spectrum Policy Task Force, the FCC initiated a Notice of Inquiry (Federal Communications Commission, 2002) inviting comments on the use of white space spectrum for unlicensed use. After much discussion among unlicensed advocates (enthusiastic), licensed advocates (who wanted this spectrum auctioned for licensed use), and television broadcasters (who claimed they would suffer interference), much testing of devices, and much planning of technical requirements and systems to prevent interference (with both television broadcasters and new users), the FCC eventually issued its Report and Order (Federal Communications Commission, 2008), which approved rules governing unlicensed use of white space spectrum.

The plan eventually adopted involved the mandated use of radios with a geolocation capability of the white spaces devices combined with database access to identify vacant television channels at specific locations. Indeed, this is exactly the type of system that unlicensed advocates had longed for: real-time management of unlicensed spectrum resources to control interference. All that was required was to designate which firms could operate as database administrators and for developers to introduce actual white space devices, so that real people could communicate using white space spectrum.

But the story was not over. The National Association of Broadcasters sued the FCC, and more delay was introduced into the process. Eventually, the FCC issued another Report and Order (Federal Communications Commission, 2010), fine-tuning the previous Report and Order, and moved toward implementation. The FCC quickly certified a number of firms to be database administrators, including Spectrum Bridge, Telcordia, and Microsoft (and, more recently, Google). The FCC also approved its first white space device, the Koos Technical Services (KTS) Agility White Space Radio, near the end of 2011; shortly thereafter, it approved another white space device from Adaptrum. Meanwhile, the FCC forged ahead with a trial of the Spectrum Bridge database and the KTS device in Wilmington, North Carolina/New Hanover County, North Carolina, eventually offering commercial service on January 26, 2012. Services offered included video surveillance in public parks, as well as public Internet access at speeds of 1.5 to 3.1 Mbps (McAdams, 2012). (5)

This rather underwhelming first commercial operation has been followed by a more substantial launch of a white space system on the West Virginia University (WVU) campus in Morgantown, West Virginia. The AIR.U system, introduced on July 9, 2013, is used primarily to connect trams running on the WVU campus to the Internet; students riding the trams connect to the device on their tram using WiFi, which then connects using via white space spectrum to the Internet. Other uses of AIR.U are being planned by WVU.

So is unlicensed white space a success or not? If we define "success" as a small commercial operation in a local niche market, after a decade of effort by the FCC and numerous industry actors, then success it must be. Certainly the hype generated by proponents of unlicensed spectrum as evidenced in the Werbach and Mehta article seems a bit overblown.

* On the basis of markets, white space systems are trivial compared to WiFi (introduced in 2000) or cellular systems over the last decade. Both cellular and WiFi have been revolutionary in their public impact during this 10-year period; during the same period, white space systems have been quite modest indeed.

* On the basis of technology, the current white space systems are a modest advance in the state of the art for operational systems. This is a good thing, in that a technically more ambitious plan would have likely failed. But we see no cognitive radios, no mesh networks, no smart antennas--none of the whizzy systems promised by that long-ago Economist article (see above).

Conclusions

Perhaps white space systems will grow, albeit slowly; potential applications in machine-to-machine Internet access are often touted as ready for white space. However, there are still no white space devices in Radio Shack. The evidence strongly suggests that our expectations should be modest, certainly as compared to cellular and WiFi. The promised revolution will certainly be a quiet one.

The PCAST report looked favorably on the white space effort, viewing it as a possible model for sharing federal spectrum. But the report wisely noted how long white space took to mature: "Recently ... 700 MHz whitespace devices ... have been approved for dynamic spectrum sharing ... [and] took many years to reach commercial viability" (PCAST, 2012, p. 65). Enthusiasm for sharing and unlicensed spectrum needs to be tempered by the modest white space experience thus far.

While the present of white space unlicensed spectrum is modest, aficionados predict a much brighter future, based on the "explosion of technical innovation around spectrum sharing" (Werbach & Mehta, 2013, p. 5). Are these innovations field-ready? A more sober view emerged at the recent Wireless Research and Development Workshop IV:

Making spectrum sharing a reality will mean identifying what types of systems can be shared, negotiating and stipulating access rights, determining the market for such shared systems, developing specification and standards to allow sharing architectures, developing infrastructure and devices to implement the sharing, certifying equipment using new test procedures and equipment for compliance, and finally enforcing compliance. This process could easily take ten years or possibly even much longer. (Rysavy, 2012, p. 3)

And further:

[The sharing systems ] being used today solve relatively simple problems, e.g., geographic sharing. ... More complex problems, such as how a carrier class mobile technology could share with multiple ... systems will take many years to develop, test, and implement in an economically rational manner. (Rysavy, 2012, p. 3)

As we get more ambitious deploying unlicensed spectrum, the problems become much more difficult and the technology is much less transparent than is portrayed by unlicensed advocates. Again, modest expectations.

References

Coase, R. H. (1959). The Federal Communications Commission. Journal of Law and Economics, 2, 1-40.

CTIA. (2013). Wireless quick facts: Year-end figures. Retrieved from http://www.ctia.org/advocacy/research/index.cfm/aid/10323

The Economist. (2002, June 20). Watch this airspace. Technology Quarterly Review [Special section]. Retrieved from http://www.economist.com/node/1176136/print?Story_ID=1176136

Faulhaber, G. (2007). Solving the interoperability problem: Are we on the same channel. Federal Communications Law Journal, 59(3), 493-515.

Faulhaber, G., & Farber, D. (2003). Spectrum management: Property rights, markets, and the commons In L. F. Cranor & S. S. Wildman (Eds.), Rethinking rights and regulations: Institutional response to new communications technologies. Cambridge, MA: MIT Press. Also published in 2003 as [TEXT NOT REPRODUCIBLE IN ASCII]GLOCOM Review, 8(1), 73-2).

Faulhaber, G., & Farber, D. (2010). Innovation in the wireless ecosystem: A customer-centric framework International Journal of Communication, 4, 73-112. Retrieved from http://ijoc.org/index.php/ijoc/article/view/670/388

Federal Communications Commission. (2002, December 11). In the matter of additional spectrum for unlicensed devices below 900 MHz and in the 3 GHz band (Notice of Inquiry ET Docket 02 380). Retrieved from http://hraunfoss.fcc.gov/edocs_public/attachmatch/FCC-02-328A1.pdf

Federal Communications Commission. (2008, November 4). In the matter of unlicensed operation in the TV broadcast bands/Additional spectrum for unlicensed devices below 900 MHz and in the 3 GH band: Second report and order and memorandum opinion and order (ET Docket 08-260). Retrieved from http://hraunfoss.fcc.gov/edocs_public/attachmatch/FCC-08-260A1.pdf

Federal Communications Commission. (2010, September 23). In the matter of unlicensed operation in th TV broadcast bands/Additional spectrum for unlicensed devices below 900 MHz and in the 3 GH band: Second and memorandum opinion and order (ET Docket 08-260). Retrieved from http://hraunfoss.fcc.gov/edocs_public/attachmatch/FCC-10-174A1.pdf

Government Accountability Office. (2012, November). Spectrum management: Incentives, opportunities and testing needed to enhance spectrum sharing (GAO 13-7). Retrieved from http://www.gao.gov/assets/660/650019.pdf

McAdams, D. (2012, January 26). First commercial white-fi network launched. TVTechnology. Retrieved from http://www.tvtechnology.com/news/0086/first-commercial-white-fi-network-launched/211530

Obama, B. (2013, June 14). Expanding America's leadership in wireless innovation. White House, Office of the Press Secretary. Retrieved from http://assets.fiercemarkets.net/public/newsletter/fiercewireless/whitehousememo.pdf

PCAST (President's Council of Advisors on Science and Technology). (2012). Report to the President: Realizing the full potential of government-held spectrum to spur economic growth. Retrieved from http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast_spectrum_report_final_july_ 20_2012.pdf

Rysavy, P. (2012). Spectrum sharing: The promise and the reality. Presented at Wireless Spectrum Research and Development Workshop IV, MIT, Cambridge, MA. Retrieved from http://www.rysavy.com/Articles/2012_07_Spectrum_Sharing.pdf

Skorup, B. (2013). Reclaiming federal spectrum: Proposals and recommendations (Mercatus Center Working Paper 13-10, George Mason University). Retrieved from http://mercatus.org/publication/reclaiming-federal-spectrum-proposals-and-recommendations

Werbach, K., & Mehta, A. (2013). The spectrum opportunity: Sharing as the solution to the spectrum wireless crunch. International Journal of Communication, 8, 2014.

GERALD R. FAULHABER *

Wharton School, University of Pennsylvania

Penn Law School

* Professor Emeritus, Wharton School, University of Pennsylvania, and Penn Law School.

(1) For a thorough analysis of the technical challenges of sharing radio spectrum, see Rysavy (2012).

(2) Observing technologists attempt to create a currency and a market is much like observing economists attempt to create a radio. Their hearts are in the right place, but don't expect either group to produce results. Clearly, the "spectrum currency" idea needs much work.

(3) The author assigned this reading to his MBA students in 2002, and kept it in the course syllabus until 2007, when students began to ask, "This article is five years old, and I can't buy any of them at Radio Shack. Where are these gadgets?" Now the list is more than a decade old, and the question is still relevant.

(4) For a richer description of innovation in wireless cellular, see Faulhaber and Farber (2010).

(5) By way of comparison, the author's 4G LTE smartphone achieves Internet connection speeds of 13 to 19 Mbps.
COPYRIGHT 2014 University of Southern California, Annenberg School for Communication & Journalism, Annenberg Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Commentary on
Author:Faulhaber, Gerald R.
Publication:International journal of communication (Online)
Article Type:Essay
Geographic Code:1USA
Date:Jan 1, 2014
Words:5182
Previous Article:Political knowledge gaps among news consumers with different news media repertoires across multiple platforms.
Next Article:The spectrum opportunity: sharing as the solution to the wireless crunch.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters