Devil Called Love Dating

Free online dating news

100 % Free Personals

Open wireless vs. licensed spectrum: evidence from market adoption.

I. INTRODUCTION

The next generation of connectivity will be marked by ever more
ubiquitous computing and communications. (1) From health monitoring to
inventory management, handheld computing to automobile computers and
payment systems, pervasive computing is everywhere–and everywhere
depends on wireless communications and therefore on wireless policy. We
see this increasing importance in several contexts. The National
Broadband Plan calls for the identification of an additional 500 MHz of
spectrum for new wireless applications. (2) The new report by the
President’s Council of Advisors on Science and Technology
(“PCAST Report”) calls for extensive sharing of government
spectrum with civilian users; (3) this report, in turn, informs an
already existing broad effort to release federally-controlled
frequencies for use by non-federal users, coordinated and managed by the
National Telecommunications and Information Administration
(“NTIA”). (4) Moreover, a series of bills introduced in
Congress (5) and proposed by the White House (6) in 2011 ultimately
resolved into law that gave the Federal Communications Commission
(“FCC”) a new incentives auction authority to reallocate some
of the TV bands to support new wireless data services. (7)

The primary contribution of this Article is to provide evidence in
aid of these ongoing efforts to refine spectrum policy in both civilian
and federal spectrum. The Article surveys the experience of several
leading-edge wireless markets, examining the relative importance of the
major policy alternatives available to support the provisioning of
wireless communications capacity. I review evidence from seven wireless
markets: mobile broadband, wireless healthcare, smart grid
communications, inventory management, access control, mobile payments,
and fleet management. I also review how secondary markets in spectrum
have fared and evaluate both the failures and successes of different
approaches to open wireless policy.

I find that markets are adopting open wireless strategies in
mission-critical applications, in many cases more so than they are
building on licensed strategies. Eighty percent of wireless healthcare,
70% of smart grid communications, over 90% of tablet mobile data, and
40-70% of mobile broadband data to all devices use open wireless
strategies to get the capacity they require. (8) Open technologies are
dominant in inventory management and access control. For mobile
payments, current major applications use open wireless, and early
implementations of mobile phone payments suggest no particular benefit
to exclusive-license strategies. (9) Fleet management is the one area
where licensed technologies are predominant. However, UPS–owner of the
largest commercial fleet in the United States–has implemented its fleet
management system (of trucks, not packages) with an open wireless
strategy, suggesting that even here open wireless may develop attractive
alternatives. By contrast to these dynamic markets, secondary markets in
flexibly licensed spectrum have been sluggish. Most of the clear
successes of open wireless strategies have come from devices and
services that use general purpose open wireless bands, like those that
support Wi-Fi. Meanwhile, efforts to provide more narrowly tailored
unlicensed allocations, such as for transportation or medicine, have
been only ambiguously successful. Some more tightly regulated and
balkanized allocations, in particular unlicensed personal communications
services (“U-PCS”), have been outright failures. Policy is
important, then, both to the choice between open wireless and licensed
spectrum and among different approaches to open wireless allocations.

Ever since 1922, when then-Secretary of Commerce Herbert Hoover
first seized the power to manage spectrum (illegally, as it turned out),
(10) policy has been a critical determinant of the rate and direction of
innovation in wireless communications. The same will be true in the
coming decades. A discussion draft circulated by House of
Representatives staff on July 13, 2011 provides a clear example of how
government decisions driven by ideology can cut off crucial innovation
paths and destroy markets. (11) The bill would have prohibited the FCC
from permitting unlicensed devices to operate in any new band unless the
FCC conducted an auction in which a coalition of device manufacturers
had bid at auction to keep those bands a “commons.” (12) The
collective action problems associated with getting a group of actors to
bid on making it legal for anyone to innovate in a band are
overwhelming. It is the equivalent of saying that cities may only
dedicate a block for a public park or a street if the public at large
outbids any developer who would want to build an office building or a
mall over that land. overcoming the collective action problems
associated with creating these kinds of classic public spaces and
infrastructures, which are then open to all on equal terms, is the
paradigmatic case of public use that even the most ardent critics of
takings accept as the proper province of eminent domain. (13) As a
practical matter, these collective action problems would cut off future
innovation on the Wi-Fi model in any bands other than those where such
innovation and markets are already permitted to operate. Had the
discussion draft been the law of the land in 1985, there would have been
no Wi-Fi. None of the predominant pathways for data transmission today
used for handhelds and tablets, smart grid communications and
healthcare, inventory management, or security would have been legal.
(14)

The core question in wireless policy, broadly recognized for at
least the last decade, (15) has been how much of the future of wireless
innovation will depend on exclusively-licensed spectrum–whether
allocated under (1) a command and control system or (2) auction and
secondary markets–and how much will be developed in bands where it is
permissible to deploy (3) open wireless systems. Some frequencies will
almost certainly remain under a command and control system. (16) This
will likely be the case for the TV bands (although the incentives
auction and TV White Spaces (17) suggest that a mixed-model with
auctions and open wireless is preferable). The same can probably be said
for military or public safety uses (although again, dynamic frequency
sharing in the 5 GHz band (18) and the strong emphasis on spectrum
sharing in the PCAST Report (19) suggest that even in those bands there
is significant drive to incorporate aspects of both auctions and open
wireless). Some will remain under auction and secondary markets, such as
the already-auctioned bands dedicated to cellular providers. And some
will remain under open wireless systems, as in current Wi-Fi. The
question is what policy to adopt for future allocations and how to
regulate current allocations.

A particularly crisp example of wireless policy’s importance
is the difference between U.S. and European regulation of industrial,
scientific, and medical bands (“ISM band”) and that
regulation’s effect on markets for smart grid communications.
Comparing these two jurisdictions suggests that providing substantial
space for open wireless experimentation can result in a significantly
different innovation path. Europe uses very little wireless smart grid
communication, al most all of it licensed-cellular. U.S. smart grid
communications systems, by contrast, overwhelmingly rely on wireless,
and three-quarters of these systems use open wireless mesh networks. one
obvious difference between the two systems is that Europe has very
little open wireless spectrum allocations below 1 GHz. (20) What little
remains is balkanized and subject to highly restrictive power limits.
(21) Europe also imposes severe power constraints on devices using its
2.4G Hz bands. (22) The United States, by contrast, has a contiguous 26
MHz band, 901-928 MHz, with less restrictive power limits, which plays a
central role in U.S. smart grid communications markets. (23)

The past decade has seen a gradual emergence of what was, fifteen
years ago, literally unbelievable: spectrum commons are becoming the
basic model for wireless communications, while various exclusive
models–both property-like and command-and-control–are becoming a
valuable complement for special cases that require high mobility and
accept little latency. Consider wireless patient monitoring, once
thought the epitome of critical applications that could never be allowed
to fail and therefore require dedicated spectrum. In the actual market
for remote monitoring, open wireless technologies, either general
purpose like Wi-Fi, or specific purpose like wireless medical telemetry,
cover almost the entire market. How can this be? After all, to quote the
most vocal critique of open wireless policy, with open wireless, as with
the Internet, “[c]lassically, the brain surgeon cannot read the
life-or-death CT-scan because the Internet backbone is clogged with junk
e-mail.” (24) Eppur si muove. Hospitals rely on Wi-Fi extensively,
or for some applications on license-by-rule Wireless Medical Telemetry
Service (“WMTS”) sharing. Cellular machine-to-machine
(“M2M”) appears to be receding as a viable competitor to these
diverse open wireless approaches. It turns out that the rate of
innovation in open wireless, the growing capacity of each node, the
improvements in shared access over diverse infrastructures, and the
design of data flows to be less latency-sensitive have all contributed
to making yesterday’s unthinkable into tomorrow’s inevitable.

In the 1990s, we spoke of the “Negroponte Switch” as the
move of personal services like voice from fixed wire to wireless, making
them pervasive, and of single-location services like video to wire. (25)
The evidence we see in many markets now suggests a very different kind
of epochal “switch” in the coming decade. This switch will see
most applications moving from generally integrated, proprietary,
sparse-infrastructure, latency-indifferent architectures, like mobile
cellular networks, to open networks built on “Shared Access Nomadic
Gateway” architectures. Shared access architectures exploit the
lumpiness of the communications needs of any given application to
deliver the kind of connection needed, when it is needed (as opposed to
continuously, whether continuity is needed or not). They run on dense
infrastructures that share not only open wireless spectrum allocations
but also access to high-capacity nodes from diverse wired platforms
offered by diverse organizations and individuals, using
cross-organizational sharing to make the hops as short as feasible.
Sparse architectures will continue to have value, but only as
complements to a baseline that will be implemented over the shared
access architectures.

After Part I’s general introduction, Part II offers a
background primer on the policy debate, and Part III focuses on the
academic discourse. If you know the landscape of the discussion, you are
encouraged to skip those Parts. Part IV describes the new evidence
offered in this Article. It surveys seven markets, the performance of
secondary markets, and various cases of failure or ambiguous success of
special-purpose open wireless allocations. Part V outlines policy
implications and offers observations on the political economy of
spectrum auctions and the risk it poses to reasoned policy. The market
evidence requires a shift in policy toward supporting
dense-infrastructure, nomadic gateway architectures but is hampered by a
skewed political economy that treats auction revenues as paramount. I
also identify some implications for how open wireless allocations should
be designed in those bands designated for open wireless use. Part VI
concludes.

II. POLICY APPROACHES

A century has passed since August 13, 1912, when Congress enacted
An Act To Regulate Radio Communication. (26) The dominant problem that
spectrum regulation has sought to address ever since then is
interference: the risk that if more than one radiator transmits at a
given frequency, no one will be heard properly. (27) In 1912, licensing
and regulation were introduced as a condition of operating a radio, but
the licensing was non-exclusive. (28) From 1912-1922, driven primarily
by war production and later by massive amateur and commercial
experimentation, radio innovation exploded, focusing from November of
1920 on broadcast. (29) As the number of broadcast stations exploded in
1922, then-Secretary of Commerce Herbert Hoover tried to graft more
extensive control over licenses onto the 1912 Act. (30) His core effort
was to provide preferred channel access to well-capitalized commercial
stations, while concentrating amateur and smaller-scale nonprofit
broadcasters in less desirable frequencies. (31) This approach
ultimately collapsed with the United States v. Zenith decision in
1926.32 It took Congress a mere two months after it returned to session
to pass the Radio Act of 1927, (33) which laid the foundation for our
present model.

A large chunk of the available spectrum is reserved for government
use; this is the part that the NTIA manages. (34) Other parts of the
spectrum are regulated by a federal commission. This commission
regulates radio communications by (a) dividing the spectrum into
distinct channels, each defined over a range of frequencies, (b)
allocating specific communications uses to stated sets of channels, (c)
determining which private party will control transmissions over each
channel in a given geographic region, and (d) determining at what power
that party can radiate on that channel, using what kind of antenna. (35)
The 1934 Act did not alter that model, but replaced the Federal Radio
Commission with the Federal Communications Commission and consolidated
in the FCC’s hands power over both radio and wireline
communications. (36) The 1996 Telecommunications Act also did not change
the basic model. (37)

This basic command-and-control model of wireless communications
regulation continues to be the dominant approach governing the majority
of bands available for use. (38) Throughout the twentieth century,
however, there were precursors of what are now seen as the two primary
alternatives to command-and-control: markets in licenses and in
unlicensed devices. Secondary markets in spectrum have existed since the
Radio Act of 1927 permitted transfers but conditioned them on “the
consent in writing of the licensing authority.” (39) FCC approval
shifted in its form and intensity, (40) but over time the agency came to
view license transfers as more or less routine and imposed fewer
constraints, preferring to rely on markets to determine the best use of
spectrum. (41) In effect, secondary markets in spectrum assignments
(i.e. to determine who gets the license) have existed since the creation
of radio, and to some extent–in the limited sense that format
regulation is a matter of allocation fine-tuning (i.e. determining the
use of a particular band)–even allocation was subject to such markets.
(42) Similarly, the roots of the unlicensed wireless regime are located
in the FCC’s 1938 decision to allow the operation of low-power
devices without an individual license. (43)

The 1995 personal communications services auctions marked two
important advances in the use of a market-based approach to wireless
regulation, which has become the FCC’s primary means of allocating
spectrum. First, and most importantly from the perspective of
efficiency, the licenses were defined in broad and loose terms. This
meant that as uses and technology changed, licensees could reallocate
their spectrum to the new approaches. (44) This basic flexibility and
fluidity for users received a substantial regulatory boost when the FCC
created the framework for secondary markets in 2003.45 Second, and more
widely discussed but less critical to efficiency, these were the first
licenses to be auctioned using the then-new authority Congress had given
the FCC to auction licenses rather than assign them through competitive
bidding. (46) Auctions can improve efficiency to some extent if they
avoid transaction costs or are designed to assure the creation of a
competitive market, but flexible licenses play the more important
long-term role. And whatever gains they offer, auctions have enormous
costs in terms of political economy. Because they are treated as a
politically easy source of revenue, they are dealt with as part of
budget processes rather than as part of planning for infrastructure
development. Efforts to make reasonable long-term policy decisions with
regard to wireless communications and innovation can get swamped by the
effort to receive a slightly more favorable score from the Congressional
Budget Office (“CBO”). In 2012, Congress, for the first time,
empowered the FCC to share some of the auction revenue with incumbents
who are cleared from the spectrum designated for auction. This was done
in order to entice broadcasters to clear some of their spectrum. (47)

The most important advances in unlicensed policy were achieved
early and without real expectation of their significance. In 1985, the
FCC expanded Part 1548 to authorize the operation of unlicensed spread
spectrum devices in the 902-928 MHz, 2400-2483.5 MHz, and 5725-5850 MHz
bands. (49) The FCC also substantially increased the permissible power
level of spread spectrum systems to one watt. (50) These bands were wide
enough and their frequency high enough to support high data rate
transmissions. (51) The FCC later updated and revised these rules in
1989. (52) In 1993, the FCC tried to build on this experience by
dedicating 20 MHz for unlicensed PCS services, a service that failed;
from that failure, we need to learn lessons about the design of
unlicensed services. (53) In 1997, the FCC passed the Unlicensed
National Information Infrastructure (“U-NII”) Band Rules,
opening up for unlicensed use the bands in 5.15-5.35 GHz and 5.7255.825
GHz. (54) In 1999, the Institute of Electrical and Electronics Engineers
(“IEEE”) defined the first Wi-Fi standard, an event followed
by explosive growth in the number of unlicensed devices that the FCC
approved. (55) By 2002, a spectrum task force appointed by then Chairman
Michael Powell issued the first comprehensive report from the regulatory
agency that described unlicensed spectrum as one of the two major
alternatives to command-and-control, albeit in a secondary role to
auctions and flexible licenses. (56) Following this report, the FCC has
sought to enhance permission for unlicensed operation of various forms,
including the approval of extremely low power, wide bandwidth devices in
the Ultrawideband (“UWB”) Order, (57) assignment of the
3.65-3.7 GHz range for license-by-rule operation for wireless Internet
service providers (“WISPs”), (58) and coordination with the
NTIA to permit unlicensed devices to share spectrum with federal radar
systems in the 5 GHz band. (59) Most recently, the FCC has moved over
the past four years to permit operation of “white spaces”
devices in the band allocated to television stations but not used for
that purpose. (60) Now, as the NTIA seeks to open up more federal bands
to civilian uses, the cost and complexity of clearing federal users and
auctioning off the spectrum increasingly suggests that the net revenue
of such clearances would be minimal and their lead times may be as long
as a decade. (61) Thus, the 2012 PCAST Report suggests a fundamental
reorientation of policy to one that sees various forms of shared access
as the baseline, while auctions of more-or-less perpetual property-like
rights will be rare: “The essential element of this new Federal
spectrum architecture is that the norm for spectrum use should be
sharing, not exclusivity.” (62)

As a matter of practical policy, this brief overview suggests that
the FCC has, over the past two decades, moved to enable markets in both
licenses and unlicensed devices. Both markets have flourished and
provide us with increasing amounts of evidence to help guide future
regulatory decisions about auctions, secondary markets, and open
wireless approaches.

III. THE ACADEMIC DEBATE

A. Background

Throughout most of the twentieth century, academic attention,
insofar as it deals with the FCC’s policy for wireless
communications, was dominated by broadcast law. (63) Debates over
regulation versus market mechanisms tended to focus on the markets in
programming, affiliate relations with networks, or vertical integration
with programmers. (64) Most took either a standard economics orientation
or a critical stance based on the relationship between economic
structure and democratic discourse. (65) The command-and-control
approach to spectrum allocation was a background fact in most of this
literature, but one extremely influential critique took on spectrum
allocation itself.

The market-based approach was anchored in work done in the 1950s by
Ronald Coase, (66) which itself built on work by Leo Herzel earlier that
decade and was followed up with sporadic work in the 1960s and 1970s.
(67) It was only after the broader victory of the Chicago school in
antitrust and the broad shift toward market-based mechanisms, however,
that work by Evan Kwerel, Gregory Rosston and others who advocated
spectrum property approaches really made in roads in the policy debate.
(68) The broader intellectual and political sentiments of the Reagan era
were translated into spectrum allocation policy as well and, just as in
areas as diverse as banking regulation and welfare reform, were
implemented as part of the Clinton Administration’s embrace of this
market-based approach–in this case the PCS auctions conducted by the
FCC under then-Chairman Reed Hundt. (69)

Just as the introduction of auctions moved spectrum property from
the “yesterday’s heresy” to “today’s
orthodoxy,” as Eli Noam called it at the time, (70) technological
developments in digital processing and wireless communications gave
birth to a new critique. One version of the critique belonged to
Noam’s: the new technologies made spectrum property obsolete
because they allowed use-rights defined in frequency, power, and
geography to be cleared through a dynamic spot market rather than
through a market in long-term property holdings. (71) Noam’s
argument is a clear precursor to both the secondary markets efforts of
Spectrum Bridge and Cantor Fitzgerald, (72) as well as the proposals
advanced in the PCAST report to permit intermediate-term rental of
federal spectrum. (73)

The more fundamental critique, however, posited that technological
developments made obsolete the whole idea of defining discrete channels
for exclusive control and then allocating and assigning them, whether by
regulation or prices. “The central question … is no longer how to
allocate spectrum channels–how to decide who makes unilateral decisions
about who may communicate using a frequency band and for what types of
communications–but whether to coordinate by defining channel
allocations.'” (74) Markets in equipment, not in spectrum
clearances, were to become primary. The argument was that as computation
becomes very cheap, the wireless equipment market can provide solutions
that will allow devices to negotiate clearance of their communications
without anyone asserting exclusivity over a defined channel, whether
that exclusivity is long-term or dynamically leased. The choice becomes
one between (1) the Internet model of markets built on smart devices and
the services that can be built from networking them and (2) the
telecommunications services model of markets built on exclusive
proprietary claims to frequencies. (75)

Over the course of the past fifteen years, substantial literature
has developed addressing the basic choice between a “spectrum
property” model of exclusive licenses defined primarily in terms of
frequency and power, and a model based on equipment and services that do
not depend on exclusive access to any frequency but rather share a given
range of frequencies under a set of generally-applicable coordination
rules. (76) The unlicensed/open commons approach to wireless policy has
drawn its fair share of critique, (77) but experience, rather than
better modeling, will show which of these two approaches should be the
baseline and which should be a useful modifier to that baseline where
appropriate.

Before reviewing the new evidence in Part IV, I offer a quick
overview of the major elements of the argument for open wireless and a
response to some of the past decade’s more persistent lines of
critique.

B. The Arguments in Favor of Open Wireless Models

1. The Core Scarcities are Computation and Electric Power, Not
“Spectrum”

The anchor of both the command-and-control and property approaches
is the idea that wireless communications “use” spectrum and
that given many potential users, not all of whom can “use” the
spectrum at the same time, spectrum is “scarce” in the
economic sense. (78) Someone has to control who “uses” that
spectrum, or else no one can “use” it. As a study published in
March of 2011 by the National Research Council’s Computer Science
and Telecommunications Board explained, however, this view is not a
correct description of what happens when multiple transmitters transmit
at the same frequency. (79) If a thousand transmitters transmit, the
“waves” don’t destroy each other. No information is
destroyed; the only thing that happens is that it becomes harder and
harder for receivers to figure out who is saying what to whom. (80) The
limitation, or the real economic scarcity, is computation and the
(battery) power to transmit and run calculations. (81) The regulatory
model of command-and-control was created at a time when machine
computation was practically impossible. Exclusive licensing was a way to
use regulation to limit the number of transmitters in a band, in order
to enable very stupid devices to understand who was saying what. The
economic models on which auctions are based were developed in the 1950s
and 1960s, when computation was still prohibitively expensive. In that
era, thinking about “spectrum” as the relevant scarce input
made sense as shorthand for the policy problem.

The core claim of the scholarship developing the open wireless
approach has been that, as computation becomes dirt cheap, the
assumption that spectrum is a stable, scarce resource is no longer the
most useful way of looking at optimizing wireless communications
systems. (82) Rather, the question is: which configuration of smart
equipment, wired and wireless infrastructure, network algorithms, and
data processing will allow the largest number of people and machines to
communicate? It is possible that a network that includes exclusive
control over the radio-frequency channel being used will achieve that
result. But it is no longer necessarily so. It may be that the
flexibility that open wireless strategies provide–to deploy as and
where you please equipment and networks made of devices capable of
identifying the communications they are seeking in the din of a large
crowd–will do so more effectively.

The most recent effort to rebut the above is an article by spectrum
property advocates Tom Hazlett and Evan Leo. Hazlett and Leo write:

   In fact, radios dispatch streams of energy from their
   antennas, and that energy propagates through the surroundings
   at the speed of light. These fluxes are not
   legal constructs, but physical things. In a microwave
   oven, they heat soup ... Thus, for example, microwave
   ovens cause 'noticeable' interference with
   Bluetooth devices operating nearby. (83)

In response to the claim that computation, rather than
“spectrum,” is the scarce input, Hazlett and Leo state
unequivocally: “No amount of additional intelligence embedded in
the receiver can reverse the process when interference transforms
information into chaos.” (84) They conclude:

   The most common form of interference arises when
   an emission from a single transmitter interferes with
   itself. This can occur when part of a signal travels directly
   from the tower to the television, and part travels
   indirectly, reflecting off (say) a nearby
   skyscraper. Two different electromagnetic signals of
   the same frequency cannot in fact coexist at exactly
   the same place and time. (85)

Hazlett and Leo’s argument is simple and deceptively
attractive. Radio waves are physical. They can interact with each other
to such an extent that they heat food or cut through steel (lasers). And
when they interact, they “interfere” with each other, creating
“chaos.” But their example of “the most common form of
interference” is actually a beautiful instance of exactly the
mistake their argument exhibits. They describe the well-known phenomenon
of multi-path, the “ghost” image that bedeviled television in
the era of rabbit ears antennae. Radio signals would be emitted by a
transmitter antenna. They would then travel through space; some would
reach the rabbit ears directly, while others would “reflect[] off
(say) a nearby skyscraper.” (86) The result was that the receiver
antenna would get two or more “signals” and would interpret
this as “noise,” the ghostly figure or the grains on the
screen.

The mistake in their argument is that with new technologies
multi-path has become a desirable feature in radio signals, actively
used to enhance the quality of the signal or the capacity of a band,
rather than a challenge to be avoided. First explored theoretically in
the mid1990s, (87) equipment and network architectures that use multiple
input, multiple output (“MIMO”) have become some of the most
widely used means of increasing capacity, speed, or both. (88) Wi-Fi
802.11n, WiMax, and LTE or 4G cellular systems all incorporate MIMO.
(89) By having multiple antennae on the transmitter and the receiver,
and building better computation on both ends, the receiver now treats
multi-path as additional information rather than as noise. When
receivers were stupid, the additional flows of radiation bouncing off
walls or objects were necessarily confusing. Now, smart receivers know
that there will be several streams with slight variations in their
arrival times and angles, and they use that diversity of flows of energy
as additional bits of information from which to calculate the original.
The same exact physical phenomenon that used to increase noise and
reduce capacity is now quality and capacity enhancing. (90)
“Diversity gain,” or “cooperation gain” as David
Reed has called it, (91) is a critical feature of open wireless systems.
(92) Far from proving that “[n]o amount of additional intelligence
embedded in the receiver can reverse the process when interference
transforms information into chaos,” (93) Hazlett and Leo illustrate
the opposite.

The physical nature of radio waves is not questioned. When they
interact, they superimpose and make extracting information out of them
more complex. That complexity, however, is amenable to calculation and
does not need to be removed by regulatory decisions, whether implemented
as command-and-control or as a cap-and-trade regime (the so-called
“spectrum property”). The core design problem for wireless
policy is not how to avoid the presence of multiple radiators in a given
frequency, time, or location. It is how to assure an innovation path
that makes that question no longer the primary source of capacity
constraint. The argument in support of open wireless innovation has
always been that a market in devices and services built on an Internet
innovation model will take advantage of Moore’s law, growing more
rapidly than a market defined in spectrum allocations that take millions
or billions of dollars to exchange. Cap-and-trade carbon markets may or
may not be the most efficient regulatory approach to achieving
sustainable carbon dioxide emissions; they are not the answer for
radiofrequency emissions.

2. Transaction Costs and the Dynamic Shape of Demand for Wireless
Capacity Make It Unlikely that Markets Defined in Spectrum Allocations
Could Achieve Optimality

The wireless communications capacity and demand of any given set of
potential communicators is highly local and temporally dynamic. (94)
Imagine two pairs of users: A and B, and X and Y. How much spectrum each
pair needs to “use”–in the sense that a communication between
A and B would prevent X and Y from communicating using that frequency in
that geographic location at that time, which would require the
comparative value of the two uses be crystallized and cleared–cannot be
defined ex ante. Instead, whether any X/Y pair will be excluded and
therefore whether communication between A and B imposes any social cost
that needs to be priced depends on the instantiated system that A and B
are using, that X and Y are using, and how those systems interact with
the found and built environment in which the two pairs operate. If A and
B use very sophisticated devices and are embedded in a cooperative
network of repeaters and cooperative antennae, and X and Y use
reasonably robust antennae or systems themselves, then no
“interference” occurs. X and Y will not fail to communicate
when they want to simply because A and B have communicated. This can be
true one minute and change the next, such as when A and B are in a built
environment rich in multi-path that their equipment uses to enhance
communication or are driving through a neighborhood with dense repeater
networks and then drive to an area that doesn’t have these
beneficial characteristics.

The complexity of the necessary transactions is even clearer when
one considers the most sophisticated effort to define what property
rights in spectrum should look like. Improving on the major work done by
De Vany et al. in the late 1960s that focused on time, area, and
frequencies, (95) Robert Matheson developed what he called the
“electrospace” model for defining property rights to improve
wireless communications. (96) This seven-dimensional definition of a
spectrum right would include: (1) frequency; (2) time; space defined in
the dimensions of (3) latitude, (4) longitude, and (5) elevation; and
angle of arrival defined in the dimensions of (6) azimuth and (7)
elevation angles. (97) This more complex and realistic characterization
of the dimensions necessary for more efficient property rights
definitions helps to underscore the severe limitations that transaction
costs impose on the feasibility of an efficient market.

Transaction costs are prohibitive, requiring negotiation of the
allocation and reallocation of capacity on a dynamic basis. They include
the entire communications overhead associated with efficient utilization
in open wireless systems (in order to figure out whether any cost is
incurred at all) plus a market mechanism to map that determination onto
a transaction. (98) These transaction costs would be predicted to lead
to the state of affairs we in fact observe: larger-scale allocations to
sets of users, who are consumers of a service that bought spectrum and
does not clear at the margin using prices but queuing (i.e. dropping
calls, losing service). The market in spectrum underwrites the existence
of the cellular industry in its relatively concentrated form, but it
cannot and does not replace managerial decisionmaking with spot pricing
of spectrum clearances. Spectrum auctions and secondary markets, rather
than the FCC commissioners and staff, decide whether the managers are
those of Verizon or AT&T, rather than T-Mobile or Sprint; the
engineers then are those of Verizon and AT&T, rather than those of
T-Mobile or the Office of Engineering and Technology. The institution of
market forces for deciding who will run the hierarchical managerial
system that governs marginal allocation decisions–not dynamic pricing
that clears the most valued calls at any given time, location, and
band–is the major achievement of spectrum property markets.

Open wireless systems mean that the markets for equipment and
services incorporate incentives to design robust equipment and networks
capable of operating with limited exclusion of others and that are
robust to radiation by others. If Linksys can find a way of achieving
higher throughput and lower latency without increasing power (say, by
adding multiple antennae to its Wi-Fi equipment), it has an incentive to
do so to outcompete Netgear. If Silver Springs Networks can avoid
interference by deploying a dense proprietary mesh network for its
neighborhood smart grid, and can do so in a way that it gains market
share and becomes the largest provider of smart grid communications, it
will develop and deploy that mesh. (99) If ExxonMobil wants to implement
a touchless payment system, it can do so without having to wait for the
cellular carriers to negotiate the standard that would allow them to
extract the highest rents from their users. (100) In all these cases,
and many others, companies operate in markets and drive innovation and
investment in devices and services using those devices, without having
to negotiate permission from spectrum owners. Ironically, even AT&T,
when faced with capacity constraints posed by the introduction of the
iPhone, reverted to Wi-Fi as the more flexible response to data capacity
constraints, rather than obtaining spectrum on secondary markets. (101)
The freedom to innovate around simple, shared standards that do not
require permission to deploy makes open wireless innovation
Internet-like; spectrum property innovation ends up, in the best case,
running on the Bell Labs model.

C. Rebutting the Primary Arguments Against Open Wireless Systems

1. Open Wireless is Not a Form of Deregulation, but Merely Another
Form of Regulation

Stuart Benjamin and others attack the claim that open wireless
approaches represent a market-based solution. (102) Instead, Benjamin
argues that “spectrum commons” necessarily require regulation
such as maximum power limits or spectrum etiquette rules. (103) These,
in turn, become the focus of lobbying and agency capture. Unless
technology makes any form of regulation unnecessary, spectrum commons is
merely a cover-up for continued regulation with all its warts and
failures. (104) A less nuanced but nonetheless succinct way of capturing
the flow of the argument is to list the subsection titles of Jerry
Brito’s 2007 article: “Given a Commons, a Controller; Given a
Controller, the Government; Given Government, Inefficiency.” (105)

None of the scholars or advocates writing in support of open
wireless approaches suggests abandoning all regulation of any kind. Cars
on highways must follow the rules of the road; visitors to national
parks must obey campsite and fire rules; ships using ocean navigation
lanes have to comply with minimal safety rules. Kevin Werbach proposed a
universal access privilege coupled with a tort law system to constrain
harmful devices and uses. (106) Stuart Buck proposed that the FCC’s
certification authority is the best means of enforcing sharing rules.
(107) Phil Weiser and Dale Hatfield provided their own nuanced critique
of spectrum commons with a proposal for a mixed model of collaborative
regulation. (108) My own proposal, though underdeveloped, sought to
minimize direct regulation by combining some utterly unregulated spaces,
a dedicated public trust, and a requirement that FCC device
certification be coupled with fast track approval for devices that
comply with standards set in open standards-setting processes. (109)

The critique that these approaches invite lobbying during the
definition of the sharing regime identifies a genuine concern with the
design of open wireless approaches. The first generations of unlicensed
spectrum allocations had the benefit of being passed before any
significant market actors knew or predicted that they could use
unlicensed strategies to make money. The early rules therefore passed
with no serious lobbying, and even some of the later rules (such as the
U-NII band), while driven by a coalition of companies, (110) were naive
in retrospect. Since 2002, lobbying around unlicensed spectrum
rulemaking has been extensive. As critics have described exhaustively,
the designation of 3.65-3.7 GHz for WISP services was rife with
lobbying; (111) the White Spaces Order was almost abandoned because of
Dolly Parton’s microphone; (112) and Cisco, caught flat-footed on
TV-band devices because of its major investments in 5 GHz, spent 2011
fighting tooth and nail to deny its competitors open access to the TV
white spaces. (113) This experience certainly lends credence to the
concerns about lobbying and agency capture associated with open wireless
approaches. But it is an argument for vigilance in the design of these
systems, not a refutation of the idea that open wireless allocations are
instances of deregulation. Certainly, the FCC regulated power levels for
Part 15 permissions in the ISM bands for spread spectrum systems, (114)
but doing so did not obviate the fact that the orders in the 1980s were
major deregulatory successes. They permitted the deployment of millions
of devices that form the basic infrastructure over which massive amounts
of data now flow in the form of Wi-Fi, (115) and on which the majority
of smart grid communications networks are built. (116) They did so by
imposing minimal rules of the road when defining standards and
certifying equipment, and then by mostly getting out of the way.

The major fallacy of the critique that “spectrum commons means
regulation,” however, is that it fails to account for the fact that
spectrum property is equally susceptible to the same criticism. Commons
are no different from property in this regard. Both systems depend on
government decisions and rulemaking, and both require resistance to
these pressures in the design of the system. For example, Hazlett and
Leo write: “When the FCC unlicenses spectrum, carriers and
consumers must choose Intel’s Centrino chips over Qualcomm’s
CDMA chips and Wi-Fi access points over data networks provided by GSM
UMTS/HSDPA, CDMA 1xEV-DV, or WiMax optimized for licensed radio
spectrum.” (117) In other words, when the FCC dedicates a band to
unlicensed use, it is picking winners in the market for chips. But
Hazlett and Leo ignore the obvious fact that the inverse statement could
be written with equal truth (or, rather, equal half-truth): When the FCC
licenses spectrum, carriers and consumers must choose Qualcomm’s
CDMA chips over Intel’s Centrino, and data networks provided by GSM
UMTS/HSDPA, CDMA 1xEV-DV, or WiMax optimized for licensed radio spectrum
over Wi-Fi access points. There is no neutral baseline by which a
decision to license does not benefit some market actors at the expense
of others. The two statements are mirror images. More generally, when
the FCC decides to package and auction allocations in two 5MHz channels
separated by other channels, it is optimizing for incumbent cellular
providers for whom this configuration makes upstream and downstream
communications with cell towers easier to manage with less expensive
hand sets. This is a perfectly fine decision for an agency that sees
cellular architectures as dominant in the foreseeable future. But it is
not neutral. It prefers cellular architectures of this model over models
that rely on, and can benefit from, broad contiguous bands, which the
allocation model that the FCC has used in most of its recent auctions
makes extremely expensive to reassemble.

Defining exclusive rights for spectrum is extremely difficult, and
different definitions will benefit different actors. As Phillip Weiser
and Dale Hatfield have shown in detail, the best-designed property
systems necessarily require ongoing refinement and supervision through
zoning-like and nuisance-like regulations, (118) just as they do for
property in land. Government power and public policy have pervaded
common law property ever since the Domesday Book. (119) It would take a
remarkably naive view of how modern property law functions to imagine
that common law courts are not political, not subject to lobbying,
politics, distortion, and plain error, when they develop the rule
against perpetuities, decide nuisance cases, pick the American rule over
the English rule for ground water as opposed to oil, or decide what to
do about a cattle feed lot when the city expands next to it. The naivete
is even more pronounced when one considers the ways in which state and
local politics enter land use and property law. To imagine a property
regime free from lobbying when parties have many billions of dollars at
stake and sophisticated lobbying machines geared up is either wishful
thinking or purposeful obfuscation.

We cannot escape some level of government regulation over wireless
communications and therefore must bear the risks of control, corruption,
and error. “Spectrum property” tries to address this weakness
by advocating property rights defined in frequency bands that are as
broad and flexible as possible and hoping that fluid secondary markets
in assignments and allocations will allow companies to reassemble
transmission rights to a level that is more or less efficient. Open
wireless strategies try to address the same problem by proposing minimal
device-level rules, symmetrically applied to all devices and
applications, with a privileged position for open standards-setting
processes as a backstop against agency capture. Neither approach will
completely succeed, and both require vigilance by their respective
proponents against corrupt and flawed implementations. Imagining that
one is systematically more resistant to the failures of government
regulation than the other will not advance either approach.

2. “Tragedy of the Commons” and Technology Will Always
Drive Demand Faster than Supply

A common major mischaracterization of the spectrum commons argument
is that it depends on a false notion of spectrum abundance, while in
reality technology will always drive demand to surpass supply, requiring
a price-based allocation mechanism to avoid tragedy of the commons. The
Spectrum Policy Task Force Report, for example, sought to dedicate
commons where demand was low, as though that approach were particularly
suited for instances of abundance, but sought to reserve bands where
there was higher demand for propertylike regimes. (120) As in many other
cases, Hazlett and Leo offer a particularly crisp version of this
argument:

   The commons advocates insist that when the technology
   is smart enough, things never get crowded.
   That story is exactly backwards. Setting aside regulatory
   barriers, it is the lack of technology that has left
   some bands relatively empty. Bands that were empty
   a decade ago are crowded today in large measure because
   affordable new products have arrived to fill
   them. In our frame of experience, technology is not
   the solution to spectrum scarcity, but its cause. (121)

Hazlett and Leo do not explain why they think technological
developments that increase the supply of services people then demand
will always necessarily lag behind fulfilling that demand. Technology
creates supply (of computation power on a chip) that allows new demands
to emerge (people can run new programs that could not run on the prior
generation of chips); these new demands ultimately crowd the computation
capacity of the last generation of chips just in time to make people
want to upgrade to the new generation. Moore’s Law describes the
technological pattern that repeated roughly every eighteen months for
the past half-century. Open wireless technology has followed a similar
pattern, based on roughly the same rapid increase in the computation
capacity of devices. Indeed, if we take theoretical speeds, Wi-Fi
equipment increased in capacity from 2 Mbps in 1998 with 802.11 legacy
devices to 1.3 Gbps under the current 802.11ac, whose first units were
introduced in the spring of 2012, roughly consistent with Moore’s
Law. (122) However, Hazlett and Leo merely state that “[t]he
spectrum always looks uncrowded to pioneers at the very top of the
ladder. Then, when costs drop and regulatory barriers fall, crowds
follow.” (123) They fail to recognize that, in computationintensive
fields, “crowds follow” implies a beneficial cycle of
obsolescence and upgrade.

And this, of course, is directly tied to the failure of Hazlett and
others of his persuasion to understand that technology does increase the
desired resource (wireless capacity), even though it does not create
property (spectrum). This is not a new argument for Hazlett, and his
earlier predictions based on it ought to give one pause. In 2001, he
wrote in almost identical words: “When unlicensed entry thrives,
the characteristic pattern is that over-crowding ensues.” (124)
Quoting extensively from a Department of Commerce report, he argued:

   The use of the ISM [unlicensed industrial, scientific,
   medical] bands for high reliability communications
   is problematic, mainly because there is no assurance
   that today's adequate performance will remain free
   of interference in the future ... Eventually there
   may be too many additional systems to expect interference-free
   operation in crowded locations. (125)

As an example of the rent seeking and conflicts that will require
FCC intervention, Hazlett explained in 2000:

   [I]n the unlicensed 2.4 GHz band, opposing interests
   recently battled over standards ... The "HomeRF"
   coalition argued that Proxim's RangeLan2 technology
   be allowed use of up to 5 MHz in the band ...
   Rival companies supporting "Wi-Fi" technology run
   up to 11 Mbps, and adamantly opposed the HomeRF
   proposal ... Spectrum scarcity leads to a highly
   contentious "mess" at 2.4 GHz, a "tug-of-war" between
   mutually incompatible demands. (126)

The actual market experience of the past decade has shown that
Hazlett’s concerns that spectrum scarcity would lead to a highly
contentious mess in 2.4 GHz were misplaced. That he continues to make
the same arguments, sometimes almost verbatim–that technology will lead
demand to outstrip increases in capacity and that standards-setting
requires a band manager who owns the spectrum–and ignores the actual
experience of equipment markets in the past decade, should lead to some
skepticism in assessing present reiterations of the same argument.

More generally, the argument in favor of open wireless was never
that there is a spectrum abundance. It was, rather, that markets in open
wireless devices and the services one can build with them will create
better incentives to innovate over time so as to create a supply of new
applications and uses more rapidly than would a spectrum-property
market. The innovation model was the model of the Internet: open
standards together with robust markets in applications and devices
connected to an open network foster extensive innovation. Hazlett again
recognized this Internet-like model and understood the relevant market
analogy. And once more he misdiagnosed its meaning. He wrote in 2001:

   The spectrum commons idea is motivated by analogy
   to the Internet. Yet, the architecture of the Internet
   ... seriously misallocates scarce bandwidth ...
   High value communications are jammed in congested
   arteries with massive volumes of data of only
   marginal significance ... The problems thus have
   been described by financial analysts:...Flat-rate
   pricing and no financial settlement led to inefficient
   usage and reduced incentive to eliminate bottlenecks
   ...Many customers who were willing to pay
   for performance couldn't get it where/when they
   wanted it, whether it was voice IP (latency), ecommerce
   (reliability) or entertainment (burstable
   bandwidth). (127)

Quoting Noam, Hazlett suggested that perhaps these failures could
be solved by packet pricing, but he then argued that this would
undermine the commons analogy. (128) Hazlett’s reliance on claims
that the absence of packet pricing would prevent the Internet from
developing reliable Voice over IP, e-commerce, and entertainment was
based on the same assumptions that underlay his prediction that
standards battles between competing device manufacturers would prevent
Wi-Fi at 2.4 GHz from being useful.

As the National Academic Study emphasized, the core scarcities of
wireless communications are processing and battery power. (129) With
enough devices, computation, and cooperative network design, a wireless
system can scale demand without exclusivity in spectrum bands. In a
system that offers both licensed and unlicensed models, as ours does,
unlicensed models scale to meet demand more flexibly. That is why when
mobile carriers faced a major data crunch with the introduction of
smartphones, they were able to scale their capacity through Wi-Fi
offloading more rapidly than by increasing cellular network capacity.
(130) That is why MasterCard, Mobil, and E-ZPass were able to develop
their own mobile payment systems, (131) and why Silver Springs Networks
could complete their mesh smart grid solutions, (132) without waiting
for cellular carriers to get around to offering the capabilities. This
is not an argument from abundance, but rather from innovation and
flexibility about the comparative agility of two systems to adapt to
increasing demand and to develop solutions to that growing demand.

3. Market Adoption and Failures to Thrive

Perhaps the most significant argument that critics present, and the
one that ought to guide our analysis most, is based on levels of market
adoption and case studies of failures. Hazlett and Leo emphasize the
size and economic value of licensed wireless as compared to unlicensed:
“[M]ore than 130 million subscribers receive high-speed data
service (fixed and mobile) via exclusively owned bandwidth, as compared
to just a few hundred thousand subscribers–at most–to WISPs and those
accessing the Internet via a ‘spectrum commons.'” (133)
As we will see in the mobile broadband case study below, this statement
ignores the fact that 40% of mobile handheld data traffic and 92% of
tablet data are carried over Wi-Fi. (134) Indeed, it was Wi-Fi that in a
sense saved AT&T’s system from crashing with the introduction
of the iPhone. (135) Comparing mobile cellular to WISPs while neglecting
the importance of Wi-Fi offloading stacks the deck against unlicensed in
a way that severely understates its centrality to how actual markets
handle mobile data services.

Hazlett and Leo further write, “[e]quipment sales tell a
similar story. In 2006, global sales for WWANs using liberal licenses
were about $225 billion (including handsets), while wireless local area
networks (‘WLANs’), using unlicensed frequencies, totaled
about $3.8 billion.” (136) In the past few years there have been
studies that have attempted to place more sympathetic estimates on the
value of Wi-Fi. A Microsoft-funded study suggested that it is in the
range of $4.3 to $12.6 billion per year in homes alone. (137) Similarly,
a different analysis commissioned by Google placed the value of Wi-Fi at
about $12 billion based on an imputed value for speed or $25 billion
based on the share of cellular carrier traffic carried over Wi-Fi. (138)
Mark Cooper of Consumer Federation of America offers a more expansive
approach that includes both imputed value of unlicensed bundled as part
of cellular service and savings from Wi-Fi offloading on the supply side
and arrives at about $50 billion per year. (139) And in light of efforts
to quantify specifically the data-carriage side of Verizon and
AT&T’s business that suggest a revenue more on the order of $50
to $55 billion per year for licensed mobile data in the United States,
(140) Hazlett and Leo’s claim of a vast disparity in value appears
to be inflated.

Independent of the competing valuations, Hazlett’s argument
incorporates a major fallacy: that it is reasonable to compare the
social values of a technology and the disruptive technology that
displaces it by comparing the revenue from each. Consider, for example,
classified ads. In 2000, the year that Craigslist first expanded from
San Francisco to nine other major cities, U.S. newspaper classified ad
revenue was $8.7 billion. (141) By 2007, the last full year before the
Great Recession, that number was $3.8 billion, and by 2009 it was $787
million. (142) During the same period, Craigslist, the largest and most
significant online replacement for newspaper personal ads, had been
reported to have revenues ranging from about $10 million dollars in 2004
to a speculated $100 million in 2009.143 For 2006, the year in which
Hazlett compares the $225 billion in licensed-wireless equipment sales
to the $3.8 billion in Wi-Fi equipment to the detriment of the latter,
Craigslist had $25 million in sales, while the newspaper classified ads
business had revenues of $4.75 billion dollars–about 190 times more
revenue. (144) Hazlett’s logic would have us believe that the
revenue advantage of newspapers supports the proposition that newspaper
personals are clearly the superior modality and would thus suggest that
government policy should aim to optimize the markets in newspaper
classifieds. If one industry completely disrupts the way that another
makes money and captures revenue by delivering equivalent or better
value at a cost that is orders of magnitude lower, then this pattern of
revenues would be exactly the one we would observe. That is precisely
what innovation is best at. Comparing the revenues of the two approaches
to delivering a human desideratum where each is built on completely
different cost models and completely different competition models is
simply nonsensical. It would be like trying to value Wikipedia by
comparing its revenues to those of Encarta or Grolier. Instead, one
needs to compare the human desideratum served, the adoption rate by
consenting adults, and the organizations or social processes that serve
each. That is the approach I pursue in Part IV.

The fallacy becomes clearer when one realizes that customers who
buy wireless data service from Verizon or AT&T are not getting their
service delivered exclusively over licensed spectrum. If 92% of data to
tablets and 42% of data to handsets is delivered over Wi-Fi, and
customers pay for carriage of bits, not for “use of spectrum,”
a more reasonable approach would be to take the money customers pay for
mobile data carriage and equipment and apportion it based on the amount
of traffic carried. (145) A different way of saying this would be to
underscore that if all the payments to wireless carriers, services and
equipment, are attributed to carriage of data over licensed cellular
networks, then we see that 99% of what people are spending goes to
support carriage of between 8% and 58% of their data. Again, this seems
like a weak argument in support of the relative efficiency of the
modality of carriage that costs so much more per bit carried. In an
ideal market one could imagine arguing that the kinds of bits that
cellular carries–highly mobile, latency-intolerant–are so much more
valuable than the kinds of bits unlicensed carries–more nomadic and
delay tolerant–as to account for the difference in payment. But in a
concentrated market with high switching costs and high entry barriers,
it is much harder to pin down how much of the revenue represents actual
value, and how much represents rent extraction and slow responsiveness
of customers with sticky habits. After all, by 2007 newspapers still had
personal ad revenues of close to $4 billion. It was only in 2009 that
habits changed and revenue fell to $787 million.

Although the particular comparisons and conclusions offered by
critics of unlicensed wireless are flawed, it is a sound approach to
look for the most significant markets where wireless communications
capacity is a core component of the service or equipment purchased, and
assess the relative success of licensed spectrum and open wireless
approaches in these markets. This is what Parts IV and V do. As Part IV
outlines, using these measures, it appears that in these leading-edge
markets, units and services that rely on open wireless techniques to
deliver the wireless communications capacity component of their product
are being more widely adopted than approaches based on licensed
spectrum.

Distinct from the “market adoption” argument are the
failure stories: in particular, the failures of the U-PCS and 3.65-3.7
GHz unlicensed allocations. (146) In both cases, the FCC allocated a
band of spectrum to unlicensed use; in both cases, there were some
efforts to implement equipment and services using these allocations; and
in both cases these efforts either failed outright, as in the case of
U-PCS, or have been largely anemic, in the case of 3.65 GHz. (147)
Indeed, these case studies may suggest that efforts to improve on basic
minimal rules like those used in the ISM bands may do more harm than
good, and that the basic minimal-rules commons are preferable to more
detailed efforts to solve some of the tragedy of the commons issues with
more detailed rules. As such, I will return to these case studies in
Part IV, when I discuss several markets that emerged, or failed to
emerge, around special-purpose unlicensed regimes, and consider the
implications for the design of open wireless allocations going forward
in Part V.

D. Conclusion

To conclude this Part, the core academic argument in favor of open
wireless strategies is that they implement the innovation model of the
Internet in wireless communications capacity. Spectrum licenses,
particularly when cleared through secondary markets, can offer great
flexibility and innovation space, but they are limited by transactions
costs and strategic interventions in the design and ongoing enforcement
of the rights. Open wireless strategies will tend to innovate and deploy
more rapidly in techniques that increase the wireless carrying capacity
in any given time, location, or system context. They harness the
personal computer market and Internet innovation models to the
provisioning of wireless communications capacity. Which of these
approaches is better as a baseline, and what mix of them we should adopt
as policy, has been a longstanding academic debate. I have sought here
to respond to the main criticisms of open wireless policy that emerged
over the past decade. But the ultimate arbiter should be experience, and
it is to experience that I turn for the remainder of the Article.

100% FREE ONLINE Dating, BEST free online dating sites, totally Free Dating online,personals,singles

Comments are closed.