downloaded as a PDF - Next Digital Decade
downloaded as a PDF - Next Digital Decade
downloaded as a PDF - Next Digital Decade
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
THE NEXT DIGITAL DECADE<br />
ESSAYS ON THE FUTURE OF THE INTERNET<br />
Edited by Berin Szoka & Adam Marcus
THE NEXT DIGITAL DECADE<br />
ESSAYS ON THE FUTURE OF THE INTERNET<br />
Edited by Berin Szoka & Adam Marcus
<strong>Next</strong><strong>Digital</strong><strong>Decade</strong>.com<br />
TechFreedom<br />
techfreedom.org<br />
W<strong>as</strong>hington, D.C.<br />
This work w<strong>as</strong> published by TechFreedom (TechFreedom.org), a non-profit<br />
public policy think tank b<strong>as</strong>ed in W<strong>as</strong>hington, D.C. TechFreedom’s mission is<br />
to unle<strong>as</strong>h the progress of technology that improves the human condition and<br />
expands individual capacity to choose. We gratefully acknowledge the generous<br />
and unconditional support for this project provided by VeriSign, Inc.<br />
More information about this book is available at <strong>Next</strong><strong>Digital</strong><strong>Decade</strong>.com<br />
ISBN 978-1-4357-6786-7<br />
© 2010 by TechFreedom, W<strong>as</strong>hington, D.C.<br />
This work is licensed under the Creative Commons Attribution-<br />
NonCommercial-ShareAlike 3.0 Unported License. To view a copy of this<br />
license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/ or send<br />
a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco,<br />
California, 94105, USA.<br />
Cover Designed by Jeff Fielding.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 3<br />
TABLE OF CONTENTS<br />
Foreword 7<br />
Berin Szoka<br />
25 Years After .COM: Ten Questions 9<br />
Berin Szoka<br />
Contributors 29<br />
Part I: The Big Picture & New Frameworks<br />
CHAPTER 1: The Internet’s Impact on<br />
Culture & Society: Good or Bad? 49<br />
Why We Must Resist the Temptation of Web 2.0 51<br />
Andrew Keen<br />
The C<strong>as</strong>e for Internet Optimism, Part 1:<br />
Saving the Net from Its Detractors 57<br />
Adam Thierer<br />
CHAPTER 2: Is the Generative Internet at Risk? 89<br />
Protecting the Internet Without Wrecking It:<br />
How to Meet the Security Threat 91<br />
Jonathan Zittrain<br />
A Portrait of the Internet <strong>as</strong> a Young Man 113<br />
Ann Bartow<br />
The C<strong>as</strong>e for Internet Optimism, Part 2:<br />
Saving the Net from Its Supporters 139<br />
Adam Thierer<br />
CHAPTER 3: Is Internet Exceptionalism Dead? 163<br />
The Third Wave of Internet Exceptionalism 165<br />
Eric Goldman<br />
A Declaration of the Dependence of Cyberspace 169<br />
Alex Kozinski and Josh Goldfoot<br />
Is Internet Exceptionalism Dead? 179<br />
Tim Wu
4 TABLE OF CONTENTS<br />
Section 230 of the CDA:<br />
Internet Exceptionalism <strong>as</strong> a Statutory Construct 189<br />
H. Brian Holland<br />
Internet Exceptionalism Revisited 209<br />
Mark MacCarthy<br />
CHAPTER 4: H<strong>as</strong> the Internet Fundamentally<br />
Changed Economics? 237<br />
Computer-Mediated Transactions 239<br />
Hal R. Varian<br />
Decentralization, Freedom to Operate & Human Sociality 257<br />
Yochai Benkler<br />
The Economics of Information:<br />
From Dismal Science to Strange Tales 273<br />
Larry Downes<br />
The Regulation of Reputational Information 293<br />
Eric Goldman<br />
CHAPTER 5: Who Will Govern the Net in 2020? 305<br />
Imagining the Future of Global Internet Governance 307<br />
Milton Mueller<br />
Democracy in Cyberspace: Self-Governing Netizens<br />
& a New, Global Form of Civic Virtue, Online 315<br />
David R. Johnson<br />
Who’s Who in Internet Politics:<br />
A Taxonomy of Information Technology Policy & Politics 327<br />
Robert D. Atkinson<br />
Part II: Issues & Applications<br />
CHAPTER 6: Should Online Intermediaries<br />
Be Required to Police More? 345<br />
Trusting (and Verifying) Online Intermediaries’ Policing 347<br />
Frank P<strong>as</strong>quale<br />
Online Liability for Payment Systems 365<br />
Mark MacCarthy
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 5<br />
Fuzzy Boundaries: The Potential Impact of Vague<br />
Secondary Liability Doctrines on Technology Innovation 393<br />
Paul Szynol<br />
CHAPTER 7: Is Search Now an “Essential Facility?” 399<br />
Dominant Search Engines:<br />
An Essential Cultural & Political Facility 401<br />
Frank P<strong>as</strong>quale<br />
The Problem of Search Engines <strong>as</strong> Essential Facilities:<br />
An Economic & Legal Assessment 419<br />
Geoffrey A. Manne<br />
Some Skepticism About Search Neutrality 435<br />
James Grimmelmann<br />
Search Engine Bi<strong>as</strong> &<br />
the Demise of Search Engine Utopianism 461<br />
Eric Goldman<br />
CHAPTER 8: What Future for Privacy? 475<br />
Privacy Protection in the <strong>Next</strong> <strong>Digital</strong> <strong>Decade</strong>:<br />
“Trading Up” or a “Race to the Bottom”? 477<br />
Michael Zimmer<br />
The Privacy Problem: What’s Wrong with Privacy? 483<br />
Stewart Baker<br />
A Market Approach to Privacy Policy 509<br />
Larry Downes<br />
CHAPTER 9: Can Speech Be Policed<br />
in a Borderless World? 529<br />
The Global Problem of State Censorship<br />
& the Need to Confront It 531<br />
John G. Palfrey, Jr.<br />
The Role of the Internet Community<br />
in Combating Hate Speech 547<br />
Christopher Wolf
6 TABLE OF CONTENTS<br />
CHAPTER 10: Will the Net Liberate the World? 555<br />
Can the Internet Liberate the World? 557<br />
Evgeny Morozov<br />
Internet Freedom: Beyond Circumvention 565<br />
Ethan Zuckerman
Foreword<br />
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 7<br />
Berin Szoka<br />
This book is both a beginning and an end. Its publication marks the beginning<br />
of TechFreedom, a new non-profit think tank that will launch alongside this<br />
book in January 2011. Our mission is simple: to unle<strong>as</strong>h the progress of<br />
technology that improves the human condition and expands individual capacity<br />
to choose. This book also marks an end, having been conceived while I w<strong>as</strong><br />
Director of the Center for Internet Freedom at The Progress & Freedom<br />
Foundation—before PFF ce<strong>as</strong>ed operations in October 2010, after seventeen<br />
years.<br />
Yet this book is just <strong>as</strong> much a continuation of the theme behind both PFF and<br />
TechFreedom: “progress <strong>as</strong> freedom.” As the historian Robert Nisbet so<br />
elegantly put it: “the condition <strong>as</strong> well <strong>as</strong> the ultimate purpose of progress is the<br />
greatest possible degree of freedom of the individual.” 1 This book’s twenty-six<br />
contributors explore this theme and its interaction with relentless technological<br />
change from a wide variety of perspectives.<br />
Personally, this book is the perfect synthesis of the themes and topics that set<br />
me down the path of studying Internet policy in the late 1990s, and weaves<br />
together most of the major books and authors that have influenced the<br />
evolution of my own thinking on cyberlaw and policy. I hope this collection of<br />
essays will offer students of the field the kind of authoritative survey that would<br />
have greatly accelerated my own studies. Even more, I hope this volume excites<br />
and inspires those who may someday produce similar scholarship of their<br />
own—perhaps to be collected in a similar volume celebrating another major<br />
Internet milestone.<br />
I am deeply grateful to Shane Tews, Vice President for Global Public Policy and<br />
Government Relations at VeriSign, who first suggested publishing this sort of a<br />
collection to commemorate the 25th anniversary of the first .COM domain<br />
name (registered in 1985) by <strong>as</strong>king what the future might bring for the<br />
Internet. Just <strong>as</strong> I hope readers of this book will be, she had been inspired by<br />
reading Who Rules the Net? Internet Governance & Jurisdiction, a collection of<br />
cyberlaw essays edited by Adam Thierer and Clyde Wayne Crews, and published<br />
by the Cato Institute in 2003. This book would not exist without the<br />
unconditional and generous support of VeriSign, the company that currently<br />
operates the .COM registry.<br />
1 ROBERT NISBET, HISTORY OF THE IDEA OF PROGRESS 215 (1980).
8 FOREWORD<br />
Nor would the book exist without the superb intellectual contributions and<br />
patience of our twenty-six authors, and all those who <strong>as</strong>sisted them. I must also<br />
thank PFF Summer Fellows Alexis Zay<strong>as</strong>, Jeff Levy and Zach Brieg for their<br />
invaluable <strong>as</strong>sistance with editing and organization, and Jeff Fielding for the<br />
book’s stunning cover artwork and design.<br />
Most of all, I must thank Adam Thierer and co-editor Adam Marcus. The two<br />
and a half years I spent working closely with them on a wide range of<br />
technology policy topics at PFF were the highlight of my career thus far.<br />
I look forward to helping, in some small way, to discover the uncertain future of<br />
progress, freedom, and technology in the next digital decade—and beyond.<br />
Berin Szoka<br />
December 16, 2010
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 9<br />
25 Years After .COM: Ten Questions<br />
Berin Szoka<br />
While historians quibble over the Internet’s birth date, one date stands out <strong>as</strong><br />
the day the Internet ce<strong>as</strong>ed being a niche for a limited number of universities,<br />
governments and military organizations, and began its transformation into a<br />
medium that would connect billions: On March 15, 1985, Symbolics, a<br />
M<strong>as</strong>sachusetts computer company, registered symbolics.com, the Internet’s first<br />
commercial domain name. 2 This book celebrates that highly “symbolic”<br />
anniversary by looking not to the Internet’s p<strong>as</strong>t, but to its future. We have<br />
<strong>as</strong>ked twenty-six thought leaders on Internet law, philosophy, policy and<br />
economics to consider what the next digital decade might bring for the Internet<br />
and digital policy.<br />
Our ten questions are all essentially variations on the theme at the heart of<br />
TechFreedom’s mission: Will the Internet, on its own, “improve the human<br />
condition and expand individual capacity to choose?” If not, what is required to<br />
<strong>as</strong>sure that technological change does serve mankind? Do the benefits of<br />
government intervention outweigh the risks? Or will digital technology itself<br />
make digital markets work better? Indeed, what would “better” mean? Can<br />
“We the Netizens,” acting through the digital equivalent of what Alexis de<br />
Tocqueville called the “intermediate institutions” of “civic society,” discipline<br />
both the Internet’s corporate intermediaries (access providers, hosting<br />
providers, payment systems, social networking sites, search engines, and even<br />
the Domain Name System operators) and our governments?<br />
Part I focuses on five “Big Picture & New Frameworks” questions:<br />
1. H<strong>as</strong> the Internet been good for our culture and society?<br />
2. Is the open Internet at risk from the drive to build more secure, but less<br />
“generative” systems and devices? Will the Internet ultimately hinder<br />
innovation absent government intervention?<br />
3. Is the Internet really so exceptional after all, or will—and should—the<br />
Internet be regulated more like traditional communications media?<br />
4. To focus on one <strong>as</strong>pect of the Internet exceptionalism, h<strong>as</strong> the Internet<br />
fundamentally changed economics? What benefits and risks does this<br />
change create?<br />
5. Who—and what ide<strong>as</strong>—will govern the Net in 2020—at the end of the<br />
next digital decade?<br />
2 John C Abell, Dot-Com Revolution Starts With a Whimper, WIRED MAGAZINE, March 15, 2010,<br />
http://www.wired.com/thisdayintech/2010/03/0315-symbolics-first-dotcom/
10 25 YEARS AFTER .COM: TEN QUESTIONS<br />
Part II tackles five “Issues & Applications” questions:<br />
6. Should intermediaries be required to police more—or be disciplined in how<br />
they police their networks, systems and services? Whether one thinks the<br />
Internet is truly exceptional, and whether it h<strong>as</strong> changed economics largely<br />
determines one’s answer to these questions.<br />
7. While debates about the role of online intermediaries and the adequacy of<br />
their self-regulation focused on net neutrality in the l<strong>as</strong>t digital decade, the<br />
battle over “search neutrality” may be just <strong>as</strong> heated in the next digital<br />
decade. Are search engines now the “essential facilities” of the speech<br />
industry that can be tamed only by regulation? Or are they engines of<br />
empowerment that will address the very concerns they raise by ongoing<br />
innovation?<br />
8. As the Internet accelerates the flow of information, what future is there for<br />
privacy, both from governments and private companies? Is privacy a right?<br />
How should it be protected—from both government and private<br />
companies?<br />
9. The book concludes with two Chapters regarding the Internet in a<br />
borderless world. The first focuses on governments’ regulation of speech.<br />
10. The second focuses on the potential for governments’ “disruption” by<br />
speech—by unfettered communication and collaboration among the<br />
citizenry. In both c<strong>as</strong>es, our authors explore the consequences—and<br />
limits—of the Internet’s empowerment of users for democracy, dissent and<br />
pluralism.<br />
Part I: Big Picture & New Frameworks<br />
The Internet's Impact on Culture & Society:<br />
Good or Bad?<br />
Andrew Keen, the self-declared “Anti-Christ of Silicon Valley” 3 is scathing in<br />
his criticism of the Internet, especially “Web 2.0.” Keen declares we must avoid<br />
the siren song of “democratized media,” citizen journalism, and, <strong>as</strong> the title of<br />
his first book puts it, the Cult of the Amateur. He laments the “technology that<br />
arms every citizen with the means to be an opinionated artist or writer” <strong>as</strong><br />
producing a techno-utopian delusion little different from Karl Marx’s fant<strong>as</strong>ies<br />
of a communist society—“where nobody h<strong>as</strong> one exclusive sphere of activity<br />
but each can become accomplished in any branch he wishes.”<br />
Keen recognizes the reality of Moore’s Law—the doubling of computing<br />
capability every two years—but refuses to accept the idea that “each advance in<br />
3 Tim Dowling, I don't think bloggers read, THE GUARDIAN, July 20, 2007,<br />
http://www.guardian.co.uk/technology/2007/jul/20/computingandthenet.books
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 11<br />
technology is accompanied by an equivalent improvement in the condition of<br />
man.” Information technology is leading us into an oblivion of cultural<br />
amnesia, narcissism, and a childish rejection of the expertise, wisdom and<br />
quality of creative elites. For Keen, a “flatter” world is one in which genius can<br />
no longer rise above a sea of mediocrity, noise and triviality. His message on<br />
the verge of the next digital decade might <strong>as</strong> well be: “Abandon all hope, ye<br />
who enter here!” Keen’s pessimism is <strong>as</strong> strident <strong>as</strong> a certain Pollyannaish<br />
utopianism on the other side.<br />
Is there a middle ground? Adam Thierer, Senior Research Fellow at George<br />
M<strong>as</strong>on University’s Mercatus Center, insists there must be. In two related<br />
essays, Thierer describes two schools of Internet pessimism: net skeptics<br />
generally pessimistic about technology and “net lovers” who think the “good ol’<br />
days” of the Internet were truly great but are nonetheless pessimistic about the<br />
future. This first essay responds to Net skeptics like Keen—putting him in the<br />
context of centuries of techno-pessimism, beginning with the tale from Plato’s<br />
Phaedrus of Theuth and Thamus. Thierer’s response is Pragmatic Optimism:<br />
“We should embrace the amazing technological changes at work in today’s<br />
Information Age but with a healthy dose of humility and appreciation for the<br />
disruptive impact and pace of that change. We need to think about how to<br />
mitigate the negative impacts <strong>as</strong>sociated with technological change without<br />
adopting the paranoid tone or Luddite-ish recommendations of the pessimists.”<br />
Is the Generative Internet at Risk?<br />
Harvard Law Professor Jonathan Zittrain summarizes the themes from his<br />
influential 2008 book, The Future of the Internet—And How to Stop It. Zittrain is<br />
Thierer’s prototypical Net-loving pessimist who worries how technology will<br />
evolve absent intervention by those capable of steering technology in better<br />
directions. Zittrain worries that consumer demand for security will drive the<br />
developers and operators of computer networks, services and devices to reduce<br />
what he calls the “generativity” of their offerings. Thus, unregulated markets<br />
will tend to produce closed systems that limit experimentation, creativity and<br />
innovation. In particular, Zittrain decries the trend towards “appliancized”<br />
devices and services—which, unlike the traditional personal computer, can load<br />
only those applications or media authorized by the developer. Not only does<br />
this diminish user control in the immediate sense, greater “regulability” also<br />
creates the potential for the Internet’s “gatekeepers” to abuse their power.<br />
Thus, Zittrain echoes the prediction made by Larry Lessig in Code—without a<br />
doubt the most influential Internet policy book ever—that “Left to itself,<br />
cyberspace will become a perfect tool of control.” 4<br />
4 Lawrence Lessig, CODE AND OTHER LAWS OF CYBERSPACE 5-6 (1999).
12 25 YEARS AFTER .COM: TEN QUESTIONS<br />
In the end, he proposes essentially two kinds of solutions for “Protecting the<br />
Internet without Wrecking It.” The first is essentially an appeal to the civic<br />
virtues of “netizenship.” Second, regulation may be required to force<br />
companies to “provide b<strong>as</strong>ic tools of transparency that empower users to<br />
understand exactly what their machines are doing,” <strong>as</strong> well <strong>as</strong> “data portability<br />
policies.” More radically, he proposes to impose liability on device<br />
manufacturers who do not respond to takedown requests regarding<br />
vulnerabilities in their code that could harm users. And, returning to his core<br />
fear of appliancized devices, he proposes that “network neutrality-style<br />
mandates” be imposed on “that subset of appliancized systems that seeks to<br />
gain the generative benefits of third-party contribution at one point in time<br />
while reserving the right to exclude it later.”<br />
Ann Bartow, Professor at the University of South Carolina School of Law,<br />
offers a stinging rebuke of Zittrain’s The Future of the Internet. She summarizes the book <strong>as</strong><br />
follows: “We have to regulate the Internet to preserve its open, unregulated<br />
nature.” Her essay draws an analogy to James Joyce’s 1916 novel, A Portrait of<br />
the Artist <strong>as</strong> a Young Man—emph<strong>as</strong>izing Zittrain’s desire for the independence of<br />
his digital homeland, much <strong>as</strong> Joyce wrote about Ireland. But <strong>as</strong> a leading<br />
cyber-feminist, she is especially critical of what she characterizes <strong>as</strong> Zittrain’s<br />
call for “an elite circle of people with computer skills and free time who share<br />
his policy perspective” to rule his preferred future (which she calls the<br />
“Zittrainet”) <strong>as</strong> “Overlords of Good Faith.”<br />
As Bartow characterizes Zittrain’s philosophy, “The technologies should be<br />
generative, but also monitored to ensure that generativity is not abused by either<br />
the government or by scoundrels; elite Internet users with, <strong>as</strong> one might say<br />
today, ‘mad programming skilz’ should be the supervisors of the Internet,<br />
scrutinizing new technological developments and establishing and modeling<br />
productive social norms online; and average, non–technically proficient Internet<br />
users should follow these norms, and should not demand security me<strong>as</strong>ures that<br />
unduly burden generativity.” In the end, she finds Zittrain’s book lacking in<br />
clear definitions of “generativity” and in specific proposals for “how to avoid a bad<br />
future for people whose interests may not be recognized or addressed by what<br />
is likely to be a very homogeneous group of elites” composed primary by male<br />
elites like Zittrain.<br />
Like Bartow, Adam Thierer rejects Zittrain’s call for rule by a Platonic elite of<br />
philosopher/programmer kings in the “C<strong>as</strong>e for Internet Optimism, Part 2:<br />
Saving the Net from Its Supporters.” Thierer connects the work of Larry Lessig,<br />
Jonathan Zittrain and Tim Wu <strong>as</strong> the dominant forces in cyberlaw, all united by<br />
an over-riding fear: “The wide-open Internet experience of the p<strong>as</strong>t decade is<br />
giving way to a new regime of corporate control, closed platforms, and walled<br />
gardens.” Thierer argues that they overstate the threats to openness and
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 13<br />
generativity. Because “companies have strong incentives to strike the right<br />
openness/closedness balance…. things are getting more open all the time<br />
anyway, even though the Internet w<strong>as</strong> never quite so open or generative <strong>as</strong> the<br />
“Openness Evangelicals” imagine. In the end, he concludes it is “significantly<br />
more likely that the [regulated] ‘openness’ they advocate will devolve into<br />
expanded government control of cyberspace and digital systems than that<br />
unregulated systems will, <strong>as</strong> the Openness Evangelicals fear, become subject to<br />
‘perfect control’ by the private sector.” Thus, Thierer rejects what Virginia<br />
Postrel called, in her 1998 book The Future and its Enemies, the “st<strong>as</strong>is<br />
mentality.” 5 Instead, he embraces Postrel’s evolutionary dynamism: “the<br />
continuum [between openness and closedness] is constantly evolving and …<br />
this evolution is taking place at a much f<strong>as</strong>ter clip in this arena than it does in<br />
other markets.” In the end, he argues for the freedom to experiment—a<br />
recurring theme of this collection.<br />
Is Internet Exceptionalism Dead?<br />
Eric Goldman, professor at Santa Clara University School of Law, provides a<br />
three-part historical framework for understanding the Internet Exceptionalism<br />
debate. In the mid-1990s, Internet Utopianism reigned triumphant, exemplified<br />
in the 1996 “Declaration of the Independence of Cyberspace” by John Perry<br />
Barlow, lyricist for the Grateful Dead. 6 Despite its radicalism, this First Wave<br />
of Internet Exceptionalism succeeded in getting Congress to add the only<br />
section of the Communications Decency Act that would survive when the<br />
Supreme Court struck down the rest of the Act on First Amendment grounds:<br />
Section 230, which “categorically immunizes online providers from liability for<br />
publishing most types of third party content” and thus “is clearly exceptionalist<br />
because it treats online providers more favorably than offline publishers—even<br />
when they publish identical content.” That law lies at the heart of the<br />
philosophical debate in this Chapter and Chapter 6: “Should Online<br />
Intermediaries Be Required to Police More?” The Second Wave (“Internet<br />
Paranoia”) led regulators to treat the Internet more harshly than analogous<br />
offline activity. The Third Wave (“Exceptionalism Proliferation”) proposed<br />
laws treating specific sites and services differently, especially social networks.<br />
The Deadhead Barlow w<strong>as</strong> dead wrong, declare—essentially—the Hon. Alex<br />
Kozinski, Chief Judge of the Ninth Circuit Court of Appeals, and Josh<br />
Goldfoot, Department of Justice litigator—each writing only in their private<br />
capacity—in “A Declaration of the Dependence of Cyberspace.” While they agree<br />
5 VIRGINIA POSTREL, THE FUTURE AND ITS ENEMIES (1998).<br />
6 Declaration of John P. Barlow, Cognitive Dissident, Co-Founder, Elec. Frontier Found., A<br />
Declaration of the Independence of Cyberspace (Feb. 8, 1996), available at<br />
http://w2.eff.org/Censorship/Internet_censorship_bills/barlow_0296.declaration.
14 25 YEARS AFTER .COM: TEN QUESTIONS<br />
that online anonymity and long-distance communications indeed make it harder<br />
for governments to punish law-breakers, governments are not helpless: “By<br />
placing pressure on [intermediaries like hosting companies, banks and credit<br />
card companies] to cut off service to customers who break the law, we can<br />
indirectly place pressure on Internet wrong-doers.” They illustrate their point<br />
with the examples of secondary liability for copyright infringement and Judge<br />
Kozinski’s Roommates.com decision. Indeed. they reject “the conceit that<br />
[cyberspace] exists at all” <strong>as</strong> a distinct, let alone exceptional place, <strong>as</strong> well <strong>as</strong><br />
arguments that the costs to Internet companies of handling traditional<br />
regulations are too high.<br />
Columbia Law Professor Tim Wu concurs that governments can, and do,<br />
regulate the Internet because of what he and Jack Goldsmith called, in their<br />
2006 book Who Controls the Internet?, the “persistence of physicality.” This is not<br />
necessarily something to be celebrated, <strong>as</strong> he notes, pointing to China’s very<br />
innovativeness in finding ways to repress its citizens online—a subject<br />
addressed in this collection’s final Chapter. Another of Thierer’s “Net-Loving<br />
Pessimists,” Wu professes Internet optimism but insists we must be “realistic<br />
about the role of government.”<br />
Wu summarizes the lengthy account in his 2010 book The M<strong>as</strong>ter Switch of how<br />
government is both responsible for creating information monopolists and yet<br />
also the only force ultimately capable of dethroning them. For Wu, the Internet<br />
is not exceptional—from “The Cycle” of alternation between<br />
centralization/closedness and decentralization/openness. Yet Wu agrees the<br />
Internet is indeed an exception to the general trend of traditional media:<br />
“[t]echnologically, and in its effects on business, culture and politics.” Thus, he<br />
compares the “ideology <strong>as</strong> expressed in its technology” and the American<br />
exceptionalism of Alexis de Tocqueville. Yet such exceptionalism, Wu warns,<br />
“cannot be <strong>as</strong>sumed, but must be defended.” Wu closes with a very useful<br />
bibliography of leading works in this ongoing debate.<br />
H. Brian Holland, Professor at Tex<strong>as</strong> Wesleyan School of Law, responds with a<br />
full-bore defense of what he calls the “modified Internet Exceptionalism”<br />
encapsulated in Section 230—“modified” to be less audacious than Goldman’s<br />
First Wave (“the Internet is inherently unregulable”), but still bold in its<br />
insistence that granting broad immunity to online intermediaries for the conduct<br />
of their users is vital to the flourishing of “cyber-libertarian” Web 2.0<br />
communities—such <strong>as</strong> wikis and social networks, capable of evolving their own<br />
norms and enforcement mechanisms for policing behavior. Holland provides a<br />
history of Section 230 and the debate over Internet exceptionalism that frames<br />
the discussion of intermediary deputization in Chapter 6. He explains how<br />
Larry Lessig’s conviction that private power leads to perfect control, <strong>as</strong><br />
mentioned above, ultimately split the Internet Exceptionalist consensus against
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 15<br />
regulation of the 1990s into two camps. Both camps carried the banner of<br />
Internet freedom but reached opposite conclusions about whether the real<br />
threat comes from government or the private sector—most notably, regarding<br />
Net Neutrality. Despite this fracturing, Holland notes that the exceptional<br />
deregulation made possible by Section 230 h<strong>as</strong> grown, not contracted, in its<br />
interpretation by the courts since 1996.<br />
Similarly, Mark MacCarthy, Adjunct Professor in the Communications Culture<br />
and Technology Program at Georgetown University, explains how “[t]he initial<br />
demand from Internet exceptionalists that the online world be left alone by<br />
governments h<strong>as</strong> morphed into the idea that governments should create a<br />
global framework to protect and spur the growth of the Internet.” Once the<br />
exaggerated claims about the impossibility of regulating the Net made by First<br />
Wave Internet Exceptionalists proved false, the question became not whether<br />
“[i]ntermediaries can control illegal behavior on the Internet and governments<br />
can control intermediaries, but should they?”<br />
B<strong>as</strong>ed on his first-hand experience at Visa (described in Chapter 6), MacCarthy<br />
seems willing to accept more intermediary deputization than Holland but insists<br />
that “[t]he establishment of these laws needs to follow all the rules of good<br />
policymaking, including imposing an obligation only when the social benefits<br />
exceed the social costs.” Furthermore, he warns that “a bordered Internet in<br />
which each country attempts to use global intermediaries to enforce its local<br />
laws will not scale. This is the fundamentally correct insight of the Internet<br />
exceptionalists.” Thus, MacCarthy concludes, “If governments are going to use<br />
intermediaries to enforce local laws, they are going to have to harmonize the<br />
local laws they want intermediaries to enforce.”<br />
H<strong>as</strong> the Internet Fundamentally<br />
Changed Economics?<br />
Google’s chief economist Hal Varian provides a coda to the 1998 book<br />
Information Rules: A Strategic Guide to the Network Economy (with Carl Shapiro of<br />
the University of California at Berkeley). That book pioneered the exploration<br />
of the unique <strong>as</strong>pects of information economics, and their implications for both<br />
business and policy. Here, Varian argues that the Internet’s most<br />
underappreciated impact on our economy lies in the obvious yet underappreciated<br />
ubiquity of computers in our economic transactions, facilitating<br />
four broad categories of “combinatorial innovation”: new forms of contract;<br />
data extraction and analysis; controlled experimentation; and personalization<br />
and customization. Varian celebrates the transformative potential of cloud<br />
computing technology to allow even tiny companies working internationally to<br />
launch innovative new applications and services that, in turn, “can serve <strong>as</strong>
16 25 YEARS AFTER .COM: TEN QUESTIONS<br />
building blocks for new sorts of combinatorial innovation in business processes<br />
that will offer a huge boost to knowledge worker productivity in the future.”<br />
Harvard Law Professor Yochai Benkler is best known for his book The Wealth of<br />
Networks—a clear allusion to Adam Smith’s 1776 cl<strong>as</strong>sic The Wealth of Nations. 7<br />
Those familiar with this part of Smith’s work view him narrowly <strong>as</strong> an<br />
economist focused solely on what h<strong>as</strong> traditionally been characterized <strong>as</strong><br />
economic exchange. But Smith in fact w<strong>as</strong> equal parts economist, moral<br />
philosopher, and jurisprudentialist—and so is Benkler. Benkler’s essay,<br />
“Decentralization, Freedom to Operate, and Human Sociality,” harkens back to<br />
Smith’s other key work, The Theory of Moral Sentiments (1759). For both Smith<br />
and Benkler, man’s natural sociability means that our distributed interactions<br />
tend to benefit society from the bottom-up—<strong>as</strong> if by Smith’s “invisible hand.”<br />
For Benkler, the Internet is “a global network of communications and exchange<br />
that allows much greater flow and conversation, so that many new connections<br />
are possible on scales never before seen.” Like Varian, Benkler celebrates the<br />
potential for cloud computing to facilitate accelerating and unprecedented<br />
collaboration.<br />
But the keys to Benkler’s future are sociality, voluntarism, widespread<br />
experimentation, and the freedom to experiment. The latter insistence makes<br />
him highly critical of is intellectual property—copyright, patent, etc. Yet he does<br />
not address the dangers of propertizing personal data <strong>as</strong> another form of<br />
intellectual property. What does privacy-property mean for data-driven<br />
experimentation and the freedom to experiment? This question, unanswered<br />
here, offers perhaps the most tantalizing organizing theme for a future<br />
successor to this collection of essays.<br />
Larry Downes closes this Chapter with an expanded version of the discussion<br />
of digital economics from his 2009 book The Laws of Disruption—a book in the<br />
same tradition <strong>as</strong> Varian and Shapiro’s Information Rules (1998), Postrel’s The<br />
Future and its Enemies (1998), and Clayton Christensen’s The Innovator’s Dilemma<br />
(1997). Here, Downes proposes five principles of information economics that<br />
make the digital economy different: (1) Renewability: “information cannot be<br />
used up”; (2) Universality: “everyone h<strong>as</strong> the ability to use the same information<br />
simultaneously;” (3) Magnetism: “Information value grows exponentially <strong>as</strong> new<br />
users absorb it;” (4) Friction-free: “the more e<strong>as</strong>ily information flows, the more<br />
quickly its value incre<strong>as</strong>es;” and (5) Vulnerability: The value of information can<br />
be destroyed through misuse or even its own success—information overload.<br />
7 ADAM SMITH, AN INQUIRY INTO THE NATURE AND CAUSES OF THE WEALTH OF NATIONS 18-<br />
21 (Edwin Cannan, ed., Methuen & Co., Ltd. 1904) (1776),<br />
http://www.econlib.org/library/Smith/smWN.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 17<br />
For Downes, the Internet h<strong>as</strong> changed economics in a second sense: by<br />
relentlessly and ruthlessly cutting transaction costs—e.g., the costs of search,<br />
information, bargaining, decision, policing and enforcement. Thus, Varian’s<br />
computer-mediation promises to dramatically flatten our economy: “As<br />
transaction costs in the open market approach zero, so does the size of the<br />
firm—if transaction costs are nonexistent, then there is no re<strong>as</strong>on to have large<br />
companies”—what Downes calls “The Law of Diminishing Firms.”<br />
Downes echoes Postrel’s critique of the st<strong>as</strong>is mentality: “the old rules do little<br />
more than hold back innovation for the benefit of those who cannot or do not<br />
know how to adapt to the economics of digital life.” Like Benkler, Downes<br />
particularly worries about copyright law’s ability to keep pace, but also explores<br />
the implications of lower transactions costs for privacy, <strong>as</strong>king: “What happens<br />
when the cost of deleting information is higher than the cost of retaining it?<br />
The answer is that nothing gets deleted.” In Chapter 7, both Downes and<br />
Stewart Baker explore the costs and benefits of privacy regulation.<br />
Finally, Eric Goldman offers another three-part conceptual framework—this<br />
time, for understanding how the Internet h<strong>as</strong> revolutionized markets for<br />
reputational information. Goldman argues that “well-functioning marketplaces<br />
depend on the vibrant flow of accurate reputational information.” The Internet<br />
may allow markets to regulate themselves better: If reputational information<br />
that w<strong>as</strong> previously “locked in consumers’ heads” can flow freely, it can “play<br />
an essential role in rewarding good producers and punishing poor ones.”<br />
Smith’s invisible hand alone is not enough, but “reputational information acts<br />
like an invisible hand guiding the invisible hand”—the “secondary invisible<br />
hand.” A “tertiary invisible hand” allows “the reputation system to earn<br />
consumer trust <strong>as</strong> a credible source… or to be drummed out of the market for<br />
lack of credibility…. ”<br />
Goldman cautions against interventions that suppress reputational information,<br />
but also highlights the potential unintended consequences of interventions<br />
intended to make reputation markets work better—like anti-gaming rules and a<br />
right-of-reply. Like Holland, Goldman emph<strong>as</strong>izes the central importance of<br />
Section 230’s immunity in allowing reputation systems to flourish without being<br />
crushed by intermediary liability or policing obligations.<br />
Who Will Govern the Net in 2020?<br />
Each of the three authors in this Chapter wisely resists the temptation to make<br />
overly specific prophesies and instead considers the broad themes likely to<br />
shape the policy debate over the Internet’s future. New York School of Law<br />
Professor David Johnson and Syracuse Information Studies Professor Milton<br />
Mueller focus on who should govern the Net in 2020—and could just <strong>as</strong> e<strong>as</strong>ily<br />
have responded to our question about Internet Exceptionalism—while Rob
18 25 YEARS AFTER .COM: TEN QUESTIONS<br />
Atkinson, President of the Information Technology and Innovation<br />
Foundation, provides a “field guide” to the eight major camps in Internet<br />
policy.<br />
Echoing Postrel’s dynamist/st<strong>as</strong>ist theme, like Thierer, Mueller predicts “The<br />
future of Internet governance will be driven by the cl<strong>as</strong>h between its raw<br />
technical potential and the desire of various incumbent interests—most notably<br />
nation-states—to <strong>as</strong>sert control over that potential].” He hopes the Internet<br />
will be governed by a “denationalized liberalism” b<strong>as</strong>ed on “a universal right to<br />
receive and impart information regardless of frontiers, and sees freedom to<br />
communicate and exchange information <strong>as</strong> fundamental, primary elements of<br />
human choice and political and social activity.” This will require the authority<br />
of national and subnational governments must be contained to “domains of law<br />
and policy suited to localized or territorialized authority,” while Internet<br />
governance institutions must be completely detached from nation-state<br />
institutions. Defenders of free speech will ultimately have to use global free<br />
trade institutions to strike down censorship.<br />
Mueller finds strong grounds for optimism in the Internet’s empowering and<br />
democratizing nature, and in the rise of new access technologies like unlicensed<br />
wireless broadband capable of disrupting existing Internet access bottlenecks.<br />
But he worries about the growing technological capabilities of broadband<br />
providers to manage and potentially censor traffic on their networks, and admits<br />
a darker future of strife, industrial consolidation, censorship and cyber-warfare<br />
is possible. Like Zittrain, Mueller fears a splintering of the Internet driven by<br />
conflicts over the Internet’s “Root Server,” and that such conflicts are bound to<br />
intensify <strong>as</strong> the drive to secure the Internet against cyber-threats and cyberwarfare<br />
intensifies.<br />
Like Wu, David Johnson, reaches back to Tocqueville’s Democracy in America<br />
(1835). While Mueller proposes a new liberalism, Johnson proposes<br />
“Democracy in Cyberspace: Self-Governing Netizens and a New, Global Form<br />
of Civic Virtue, Online.” Paraphr<strong>as</strong>ing Tocqueville, Johnson argues: “The<br />
Internet establishes a new equality of condition and enables us to exercise<br />
liberty to form <strong>as</strong>sociations to pursue new civic, social, and cultural goals.”<br />
Thus, the Internet is “inherently democratic”—in ways well beyond politics.<br />
But the Internet’s nature <strong>as</strong> an “engine of democratic civic virtue” must be<br />
defended daily by “netizens—the global polity of those who collaborate online,<br />
seek to use the new affordances of the Internet to improve the world, and care<br />
about protecting an Internet architecture that facilitates new forms of civic<br />
virtue.” Johnson argues against Wu’s apparent resignation to some degree of<br />
government meddling online: “A world in which every local sovereign seeks to<br />
control the activities of netizens beyond its borders violates the true meaning of<br />
self-governance and democratic sovereignty.” Johnson predicts that technology
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 19<br />
will empower users to sidestep the traditional controls imposed by<br />
governments—not perfectly, but well enough. Thus, the Internet can fulfill the<br />
more modest ambitions of First Wave Internet exceptionalists: by making the<br />
Internet exceptionally democratic and pluralistic.<br />
Johnson’s approach resembles Thierer’s Pragmatic Optimism staked out by<br />
Adam Thierer: "the trajectory of freedom and even civic virtue h<strong>as</strong> been, in<br />
broad terms, over time, constantly upward—because everyone who gets a<br />
chance to experience an incre<strong>as</strong>ed level of democratic self-government—a new<br />
‘equality of condition.’” Like Varian, Benkler and Downes, Johnson sees the<br />
Internet’s facilitation of collaboration and communication <strong>as</strong> the keys to<br />
democratic empowerment.<br />
As a think tank veteran, Rob Atkinson offers a “Taxonomy of Information<br />
Technology Policy and Politics,” describing eight camps and their positions<br />
along four key issues. First is perhaps the strongest, yet also the hardest to<br />
define: the Internet Exceptionalists, the “Netizens” who “believe that they<br />
launched the Internet revolution,” prefer informal Internet governance, and<br />
generally oppose government intervention online—especially copyright. By<br />
contr<strong>as</strong>t, Social Engineers distrust large corporations even more than<br />
government, thus leading them to advocate regulatory solutions. Though<br />
Atkinson doesn’t draw the connection, this camp might well be unified by<br />
Lessig’s concept of “code <strong>as</strong> law”—updated <strong>as</strong> “choice architecture,” in the<br />
highly influential 2008 book Nudge: Improving Decisions about Health, Wealth, and<br />
Happiness by C<strong>as</strong>s Sunstein and Richard Thaler. Free Marketers are those who<br />
believe the “Internet empowers people, liberates entrepreneurs, and enables<br />
markets”—especially by reducing transactions costs. Atkinson’s proposed tent<br />
may be rather too large, potentially encomp<strong>as</strong>sing some who advocate<br />
regulations like net neutrality or antitrust intervention they believe are the key to<br />
freeing markets. The term cyber-libertarian, seems both narrower and broader<br />
than Atkinson’s conception of “free-marketeers.” 8 Indeed, it w<strong>as</strong> originally the<br />
term Atkinson used for the “Internet Exceptionalist” camp, focused primarily<br />
on cyber-libertinism and a fanatic rejection of copyright.<br />
Moral Conservatives, on the other hand, “have no qualms about enlisting<br />
governments to regulate the Internet” to stamp out sin and sedition. Old<br />
Economy Regulators reject Internet exceptionalism absolutely and insist on<br />
continuing to regulate the Internet like all media in the “public interest.” Tech<br />
Companies & Trade Associations are united not by philosophical approach<br />
but by their ultimate duty to shareholders, while Bricks-and-Mortars<br />
8 See Adam Thierer & Berin Szoka, Cyber-Libertarianism: The C<strong>as</strong>e for Real Internet Freedom, THE<br />
TECHNOLOGY LIBERATION FRONT, Aug. 12, 2009, http://techliberation.com/2009/08/<br />
12/cyber-libertarianism-the-c<strong>as</strong>e-for-real-internet-freedom/
20 25 YEARS AFTER .COM: TEN QUESTIONS<br />
companies, professional groups, and unions generally work to thwart the<br />
Internet’s disruption of their business models—exemplifying Virginia Postrel’s<br />
“st<strong>as</strong>is mindset.” Atkinson’s own camp is that of the Moderates, who want<br />
government to “do no harm” to information technology innovations, but also<br />
to “actively do good’ by adopting policies to promote digital transformation” of<br />
the economy.<br />
Part II: Issues & Applications<br />
Should Online Intermediaries<br />
Be Required to Police More?<br />
Seton Hall Law Professor Frank P<strong>as</strong>quale argues that the Internet allows<br />
intermediaries to shroud their operations in what might be called “perfect<br />
opaqueness”—to extend Larry Lessig’s feared model of “perfect control.”<br />
P<strong>as</strong>quale uses the example of Google to illustrate the many ways in which<br />
online intermediaries choose to police the Internet, even when not required to<br />
do by governments. Given the critical policing role played by intermediaries,<br />
P<strong>as</strong>quale proposes an “Internet Intermediary Regulatory Council” to “help<br />
courts and agencies adjudicate controversies concerning intermediary practice”<br />
and <strong>as</strong>sure adequate monitoring—a “prerequisite for <strong>as</strong>suring a level playing<br />
field online.” The IIRC “could include a search engine division, an ISP<br />
division focusing on carriers, and eventually divisions related to social networks<br />
or auction sites if their practices begin to raise commensurate concerns.”<br />
While leaving open the possibility that the IIRC could be a private entity,<br />
P<strong>as</strong>quale is unab<strong>as</strong>hed in citing Robert Hale, theoretician of the New Deal’s<br />
regulatory frenzy: “Hale’s crucial insight w<strong>as</strong> that many of the leading businesses<br />
of his day were not extraordinary innovators that ‘deserved’ all the profits they<br />
made; rather, their success w<strong>as</strong> dependent on a network of laws and regulation<br />
that could e<strong>as</strong>ily shift favor from one corporate player to another.” But rather<br />
than repealing these laws and regulation to allow the “evolutionary dynamism”<br />
of competition to play out, <strong>as</strong> Adam Thierer proposes, P<strong>as</strong>quale is willing to<br />
“rely on competition-promotion via markets and antitrust only to the extent<br />
that (a) the intermediary in question is an economic (<strong>as</strong> opposed to cultural or<br />
political) force; (b) the ‘voice’ of the intermediary’s user community is strong;<br />
and (c) competition is likely to be genuine and not contrived.” Otherwise,<br />
competition is inadequate. “The bottom line,” P<strong>as</strong>quale concludes, “is that<br />
someone needs to be able to look under the hood” of culturally significant<br />
automated ranking systems.” Thus, the Internet is not exceptional: P<strong>as</strong>quale<br />
believes only careful regulatory oversight can protect us from shadowy<br />
corporations, just <strong>as</strong> in Franklin Delano Roosevelt’s telephone-and-radio era.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 21<br />
While P<strong>as</strong>quale seems not to object to intermediaries acting <strong>as</strong> arms of the<br />
police state so long <strong>as</strong> they are properly transparent and regulated, Mark<br />
MacCarthy cautions against the practical problems raised by intermediary<br />
policing and offers an analytical model for deciding when intermediary<br />
deputization is appropriate. B<strong>as</strong>ed on his experience <strong>as</strong> Senior Vice President<br />
for Public Policy at Visa Inc., MacCarthy explores how payment systems have<br />
handled Internet gambling and copyright infringement <strong>as</strong> exemplary c<strong>as</strong>e studies<br />
in intermediary deputization because, unlike most online intermediaries,<br />
payment systems are subject neither to Section 230’s absolute immunity for<br />
third-party content or activities nor to the notice-and-take-down conditional<br />
immunity of the <strong>Digital</strong> Millennium Copyright Act.<br />
MacCarthy finds cause for optimism about self-regulation: “regardless of the<br />
precise legal liabilities, intermediaries have a general responsibility to keep their<br />
systems free of illegal transactions and they are taking steps to satisfy that<br />
obligation.” But he insists intermediary liability should be imposed only where<br />
real market failures exists, where supported by “an analysis of costs, benefits<br />
and equities,” where spelled out clearly, and to the extent local laws are<br />
harmonized internationally.<br />
The most troubling form of intermediary deputization comes from uncertain<br />
secondary copyright liability, writes independent writer, lawyer and programmer<br />
Paul Szynol in an expanded version of an essay originally written for the<br />
Electronic Frontier Foundation. He challenges the anti-exceptionalist<br />
arguments made by Judge Kozinski and Josh Goldfoot. Szynol argues that the<br />
failure to clearly define such liability chills innovation and investment in<br />
innovative start-ups—and that that this problem is unique to the Internet, given<br />
the v<strong>as</strong>tly larger scale of competition facilitated by digital markets.<br />
Most intriguingly, Szynol argues that Kozinski and Goldfoot contradict their<br />
argument against Internet Exceptionalism by insisting on a standard for<br />
secondary liability online that is not actually applied offline. Szynol <strong>as</strong>ks,<br />
“should a car company be held liable for drivers who speed? After all, it would<br />
be e<strong>as</strong>y enough to add a ‘speed limit compliance chip.’ Yet auto manufacturers<br />
are not forced to pay any portion of a speeding driver's ticket. Offline, in other<br />
words, bad actors—the users of technology—are punished for their own<br />
transgressions. Online, however, the law ch<strong>as</strong>es the manufacturers—and<br />
applies ad-hoc, ambiguous standards [of secondary liability] to their products.”<br />
Thus, for all their denunciation of First Wave Exceptionalists like John Perry<br />
Barlow, Szynol essentially insists Kozinski and Goldfoot are actually Goldman’s<br />
“Second Wave” Internet Exceptionalists who want to impose more punitive<br />
regulations online than offline.
22 25 YEARS AFTER .COM: TEN QUESTIONS<br />
Is Search Now an "Essential Facility?"<br />
Frank P<strong>as</strong>quale brings his theory of intermediary regulation to full fruition with<br />
his sweeping call for “search neutrality.” Like Tim Wu in The M<strong>as</strong>ter Switch,<br />
P<strong>as</strong>quale worries that antitrust law is incapable of protecting innovation and<br />
adequately addressing the “the cultural and political concerns that dominant<br />
search engines raise." Thus, he aims to “point the way toward a new concept of<br />
‘essential cultural and political facility,’ which can help policymakers realize the<br />
situations where a bottleneck h<strong>as</strong> become important enough that special<br />
scrutiny is warranted.” In particular, P<strong>as</strong>quale sees taming search <strong>as</strong> inextricably<br />
intertwined with protecting privacy—“Engaging in a cost-benefit analysis [<strong>as</strong> in<br />
antitrust law] diminishes privacy's status <strong>as</strong> a right”—and Google’s potential<br />
chokehold on information through the Google Books Settlement.<br />
The existence of competition in search, especially from Microsoft’s Bing, and<br />
the potential for competition from Facebook and other services yet to be<br />
invented, are essentially irrelevant to P<strong>as</strong>quale, while the First Amendment’s<br />
protection of search engine operators are a complication to be addressed down<br />
the road. He concludes by insisting that regulation should be supplemented by<br />
a publicly funded alternative to the dominant private sector search engine—<br />
something the French government h<strong>as</strong> heavily subsidized a European “Quaero”<br />
search engine. Similarly, in Chapter 6, P<strong>as</strong>quale proposed to model his Internet<br />
Intermediary Regulatory Council on the French Data Protection Authority.<br />
Thus, P<strong>as</strong>quale’s over-arching vision seems to be that of a <strong>Digital</strong> New Deal—a<br />
la française.<br />
Geoffrey Manne, Professor at Lewis & Clark Law and Executive Director of<br />
the International Center for Law & Economics, explains that search engines are<br />
not the bottlenecks P<strong>as</strong>quale suggests—and thus why even the traditional<br />
essential facilities doctrine, which he says “h<strong>as</strong> been relegated by most antitrust<br />
experts to the dustbin of history,” should not apply to them. In essence, he<br />
argues that “search neutrality” would protect only competitors, not consumers,<br />
because even a popular search engine like Google cannot foreclose advertisers’<br />
access to consumers’ attention. Google, like any company, h<strong>as</strong> no legal duty to<br />
help its rivals. More to the point, even if Google entirely dominated search, it<br />
could not block consumers’ access to its competitors. This, argues Manne, is the<br />
relevant market to analyze—quoting Supreme Court Justice Abe Fort<strong>as</strong>’s<br />
famous admonition about excessively narrow market definitions: “This Court<br />
now approves this strange red-haired, bearded, one-eyed man-with-a-limp<br />
cl<strong>as</strong>sification.”<br />
Like Manne, New York Law School Professor James Grimmelmann expresses<br />
“Skepticism about Search Neutrality,” and the significant practical problems it
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 23<br />
would create. As the author of the definitive law review article, The Structure of<br />
Search Engine Law, 9 Grimmelmann is keenly aware of the concerns raised by<br />
search, yet he concludes that “the c<strong>as</strong>e for search neutrality is a muddle”<br />
because its “ends and means don’t match.” Echoing Johnson, Mueller,<br />
Holland, and Thierer’s view of the Internet <strong>as</strong> a liberating, democratizing force,<br />
Grimmelmann is clear that the lodestar of search is user autonomy: “If search<br />
did not exist, then for the sake of human freedom it would be necessary to<br />
invent it.” He deconstructs eight search neutrality principles—equality, objectivity,<br />
bi<strong>as</strong>, traffic, relevance, self-interest, transparency and manipulation—and<br />
finds each lacking, but cautions that “it doesn’t follow that search engines<br />
deserve a free p<strong>as</strong>s under antitrust, intellectual property, privacy, or other wellestablished<br />
bodies of law,” and that some other “form of search-specific legal<br />
oversight” might be appropriate.<br />
Eric Goldman once again puts the debate in the context of its intellectual<br />
history. Always focused on questions of exceptionalism, Goldman concludes<br />
search engines are neutral only in theory (“Search Engine Utopianism”) but<br />
must “make editorial judgments just like any other media company.” He<br />
explains that, while “search engine bi<strong>as</strong> sounds scary, … such bi<strong>as</strong> is both<br />
necessary and desirable”—and the remedy of “search neutrality” is probably<br />
worse than whatever adverse consequences come with search engine bi<strong>as</strong>.<br />
Ultimately, he predicts that “emerging personalization technology will soon<br />
ameliorate many concerns about search engine bi<strong>as</strong>.”<br />
What Future for Privacy Online?<br />
Michael Zimmer, Professor of Information Studies at School of the University<br />
of Wisconsin-Milwaukee, concedes that “the Internet h<strong>as</strong> become a platform<br />
for the open flow of personal information—flows that are largely voluntarily<br />
provided by users.” Yet Zimmer discusses lingering re<strong>as</strong>ons for concern about<br />
the Internet <strong>as</strong> a “potent infr<strong>as</strong>tructure for the flow and capture of personal<br />
information.”<br />
Zimmer explores the conflicts among privacy laws in the U.S., Europe, Canada<br />
and elsewhere, but concludes that “Companies are, on the whole, not moving<br />
around in order to avoid strict privacy regulations… instead, there h<strong>as</strong> been a<br />
gradual incre<strong>as</strong>e in awareness and action on the issue of privacy.” Still, Zimmer<br />
worries that the “‘trading up’ to an incre<strong>as</strong>ed level of protection of personal<br />
information flows on our transnational digital networks h<strong>as</strong> not materialized <strong>as</strong><br />
quickly or clearly <strong>as</strong> one might expect.” Zimmer’s answer is to demand a<br />
“renewed commitment to the rights of data subjects embodied in the Canadian<br />
and European Union approach to data protection.”<br />
9 James Grimmelmann, The Structure of Search Engine Law, 93 IOWA L. REV. 1 (2007).
24 25 YEARS AFTER .COM: TEN QUESTIONS<br />
Zimmer writes from the perspective that views privacy <strong>as</strong> a “right.” This is, to<br />
put it mildly, not a perspective shared by the other two authors in this Chapter:<br />
Stewart Baker, a Partner at Steptoe & Johnson LLP and former Assistant<br />
Secretary for Policy at the Department of Homeland Security (DHS), and Larry<br />
Downes, who h<strong>as</strong> expanded his essay from his 2009 book The Laws of Disruption.<br />
Baker spent his time at DHS battling privacy advocates over programs he felt<br />
justified to protect Americans against terrorism—leading him to <strong>as</strong>k, “What’s<br />
Wrong with Privacy?” He traces the answer back to the 1890 law review article,<br />
“The Right to Privacy” by Supreme Court Justice Louis Brandeis and Samuel<br />
Warren that gave birth to modern privacy law. Baker rejects their “reactionary<br />
defense of the status quo” <strong>as</strong> Boston elites who didn’t much like the news<br />
media reporting on the details of their private parties. In essence, Baker finds in<br />
the privacy “movement” the same “st<strong>as</strong>is mentality” defined by Virginia Postrel.<br />
Like Postrel, Baker argues for dynamism: “Each new privacy kerfuffle inspires<br />
strong feelings precisely because we are reacting against the effects of a new<br />
technology. Yet <strong>as</strong> time goes on, the new technology becomes commonplace.<br />
Our reaction dwindles away. The raw spot grows a callous. And once the initial<br />
reaction h<strong>as</strong> p<strong>as</strong>sed, so does the sense that our privacy h<strong>as</strong> been invaded. In<br />
short, we get used to it.”<br />
Baker rejects the concept of “predicates” for government access to data (e.g.,<br />
requiring “probable cause” for a warrant), the “Brandeisian notion that we<br />
should all ‘own’ our personal data,” and attempting to limit uses of information.<br />
Baker h<strong>as</strong> little to say about the private sector’s use of data but proposes a<br />
system of auditing government employees to rigorously monitor their use of<br />
private information.<br />
Larry Downes, too, rejects the concept of intellectual property in personal<br />
information—but is willing to concede that Warren and Brandeis “weren’t<br />
entirely wrong” in that “‘private’ information can also be used destructively.”<br />
He thus leaves open the possibility of narrow laws tailored to limiting specific,<br />
destructive uses of information—such <strong>as</strong> anti-discrimination laws. But Downes<br />
is highly skeptical about governmental enforcement of “privacy rights,” and<br />
ultimately echoes John Perry Barlow’s optimism about the potential for<br />
Netizens to solve their own problems: “Where there are real conflicts, where<br />
there are wrongs, we will identify them and address them by our means.” 10<br />
Specifically, Downes argues that “the same technologies that create the privacy<br />
problem are also proving to be the source of its solution. Even without<br />
government intervention, consumers incre<strong>as</strong>ingly have the ability to organize,<br />
identify their common demands, and enforce their will on enterprises”—<br />
detailing examples of how reputational pressure can discipline corporate privacy<br />
10 Barlow, supra note 6.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 25<br />
practices. Three cheers for the sound of Eric Goldman’s three invisible hand<br />
clapping, perhaps? Ultimately, Downes vests his greatest hope in the Internet’s<br />
potential to create new markets by lowering transactions costs—this time, a<br />
market for private data in which an explicit quid pro quo rewards consumers for<br />
sharing their personal data for beneficial, rather than destructive uses.<br />
Can Speech Be Policed<br />
in a Borderless World?<br />
John Palfrey, Harvard Law Professor and co-director of Harvard’s influential<br />
Berkman Center for Internet & Society, speaks with unique authority on<br />
censorship <strong>as</strong> one of the co-authors of exhaustive surveys of global censorship<br />
conducted by himself, Jonathan Zittrain and others at Berkman. These studies<br />
confirm Tim Wu’s conclusion that governments can and do censor speech<br />
effectively, contrary to the hopes of First Wave Internet Exceptionalists.<br />
Palfrey provides a beginner’s guide to the techniques used in, goals of, and<br />
practical problems created by content filtering. Most disturbingly, he notes the<br />
growing use of “soft controls” through governmental pressure and governmentfostered<br />
social norms intended to squelch dissent.<br />
Like Zittrain, Mueller and Johnson, Palfrey fears “we may be headed toward a<br />
localized version of the Internet, governed in each instance by local laws.” He<br />
thus demands a greater international debate about speech controls that forces<br />
states to discuss whether they “actually want their citizens to have full access to<br />
the Internet or not.” In particular, he echoes Mueller’s call for international free<br />
trade institutions to strike down censorship barriers to free speech.<br />
Christopher Wolf, Partner at Hogan Hartson LLP, focuses not on speech that<br />
governments hate, but on “hate speech” we all—or nearly all—would find<br />
objectionable. Yet he notes how difficult it can be to distinguish these two<br />
categories of censorship. Furthermore, he concludes, after much crusading<br />
against hate speech, that “laws against hate speech have not demonstrably<br />
reduced hate speech or deterred haters.” Thus, he concludes that “Hate speech<br />
can be ‘policed’ in a borderless world, but not principally by the traditional<br />
police of law enforcement. The Internet community must continue to serve <strong>as</strong><br />
a ‘neighborhood watch’ against hate speech online, ‘saying something when it<br />
sees something,’ and working with online providers to enforce community<br />
standards.” Thus, like Johnson, Mueller and Barlow, Wolf looks to Netizens to<br />
combat hate speech.
26 25 YEARS AFTER .COM: TEN QUESTIONS<br />
Can the Net Liberate the World?<br />
The book closes by discussing the most tragic disappointment of the First Wave<br />
Internet Exceptionalists’ vision. Where John Perry Barlow insisted, defiantly,<br />
that governments those “weary giants of flesh and steel… [did not] possess any<br />
methods of enforcement we have true re<strong>as</strong>on to fear,” the reality is that<br />
oppressive governments continue to reign, sometimes even using the Internet<br />
to serve their agenda. Can the Net liberate the world—or will it, too, become<br />
another tool of “perfect control,” <strong>as</strong> Larry Lessig feared? Or will imperfect<br />
controls work well enough to allow tyrants to hang on to power?<br />
Evgeny Morozov is a leading commentator on foreign affairs, a visiting scholar<br />
at Stanford University and a Schwartz fellow at the New America Foundation.<br />
He praises the Internet’s ability to quickly disseminate information and allow<br />
dissidents to organize. Yet, having grown up in the Soviet Union, he is deeply<br />
skeptical about the much-hyped potential for Web media to live up to the hype<br />
about democratization. He rejects two critical <strong>as</strong>sumptions underlying this<br />
hype. First, he concludes that the legitimacy of undemocratic regimes is derived<br />
less from “brainw<strong>as</strong>hing” that can be cured by exposure to the alternative views<br />
online and more from popular support for authoritarian regimes that promise to<br />
deliver economic growth or play effectively on other concerns, such <strong>as</strong><br />
nationalism or religion. Second, he suggests the Internet can actually facilitate<br />
surveillance, fuel genuine support for existing regimes, allow government to<br />
subtly manipulate public opinion, or simply make authoritarianism more<br />
efficient.<br />
John Palfrey’s acid observation in the previous Chapter bolsters Morozov’s<br />
suggestion that much of the world may not actually want to be liberated: “In<br />
China and in parts of the former Soviet Union, very often the most fearsome<br />
enforcer of the state's will is the old woman on one's block, who may or may<br />
not be on the state's payroll.”<br />
Optimists like Johnson, Mueller, Thierer and Holland would likely differ from<br />
Morozov—and the U.S. State Department h<strong>as</strong> tended in this direction, too. In<br />
January 2010, Secretary of State Hillary Clinton gave a bold speech embracing<br />
this optimism about the liberating potential of the Internet, and announcing a<br />
commitment to “supporting the development of new tools that enable citizens<br />
to exercise their rights of free expression by circumventing politically motivated<br />
censorship.” 11<br />
11 Hillary Rodham Clinton, Remarks on Internet Freedom, Jan. 21, 2010,<br />
http://www.state.gov/secretary/rm/2010/01/135519.htm
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 27<br />
Internet entrepreneur Ethan Zuckerman is a senior researcher at the Berkman<br />
Center and founder of Geekcorps, a non-profit dedicated to building computer<br />
infr<strong>as</strong>tructure in developing countries. He joined John Palfrey in the study of<br />
censorship circumvention tools mentioned above. 12 Despite his p<strong>as</strong>sionate<br />
commitment to promoting such tools, <strong>as</strong> Secretary Clinton proposed, he<br />
concludes that “We can’t circumvent our way around Internet censorship”<br />
because of the costs and practical challenges of attempting to circumvent<br />
censorship on a scale sufficient to make a real difference. Thus, he views<br />
circumvention <strong>as</strong> just one of many tools required to thwart “soft censorship,<br />
website blocking, and attacks on dissident sites. But ultimately, what is most<br />
required is building the right “theory of change” to inform the multi-pronged<br />
strategy necessary for the Internet to achieve its democratizing potential.<br />
Conclusion: Discovering the Future<br />
of the Internet & <strong>Digital</strong> Policy<br />
In these thirty-one essays, our authors paint a complex picture of the future of<br />
the Internet and digital policy: Technological change inevitably creates new<br />
problems, even <strong>as</strong> it solves old ones. In the end, one’s perspective ultimately<br />
depends on whether one thinks the “net” effect of that change is positive or<br />
negative—depending on how much, and in what ways, government intervenes<br />
online.<br />
Personally, this collection brings me back to where I started my study of<br />
Internet policy—reading John Perry Barlow’s “Declaration of the Independence<br />
of Cyberspace” in 1996, and Virginia Postrel’s The Future and Its Enemies in 1998.<br />
Despite its now-obviously excessive utopian naïveté about the Internet’s<br />
crippling of the State, Barlow’s poetry still resonates deeply with many,<br />
including myself, <strong>as</strong> a powerful synthesis of Internet exceptionalism and cyberlibertarianism,<br />
a vision of progress <strong>as</strong> empowerment and uplifting of the user.<br />
Yet like my former colleague Adam Thierer, it is Postrel’s evolutionary<br />
dynamism that most guides me, with its emph<strong>as</strong>is not on a “carefully outlined<br />
future” or “build[ing] a single bridge from here to there, for neither here nor<br />
there is a single point,” but on the process of discovery by which the future<br />
evolves. 13 Like Postrel, I do not imagine that the disruption and transformation<br />
wrought by the <strong>Digital</strong> Revolution will always be rosy or e<strong>as</strong>y. But we cannot—<br />
12 HAL ROBERTS, ETHAN ZUCKERMAN & JOHN PALFREY, 2007 CIRCUMVENTION LANDSCAPE<br />
REPORT: METHODS, USES, AND TOOLS (March 2009), http://d<strong>as</strong>h.harvard.edu/<br />
bitstream/handle/1/2794933/2007_Circumvention_Landscape.pdf?sequence=2.<br />
13 Postrel, supra note 5 at 218.
28 25 YEARS AFTER .COM: TEN QUESTIONS<br />
<strong>as</strong> the legendary King Canute once tried with the English Channel—command<br />
the tides of technological change to halt.<br />
Thierer’s “Pragmatic Optimism” demands much more than a resignation to the<br />
inevitability of change. At its heart, it is requires a cheery confidence in what<br />
David Johnson dubs the “Trajectory of Freedom”—“in broad terms, over time,<br />
constantly upward”—but also a commitment to the process by which that<br />
trajectory is discovered. This is progress—progress <strong>as</strong> freedom. 14 But progress<br />
also requires freedom, the freedom to discover, innovate and experiment, if<br />
technology is to achieve its full potential to improve the human condition and<br />
expand individual capacity to choose.<br />
I leave it to you, the reader, to choose—to discover—your own answers to the<br />
many questions of law, economics, philosophy and policy explored in this<br />
unique book.<br />
14 ROBERT NISBET, HISTORY OF THE IDEA OF PROGRESS 215 (1980).
CONTRIBUTORS<br />
Robert D. Atkinson 31<br />
Stewart Baker 31<br />
Ann Bartow 32<br />
Yochai Benkler 32<br />
Larry Downes 33<br />
Josh Goldfoot 34<br />
Eric Goldman 34<br />
James Grimmelmann 35<br />
H. Brian Holland 35<br />
David R. Johnson 36<br />
Andrew Keen 36<br />
Hon. Alex Kozinski 37<br />
Mark MacCarthy 37<br />
Geoffrey Manne 38<br />
Evgeny Morozov 39<br />
Milton Mueller 39<br />
29
30 CONTRIBUTORS<br />
John Palfrey 40<br />
Frank P<strong>as</strong>quale 40<br />
Berin Szoka 41<br />
Paul Szynol 41<br />
Adam Thierer 42<br />
Hal Varian 42<br />
Christopher Wolf 43<br />
Tim Wu 44<br />
Michael Zimmer 44<br />
Jonathan Zittrain 45<br />
Ethan Zuckerman 46
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 31<br />
Robert D. Atkinson<br />
Robert Atkinson is the founder and president of the Information Technology<br />
and Innovation Foundation, a W<strong>as</strong>hington, D.C.-b<strong>as</strong>ed technology policy think<br />
tank. He is also author of the State New Economy Index series and the book, The<br />
P<strong>as</strong>t And Future Of America’s Economy: Long Waves Of Innovation That Power Cycles<br />
Of Growth (Edward Elgar, 2005). He h<strong>as</strong> an extensive background in technology<br />
policy, he h<strong>as</strong> conducted ground-breaking research projects on technology and<br />
innovation, is a valued adviser to state and national policy makers, and a popular<br />
speaker on innovation policy nationally and internationally.<br />
Before coming to ITIF, Dr. Atkinson w<strong>as</strong> Vice President of the Progressive<br />
Policy Institute and Director of PPI’s Technology & New Economy Project.<br />
While at PPI he wrote numerous research reports on technology and innovation<br />
policy, including on issues such <strong>as</strong> broadband telecommunications, Internet<br />
telephony, universal service, e-commerce, e-government, middleman opposition<br />
to e-commerce, privacy, copyright, RFID and smart cards, the role of IT in<br />
homeland security, the R&D tax credit, offshoring, and growth economics.<br />
Previously Dr. Atkinson served <strong>as</strong> the first Executive Director of the Rhode<br />
Island Economic Policy Council, a public-private partnership including <strong>as</strong><br />
members the Governor, legislative leaders, and corporate and labor leaders. As<br />
head of RIEPC, he w<strong>as</strong> responsible for drafting a comprehensive economic<br />
strategic development plan for the state, developing a ten-point economic<br />
development plan, and working to successfully implement all ten proposals<br />
through the legislative and administrative branches. Prior to that he w<strong>as</strong> Project<br />
Director at the former Congressional Office of Technology Assessment. While<br />
at OTA, he directed The Technological Reshaping of Metropolitan America, a seminal<br />
report examining the impact of the information technology revolution on<br />
America’s urban are<strong>as</strong>.<br />
Stewart Baker<br />
Stewart A. Baker is a partner in the W<strong>as</strong>hington office of Steptoe & Johnson<br />
LLP. He returned to the firm following 3½ years at the Department of<br />
Homeland Security <strong>as</strong> its first Assistant Secretary for Policy.<br />
At Homeland Security, Mr. Baker created and staffed the 250-person DHS<br />
Policy Directorate. He w<strong>as</strong> responsible for policy analysis across the<br />
Department, <strong>as</strong> well <strong>as</strong> for the Department’s international affairs, strategic<br />
planning and relationships with law enforcement and public advisory<br />
committees. While at DHS, Mr. Baker led successful negotiations with<br />
European and Middle E<strong>as</strong>tern governments over travel data, privacy, visa<br />
waiver and related issues. He devised a new approach to visa-free travel, forged<br />
a congressional and interagency consensus on the plan and negotiated<br />
acceptance with key governments.
32 CONTRIBUTORS<br />
Mr. Baker manages one of the nation’s premier technology and security<br />
practices at Steptoe. Mr. Baker’s practice covers national security, electronic<br />
surveillance, law enforcement, export control encryption, and related<br />
technology issues. He h<strong>as</strong> been a key advisor on U.S. export controls and on<br />
foreign import controls on technology. He h<strong>as</strong> also advised companies on the<br />
requirements imposed by CFIUS.<br />
Mr. Baker’s practice includes issues relating to government regulation of<br />
international trade in high-technology products, and advice and practice under<br />
the antidumping and countervailing duty laws of United States, European<br />
Community, Canada, and Australia. He also counsels clients on issues involving<br />
foreign sovereign immunity, and compliance with the Foreign Corrupt Practices<br />
Act.<br />
Mr. Baker h<strong>as</strong> handled the arbitration of claims exceeding a billion dollars, is a<br />
member of national and international rosters of arbitrators, and is the author of<br />
articles and a book on the United Nations Commission on International Trade<br />
Law arbitration rules.<br />
Mr. Baker h<strong>as</strong> had a number of significant successes in appellate litigation and<br />
appearances before the United States Supreme Court. He developed—and<br />
persuaded the Court to adopt—a new theory of constitutional federalism that<br />
remains the most vibrant 10th Amendment doctrine of the p<strong>as</strong>t 30 years.<br />
Ann Bartow<br />
Ann Bartow is a Professor of Law at the University of South Carolina School<br />
of Law in Columbia, South Carolina. She holds a Bachelor of Science from<br />
Cornell University and a Juris Doctor from the University of Pennsylvania Law<br />
School. She currently teaches an Intellectual Property Survey, Copyright Law,<br />
Trademarks and Unfair Competition Law, Patent Law and a seminar entitled<br />
Pornography, Prostitution, Sex Trafficking and the Law. Her scholarship<br />
focuses on the intersection between intellectual property laws and public policy<br />
concerns, privacy and technology law, and feminist legal theory. She also coadministers<br />
the Feminist Law Professors blog, is a regular blogger at<br />
Madisonian.net and a contributing editor at Jotwell.com.<br />
Yochai Benkler<br />
Yochai Benkler is the Berkman Professor of Entrepreneurial Legal Studies at<br />
Harvard, and faculty co-director of the Berkman Center for Internet and<br />
Society. Before joining the faculty at Harvard Law School, he w<strong>as</strong> Joseph M.<br />
Field ‘55 Professor of Law at Yale. He writes about the Internet and the<br />
emergence of networked economy and society, <strong>as</strong> well <strong>as</strong> the organization of<br />
infr<strong>as</strong>tructure, such <strong>as</strong> wireless communications.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 33<br />
In the 1990s he played a role in characterizing the centrality of information<br />
commons to innovation, information production, and freedom in both its<br />
autonomy and democracy senses. In the 2000s, he worked more on the sources<br />
and economic and political significance of radically decentralized individual<br />
action and collaboration in the production of information, knowledge and<br />
culture. His books include THE WEALTH OF NETWORKS: HOW SOCIAL<br />
PRODUCTION TRANSFORMS MARKETS AND FREEDOM (2006), which received<br />
the Don K. Price award from the American Political Science Association for<br />
best book on science, technology, and politics.<br />
In civil society, Benkler’s work w<strong>as</strong> recognized by the Electronic Frontier<br />
Foundation’s Pioneer Award in 2007, and the Public Knowledge IP3 Award in<br />
2006. His articles include Overcoming Agoraphobia (1997/98, initiating the debate<br />
over spectrum commons); Commons <strong>as</strong> Neglected Factor of Information Production<br />
(1998) and Free <strong>as</strong> the Air to Common Use (1998, characterizing the role of the<br />
commons in information production and its relation to freedom); From<br />
Consumers to Users (2000, characterizing the need to preserve commons <strong>as</strong> a core<br />
policy goal, across all layers of the information environment); Co<strong>as</strong>e’s Penguin, or<br />
Linux and the Nature of the Firm (characterizing peer production <strong>as</strong> a b<strong>as</strong>ic<br />
phenomenon of the networked economy) and Sharing Nicely (2002,<br />
characterizing shareable goods and explaining sharing of material resources<br />
online). His work can be freely accessed at benkler.org.<br />
Larry Downes<br />
Larry Downes is an Internet analyst and consultant, helping clients develop<br />
business strategies in an age of constant disruption caused by information<br />
technology.<br />
Downes is author of Unle<strong>as</strong>hing the Killer App: <strong>Digital</strong> Strategies for Market<br />
Dominance (Harvard Business School Press, 1998), which w<strong>as</strong> named by the Wall<br />
Street Journal <strong>as</strong> one of the five most important books ever published on<br />
business and technology.<br />
His new book, The Laws of Disruption: Harnessing the New Forces that Govern Lie and<br />
Business in the <strong>Digital</strong> Age (B<strong>as</strong>ic Books 2009) offers nine strategies for success in<br />
navigating the accident-prone intersection of innovation and the law.<br />
From 2006-2010, Downes w<strong>as</strong> a nonresident Fellow at the Stanford Law School<br />
Center for Internet and Society. Before that, he held faculty positions at the<br />
University of California-Berkeley, Northwestern University School of Law, and<br />
the University of Chicago Graduate School of Business. Downes is a Partner<br />
with the Bell-M<strong>as</strong>on Group, which works with Global 1000 corporations,<br />
providing corporate venturing methodologies, tools, techniques and support<br />
that accelerate corporate innovation and venturing programs.
34 CONTRIBUTORS<br />
He h<strong>as</strong> written for a variety of publications, including U.S.A Today, Harvard<br />
Business Review, Inc., Wired, CNet, Strategy & Leadership, CIO, The<br />
American Scholar and the Harvard Journal of Law and Technology. He w<strong>as</strong> a<br />
columnist for both The Industry Standard and CIO Insight. He blogs for the<br />
Technology Liberation Front.<br />
Josh Goldfoot<br />
Josh Goldfoot is Senior Counsel with the Computer Crime & Intellectual<br />
Property Section of the U.S. Department of Justice. He prosecutes hackers and<br />
other computer criminals, and advises investigators and other prosecutors about<br />
privacy statutes, the Fourth Amendment, and implications of emerging<br />
technologies on law enforcement. In 2010, he w<strong>as</strong> awarded the Assistant<br />
Attorney General’s Meritorious Service Award. He is an accomplished software<br />
developer and computer technician, and received a United States patent in 2008<br />
for shape recognition technology. He is a graduate of Yale University and<br />
earned his law degree from the University of Virginia School of Law. He h<strong>as</strong><br />
worked in technology law since 1999, when he advised Internet startups in<br />
Silicon Valley on intellectual property issues. Prior to joining the Department<br />
of Justice in 2005, he did appellate and civil litigation, and clerked for judge Alex<br />
Kozinski on the Ninth Circuit U.S. Court of Appeals. He w<strong>as</strong> a Special<br />
Assistant United States Attorney in the E<strong>as</strong>tern District of Virginia for six<br />
months in 2007 and 2008.<br />
Eric Goldman<br />
Eric Goldman is an Associate Professor of Law at Santa Clara University<br />
School of Law. He also directs the school’s High Tech Law Institute. Before<br />
joining the SCU faculty in 2006, he w<strong>as</strong> an Assistant Professor at Marquette<br />
University Law School, General Counsel of Epinions.com, and an Internet<br />
transactional attorney at Cooley Godward LLP.<br />
Eric teaches Cyberlaw and Intellectual Property and previously h<strong>as</strong> taught<br />
courses in Copyrights, Contracts, Software Licensing and Professional<br />
Responsibility.<br />
Eric’s research focuses on Internet law, intellectual property, marketing, and the<br />
legal and social implications of new communication technologies. Recent<br />
papers have addressed topics such <strong>as</strong> search engines and online marketing<br />
practices.<br />
Eric received his BA, summa cum laude and Phi Beta Kappa, in<br />
Economics/Business from UCLA in 1988. He received his JD from UCLA in<br />
1994, where he w<strong>as</strong> a member of the UCLA Law Review, and concurrently<br />
received his MBA from the Anderson School at UCLA.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 35<br />
James Grimmelmann<br />
James Grimmelmann is Associate Professor at New York Law School and a<br />
member of its Institute for Information Law and Policy. He received his<br />
J.D. from Yale Law School, where he w<strong>as</strong> Editor-in-Chief of LawMeme and a<br />
member of the Yale Law Journal. Prior to law school, he received an A.B. in<br />
computer science from Harvard College and worked <strong>as</strong> a programmer for<br />
Microsoft. He h<strong>as</strong> served <strong>as</strong> a Resident Fellow of the Information Society<br />
Project at Yale, <strong>as</strong> a legal intern for Creative Commons and the Electronic<br />
Frontier Foundation, and <strong>as</strong> a law clerk to the Honorable Maryanne Trump<br />
Barry of the United States Court of Appeals for the Third Circuit.<br />
He studies how the law governing the creation and use of computer software<br />
affects individual freedom and the distribution of wealth and power in society.<br />
As a lawyer and technologist, he aims to help these two groups speak intelligibly<br />
to each other. He writes about intellectual property, virtual worlds, search<br />
engines, online privacy, and other topics in computer and Internet law. Recent<br />
publications include The Internet Is a Semicommons, 78 Fordham L. Rev. 2799<br />
(2010), Saving Facebook, 94 Iowa L. Rev. 1137 (2009), The Ethical Visions of<br />
Copyright Law, 77 Fordham L. Rev. 2005 (2009).<br />
He h<strong>as</strong> been blogging since 2000 at the Laboratorium: www.laboratorium.net.<br />
H. Brian Holland<br />
Professor H. Brian Holland joined the faculty of Tex<strong>as</strong> Wesleyan School of<br />
Law in 2009. Prior to his arrival, Professor Holland w<strong>as</strong> a Visiting Associate<br />
Professor at Penn State University’s Dickinson School of Law.<br />
After graduating from law school, Professor Holland spent two years <strong>as</strong> a<br />
judicial clerk in the U.S. Court of Appeals for the Second Circuit in New York.<br />
He then joined the W<strong>as</strong>hington, D.C. office of Jones, Day, Reavis & Pogue.<br />
His work with the firm consisted primarily of appellate work before the U.S.<br />
Supreme Court and federal courts of appeals, <strong>as</strong> well <strong>as</strong> international arbitration<br />
before the World Bank. Among the significant c<strong>as</strong>es litigated during this period<br />
were issues of intellectual property and constitutional law (Eldred v.<br />
Reno/Ashcroft and Luck’s Music Library, Inc. v. Reno/Ashcroft), privacy and<br />
identity theft (TRW v. Andrews), and federal bankruptcy jurisdiction and venue.<br />
Professor Holland’s scholarship reflects his interest in technology, governance<br />
and social change, with a particular emph<strong>as</strong>is on issues of authority within the<br />
online environment and the development of social norms in mediated<br />
communities. He is currently writing on privacy in social networks. His most<br />
recent work, Social Distortion: Regulating Privacy in Social Networks, h<strong>as</strong> been a<br />
featured presentation at privacy conferences both in the United States and<br />
Europe.
36 CONTRIBUTORS<br />
Professor Holland received a LL.M., with honors, from Columbia University<br />
School of Law, completing a self-designed program in technology law. He<br />
holds a J.D., summa cum laude, from American University’s W<strong>as</strong>hington<br />
College of Law, and a B.A. from Tufts University. Professor Holland is<br />
currently pursuing his Ph.D. in <strong>Digital</strong> Media and M<strong>as</strong>s Communications at<br />
Penn State University. His dissertation, now in progress, applies social semiotic<br />
theories to the concept of fair use in intellectual property law.<br />
David R. Johnson<br />
David Johnson joined New York Law School’s faculty in spring 2004 <strong>as</strong> a<br />
visiting professor of law. He is a faculty member of the Institute for<br />
Information Law and Policy.<br />
Professor Johnson joined Wilmer, Cutler & Pickering in 1973 and became a<br />
partner in 1980. His practice focused primarily on the emerging area of<br />
electronic commerce, including counseling on issues relating to privacy, domain<br />
names and Internet governance issues, jurisdiction, copyright, taxation,<br />
electronic contracting, encryption, defamation, ISP and OSP liability, regulation,<br />
and other intellectual property matters.<br />
Professor Johnson helped to write the Electronic Communications Privacy Act,<br />
w<strong>as</strong> involved in discussions leading to the Framework for Global Electronic<br />
Commerce, and h<strong>as</strong> been active in the introduction of personal computers in<br />
law practice. He co-authored, with David Post, Law and Borders: The Rise of<br />
Law in Cyberspace, 48 Stanford Law Review 1367 (1996). He is currently<br />
developing online legal games and law practice simulations.<br />
Professor Johnson graduated from Yale College with a B.A. summa cum laude<br />
in 1967. He completed a year of postgraduate study at University College,<br />
Oxford in 1968, and earned a J.D. from Yale Law School in 1972. Following<br />
graduation from law school, Professor Johnson clerked for a year for the<br />
Honorable Malcolm R. Wilkey of the United States Court of Appeals for the<br />
District of Columbia.<br />
Andrew Keen<br />
Andrew Keen is the author of the international hit CULT OF THE AMATEUR:<br />
HOW THE INTERNET IS KILLING OUR CULTURE (2008). Acclaimed by THE<br />
NEW YORK TIMES’ Michiko Kakutani <strong>as</strong> having been written “with acuity and<br />
p<strong>as</strong>sion” and by A.N. Wilson in the DAILY MAIL <strong>as</strong> “staggering”, CULT OF<br />
THE AMATEUR h<strong>as</strong> been published in fifteen different language editions and<br />
w<strong>as</strong> short-listed for the 2008 Higham’s Business Technology Book of the Year<br />
award.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 37<br />
As a pioneering Internet entrepreneur, Andrew founded Audiocafe.com in 1995<br />
and built it into a popular first generation Internet music company. He w<strong>as</strong> the<br />
executive producer of the new media show “MB5 2000” and, between 2001 and<br />
2007, worked <strong>as</strong> a senior sales and marketing executive at several Silicon Valleyb<strong>as</strong>ed<br />
technology start-ups including Pulse, Santa Cruz Networks and Pure<br />
Depth. He is currently the host of the popular “Keen On” show on<br />
Techcrunch.tv <strong>as</strong> well <strong>as</strong> the host of the video interview series “The Future of<br />
Creativity” on the Harvard Business Review website.<br />
Andrew w<strong>as</strong> educated at London University where he w<strong>as</strong> awarded a First Cl<strong>as</strong>s<br />
Honors Degree in Modern History, <strong>as</strong> a British Council Fellow at the University<br />
of Sarajevo and at the University of California at Berkeley where he earned a<br />
M<strong>as</strong>ters Degree in Political Science.<br />
He is currently writing a second book entitled DIGITAL VERTIGO: ANXIETY,<br />
LONELINESS AND INEQUALITY IN THE SOCIAL MEDIA AGE which will be<br />
published by St Martins Press.<br />
Hon. Alex Kozinski<br />
Chief Judge Alex Kozinski w<strong>as</strong> appointed United States Circuit Judge for the<br />
Ninth Circuit on November 7, 1985. He graduated from UCLA, receiving an<br />
A.B. degree in 1972, and from UCLA Law School, receiving a J.D. degree in<br />
1975. Prior to his appointment to the appellate bench, Judge Kozinski served<br />
<strong>as</strong> Chief Judge of the United States Claims Court, 1982-85; Special Counsel,<br />
Merit Systems Protection Board, 1981-1982; Assistant Counsel, Office of<br />
Counsel to the President, 1981; Deputy Legal Counsel, Office of President-<br />
Elect Reagan, 1980-81; Attorney, Covington & Burling, 1979-81; Attorney,<br />
Forry Golbert Singer & Gelles, 1977-79; Law Clerk to Chief Justice Warren E.<br />
Burger, 1976-77; and Law Clerk to Circuit Judge Anthony M. Kennedy, 1975-<br />
76.<br />
Mark MacCarthy<br />
Mark MacCarthy teaches and conducts research at Georgetown University’s<br />
Communication, Culture, and Technology Program. He teaches courses on the<br />
development of the electronic media, technology policy and Internet freedom.<br />
He is also an adjunct member of the Department of Philosophy where he<br />
teaches courses in political philosophy and philosophy and privacy. He does<br />
research and consults in the are<strong>as</strong> of information privacy and security, Internet<br />
intermediary liability, global Internet freedom, the future of the online media,<br />
consumer financial protection, open standards, electronic and mobile commerce<br />
and other technology policy issues. He is an Affiliate of Georgetown<br />
University’s McDonough School of Business Center for Business and Public<br />
Policy, an investigator with the RFID Consortium for Security and Privacy, and<br />
the appointed expert of the American National Standards Institute on the
38 CONTRIBUTORS<br />
International Organization For Standardization (ISO) Technical Management<br />
Board (TMB) Privacy Steering Committee<br />
From 2000 to 2008, he w<strong>as</strong> Senior Vice President for Global Public Policy at<br />
Visa Inc, responsible for policy initiatives affecting electronic commerce, new<br />
technology and information security and privacy. He regularly represented Visa<br />
before the U.S. Congress, the U.S. Administration, the U.S. Federal Trade<br />
Commission, the U.S federal financial regulators and multi-governmental<br />
groups such <strong>as</strong> the Organization for Economic Cooperation and Development<br />
and Asia Pacific Economic Cooperation group.<br />
Prior to joining Visa, he spent six years <strong>as</strong> a principal and senior director with<br />
the Wexler-Walker Group, a W<strong>as</strong>hington public policy consulting firm, where<br />
he worked with a variety of clients on electronic commerce, financial services,<br />
privacy and telecommunications. He w<strong>as</strong> Vice President in charge of Capital<br />
Cities/ABC’s W<strong>as</strong>hington office from 1988 to 1994, representing the<br />
company’s interests before Congress, the Federal Communications Commission<br />
and other administrative agencies. From 1981 to 1988 he w<strong>as</strong> a professional<br />
staff member of the House Committee on Energy and Commerce, where he<br />
handled communications policy issues. From 1978 to 1981, he worked <strong>as</strong> an<br />
economist performing regulatory analyses of safety and health regulations at the<br />
U.S. Occupational Safety and Health Administration.<br />
Mr. MacCarthy h<strong>as</strong> a Ph.D in philosophy from Indiana University and an MA in<br />
economics from the University of Notre Dame.<br />
Geoffrey Manne<br />
Currently the Executive Director of the International Center for Law &<br />
Economics (ICLE), a global think tank, Professor Manne also serves <strong>as</strong><br />
Lecturer in Law for Lewis & Clark Law School. In this capacity he lends his<br />
expertise to various law school endeavors and teaches the school’s Law and<br />
Economics course. The ICLE’s website is at www.laweconcenter.org.<br />
Manne w<strong>as</strong> an Assistant Professor of Law at Lewis & Clark from 2003 to 2008.<br />
From 2006 to 2008 he took a leave of absence from the school to direct a law<br />
and economics academic outreach program at Microsoft, and w<strong>as</strong> Director,<br />
Global Public Policy at LECG, an economic consulting firm, until founding the<br />
ICLE at the end of 2009. Prior to joining the Lewis & Clark faculty he practiced<br />
law at Latham & Watkins in W<strong>as</strong>hington, D.C., where he specialized in antitrust<br />
litigation and counseling. Before private practice Manne w<strong>as</strong> a Bigelow Fellow<br />
at the University of Chicago Law School, an Olin Fellow at the University of<br />
Virginia School of Law and a law clerk to Judge Morris S. Arnold of the U.S.<br />
Court of Appeals for the Eighth Circuit. While clerking he taught a seminar on<br />
Law & Literature at the University of Arkans<strong>as</strong> at Little Rock.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 39<br />
During law school Manne w<strong>as</strong> a research <strong>as</strong>sistant to Judge Richard Posner,<br />
Comment Editor of the University of Chicago Law School Roundtable and a<br />
Staff Member of the University of Chicago Legal Forum. Among his other<br />
vocational pursuits w<strong>as</strong> a brief stint at the U.S. Federal Trade Commission. His<br />
research h<strong>as</strong> focused broadly on the economic implications of legal constraints<br />
on business organizations, particularly in the contexts of antitrust, nonprofit<br />
organizations and international law. Manne is a member of the Virginia bar, <strong>as</strong><br />
well <strong>as</strong> the Bar of the U.S. Bankruptcy Court for the E<strong>as</strong>tern District of<br />
Virginia. He is also a member of the American Law and Economics<br />
Association, the Canadian Law and Economics Association and the<br />
International Society for New Institutional Economics.<br />
He blogs for the Technology Liberation Front.<br />
Evgeny Morozov<br />
Evgeny Morozov is the author of THE NET DELUSION: THE DARK SIDE OF<br />
INTERNET FREEDOM (Public Affairs, 2011). He is also a visiting scholar at<br />
Stanford University, a fellow at the New America Foundation and a<br />
contributing editor to Foreign Policy magazine.<br />
Milton Mueller<br />
Milton Mueller teaches and does research on the political economy of<br />
communication and information. He uses the theoretical tools of property<br />
rights analysis, institutional economics and both historical and quantitative<br />
social science methods. He h<strong>as</strong> a longstanding interest in the history of<br />
communication technologies and global governance institutions. Mueller<br />
received the Ph.D. from the University of Pennsylvania in 1989.<br />
Mueller’s most recent research projects explore the problem of governing the<br />
Internet. His new book NETWORKS AND STATES: THE GLOBAL POLITICS OF<br />
INTERNET GOVERNANCE (MIT Press, 2010) provides a comprehensive<br />
overview of the political and economic drivers of a new global politics. His<br />
acclaimed book RULING THE ROOT: INTERNET GOVERNANCE AND THE<br />
TAMING OF CYBERSPACE (MIT Press, 2002) w<strong>as</strong> the first scholarly account of<br />
the debates over the governance of the domain name system. His book,<br />
UNIVERSAL SERVICE: COMPETITION, INTERCONNECTION AND MONOPOLY<br />
IN THE MAKING OF THE AMERICAN TELEPHONE SYSTEM (MIT Press, 1997)<br />
set out a dramatic revision of our understanding of the origins of universal<br />
telephone service and the role of interconnection in industry development. He<br />
is on the international editorial boards of the journals TELECOMMUNICATIONS<br />
POLICY, THE INFORMATION SOCIETY, and INFO: THE JOURNAL OF POLICY,<br />
REGULATION AND STRATEGY FOR TELECOMMUNICATION, INFORMATION<br />
AND MEDIA.
40 CONTRIBUTORS<br />
John Palfrey<br />
John Palfrey is Henry N. Ess Professor of Law and Vice Dean for Library and<br />
Information Resources at Harvard Law School. He is the co-author of Born<br />
<strong>Digital</strong>: Understanding the First Generation of <strong>Digital</strong> Natives (B<strong>as</strong>ic Books, 2008) and<br />
Access Denied: The Practice and Politics of Internet Filtering (MIT Press, 2008). His<br />
research and teaching is focused on Internet law, intellectual property, and<br />
international law. He practiced intellectual property and corporate law at the<br />
law firm of Ropes & Gray. He is a faculty co-director of the Berkman Center<br />
for Internet & Society at Harvard University. Outside of Harvard Law<br />
School, he is a Venture Executive at Highland Capital Partners and serves on<br />
the board of several technology companies and non-profits. John served <strong>as</strong> a<br />
special <strong>as</strong>sistant at the U.S. EPA during the Clinton Administration. He is a<br />
graduate of Harvard College, the University of Cambridge, and Harvard Law<br />
School.<br />
Frank P<strong>as</strong>quale<br />
Frank P<strong>as</strong>quale is a professor of law at Seton Hall Law School and a visiting<br />
fellow at the Princeton University’s Center for Information Technology Policy.<br />
He h<strong>as</strong> a JD from Yale, w<strong>as</strong> a Marshall Scholar at Oxford, and graduated from<br />
Harvard summa cum laude. He h<strong>as</strong> been a visiting professor at Yale and<br />
Cardozo Law Schools. He h<strong>as</strong> published widely on information law and policy.<br />
In 2010, he w<strong>as</strong> twice invited by the National Academy of Sciences’s Committee<br />
on Law, Science, and Technology and its Government-University-Industry<br />
Roundtable to present on the privacy and security implications of data sensor<br />
networks. He also w<strong>as</strong> invited by the Department of Health and Human<br />
Services’ Office of the National Coordinator for Health Information<br />
Technology to present at a roundtable organized to inform ONC’s<br />
Congressionally mandated report on privacy and security requirements for<br />
entities not covered by HIPAA (relating to Section 13424 of the HITECH Act).<br />
In 2008, he presented Internet Nondiscrimination Principles for Competition<br />
Policy Online before the T<strong>as</strong>k Force on Competition Policy and Antitrust Laws<br />
of the House Committee on the Judiciary, appearing with the General Counsels<br />
of Google, Microsoft, and Yahoo. He is the Chair of the Privacy & Defamation<br />
section of the American Association of Law Schools for 2010.<br />
P<strong>as</strong>quale h<strong>as</strong> been quoted in the New York Times, San Francisco Chronicle,<br />
Los Angeles Times, Boston Globe, Financial Times, and many other<br />
publications. He h<strong>as</strong> appeared on CNN to comment on Google’s China policy.<br />
He h<strong>as</strong> been interviewed on internet regulation on David Levine’s Hearsay<br />
Culture poD.C.<strong>as</strong>t, WNYC’s Brian Lehrer Show, the Canadian<br />
BroaD.C.<strong>as</strong>ting Corporation’s documentary “Engineering Search,” and on<br />
National Public Radio’s Talk of the Nation. His recent publications include<br />
“Beyond Innovation and Competition,” “Network Accountability for the<br />
Domestic Intelligence Apparatus” (with Danielle Citron), “Restoring
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 41<br />
Transparency to Automated Authority,” and “Data and Power.” He is<br />
presently working on a book titled “The Black Box Society” which examines<br />
and critiques the rise of secret technology in the internet and finance sectors.<br />
Berin Szoka<br />
Berin Szoka is founder of TechFreedom, a non-profit think tank dedicated<br />
to unle<strong>as</strong>hing the progress of technology that improves the human condition<br />
and expands individual capacity to choose.<br />
Previously, he w<strong>as</strong> a Senior Fellow and the Director of the Center for Internet<br />
Freedom at The Progress & Freedom Foundation. Before joining PFF, he<br />
w<strong>as</strong> an Associate in the Communications Practice Group at Latham & Watkins<br />
LLP. Before joining Latham, Szoka practiced at Lawler Metzger Milkman &<br />
Keeney, LLC in W<strong>as</strong>hington and clerked for the Hon. H. Dale Cook, Senior<br />
U.S. District Judge for the Northern District of Oklahoma.<br />
Szoka received his Bachelor's degree in economics from Duke University and<br />
his juris doctor from the University of Virginia School of Law, where he served<br />
<strong>as</strong> Submissions Editor of the Virginia Journal of Law and Technology. He is<br />
admitted to practice law in the District of Columbia and California (inactive).<br />
He serves on the Steering Committee for the D.C. Bar's Computer &<br />
Telecommunications Law Section, and on the FAA's Commercial Space<br />
Transportation Advisory Committee (COMSTAC). Szoka h<strong>as</strong> chaired, and<br />
currently serves on, the Board of Directors of the Space Frontier Foundation,<br />
a non-profit citizens' advocacy group founded in 1988 and dedicated to<br />
advancing commercial opportunity and expansion of human civilization in<br />
space.<br />
Paul Szynol<br />
Paul Szynol w<strong>as</strong> born in Warsaw, Poland, and moved to the United States in<br />
1984, the year that New York City’s transit fare rose from 75 cents to 90 cents;<br />
33 previously unknown Bach pieces were found in an academic library; and<br />
Canon demoed its first digital still camera. He h<strong>as</strong> lived in New York City, San<br />
Francisco, Los Angeles, New Haven, Philadelphia, New Jersey, and Warsaw,<br />
and, during his six drives across the U.S., visited the v<strong>as</strong>t majority of the<br />
contiguous states. He graduated from Columbia University, where he studied<br />
history and philosophy, and Yale University, where he studied intellectual<br />
property law. He h<strong>as</strong> also taken courses at the International Center of<br />
Photography. In the p<strong>as</strong>t, Paul played drums and w<strong>as</strong> a computer programmer,<br />
and he still tinkers with Pearl drums and Java libraries. He likes dogs,<br />
documentary photography, music, San Francisco, Linux, and depressing movies.<br />
He is currently b<strong>as</strong>ed in New York City.
42 CONTRIBUTORS<br />
Adam Thierer<br />
Adam Thierer is a senior research fellow at the Mercatus Center at George<br />
M<strong>as</strong>on University where he works with the Technology Policy Program.<br />
Thierer covers technology, media, Internet, and free speech policy issues with a<br />
particular focus in online child safety and digital privacy policy issues.<br />
Thierer h<strong>as</strong> spent almost two decades in the public policy research community.<br />
He previously served <strong>as</strong> the President of The Progress & Freedom Foundation,<br />
the Director of Telecommunications Studies at the Cato Institute, a Senior<br />
Fellow at The Heritage Foundation <strong>as</strong> a Fellow in Economic Policy, and a<br />
researcher at the Adam Smith Institute in London.<br />
Thierer is the author or editor of seven books on diverse topics such <strong>as</strong> media<br />
regulation and child safety issues, m<strong>as</strong>s media regulation, Internet governance<br />
and jurisdiction, intellectual property, regulation of network industries, and the<br />
role of federalism within high-technology markets. He earned his B.A. in<br />
journalism and political science at Indiana University, and received his M.A. in<br />
international business management and trade theory at the University of<br />
Maryland.<br />
Thierer h<strong>as</strong> served on several distinguished online safety t<strong>as</strong>k forces, including<br />
Harvard Law School’s Internet Safety Technical T<strong>as</strong>k Force, a “Blue Ribbon<br />
Working Group” on child safety organized by Common Sense Media, the<br />
iKeepSafe Coalition, and the National Cable & Telecommunications<br />
Association, and the National Telecommunications and Information<br />
Administration’s “Online Safety and Technology Working Group.” He is also<br />
an advisor to the American Legislative Exchange Council’s Telecom & IT T<strong>as</strong>k<br />
Force. In 2008, Thierer received the Family Online Safety Institute’s “Award<br />
for Outstanding Achievement.”<br />
Hal Varian<br />
Hal R. Varian is the Chief Economist at Google. He started in May 2002 <strong>as</strong> a<br />
consultant and h<strong>as</strong> been involved in many <strong>as</strong>pects of the company, including<br />
auction design, econometric analysis, finance, corporate strategy and public<br />
policy.<br />
He is also an emeritus professor at the University of California, Berkeley in<br />
three departments: business, economics, and information management.<br />
He received his SB degree from MIT in 1969 and his MA in mathematics and<br />
Ph.D. in economics from UC Berkeley in 1973. He h<strong>as</strong> also taught at MIT,<br />
Stanford, Oxford, Michigan and other universities around the world.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 43<br />
Dr. Varian is a fellow of the Guggenheim Foundation, the Econometric Society,<br />
and the American Academy of Arts and Sciences. He w<strong>as</strong> Co-Editor of the<br />
American Economic Review from 1987-1990 and holds honorary doctorates<br />
from the University of Oulu, Finland and the University of Karlsruhe,<br />
Germany.<br />
Professor Varian h<strong>as</strong> published numerous papers in economic theory, industrial<br />
organization, financial economics, econometrics and information economics.<br />
He is the author of two major economics textbooks which have been translated<br />
into 22 languages. He is the co-author of a bestselling book on business<br />
strategy, INFORMATION RULES: A STRATEGIC GUIDE TO THE NETWORK<br />
ECONOMY and wrote a monthly column for the NEW YORK TIMES from<br />
2000 to 2007.<br />
Christopher Wolf<br />
Christopher Wolf is a director of Hogan Lovells’ Privacy and Information<br />
Management practice group. Chris is widely recognized <strong>as</strong> one of the leading<br />
American practitioners in the field of privacy and data security law. The<br />
prestigious Practising Law Institute (PLI) tapped Chris to serve <strong>as</strong> editor and<br />
lead author of its first-ever treatise on the subject, and to serve <strong>as</strong> co-editor of<br />
its guide to the FACTA Red Flags identity theft regulations. Chris recently w<strong>as</strong><br />
heralded for his “lifelong experience <strong>as</strong> a litigator” by Chambers U.S.A by<br />
ranking him <strong>as</strong> one of the nation’s top privacy lawyers. He also w<strong>as</strong> <strong>as</strong>ked to<br />
form and co-chair The Future of Privacy Forum, a think tank that focuses on<br />
modern privacy issues with a business practical-consumer friendly perspective,<br />
collaborating with industry, government, academia, and privacy advocates.<br />
When MSNBC labeled Chris “a pioneer in Internet law,” it w<strong>as</strong> reflecting on his<br />
participation in many of the precedent-setting matters that form the framework<br />
of modern privacy law.<br />
Chris h<strong>as</strong> deep experience in the entire range of international, federal, and state<br />
privacy and data security laws <strong>as</strong> well <strong>as</strong> the many sectoral and geographic<br />
regulations. Chris also counsels clients on compliance with self-regulatory<br />
regimes.<br />
Chris h<strong>as</strong> appeared <strong>as</strong> a speaker for the International Association of Privacy<br />
Professionals and for the Canadian Association of Chief Privacy Officers. He<br />
appears annually at the PLI Institute on Privacy and Security Law. He also h<strong>as</strong><br />
spoken at colleges and universities including Harvard, Stanford, Berkeley, the<br />
University of Chicago, George W<strong>as</strong>hington University, Georgetown University,<br />
and the W<strong>as</strong>hington & Lee University School of Law. He is a frequent<br />
television guest on privacy and related issues, appearing on PBS, NBC,<br />
MSNBC, CNN, Fox News, and others.
44 CONTRIBUTORS<br />
Chris is a fourth-generation W<strong>as</strong>hingtonian who started his career in<br />
W<strong>as</strong>hington, D.C. <strong>as</strong> law clerk to The Honorable Aubrey E. Robinson, Jr., of<br />
the U.S. District Court for the District of Columbia. While in law school, he<br />
w<strong>as</strong> a member of the W<strong>as</strong>hington & Lee Law Review.<br />
Tim Wu<br />
Tim Wu is a professor at Columbia Law School. He teaches copyright and<br />
communications.<br />
He is the chair of media reform organization Free Press, and writes for Slate<br />
magazine on law, media, culture, travel, and dumplings. He h<strong>as</strong> also written<br />
for some other publications <strong>as</strong> a pure freelancer, including the NEW YORKER,<br />
the NEW YORK TIMES, WASHINGTON POST WEEKEND, and FORBES.<br />
He is also involved in various other projects, usually related to alternative<br />
channels of content distribution. Many are run through the Columbia<br />
Program on Law & Technology. One example is Project Posner, another is<br />
AltLaw, and another is Keep Your Copyrights.<br />
His first book w<strong>as</strong> WHO CONTROLS THE INTERNET with Jack Goldsmith.<br />
He is writing a new book on the long patterns of media centralization and<br />
decentralization; the publisher is Knopf / Random House.<br />
His topics of study are: Network neutrality, the history and structure of the<br />
media and communications industries (the book he is currently working on),<br />
international problems faced by the Internet (see WHO CONTROLS THE<br />
INTERNET, and copyright and innovation policy (Copyright’s<br />
Communications Policy).<br />
His brother is David Wu, author of the Xbox 360 game Full Auto, and his<br />
mother is Gillian Wu, a scientist. He is married to Kate Judge. His best<br />
friends are the Famous Five.<br />
Michael Zimmer<br />
Michael Zimmer, PhD, is an <strong>as</strong>sistant professor in the School of Information<br />
Studies at the University of Wisconsin-Milwaukee, and an <strong>as</strong>sociate at the<br />
Center for Information Policy Research.<br />
With a background in new media and Internet studies, the philosophy of<br />
technology, and information policy, Zimmer studies the social, political, and<br />
ethical dimensions of new media and information technologies. His research<br />
and teaching focuses on:
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 45<br />
� Ethics and Information Technology<br />
� Information Policy<br />
� Web Search Engines<br />
� Web 2.0 and Library 2.0<br />
� Privacy and Surveillance Theory<br />
� Information and Web Literacy<br />
� Access to Knowledge<br />
� Internet Research Ethics<br />
Zimmer received his PhD in 2007 from the Department of Media, Culture,<br />
and Communication at New York University under the guidance of Profs.<br />
Helen Nissenbaum, Alex Galloway, and Siva Vaidhyanathan. He w<strong>as</strong> a<br />
Student Fellow at the Information Law Institute at NYU Law from 2004-<br />
2007, and w<strong>as</strong> the Microsoft Resident Fellow at the Information Society<br />
Project at Yale Law School for 2007-2008. Zimmer joined UW-Milwaukee’s<br />
School of Information Studies in 2008.<br />
Zimmer earned a B.B.A. in Marketing from the University of Notre Dame in<br />
1994 and worked for an electronic payment processing company in Milwaukee,<br />
Wisconsin for several years before moving to New York City to pursue a new<br />
career in academia. He earned an M.A. in Media Ecology from NYU in 2002,<br />
and his doctoral studies were supported by the Phyllis and Gerald LeBoff<br />
Doctoral Fellowship in Media Ecology from the Steinhardt School of<br />
Education at New York University. His dissertation research w<strong>as</strong> supported by<br />
an NSF SES Dissertation Improvement Grant.<br />
Zimmer h<strong>as</strong> published in international journals and delivered talks across North<br />
America and Europe. He h<strong>as</strong> been interviewed in The New York Times, on<br />
National Public Radio’s Morning Edition and Science Friday programs, The<br />
Huffington Post, MSNBC.com, GQ Magazine, The Montreal Gazette, The Boston<br />
Globe, MIT Technology Review, The Milwaukee Journal Sentinel, and in various<br />
other national and local print and radio outlets.<br />
Zimmer w<strong>as</strong> also featured in the “Is My Cellphone Spying on Me?”<br />
commentary accompanying the 2-disc special edition DVD for the hit<br />
action/thriller movie Eagle Eye.<br />
Jonathan Zittrain<br />
Jonathan Zittrain is Professor of Law at Harvard Law School and the Harvard<br />
Kennedy School of Government, co-founder of the Berkman Center for<br />
Internet & Society, and Professor of Computer Science in the Harvard School<br />
of Engineering and Applied Sciences. He is a member of the Board of Trustees<br />
of the Internet Society and is on the board of advisors for Scientific American.<br />
Previously he w<strong>as</strong> Professor of Internet Governance and Regulation at Oxford<br />
University.
46 CONTRIBUTORS<br />
His research interests include battles for control of digital property and content,<br />
cryptography, electronic privacy, the roles of intermediaries within Internet<br />
architecture, and the useful and unobtrusive deployment of technology in<br />
education.<br />
He performed the first large-scale tests of Internet filtering in China and Saudi<br />
Arabia in 2002, and now <strong>as</strong> part of the OpenNet Initiative he h<strong>as</strong> co-edited a<br />
study of Internet filtering by national governments, ACCESS DENIED: THE<br />
PRACTICE AND POLICY OF GLOBAL INTERNET FILTERING, and its sequel,<br />
ACCESS CONTROLLED: THE SHAPING OF POWER, RIGHTS, AND RULE IN<br />
CYBERSPACE.<br />
His book THE FUTURE OF THE INTERNET—AND HOW TO STOP IT is available<br />
from Yale University Press and Penguin UK—and under a Creative Commons<br />
license. His papers may be found at www.jz.org.<br />
Ethan Zuckerman<br />
Ethan Zuckerman served <strong>as</strong> a fellow of the Berkman Center for Internet and<br />
Society at Harvard University from 2003 through 2009. Since 2009, he’s been a<br />
senior researcher at the center, working on projects that focus on the impact of<br />
technology and media on the developing world and on quantitative analysis of<br />
media. With Hal Roberts, he is working on comparative studies of tools for<br />
censorship circumvention, techniques for blocking-resistant publishing for<br />
human rights sites and on the Media Cloud framework for quantitative study of<br />
digital media.<br />
Ethan and Berkman fellow Rebecca MacKinnon founded Global Voices, a<br />
global citizen media network. Beginning at a Berkman conference in 2004,<br />
Global Voices h<strong>as</strong> grown into an independent Netherlands-b<strong>as</strong>ed nonprofit<br />
with over 200 employees and volunteers in over 100 countries. Global Voices<br />
maintains an international citizen media newsroom, tracks censorship<br />
and advocates for freedom of speech online, supports gr<strong>as</strong>sroots citizen<br />
media efforts and is a pioneer in the space of social translation. Global<br />
Voices’ work h<strong>as</strong> been supported by the MacArthur Foundation, Ford<br />
Foundation, Knight Foundation, Hivos, Open Society Institute <strong>as</strong> well <strong>as</strong><br />
Google, Reuters and private donors. Ethan chairs Global Voices’ global board<br />
of directors.<br />
In 2000, Ethan founded Geekcorps, a non-profit technology volunteer corps.<br />
Geekcorps pairs skilled volunteers from U.S. and European high tech<br />
companies with businesses in emerging nations for one to four month volunteer<br />
tours. Volunteers have served in 14 nations, including Ghana, Senegal, Mali,<br />
Rwanda, Armenia and Jordan, completing over a hundred projects. Geekcorps<br />
became a division of the International Executive Service Corps in 2001, where<br />
Ethan served <strong>as</strong> a vice president from 2001 to 2004.
PART I<br />
THE BIG PICTURE & NEW FRAMEWORKS<br />
47
CHAPTER 1<br />
THE INTERNET’S IMPACT ON<br />
CULTURE & SOCIETY: GOOD OR BAD?<br />
Why We Must Resist the Temptation of Web 2.0 51<br />
Andrew Keen<br />
The C<strong>as</strong>e for Internet Optimism, Part 1:<br />
Saving the Net from Its Detractors 57<br />
Adam Thierer<br />
49
50 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 51<br />
Why We Must Resist the<br />
Temptation of Web 2.0<br />
By Andrew Keen *<br />
The ancients were good at resisting seduction. Odysseus fought the seductive<br />
song of the Sirens by having his men tie him to the m<strong>as</strong>t of his ship <strong>as</strong> it sailed<br />
p<strong>as</strong>t the Siren’s Isle. Socrates w<strong>as</strong> so intent on protecting citizens from the<br />
seductive opinions of artists and writers, that he outlawed them from his<br />
imaginary republic.<br />
We moderns are less nimble at resisting great seductions, particularly those<br />
utopian visions that promise grand political or cultural salvation. From the<br />
French and Russian revolutions to the counter-cultural upheavals of the ‘60s<br />
and the digital revolution of the ‘90s, we have been seduced, time after time and<br />
text after text, by the vision of a political or economic utopia.<br />
Rather than Paris, Moscow, or Berkeley, the grand utopian movement of our<br />
contemporary age is headquartered in Silicon Valley, whose great seduction is<br />
actually a fusion of two historical movements: the counter-cultural utopianism<br />
of the ‘60s and the techno-economic utopianism of the ‘90s. Here in Silicon<br />
Valley, this seduction h<strong>as</strong> announced itself to the world <strong>as</strong> the “Web 2.0”<br />
movement.<br />
On one occ<strong>as</strong>ion, I w<strong>as</strong> treated to lunch at a f<strong>as</strong>hionable Japanese restaurant in<br />
Palo Alto by a serial Silicon Valley entrepreneur who, back in the dot.com<br />
boom, had invested in my start-up Audiocafe.com. The entrepreneur, a Silicon<br />
Valley veteran like me, w<strong>as</strong> pitching me his latest start-up: a technology<br />
platform that creates e<strong>as</strong>y-to-use software tools for online communities to<br />
publish weblogs, digital movies, and music. It is technology that enables anyone<br />
with a computer to become an author, a film director, or a musician. This Web<br />
2.0 dream is Socrates’s nightmare: technology that arms every citizen with the<br />
means to be an opinionated artist or writer.<br />
“This is historic,” my friend promised me. “We are enabling Internet users to<br />
author their own content. Think of it <strong>as</strong> empowering citizen media. We can<br />
help sm<strong>as</strong>h the elitism of the Hollywood studios and the big record labels. Our<br />
technology platform will radically democratize culture, build authentic<br />
community, and create citizen media.” Welcome to Web 2.0.<br />
* Andrew Keen is a veteran Silicon Valley entrepreneur and digital media critic. He blogs at<br />
TheGreatSeduction.com and h<strong>as</strong> recently launched AfterTV, a podc<strong>as</strong>t chat show about<br />
media, culture, and technology. He is the author of THE CULT OF THE AMATEUR: HOW<br />
TODAY’S INTERNET IS KILLING OUR CULTURE (Crown 2007).
52 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
Buzzwords from the old dot.com era—like “cool,” “eyeballs,” or “burn-rate”—<br />
have been replaced in Web 2.0 by language which is simultaneously more<br />
militant and absurd: empowering citizen media, radically democratize, sm<strong>as</strong>h<br />
elitism, content redistribution, authentic community. This sociological jargon,<br />
once the preserve of the hippie counterculture, h<strong>as</strong> now become the lexicon of<br />
new media capitalism.<br />
Yet this entrepreneur owns a $4 million house a few blocks from Steve Jobs’s<br />
house. He vacations in the South Pacific. His children attend the most<br />
exclusive private academy on the peninsula. But for all of this he sounds more<br />
like a cultural Marxist—a disciple of Antonio Gramsci or Herbert Marcuse—<br />
than a capitalist with an MBA from Stanford.<br />
In his mind, “big media”—the Hollywood studios, the major record labels and<br />
international publishing houses—really did represent the enemy. The promised<br />
land w<strong>as</strong> user-generated online content. In Marxist terms, the traditional media<br />
had become the exploitative “bourgeoisie,” and citizen media, those heroic<br />
bloggers and podc<strong>as</strong>ters, were the “proletariat.”<br />
This outlook is typical of the Web 2.0 movement, which fuses ‘60s radicalism<br />
with the utopian eschatology of digital technology. The ideological outcome<br />
may be trouble for all of us.<br />
So what, exactly, is the Web 2.0 movement? As an ideology, it is b<strong>as</strong>ed upon a<br />
series of ethical <strong>as</strong>sumptions about media, culture, and technology. It worships<br />
the creative amateur: the self-taught filmmaker, the dorm-room musician, the<br />
unpublished writer. It suggests that everyone—even the most poorly educated<br />
and inarticulate amongst us—can and should use digital media to express and<br />
realize themselves. Web 2.0 “empowers” our creativity, it “democratizes”<br />
media, it “levels the playing field” between experts and amateurs. The enemy of<br />
Web 2.0 is “elitist” traditional media.<br />
Empowered by Web 2.0 technology, we can all become citizen journalists,<br />
citizen videographers, or citizen musicians. Empowered by this technology, we<br />
will be able to write in the morning, direct movies in the afternoon, and make<br />
music in the evening.<br />
Sounds familiar? It’s eerily similar to Marx’s seductive promise about individual<br />
self-realization in his German Ideology:<br />
Where<strong>as</strong> in communist society, where nobody h<strong>as</strong> one<br />
exclusive sphere of activity but each can become accomplished<br />
in any branch he wishes, society regulates the general<br />
production and thus makes it possible for me to do one thing<br />
today and another tomorrow, to hunt in the morning, fish in<br />
the afternoon, rear cattle in the evening, criticise after dinner,
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 53<br />
just <strong>as</strong> I have a mind, without ever becoming hunter,<br />
fisherman, shepherd or critic. 1<br />
Just <strong>as</strong> Marx seduced a generation of European idealists with his fant<strong>as</strong>y of selfrealization<br />
in a communist utopia, so the Web 2.0 cult of creative self-realization<br />
h<strong>as</strong> seduced everyone in Silicon Valley. The movement bridges counter-cultural<br />
radicals of the ‘60s such <strong>as</strong> Steve Jobs with the contemporary geek culture of<br />
Google’s Larry Page. Between the book-ends of Jobs and Page lies the rest of<br />
Silicon Valley including radical communitarians like Craig Newmark (of<br />
Craigslist.com), intellectual property communists such <strong>as</strong> Stanford Law<br />
Professor Larry Lessig, economic cornucopians like Wired magazine editor Chris<br />
“Long Tail” Anderson, journalism professor Jeff Jarvis, and new media moguls<br />
Tim O’Reilly and John Battelle.<br />
The ideology of the Web 2.0 movement w<strong>as</strong> perfectly summarized at the<br />
Technology Education and Design (TED) show in Monterey in 2005 when<br />
Kevin Kelly, Silicon Valley’s über-idealist and author of the Web 1.0 Internet<br />
utopia Ten Rules for The New Economy, said:<br />
Imagine Mozart before the technology of the piano. Imagine<br />
Van Gogh before the technology of affordable oil paints.<br />
Imagine Hitchcock before the technology of film. We have a<br />
moral obligation to develop technology. 2<br />
But where Kelly sees a moral obligation to develop technology, we should actually<br />
have—if we really care about Mozart, Van Gogh and Hitchcock—a moral<br />
obligation to question the development of technology.<br />
The consequences of Web 2.0 are inherently dangerous for the vitality of<br />
culture and the arts. Its empowering promises play upon that legacy of the<br />
‘60s—the creeping narcissism that Christopher L<strong>as</strong>ch described so presciently,<br />
with its obsessive focus on the realization of the self. 3<br />
Another word for narcissism is “personalization.” Web 2.0 technology<br />
personalizes culture so that it reflects ourselves rather than the world around us.<br />
Blogs personalize media content so that all we read are our own thoughts.<br />
1 KARL MARX & FRIEDRICH ENGELS, THE GERMAN IDEOLOGY (1845), text available at<br />
Marxist Internet Archive,<br />
http://www.marxists.org/archive/marx/works/1845/german-ideology/ch01a.htm.<br />
2 See Dan Frost, Meeting of Minds in Monterey, SAN FRANCISCO CHRONICLE, Feb. 27, 2005,<br />
http://articles.sfgate.com/2005-02-27/business/17361312_1_digital-world-edwardburtynsky-robert-fischell/2<br />
(quoting Kevin Kelly).<br />
3 See CHRISTOPHER LASCH, THE CULTURE OF NARCISSISM: AMERICAN LIFE IN AN AGE OF<br />
DIMINISHING EXPECTATIONS (1978).
54 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
Online stores personalize our preferences, thus feeding back to us our own<br />
t<strong>as</strong>te. Google personalizes searches so that all we see are advertisements for<br />
products and services we already use.<br />
Instead of Mozart, Van Gogh, or Hitchcock, all we get with the Web 2.0<br />
revolution is more of ourselves.<br />
Still, the idea of inevitable technological progress h<strong>as</strong> become so seductive that<br />
it h<strong>as</strong> been transformed into “laws.” In Silicon Valley, the most quoted of these<br />
laws, Moore’s Law, states that the number of transistors on a chip doubles every<br />
two years, thus doubling the memory capacity of the personal computer every<br />
two years. On one level, of course, Moore’s Law is real and it h<strong>as</strong> driven the<br />
Silicon Valley economy. But there is an unspoken ethical dimension to Moore’s<br />
Law. It presumes that each advance in technology is accompanied by an<br />
equivalent improvement in the condition of man.<br />
But <strong>as</strong> Max Weber so convincingly demonstrated, the only really reliable law of<br />
history is the Law of Unintended Consequences.<br />
We know what happened the first time around, in the dot.com boom of the<br />
‘90s. At first there w<strong>as</strong> irrational exuberance. Then the dot.com bubble<br />
popped; some people lost a lot of money and a lot of people lost some money.<br />
But nothing really changed. Big media remained big media and almost<br />
everything else—with the exception of Amazon.com and eBay—withered away.<br />
This time, however, the consequences of the digital media revolution are much<br />
more profound. Apple, Google and Craigslist really are revolutionizing our<br />
cultural habits, our ways of entertaining ourselves, our ways of defining who we<br />
are. Traditional “elitist” media is being destroyed by digital technologies.<br />
Newspapers are in free-fall. Network television, the modern equivalent of the<br />
dinosaur, is being shaken by TiVo’s overnight annihilation of the 30-second<br />
commercial and competition from Internet-delivered television and amateur<br />
video. The iPod is undermining the multibillion dollar music industry.<br />
Meanwhile, digital piracy, enabled by Silicon Valley hardware and justified by<br />
intellectual property communists such <strong>as</strong> Larry Lessig, is draining revenue from<br />
established artists, movie studios, newspapers, record labels, and song writers.<br />
Is this a bad thing? The purpose of our media and culture industries—beyond<br />
the obvious need to make money and entertain people—is to discover, nurture,<br />
and reward elite talent. Our traditional mainstream media h<strong>as</strong> done this with<br />
great success over the l<strong>as</strong>t century. Consider Alfred Hitchcock’s m<strong>as</strong>terpiece,<br />
Vertigo and a couple of other brilliantly talented works of the same name: the<br />
1999 book by Anglo-German writer W.G. Sebald, and the 2004 song by Irish<br />
rock star Bono. Hitchcock could never have made his expensive, complex<br />
movies outside the Hollywood studio system. Bono would never have become<br />
Bono without the music industry’s super-heavyweight marketing muscle. And
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 55<br />
W.G. Sebald, the most obscure of this trinity of talent, would have remained an<br />
unknown university professor, had a high-end publishing house not had the<br />
good t<strong>as</strong>te to discover and distribute his work. Elite artists and an elite media<br />
industry are symbiotic. If you democratize media, then you end up<br />
democratizing talent. The unintended consequence of all this democratization,<br />
to misquote Web 2.0 apologist Thom<strong>as</strong> Friedman, is cultural “flattening.” 4 No<br />
more Hitchcocks, Bonos, or Sebalds. Just the flat noise of opinion—Socrates’s<br />
nightmare.<br />
While Socrates correctly gave warning about the dangers of a society infatuated<br />
by opinion in Plato’s Republic, more modern dystopian writers—Huxley,<br />
Bradbury, and Orwell—got the Web 2.0 future exactly wrong. Much h<strong>as</strong> been<br />
made, for example, of the <strong>as</strong>sociations between the all-seeing, all-knowing<br />
qualities of Google’s search engine and the Big Brother in Nineteen Eighty-<br />
Four. 5 But Orwell’s fear w<strong>as</strong> the disappearance of the individual right to selfexpression.<br />
Thus Winston Smith’s great act of rebellion in Nineteen Eight-<br />
Four w<strong>as</strong> his decision to pick up a rusty pen and express his own thoughts:<br />
The thing that he w<strong>as</strong> about to do w<strong>as</strong> open a diary. This w<strong>as</strong><br />
not illegal, but if detected it w<strong>as</strong> re<strong>as</strong>onably certain that it<br />
would be punished by death… Winston fitted a nib into the<br />
penholder and sucked it to get the gre<strong>as</strong>e off…. He dipped the<br />
pen into the ink and then faltered for just a second. A tremor<br />
had gone through his bowels. To mark the paper w<strong>as</strong> the<br />
decisive act. 6<br />
In the Web 2.0 world, however, the nightmare is not the scarcity, but the overabundance<br />
of authors. Since everyone will use digital media to express<br />
themselves, the only decisive act will be to not mark the paper. Not writing <strong>as</strong><br />
rebellion sounds bizarre—like a piece of fiction authored by Franz Kafka. But<br />
one of the unintended consequences of the Web 2.0 future may well be that<br />
everyone is an author, while there is no longer any audience.<br />
Speaking of Kafka, on the back cover of the January 2006 issue of Poets &<br />
Writers magazine, there is a seductive Web 2.0 style advertisement which reads:<br />
Kafka toiled in obscurity and died penniless. If only he’d had a<br />
website … .<br />
4 See THOMAS FRIEDMAN, THE WORLD IS FLAT: A BRIEF HISTORY OF THE TWENTY-FIRST<br />
CENTURY (2005).<br />
5 See GEORGE ORWELL, NINETEEN EIGHTY-FOUR (1949).<br />
6 Id. at 6.
56 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
Presumably, if Kafka had had a website, it would be located at kafka.com—<br />
which is today an address owned by a mad left-wing blog called The Biscuit<br />
Report. The front page of this site quotes some words written by Kafka in his<br />
diary:<br />
I have no memory for things I have learned, nor things I have<br />
read, nor things experienced or heard, neither for people nor<br />
events; I feel that I have experienced nothing, learned nothing,<br />
that I actually know less than the average schoolboy, and that<br />
what I do know is superficial, and that every second question is<br />
beyond me. I am incapable of thinking deliberately; my<br />
thoughts run into a wall. I can gr<strong>as</strong>p the essence of things in<br />
isolation, but I am quite incapable of coherent, unbroken<br />
thinking. I can’t even tell a story properly; in fact, I can<br />
scarcely talk … 7<br />
One of the unintended consequences of the Web 2.0 movement may well be<br />
that we fall, collectively, into the amnesia that Kafka describes. Without an elite<br />
mainstream media, we will lose our memory for things learnt, read, experienced,<br />
or heard. The cultural consequences of this are dire, requiring the authoritative<br />
voice of at le<strong>as</strong>t an Allan Bloom, 8 if not an Oswald Spengler. 9 But here in<br />
Silicon Valley, on the brink of the Web 2.0 epoch, there no longer are any<br />
Blooms or Spenglers. All we have is the great seduction of citizen media,<br />
democratized content and authentic online communities. And blogs, of course.<br />
Millions and millions of blogs.<br />
7 See The Biscuit Report,<br />
http://web.archive.org/web/20080225015716/http://www.kafka.com/.<br />
8 See ALLAN BLOOM, THE CLOSING OF THE AMERICAN MIND (1987).<br />
9 See OSWALD SPENGLER, THE DECLINE OF THE WEST (1918).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 57<br />
The C<strong>as</strong>e for Internet Optimism,<br />
Part 1: Saving the Net<br />
from Its Detractors<br />
By Adam Thierer *<br />
Introduction: Two Schools<br />
of Internet Pessimism<br />
Surveying the prevailing mood surrounding cyberlaw and Internet policy circa<br />
2010, one is struck by the overwhelming sense of pessimism regarding the longterm<br />
prospects for a better future. “Internet pessimism,” however, comes in<br />
two very distinct flavors:<br />
1. Net Skeptics, Pessimistic about the Internet Improving the Lot of<br />
Mankind: The first variant of Internet pessimism is rooted in general<br />
skepticism about the supposed benefits of cyberspace, digital technologies,<br />
and information abundance. The proponents of this pessimistic view often<br />
wax nostalgic about some supposed “good ‘ol days” when life w<strong>as</strong> much<br />
better (although they can’t seem to agree when those were). At a minimum,<br />
they want us to slow down and think twice about life in the Information<br />
Age and how it’s personally affecting each of us. Occ<strong>as</strong>ionally, however,<br />
this pessimism borders on neo-Ludditism, with some proponents<br />
recommending steps to curtail what they feel is the destructive impact of<br />
the Net or digital technologies on culture or the economy. Leading<br />
proponents of this variant of Internet pessimism include: Neil Postman<br />
(Technopoly: The Surrender of Culture to Technology), Andrew Keen, (The Cult of<br />
the Amateur: How Today’s Internet is Killing our Culture), Lee Siegel, (Against the<br />
Machine: Being Human in the Age of the Electronic Mob), Mark Helprin, (<strong>Digital</strong><br />
Barbarism) and, to a lesser degree, Jaron Lanier (You Are Not a Gadget) and<br />
Nichol<strong>as</strong> Carr (The Big Switch and The Shallows).<br />
2. Net Lovers, Pessimistic about the Future of Openness: A different<br />
type of Internet pessimism is on display in the work of many leading<br />
cyberlaw scholars today. Noted academics such <strong>as</strong> Lawrence Lessig, (Code<br />
and Other Laws of Cyberspace), Jonathan Zittrain (The Future of the Internet—<br />
And How to Stop It), and Tim Wu (The M<strong>as</strong>ter Switch: The Rise and Fall of<br />
Information Empires), embrace the Internet and digital technologies, but argue<br />
that they are “dying” due to a lack of sufficient care or collective oversight.<br />
* Adam Thierer is a senior research fellow at the Mercatus Center at George M<strong>as</strong>on<br />
University where he works with the Technology Policy Program.
58 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
In particular, they fear that the “open” Internet and “generative” digital<br />
systems are giving way to closed, proprietary systems, typically run by<br />
villainous corporations out to erect walled gardens and qu<strong>as</strong>h our digital<br />
liberties. Thus, they are pessimistic about the long-term survival of the<br />
Internet that we currently know and love.<br />
Despite their different concerns, two things unite these two schools of technopessimism.<br />
First, there is an elitist air to their pronouncements; a veritable “the<br />
rest of you just don’t get it” attitude pervades much of their work. In the c<strong>as</strong>e<br />
of the Net skeptics, it’s the supposed decline of culture, tradition, and economy<br />
that the rest of us are supposedly blind to, but which they see perfectly—and<br />
know how to rectify. For the Net Lovers, by contr<strong>as</strong>t, we see this attitude on<br />
display when they imply that a <strong>Digital</strong> Dark Age of Closed Systems is unfolding<br />
since nefarious schemers in high-tech corporate America are out to suffocate<br />
Internet innovation and digital freedom more generally. The Net Lovers<br />
apparently see this plot unfolding, but paint the rest of us out to be robotic<br />
sheep being led to the cyber-slaughter: We are unwittingly using services (AOL<br />
in the old days; Facebook today) or devices (the iPhone and iPad) that play right<br />
into the hands of the very corporate schemers determined to trap us in high and<br />
tight walled gardens.<br />
Unsurprisingly, this elitist attitude leads to the second belief uniting these two<br />
variants of Net pessimism: Someone or something must intervene to set us on a<br />
better course or protect those things that they regard <strong>as</strong> sacred. The critics<br />
either fancy themselves <strong>as</strong> the philosopher kings who can set things back on a<br />
better course, or imagine that such creatures exist in government today and can<br />
be tapped to save us from our impending digital doom—whatever it may be.<br />
Dynamism vs. the St<strong>as</strong>is Mentality<br />
In both c<strong>as</strong>es, these two schools of Internet pessimism have (a) over-stated the<br />
severity of the respective problems they’ve identified and (b) failed to appreciate<br />
the benefits of evolutionary dynamism. I borrow the term “dynamism” from<br />
Virginia Postrel, who contr<strong>as</strong>ted the conflicting worldviews of dynamism and<br />
st<strong>as</strong>is so eloquently in her 1998 book, The Future and Its Enemies. Postrel argued<br />
that:<br />
The future we face at the dawn of the twenty-first century is,<br />
like all futures left to themselves, “emergent, complex<br />
messiness.” Its “messiness” lies not in disorder, but in an order<br />
that is unpredictable, spontaneous, and ever shifting, a pattern<br />
created by millions of uncoordinated, independent decisions. 1<br />
1 VIRGINIA POSTREL, THE FUTURE AND ITS ENEMIES, at xv (1998).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 59<br />
“[T]hese actions shape a future no one can see, a future that is dynamic and<br />
inherently unstable,” Postrel noted. 2 But that inherent instability and the<br />
uncomfortable realization that the future is, by its very nature, unknowable,<br />
leads to exactly the sort of anxieties we see on display in the works of both<br />
varieties of Internet pessimists today. Postrel made the c<strong>as</strong>e for embracing<br />
dynamism <strong>as</strong> follows:<br />
How we feel about the evolving future tells us who we are <strong>as</strong><br />
individuals and <strong>as</strong> a civilization: Do we search for st<strong>as</strong>is—a<br />
regulated, engineered world? Or do we embrace dynamism—a<br />
world of constant creation, discovery, and competition? Do we<br />
value stability and control, or evolution and learning? Do we<br />
declare with [Tim] Appelo that “we’re scared of the future”<br />
and join [Judith] Adams in decrying technology <strong>as</strong> “a killing<br />
thing”? Or do we see technology <strong>as</strong> an expression of human<br />
creativity and the future <strong>as</strong> inviting? Do we think that progress<br />
requires a central blueprint, or do we see it <strong>as</strong> a decentralized,<br />
evolutionary process? Do we consider mistakes permanent<br />
dis<strong>as</strong>ters, or the correctable by-products of experimentation?<br />
Do we crave predictability, or relish surprise? These two poles,<br />
st<strong>as</strong>is and dynamism, incre<strong>as</strong>ingly define our political,<br />
intellectual, and cultural landscape. The central question of our<br />
time is what to do about the future. And that question creates<br />
a deep divide. 3<br />
Indeed it does, and that divide is growing deeper <strong>as</strong> the two schools of Internet<br />
pessimism—unwittingly, of course—work together to concoct a lugubrious<br />
narrative of impending techno-apocalypse. It makes little difference whether<br />
the two schools disagree on the root cause(s) of all our problems; in the end, it’s<br />
their common call for a more “regulated, engineered world” that makes them<br />
both embrace the same st<strong>as</strong>is mindset. Again, the air of elitism rears its ugly<br />
head, Postrel notes:<br />
2 Id.<br />
3 Id. at xiv.<br />
St<strong>as</strong>ist social criticism… brings up the specifics of life only to<br />
sneer at or b<strong>as</strong>h them. Critics <strong>as</strong>sume that readers will share<br />
their attitudes and will see contemporary life <strong>as</strong> a problem<br />
demanding immediate action by the powerful and wise. This<br />
relentlessly hostile view of how we live, and how we may come<br />
to live, is distorted and dangerous. It overvalues the t<strong>as</strong>tes of<br />
an articulate elite, compares the real world of trade-offs to<br />
fant<strong>as</strong>ies of utopia, omits important details and connections,
60 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
and confuses temporary growing pains with permanent<br />
cat<strong>as</strong>trophes. It demoralizes and devalues the creative minds<br />
on whom our future depends. And it encourages the coercive<br />
use of political power to wipe out choice, forbid<br />
experimentation, short-circuit feedback, and trammel<br />
progress. 4<br />
In this essay, I focus on the first variant of Internet pessimism (the Net<br />
skeptics) and discuss their cl<strong>as</strong>h with Internet optimists. I form this narrative<br />
using the words and themes developed in various books published by Net<br />
optimists and pessimists in recent years. I make the dynamist c<strong>as</strong>e for what I<br />
call “pragmatic optimism” in that I argue that the Internet and digital<br />
technologies are reshaping our culture, economy and society—in most ways for<br />
the better (<strong>as</strong> the optimists argue), but not without some serious heartburn<br />
along the way (<strong>as</strong> the pessimists claim). My bottom line comes down to a<br />
simple cost-benefit calculus: Were we really better off in the scarcity era when we were<br />
collectively suffering from information poverty? Generally speaking, I’ll take information<br />
abundance over information poverty any day! But we should not underestimate<br />
or belittle the disruptive impacts <strong>as</strong>sociated with the Information Revolution.<br />
We need to find ways to better cope with turbulent change in a dynamist<br />
f<strong>as</strong>hion instead of embracing the st<strong>as</strong>is notion that we can roll back the clock on<br />
progress or recapture “the good ‘ol days”—which actually weren’t all that good.<br />
In another essay in this book, I address the second variant of Internet<br />
pessimism (the Net lovers) and argue that reports of the Internet’s death have<br />
been greatly exaggerated. Although the Net lovers will likely recoil at the<br />
suggestion that they are not dynamists, closer examination reveals their attitudes<br />
and recommendations to be deeply st<strong>as</strong>ist. They fret about a cyber-future in<br />
which the Internet might not <strong>as</strong> closely resemble its opening epoch. Worse yet,<br />
many of them agree with what Lawrence Lessig said in his seminal—by highly<br />
pessimistic—1999 book, Code and Other Laws of Cyberspace, that “we have every<br />
re<strong>as</strong>on to believe that cyberspace, left to itself, will not fulfill the promise of<br />
freedom. Left to itself, cyberspace will become a perfect tool of control.” 5<br />
Lessig and his intellectual disciples—especially Zittrain and Wu—have<br />
continued to forec<strong>as</strong>t a gloomy digital future unless something is done to address<br />
the Great <strong>Digital</strong> Closing we are supposedly experiencing. I will argue that,<br />
while many of us share their appreciation of the Internet’s current nature and its<br />
early history, their embrace of the st<strong>as</strong>is mentality is unfortunate since it<br />
forecloses the spontaneous evolution of cyberspace and invites government<br />
4 Id. at xvii-xviii.<br />
5 LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE 5-6 (1999).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 61<br />
But first let us turn to the Net skeptics, who don’t share such an appreciation of<br />
the potential benefits of cyberspace. Rather, their pessimism cuts deep and is<br />
rooted in overt hostility to all things digital.<br />
The Familiar Cycle of<br />
Technological Revolutions<br />
The impact of technological change on culture, learning, and morality h<strong>as</strong> long<br />
been the subject of intense debate, and every technological revolution brings<br />
out a fresh crop of both pessimists and Pollyann<strong>as</strong>. Indeed, a familiar cycle h<strong>as</strong><br />
repeat itself throughout history whenever new modes of production (from<br />
mechanized agriculture to <strong>as</strong>sembly-line production), means of transportation<br />
(water, rail, road, or air), energy production processes (steam, electric, nuclear),<br />
medical breakthroughs (vaccination, surgery, cloning), or communications<br />
techniques (telegraph, telephone, radio, television) have emerged.<br />
The cycle goes something like this: A new technology appears. Those who fear<br />
the sweeping changes brought about by this technology see a sky that is about<br />
to fall. These “techno-pessimists” predict the death of the old order (which,<br />
ironically, is often a previous generation’s hotly-debated technology that others<br />
wanted slowed or stopped). Embracing this new technology, they fear, will<br />
result in the overthrow of traditions, beliefs, values, institutions, business<br />
models, and much else they hold sacred. As Dennis Baron, author of A Better<br />
Pencil, h<strong>as</strong> noted, “the shock of the new often brings out critics eager to warn us<br />
away.” 6<br />
The Pollyann<strong>as</strong>, by contr<strong>as</strong>t, look out at the unfolding landscape and see mostly<br />
rainbows in the air. Theirs is a rose-colored world in which the technological<br />
revolution du jour improves the general lot of mankind. If something must give,<br />
then the old ways be damned! For such “techno-optimists,” progress means<br />
some norms and institutions must adapt—perhaps even disappear—for society<br />
to continue its march forward.<br />
Our current Information Revolution is no different. It too h<strong>as</strong> its share of<br />
techno-pessimists and techno-optimists who continue to debate the impact of<br />
technology on human existence. 7 Indeed, before most of us had even heard of<br />
6 DENNIS BARON, A BETTER PENCIL 12 (2009).<br />
7 William Powers, author of Hamlet’s BlackBerry: A Practical Philosophy for Building a Good<br />
Life in the <strong>Digital</strong> Age, reminds us that:<br />
whenever new devices have emerged, they’ve presented the kinds of<br />
challenges we face today—busyness, information overload, that sense of life<br />
being out of control. These challenges were <strong>as</strong> real two millennia ago <strong>as</strong> they<br />
are today, and throughout history, people have been grappling with them and<br />
looking for creative ways to manage life in the crowd.
62 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
the Internet, people were already fighting about it—or at le<strong>as</strong>t debating what the<br />
rise of the Information Age meant for our culture, society, and economy.<br />
Web 1.0 Fight: Postman vs. Negroponte<br />
In his 1992 anti-technology manifesto Technopoly: The Surrender of Culture to<br />
Technology, the late social critic Neil Postman greeted the unfolding Information<br />
Age with a combination of skepticism and scorn. 8 Indeed, Postman’s book w<strong>as</strong><br />
a near-perfect articulation of the techno-pessimist’s creed. ”Information h<strong>as</strong><br />
become a form of garbage,” he claimed, “not only incapable of answering the<br />
most fundamental human questions but barely useful in providing coherent<br />
direction to the solution of even mundane problems.” 9 If left unchecked,<br />
Postman argued, America’s new technopoly—”the submission of all forms of<br />
cultural life to the sovereignty of technique and technology”—would destroy<br />
“the vital sources of our humanity” and lead to “a culture without a moral<br />
foundation” by undermining “certain mental processes and social relations that<br />
make human life worth living.” 10<br />
Postman opened his polemic with the well-known allegorical tale found in<br />
Plato’s Phaedrus about the dangers of the written word. Postman reminded us<br />
how King Thamus responded to the god Theuth, who bo<strong>as</strong>ted that his<br />
invention of writing would improve the wisdom and memory of the m<strong>as</strong>ses<br />
relative to the oral tradition of learning. King Thamus shot back, “the<br />
discoverer of an art is not the best judge of the good or harm which will accrue<br />
to those who practice it.” King Thamus then p<strong>as</strong>sed judgment himself about<br />
the impact of writing on society, saying he feared that the people “will receive a<br />
quantity of information without proper instruction, and in consequence be<br />
thought very knowledgeable when they are for the most part quite ignorant.”<br />
And so Postman—fancying himself a modern Thamus—c<strong>as</strong>t judgment on<br />
today’s comparable technological advances and those who would glorify them:<br />
being out of control. These challenges were <strong>as</strong> real two millennia ago <strong>as</strong> they<br />
are today, and throughout history, people have been grappling with them and<br />
looking for creative ways to manage life in the crowd.<br />
WILLIAM POWERS, HAMLET’S BLACKBERRY: A PRACTICAL PHILOSOPHY FOR BUILDING A<br />
GOOD LIFE IN THE DIGITAL AGE 5 (2010). Similarly, Baron notes that “from the first days<br />
of writing to the present, each time a new communication technology appeared, people had<br />
to learn all over again how to use it, how to respond to it, how to trust the documents it<br />
produced.” DENNIS BARON, A BETTER PENCIL 5 (2009).<br />
8 NEIL POSTMAN, TECHNOPOLY: THE SURRENDER OF CULTURE TO TECHNOLOGY (1992).<br />
9 Id. at 69-70.<br />
10 Id. at 52, xii.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 63<br />
we are currently surrounded by throngs of zealous Theuths,<br />
one-eyed prophets who see only what new technologies can do<br />
and are incapable of imagining what they will undo. We might<br />
call such people Technophiles. They gaze on technology <strong>as</strong> a<br />
lover does on his beloved, seeing it <strong>as</strong> without blemish and<br />
entertaining no apprehension for the future. They are therefore<br />
dangerous and to be approached cautiously. … If one is to err,<br />
it is better to err on the side of Thamusian skepticism. 11<br />
Nichol<strong>as</strong> Negroponte begged to differ. An unapologetic Theuthian technophile,<br />
the former director of the MIT Media Lab responded on behalf of the technooptimists<br />
in 1995 with his prescient polemic, Being <strong>Digital</strong>. 12 It w<strong>as</strong> a paean to<br />
the Information Age, for which he served <strong>as</strong> one of the first high prophets—<br />
with Wired magazine’s back page serving <strong>as</strong> his pulpit during the many years he<br />
served <strong>as</strong> a regular columnist.<br />
Appropriately enough, the epilogue of Negroponte’s Being <strong>Digital</strong> w<strong>as</strong> entitled<br />
“An Age of Optimism” and, like the rest of the book, it stood in stark contr<strong>as</strong>t<br />
to Postman’s pessimistic worldview. Although Negroponte conceded that<br />
technology indeed had a “dark side” in that it could destroy much of the old<br />
order, he believed that destruction w<strong>as</strong> both inevitable and not cause for much<br />
concern. “Like a force of nature, the digital age cannot be denied or stopped,”<br />
he insisted, and we must learn to appreciate the ways “digital technology can be<br />
a natural force drawing people into greater world harmony.” 13 (This sort of<br />
techno-determinism is a theme found in many of the Internet optimist works<br />
that followed Negroponte.)<br />
To Postman’s persistent claim that America’s technopoly lacked a moral<br />
comp<strong>as</strong>s, Negroponte again conceded the point but took the gl<strong>as</strong>s-is-half-full<br />
view: “Computers are not moral; they cannot resolve complex issues like the<br />
rights to life and to death. But being digital, nevertheless, does give much cause<br />
for optimism.” 14 His defense of the digital age rested on the “four very<br />
powerful qualities that will result in its ultimate triumph: decentralizing,<br />
globalizing, harmonizing, and empowering.” 15 Gazing into his techno-crystal<br />
ball in 1995, Negroponte forec<strong>as</strong>t the ways in which those qualities would<br />
revolutionize society:<br />
11 Id. at 5.<br />
12 NICHOLAS NEGROPONTE, BEING DIGITAL (1995).<br />
13 Id. at 229, 230.<br />
14 Id. at 228-9.<br />
15 Id. at 229.
64 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
The access, the mobility, and the ability to effect change are<br />
what will make the future so different from the present. The<br />
information superhighway may be mostly hype today, but it is<br />
an understatement about tomorrow. It will exist beyond<br />
people’s wildest predictions. As children appropriate a global<br />
information resource, and <strong>as</strong> they discover that only adults<br />
need learner’s permits, we are bound to find new hope and<br />
dignity in places where very little existed before. 16<br />
In many ways, that’s the world we occupy today: one of unprecedented media<br />
abundance and unlimited communications and connectivity opportunities.<br />
But the great debate about the impact of digitization and information<br />
abundance did not end with Postman and Negroponte. Theirs w<strong>as</strong> but Act I in<br />
a drama that continues to unfold, and grows more heated and complex with<br />
each new character on the stage. “This conflict between stability and progress,<br />
security and prosperity, dynamism and st<strong>as</strong>is, h<strong>as</strong> led to the creation of a major<br />
political fault line in American politics,” argues Robert D. Atkinson: “On one<br />
side are those who welcome the future and look at the New Economy <strong>as</strong> largely<br />
positive. On the other are those who resist change and see only the risks of new<br />
technologies and the New Economy.” 17 Atkinson expands on this theme in<br />
another essay in this collection. 18<br />
Web War II<br />
The disciples of Postman and Negroponte are a colorful, diverse lot. The<br />
players in Act II of this drama occupy many diverse professions: journalists,<br />
technologists, business consultants, sociologists, economists, lawyers, etc. The<br />
two camps disagree with each other even more vehemently and vociferously<br />
about the impact of the Internet and digital technologies than Postman and<br />
Negroponte did.<br />
In Exhibit 1, I have listed the Internet optimists and pessimists alongside their<br />
key works. This very binary treatment obviously cannot do justice to the<br />
varying shades of optimism or pessimism in in each, but is nonetheless helpful.<br />
16 Id. at 231.<br />
17 ROBERT D. ATKINSON, THE PAST AND FUTURE OF AMERICA’S ECONOMY 201 (2004). “As a<br />
result,” he says, “a political divide is emerging between preservationists who want to hold<br />
onto the p<strong>as</strong>t and modernizers who recognize that new times require new means.”<br />
18 Robert D. Atkinson, Who’s Who in Internet Politics: A Taxonomy of Information Technology Policy<br />
& Politics, infra at 162.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 65<br />
Theuthian Technophiles<br />
( “The Internet Optimists”)<br />
Nichol<strong>as</strong> Negroponte, Being <strong>Digital</strong><br />
(1995)<br />
Exhibit 1<br />
Kevin Kelly, Out of Control: The New<br />
Biology of Machines, Social Systems,<br />
and the Economic World (1995)<br />
Virginia Postrel, The Future and<br />
Its Enemies (1998)<br />
James Surowiecki, The Wisdom of<br />
Crowds (2004)<br />
Chris Anderson, The Long Tail: Why the<br />
Future of Business is Selling Less of<br />
More (2006)<br />
Steven Johnson, Everything Bad is Good<br />
For You (2006)<br />
Glenn Reynolds, An Army of Davids:<br />
How Markets and Technology<br />
Empower Ordinary People to Beat Big<br />
Media, Big Government, and Other<br />
Goliaths (2006)<br />
Yochai Benkler, The Wealth of<br />
Networks: How Social Production<br />
Transforms Markets and Freedom<br />
(2006)<br />
Clay Shirky, Here Comes Everybody:<br />
The Power of Organizing without<br />
Organizations (2008)<br />
Don Tapscott & Anthony D. Williams,<br />
Wikinomics: How M<strong>as</strong>s Collaboration<br />
Changes Everything (2008)<br />
Thamusian Technophobes<br />
( “The Internet Pessimists”)<br />
Neil Postman, Technopoly: The<br />
Surrender of Culture to<br />
Technology (1993)<br />
Sven Birkerts, The Gutenberg<br />
Elegies: The Fate of Reading<br />
in an Electronic Age (1994)<br />
Clifford Stoll, High-Tech<br />
Heretic: Reflections of a<br />
Computer Contrarian (1999)<br />
C<strong>as</strong>s Sunstein, Republic.com<br />
(2001)<br />
Todd Gitlin, Media Unlimited:<br />
How the Torment of Images<br />
and Sounds Overwhelms Our<br />
Lives (2002)<br />
Todd Oppenheimer, The<br />
Flickering Mind: Saving<br />
Education from the False<br />
Promise of Technology (2003)<br />
Andrew Keen, The Cult of the<br />
Amateur: How Today’s<br />
Internet is Killing our Culture<br />
(2007)<br />
Steve Talbott, Devices of the<br />
Soul: Battling for Our Selves in<br />
an Age of Machines (2007)<br />
Nick Carr, The Big Switch:<br />
Rewiring the World, from<br />
Edison to Google (2008)
66 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
Theuthian Technophiles<br />
( “The Internet Optimists”)<br />
Jeff Howe, Crowdsourcing: Why<br />
the Power of the Crowd Is<br />
Driving the Future of Business<br />
(2008)<br />
Tyler Cowen, Create Your Own<br />
Economy: The Path to<br />
Prosperity in a Disordered World<br />
(2009)<br />
Dennis Baron, A Better Pencil:<br />
Readers, Writers, and the <strong>Digital</strong><br />
Revolution (2009)<br />
Jeff Jarvis, What Would Google<br />
Do? (2009)<br />
Clay Shirky, Cognitive Surplus:<br />
Creativity and Generosity in a<br />
Connected Age (2010)<br />
Nick Bilton, I Live in the Future<br />
& Here’s How It Works (2010)<br />
Kevin Kelly, What Technology<br />
Wants (2010)<br />
Exhibit 1 Continued<br />
Thamusian Technophobes<br />
( “The Internet Pessimists”)<br />
Lee Siegel, Against the Machine: Being<br />
Human in the Age of the Electronic<br />
Mob (2008)<br />
Mark Bauerlein, The Dumbest<br />
Generation: How the <strong>Digital</strong> Age<br />
Stupefies Young Americans and<br />
Jeopardizes Our Future (2008)<br />
Mark Helprin, <strong>Digital</strong> Barbarism: A<br />
Writer’s Manifesto (2009)<br />
Maggie Jackson, Distracted: The<br />
Erosion of Attention and the Coming<br />
Dark Age (2009)<br />
John Freeman, The Tyranny of E-Mail:<br />
The Four-Thousand-Year Journey to<br />
Your Inbox (2009)<br />
Jaron Lanier, You Are Not a Gadget<br />
(2010)<br />
Nick Carr, The Shallows: What the<br />
Internet Is Doing to Our Brains (2010)<br />
William Powers, Hamlet’s BlackBerry:<br />
A Practical Philosophy for Building a<br />
Good Life in the <strong>Digital</strong> Age (2010)<br />
In Exhibit 2, I have sketched out the major lines of disagreement between these<br />
two camps and divided those disagreements into (1) Cultural / Social beliefs<br />
vs. (2) Economic / Business beliefs.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 67<br />
Exhibit 2<br />
Optimists Pessimists<br />
Cultural / Social beliefs<br />
Net is participatory Net is polarizing<br />
Net facilitates personalization<br />
(welcome of “Daily Me”<br />
that digital tech allows)<br />
Net facilitates fragmentation<br />
(fear of the “Daily Me”)<br />
“a global village” balkanization and<br />
fears of “mob rule”<br />
heterogeneity / encourages diversity<br />
of thought and expression<br />
homogeneity / Net<br />
leads to close-mindedness<br />
allows self-actualization diminishes personhood<br />
Net a tool of liberation<br />
& empowerment<br />
Net a tool of frequent<br />
misuse & abuse<br />
Net can help educate the m<strong>as</strong>ses dumbs down the m<strong>as</strong>ses<br />
anonymous communication<br />
encourages vibrant debate +<br />
whistleblowing (a net good)<br />
welcome information abundance;<br />
believe it will create new<br />
opportunities for learning<br />
benefits of “Free” (incre<strong>as</strong>ing<br />
importance of “gift economy”)<br />
m<strong>as</strong>s collaboration is<br />
generally more important<br />
Economic / Business beliefs<br />
anonymity deb<strong>as</strong>es culture &<br />
leads to lack of accountability<br />
concern about information overload;<br />
esp. impact on learning & reading<br />
costs of “Free” (“free” = threat to<br />
quality & business models)<br />
individual effort is<br />
generally more important<br />
embrace of “amateur” creativity superiority of “professionalism”<br />
stress importance of “open<br />
systems” of production<br />
“wiki” model = wisdom of crowds;<br />
benefits of crowdsourcing<br />
stress importance of “proprietary”<br />
models of production<br />
“wiki” model = stupidity of crowds;<br />
collective intelligence is oxymoron; +<br />
“sharecropper” concern about<br />
exploitation of free labor
68 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
When you boil it all down, there are two major points of contention between<br />
the Internet optimists and pessimists:<br />
1. The impact of technology on learning & culture and the role of experts<br />
vs. amateurs in that process.<br />
2. The promise—or perils—of personalization, for both individuals and<br />
society.<br />
Each dispute is discussed in more detail below.<br />
Differences Over Learning,<br />
Culture & “Truth”<br />
As with Theuth and Thamus, today’s optimists and skeptics differ about who is<br />
the best judge of what constitutes progress, authority, and “truth” and how<br />
technological change will impact these things.<br />
The Pessimists’ Critique<br />
Consider the heated debates over the role of “amateur” creations, usergeneration<br />
content, and peer-b<strong>as</strong>ed forms of production. Pessimists tend to<br />
fear the impact of the Net and the rise of what Andrew Keen h<strong>as</strong> called “the<br />
cult of the amateur.” 19 They worry that “professional” media or more<br />
enlightened voices and viewpoints might be drowned out by a cacophony of<br />
competing—but less compelling or enlightened—voices and viewpoints.<br />
Without “enforceable scarcity” and protection for the “enlightened cl<strong>as</strong>s,” the<br />
pessimists wonder how “high quality” news or “high art” will be funded and<br />
disseminated. Some, like Keen, even suggest the need to “re-create media<br />
scarcity” to save culture or professional content creators. 20<br />
Some of these pessimists clearly think in zero-sum terms: More “amateur”<br />
production seems to mean less “professional” content creation will be possible.<br />
For example, Lee Siegel, author of Against the Machine: Being Human in the Age of<br />
the Electronic Mob, says that by empowering the m<strong>as</strong>ses to have more of a voice,<br />
“unbi<strong>as</strong>ed, rational, intelligent, and comprehensive news … will become less<br />
19 ANDREW KEEN, THE CULT OF THE AMATEUR: HOW TODAY’S INTERNET IS KILLING OUR<br />
CULTURE (2007).<br />
20 Andrew Keen, Art & Commerce: Death by YouTube, ADWEEK, Oct. 15, 2007,<br />
http://web.archive.org/web/20080107024552/http:/www.adweek.com/aw/magazin<br />
e/article_display.jsp?vnu_content_id=1003658204. For a response, see Adam Thierer,<br />
Thoughts on Andrew Keen, Part 2: The Dangers of the St<strong>as</strong>is Mentality, TECHNOLOGY LIBERATION<br />
FRONT, Oct. 18, 2007, http://techliberation.com/2007/10/18/thoughts-on-andrewkeen-part-2-the-dangers-of-the-st<strong>as</strong>is-mentality.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 69<br />
and less available.” 21 “[G]iving everyone a voice,” he argues, “can also be a way<br />
to keep the most creative, intelligent, and original voices from being heard.” 22<br />
The centrality of Wikipedia, the collaborative online encyclopedia, to this<br />
discussion serves <strong>as</strong> a microcosm of the broader debate between the optimists<br />
and the pessimists. Almost every major optimist and pessimist tract includes a<br />
discussion of Wikipedia; it generally serves <strong>as</strong> a hero in the works of the former<br />
and a villain in the latter. For the pessimists, Wikipedia marks the decline of<br />
authority, the death of objectivity, and the rise of “mobocracy” since it allows<br />
“anyone with opposable thumbs and a fifth-grade education [to] publish<br />
anything on any topic.” 23 They fear that “truth” becomes more relativistic<br />
under models of peer collaboration or crowd-sourced initiatives. 24<br />
The pessimists also have very little good to say about YouTube, blogs, social<br />
networks, and almost all user-generated content. They treat them with a<br />
combination of confusion and contempt. “[S]elf-expression is not the same<br />
thing <strong>as</strong> imagination,” or art, Siegel argues. 25 Instead, he regards the explosion<br />
of online expression <strong>as</strong> the “narcissistic” bloviation of the m<strong>as</strong>ses and argues it<br />
is destroying true culture and knowledge. Echoing Postman’s <strong>as</strong>sertion that<br />
“information h<strong>as</strong> become a form of garbage,” Siegel says that the “Under the<br />
influence of the Internet, knowledge is withering away into information.” 26 Our<br />
new age of information abundance is not worth celebrating, he says, because<br />
“information is powerlessness.” 27<br />
Some pessimists argue that all the new information and media choices are<br />
largely false choices that don’t benefit society. For example, Siegel disputes<br />
what he regards <strong>as</strong> overly-romanticized notions of “online participation” and<br />
“personal democracy.” Keen goes further referring to them <strong>as</strong> “the great<br />
seduction.” He says “the Web 2.0 revolution h<strong>as</strong> peddled the promise of<br />
21 LEE SIEGEL, AGAINST THE MACHINE: BEING HUMAN IN THE AGE OF THE ELECTRONIC MOB<br />
165 (2008). For a review of the book, see Adam Thierer, Book Review: Lee Siegel’s Against the<br />
Machine, TECHNOLOGY LIBERATION FRONT, Oct. 20, 2008,<br />
http://techliberation.com/2008/10/20/book-review-lee-siegel%E2%80%99sagainst-the-machine.<br />
22 Id. at 5.<br />
23 Keen, supra note 19, at 4.<br />
24 “Wikipedia, with its video-game like mode of participation, and with its mountains of trivial<br />
factoids, of shifting mounds of gossip, of inane personal details, is knowledge in the process<br />
of becoming information.” Siegel, supra note 21, at 152.<br />
25 Id. at 52.<br />
26 Id. at 152.<br />
27 Id. at 148.
70 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
bringing more truth to more people … but this is all a smokescreen.” 28 “What<br />
the Web 2.0 revolution is really delivering,” he argues, “is superficial<br />
observations of the world around us rather than deep analysis, shrill opinion<br />
rather than considered judgment.” 29<br />
Occ<strong>as</strong>ionally, the pessimists resort to some fairly immature name-calling tactics<br />
while critiquing Information Age culture. “It would be one thing if such a<br />
[digital] revolution produced Mozarts, Einsteins, or Raphaels,” says novelist<br />
Mark Helprin, “but it doesn’t... It produces mouth-breathing morons in<br />
backward b<strong>as</strong>eball caps and pants that fall down; Slurpee-sucking geeks who<br />
seldom see daylight; pretentious and earnest hipsters who want you to wear<br />
bamboo socks so the world won’t end … beer-drinking dufuses who pay to<br />
watch noisy cars driving around in a circle for eight hours at a stretch.” 30<br />
Some pessimists also claim that proliferating new media choices are merely<br />
force-fed commercial propaganda or that digital technologies are spawning<br />
needless consumerism. “New technologies unquestionably make purch<strong>as</strong>es<br />
e<strong>as</strong>ier and more convenient for consumers. To this extent, they do help,” says<br />
the prolific University of Chicago law professor C<strong>as</strong>s Sunstein. “But they help<br />
far less than we usually think, because they accelerate the consumption treadmill<br />
without making life much better for consumers of most goods.” 31<br />
In Siegel’s opinion, everyone is just in it for the money. “Web 2.0 is the<br />
brainchild of businessmen,” and the “producer public” is really just a “totalized<br />
‘consumerist’ society.” 32 Countless unpaid bloggers—in it for the love of the<br />
conversation and debate—are merely brainw<strong>as</strong>hed sheep whom Siegel argues<br />
just don’t realize the harm they are doing. “[T]he bloggers are playing into the<br />
hands of political and financial forces that want nothing more than to see the<br />
critical, scrutinizing media disappear.” 33 He reserves special scorn for Net<br />
evangelists who believe that something truly exciting is happening with the new<br />
online conversation. According to Siegel, they are simply “in a mad rush to earn<br />
profits or push a fervent idealism.” 34<br />
The pessimists also fear that these new technologies and trends could have<br />
profound ramifications not just for entertainment culture, but also for the<br />
28 Keen, supra note 19, at 16.<br />
29 Id.<br />
30 MARK HELPRIN, DIGITAL BARBARISM: A WRITER’S MANIFESTO 57 (2009).<br />
31 CASS SUNSTEIN, REPUBLIC.COM 121 (2010).<br />
32 Siegel, supra note 21, at 128.<br />
33 Id. at 141.<br />
34 Id. at 25-6.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 71<br />
future of news and professional journalism. They worry about the loss of<br />
trusted intermediaries and traditional authorities. For example, Keen fears that<br />
Wikipedia, “is almost single-handedly killing the traditional information<br />
business.” 35 They also argue that “free culture” isn’t free at all; it’s often just<br />
par<strong>as</strong>itic copying or blatant piracy.<br />
Similarly, Nick Carr and Jaron Lanier worry about the rise of “digital<br />
sharecropping,” where a small group of elites make money off the back of free<br />
labor. To Carr, many new Web 2.0 sites and services “are essentially<br />
agglomerations of the creative, unpaid contributions of their members. In a<br />
twist on the old agricultural practice of sharecropping, the site owners provide<br />
the digital real estate and tools, let the members do all the work, and then<br />
harvest the economic riches.” 36 And in opening his book, Lanier says<br />
“Ultimately these words will contribute to the fortunes of those few who have<br />
been able to position themselves <strong>as</strong> lords of the computing clouds.” 37<br />
Finally, some pessimists worry deeply about the impact of computers and digital<br />
technologies on learning. They fear these trends will inevitably result in a<br />
general “dumbing down” of the m<strong>as</strong>ses or even the disappearance of reading,<br />
writing, and other arts. Typifying this view is Mark Bauerlein’s The Dumbest<br />
Generation: How the <strong>Digital</strong> Age Stupefies Young Americans and Jeopardizes Our Future<br />
(2008), but similar concerns are on display in the works of Sven Birkerts, 38<br />
Clifford Stoll, 39 Todd Gitlin, 40 and Todd Oppenheimer. 41<br />
The Optimists’ Response<br />
The optimists’ response is rooted in the belief that, despite their highly<br />
disruptive nature, the Internet and new digital technologies empower and<br />
enlighten individuals and, therefore, generally benefit society.<br />
35 Keen, supra note 19, at 131.<br />
36 NICHOLAS CARR, THE BIG SWITCH: REWIRING THE WORLD, FROM EDISON TO GOOGLE 137-<br />
8 (2008).<br />
37 LANIER, YOU ARE NOT A GADGET at 1 (2010).<br />
38 SVEN BIRKERTS, THE GUTENBERG ELEGIES: THE FATE OF READING IN AN ELECTRONIC AGE<br />
(1994).<br />
39 CLIFFORD STOLL, HIGH-TECH HERETIC: REFLECTIONS OF A COMPUTER CONTRARIAN<br />
(1999).<br />
40 TODD GITLIN, MEDIA UNLIMITED: HOW THE TORMENT OF IMAGES AND SOUNDS<br />
OVERWHELMS OUR LIVES (2002).<br />
41 TODD OPPENHEIMER, THE FLICKERING MIND: SAVING EDUCATION FROM THE FALSE<br />
PROMISE OF TECHNOLOGY (2003).
72 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
The optimists tend to argue that new modes of production (especially peerb<strong>as</strong>ed<br />
production) will offer an adequate—if not superior—alternative to<br />
traditional modalities of cultural or artistic production. Despite displacing some<br />
institutions and cultural norms, they claim digital technologies create more<br />
opportunities. They speak of “collective intelligence,” 42 the “wisdom of<br />
crowds,” 43 the importance of peer production, 44 and the rise of what futurist<br />
Alvin Toffler first referred to <strong>as</strong> “prosumers.” 45 “There h<strong>as</strong> been a fundamental<br />
shift in the balance of power between consumers and salesmen over the l<strong>as</strong>t<br />
generation and it points in the direction of consumers,” Tyler Cowen argues in<br />
his book, Create Your Own Economy: The Path to Prosperity in a Disordered World. 46<br />
The peer production trend is stressed in works such <strong>as</strong> The Wealth of Networks:<br />
How Social Production Transforms Markets and Freedom, by Yochai Benkler, 47 and<br />
Wikinomics: How M<strong>as</strong>s Collaboration Changes Everything, by Don Tapscott and<br />
Anthony D. Williams. 48 “A new economic democracy is emerging in which we<br />
all have a lead role,” claim Tapscott and Williams, 49 because “the economics of<br />
production have changed significantly.” 50<br />
Most optimists also argue that new business models will evolve to support what<br />
had previously been provided by professional content creators or news<br />
providers. Glenn Reynolds (An Army of Davids) and Dan Gillmor (We the Media)<br />
refer of the rise of “we-dia” (user-generated content and citizen journalism) that<br />
is an incre<strong>as</strong>ingly important part of the modern media landscape. Gillmor, a<br />
former San Jose Mercury News columnist, speaks of “a modern revolution …<br />
because technology h<strong>as</strong> given us a communications toolkit that allows anyone to<br />
become a journalist at little cost and, in theory, with global reach. Nothing like<br />
this h<strong>as</strong> ever been remotely possible before,” he argues. 51 And the optimists<br />
generally don’t spend much time lamenting the obliteration of large media<br />
42 HENRY JENKINS, CONVERGENCE CULTURE: WHERE OLD AND NEW MEDIA COLLIDE 4<br />
(2006).<br />
43 JAMES SUROWIECKI, THE WISDOM OF CROWDS (2004).<br />
44 DON TAPSCOTT & ANTHONY D. WILLIAMS, WIKINOMICS: HOW MASS COLLABORATION<br />
CHANGES EVERYTHING 1, 67 (2008).<br />
45 ALVIN TOFFLER, THE THIRD WAVE 265 (1980).<br />
46 TYLER COWEN, CREATE YOUR OWN ECONOMY: THE PATH TO PROSPERITY IN A DISORDERED<br />
WORLD 117 (2009).<br />
47 YOCHAI BENKLER, THE WEALTH OF NETWORKS: HOW SOCIAL PRODUCTION TRANSFORMS<br />
MARKETS AND FREEDOM (2006).<br />
48 Tapscott & Williams, supra note 44, at 15.<br />
49 Id. at 15.<br />
50 Id. at 68.<br />
51 DAN GILLMOR, WE THE MEDIA at xii (2004).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 73<br />
institutions, either because they think little of their p<strong>as</strong>t performance or,<br />
alternatively, believe that whatever “watchdog” role they played can be filled by<br />
others. “We are seeing the emergence of new, decentralized approaches to<br />
fulfilling the watchdog function and to engaging in political debate and<br />
organization,” Benkler claims. 52<br />
Optimists also believe that the Information Age offers real choices and genuine<br />
voices, and they vociferously dispute charges of diminished quality by<br />
prosumers, amateur creators, new media outlets, and citizen journalists.<br />
Moreover, they do not fear the impact of these new trends and technologies on<br />
learning or culture. “Surely the technophobes who romanticize the pencil don’t<br />
want to return us to the low literacy rates that characterized the good old days<br />
of writing with pencils and quills,” Baron <strong>as</strong>ks. “Still, a few critics object to the<br />
new technologies because they enable too many people to join the guild of<br />
writers, and they might paraphr<strong>as</strong>e Thoreau’s objection to the telegraph: these<br />
new computer writers, it may be, have nothing to say to one another.” 53<br />
Finally, in addressing the sharecropper concern raised by Carr and Lanier, the<br />
optimists insist most people aren’t in it for the money. Shirky notes that<br />
“Humans intrinsically value a sense of connectedness,” and much of what they<br />
do in the social media world is a true labor of love. 54 “Amateurs aren’t just pintsized<br />
professionals; people are sometimes happy to do things for re<strong>as</strong>ons that<br />
are incompatible with getting paid,” he says. 55 Mostly they do it for love of<br />
knowledge or a belief in the importance of “free culture,” the optimists claim.<br />
The Debate Over the Promise—<br />
or Perils—of Personalization<br />
Optimists and pessimists tend to agree that the Internet and “Web 2.0” is<br />
leading to more “personalized” media and information experiences. They<br />
disagree vehemently, however, on whether this is good or bad. They<br />
particularly disagree on what incre<strong>as</strong>ed information customization means for<br />
participatory democracy and the future of relations among people of diverse<br />
backgrounds and ideologies. Finally, they differ on how serious of a problem<br />
“information overload” is for society and individuals.<br />
52 Benkler, supra note 47, at 11.<br />
53 DENNIS BARON, A BETTER PENCIL 159 (2009).<br />
54 CLAY SHIRKY, COGNITIVE SURPLUS: CREATIVITY AND GENEROSITY IN A CONNECTED AGE<br />
58-9 (2010).<br />
55 Id.
74 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
The Optimists’ C<strong>as</strong>e<br />
Let’s take the optimists first this time.<br />
The optimists tend to embrace what Nichol<strong>as</strong> Negroponte first labeled “The<br />
Daily Me” (i.e., hyper-personalized news, culture, and information). In 1995,<br />
Negroponte <strong>as</strong>ked us to:<br />
Imagine a future in which your interface agent can read every<br />
newswire and newspaper and catch every TV and radio<br />
broadc<strong>as</strong>t on the planet, and then construct a personalized<br />
summary. This kind of newspaper is printed in an edition of<br />
one.…<br />
Imagine a computer display of news stories with a knob that,<br />
like a volume control, allows you to crank personalization up<br />
or down. You could have many of these controls, including a<br />
slider that moves both literally and politically from left to right<br />
to modify stories about public affairs. These controls change<br />
your window onto the news, both in terms of size and its<br />
editorial tone. In the distant future, interface agents will read,<br />
listen to, and look at each story in its entirety. In the near<br />
future, the filtering process will happen by using headers, those<br />
bits about bits. 56<br />
That future came about sooner than even Negroponte could have predicted.<br />
We all have a “Daily Me” at our disposal today thanks to RSS feeds, Facebook,<br />
Google Alerts, Twitter, email newsletters, instant messaging, and so on. These<br />
tools, among others, can provide tailored, automated search results served up<br />
instantaneously. The optimists argue that this incre<strong>as</strong>ed tailoring and<br />
personalization of our media experiences empowers heretofore silenced m<strong>as</strong>ses.<br />
This worldview is typified by the title of Glenn Reynolds’ book: An Army of<br />
Davids: How Markets and Technology Empower Ordinary People to Beat Big Media, Big<br />
Government and Other Goliaths. 57 The optimists argue that our “participatory<br />
culture” promotes greater cultural heterogeneity and gives everyone a better<br />
chance to be heard. “In a world of media convergence, every important story<br />
gets told, every brand gets sold, and every consumer gets courted across<br />
multiple media platforms,” says Henry Jenkins, author of Convergence Culture. 58<br />
56 Negroponte, supra note 12, at 153-54.<br />
57 GLENN REYNOLDS, AN ARMY OF DAVIDS: HOW MARKETS AND TECHNOLOGY EMPOWER<br />
ORDINARY PEOPLE TO BEAT BIG MEDIA, BIG GOVERNMENT AND OTHER GOLIATHS (2006).<br />
58 HENRY JENKINS, CONVERGENCE CULTURE: WHERE OLD AND NEW MEDIA COLLIDE 3<br />
(2006). Tapscott & Williams, supra note 44, at 41.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 75<br />
Again, they stress the empowering nature of digital technology <strong>as</strong> a good in and<br />
of itself. “The m<strong>as</strong>s amateurization of publishing undoes the limitations<br />
inherent in having a small number of traditional press outlets,” Shirky claims. 59<br />
This leads to greater openness, transparency, exposure to new thinking and<br />
opinions, and a diversity of thought and societal participation. Shirky speaks of<br />
the “cognitive surplus” unle<strong>as</strong>hed by these changes and its myriad benefits for<br />
society and culture:<br />
The harnessing of our cognitive surplus allows people to<br />
behave in incre<strong>as</strong>ingly generous, public, and social ways,<br />
relative to their old status <strong>as</strong> consumers and couch potatoes.<br />
The raw material of this change is the free time available to us,<br />
time we can commit to projects that range from the amusing to<br />
the culturally transformative. … Flexible, cheap, and inclusive<br />
media now offers us opportunities to do all sorts of things we<br />
once didn’t do. In the world of “the media,” we were like<br />
children, sitting quietly at the edge of a circle and consuming<br />
whatever the grown-ups in the center of the circle produced.<br />
That h<strong>as</strong> given way to a world in which most forms of<br />
communication, public and private, are available to everyone in<br />
some form. 60<br />
Shirky even suggests that “The world’s cognitive surplus is so large that small<br />
changes can have huge ramifications in aggregate,” and have beneficial impacts<br />
on politics, advocacy, and “generosity.”<br />
When it comes to concerns about “information overload,” most optimists see<br />
little re<strong>as</strong>on for concern. Tyler Cowen argues that using search tools like<br />
Google and other information gathering and processing technologies actually<br />
“lengthen our attention spans in another way, namely by allowing greater<br />
specialization of knowledge:” 61<br />
We don’t have to spend <strong>as</strong> much time looking up various facts<br />
and we can focus on the particular are<strong>as</strong> of interest, if only<br />
because general knowledge is so readily available. It’s never<br />
been e<strong>as</strong>ier to wrap yourself up in a long-term intellectual<br />
project, yet without losing touch with the world around you.<br />
59 CLAY SHIRKY, HERE COMES EVERYBODY: THE POWER OF ORGANIZING WITHOUT<br />
ORGANIZATIONS 65 (2008).<br />
60 CLAY SHIRKY, COGNITIVE SURPLUS, supra note 54, at 63.<br />
61 TYLER COWEN, CREATE YOUR OWN ECONOMY: THE PATH TO PROSPERITY IN A DISORDERED<br />
WORLD 55 (2009).
76 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
As for information overload, it is you who chooses how much<br />
“stuff” you want to experience and how many small bits you<br />
want to put together … . The quantity of information coming<br />
our way h<strong>as</strong> exploded, but so h<strong>as</strong> the quality of our filters. 62<br />
Chris Anderson previously made this point in his book, The Long Tail.<br />
Anderson defined filters <strong>as</strong> “the catch-all phr<strong>as</strong>e for recommendations and all<br />
the other tools that help you find quality in the Long Tail” and noted that<br />
“these technologies and services sift through a v<strong>as</strong>t array of choices to present<br />
you with the ones that are most right for you.” 63 “The job of filters is to screen<br />
out [the] noise” or information clutter, Anderson says. 64 Cowen argues that the<br />
filtering technologies are getting better at this sifting and processing process, but<br />
so too are humans, he says. The key to this, he argues, is that we are getting better<br />
at “ordering” information.<br />
On balance, therefore, the optimists argue that personalization benefits our<br />
culture and humanity. Dennis Baron concludes, “English survives, conversation<br />
thrives online <strong>as</strong> well <strong>as</strong> off, and on balance, digital communications seems to<br />
be enhancing human interaction, not detracting from it.” 65<br />
The Pessimists’ Response<br />
The pessimists argue that all this Pollyannaish talk about a new age of<br />
participatory democracy is bunk. Instead of welcoming incre<strong>as</strong>ed information<br />
and media personalization, they lament it. They fear that “The Daily Me” that<br />
the optimists laud will lead to homogenization, close-mindedness, an online<br />
echo-chamber, information overload, corporate brainw<strong>as</strong>hing, etc. Worst,<br />
hyper-customization of websites and online technologies will cause extreme<br />
social “fragmentation,” “polarization,” “balkanization,” “extremism” and even<br />
the decline of deliberative democracy. 66<br />
Siegel and Keen are probably the most searing in this critique. To Siegel, for<br />
example, the “Daily Me” is little more that the creation of a “narcissistic<br />
culture” in which “exaggeration” and the “loudest, most outrageous, or most<br />
62 Id.<br />
63 CHRIS ANDERSON, THE LONG TAIL 108 (2006).<br />
64 Id. at 115.<br />
65 DENNIS BARON, A BETTER PENCIL 135 (2009).<br />
66 Carr worries that every little choice moves us close toward such social isolation: “Every time<br />
we subscribe to a blog, add a friend to our social network, categorize an email message <strong>as</strong><br />
spam, or even choose a site from a list of search results, we are making a decision that<br />
defines, in some small way, whom we <strong>as</strong>sociate with and what information we pay attention<br />
to.” NICHOLAS CARR, THE BIG SWITCH: REWIRING THE WORLD, FROM EDISON TO GOOGLE<br />
160 (2008).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 77<br />
extreme voices sway the crowd of voices this way; the cutest, most self-effacing,<br />
most ridiculous, or most transparently fraudulent of voices sway the crowd of<br />
voices that way.” 67 He calls Web 2.0 “democracy’s fatal turn” in that, instead of<br />
“allowing individuals to create their own cultural and commercial choices,” it<br />
h<strong>as</strong> instead created “a more potent form of homogenization.” 68 Keen fears the<br />
rise of “a dangerous form of digital narcissism” and “the degeneration of<br />
democracy into the rule of the mob and the rumor mill.” 69<br />
This echoes concerns first raised by C<strong>as</strong>s Sunstein in his 2001 book<br />
Republic.com. 70 In that book, Sunstein referred to Negroponte’s “Daily Me” in<br />
contemptuous terms, saying that the hyper-customization of websites and<br />
online technologies w<strong>as</strong> causing extreme social fragmentation and isolation that<br />
could lead to political extremism. “A system of limitless individual choices, with<br />
respect to communications, is not necessarily in the interest of citizenship and<br />
self-government,” he wrote. 71 Sunstein w<strong>as</strong> essentially claiming that the<br />
Internet is breeding a dangerous new creature: Anti-Democratic Man. 72<br />
“Group polarization is unquestionably occurring on the Internet,” he<br />
proclaimed, and it is weakening what he called the “social glue” that binds<br />
society together and provides citizens with a common “group identity.” 73 If that<br />
continues unabated, Sunstein argued, the potential result could be nothing short<br />
of the death of deliberative democracy and the breakdown of the American<br />
system of government.<br />
Some of the pessimists, like Keen, go further and claim that “the moral fabric of<br />
our society is being unraveled by Web 2.0. It seduces us into acting on our<br />
most deviant instincts and allows us to succumb to our most destructive vices.<br />
And it is corroding the values we share <strong>as</strong> a nation.” 74 Nick Carr summarizes the<br />
views of the pessimists when he says: “it’s clear that two of the hopes most dear<br />
to the Internet optimists—that the Web will create a more bountiful culture and<br />
that it will promote greater harmony and understanding—should be treated<br />
67 Siegel, supra note 21, at 79.<br />
68 Id. at 67.<br />
69 Keen, supra note 19, at 54-5.<br />
70 CASS SUNSTEIN, REPUBLIC.COM (2001).<br />
71 Id. at 123.<br />
72 See Adam Thierer, Saving Democracy from the Internet, REGULATION (Fall 2001) 78-9,<br />
http://www.cato.org/pubs/regulation/regv24n3/inreview.pdf.<br />
73 Sunstein, supra, at 71, 89.<br />
74 Keen, supra note 19, at 163.
78 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
with skepticism. Cultural impoverishment and social fragmentation seem<br />
equally likely outcomes.” 75<br />
Another common theme in the works of the pessimists is summarized by the<br />
title of Siegel’s book (Against the Machine). They fear the “mechanization of the<br />
soul” 76 or humanity’s “surrender” to “the machine revolution.” 77 In opening of<br />
You Are Not a Gadget, Lanier fears that “these words will mostly be read by<br />
nonpersons—automatons or numb mobs composed of people who are no<br />
longer acting <strong>as</strong> individuals.” 78 “The trick is not to subject man and nature to<br />
the laws of the machine,” says Helprin, “but rather to control the machine<br />
according to the laws and suggestions of nature and human nature. To<br />
subscribe to this does not make one a Luddite.” 79<br />
Finally, the pessimists are also concerned about the impact of online anonymity<br />
on human conduct and language. They argue anonymity leads to less<br />
accountability or, more simply, just plain bad manners. “If our national<br />
conversation is carried out by anonymous, self-obsessed people unwilling to<br />
reveal their real identities, then,” Keen argues, “community denigrates into<br />
anarchy.” 80<br />
So Who’s Right?<br />
On balance, the optimists generally have the better of the argument today. We<br />
really are better off in an age of information abundance than we were in the<br />
scarcity era we just exited. Nonetheless, the pessimists make many fair points<br />
that deserve to be taken seriously. But they need a more re<strong>as</strong>onable articulation<br />
of those concerns and a constructive plan for how to move forward without a<br />
call for extreme reactionary solutions.<br />
A hybrid approach here might be thought of <strong>as</strong> “pragmatic optimism,” which<br />
attempts to rid the optimist paradigm of its kookier, pollyannish thinking while<br />
also taking into account some of the very legitimate concerns raised by the<br />
pessimists, but rejecting its caustic, neo-Luddite fringe elements and st<strong>as</strong>is<br />
mentality in the process.<br />
75 Carr, supra note 36, at 167.<br />
76 Helprin, supra note 30, at 100.<br />
77 Id. at 9, 100.<br />
78 Lanier, supra note 37, at 1.<br />
79 Helprin, supra note 30, at 144.<br />
80 Keen, supra note 30, at 80.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 79<br />
Thoughts on the Pessimists<br />
First and foremost, if they hope to be taken more seriously, Net skeptics need<br />
better spokespersons. Or, they at le<strong>as</strong>t need a more moderated, less hysterical<br />
tone when addressing valid concerns raised by technological progress. It’s often<br />
difficult to take the pessimists seriously when they exude outright hostility to<br />
most forms of technological progress. Most of them deny being high-tech<br />
troglodytes, but the tone of some of their writing, and the thrust of some of<br />
their recommendations, exhibit occ<strong>as</strong>ional Luddite tendencies—even if they<br />
don’t always come out and call for extreme me<strong>as</strong>ures to counteract dynamism.<br />
Moreover, the name-calling they sometimes engage in, and their derision for the<br />
digital generation can be just <strong>as</strong> insulting and immature <strong>as</strong> the online “mob”<br />
they repeatedly c<strong>as</strong>tigate in their works. Too often, their criticism devolves into<br />
philosophical snobbery and blatant elitism, <strong>as</strong> in the works of Helprin, Siegel,<br />
and Keen. Constantly looking down their noses at digital natives and all<br />
“amateur” production isn’t going to help them win any converts or respect for<br />
their positions. Moreover, one wonders if they have fingered the right culprit<br />
for civilization’s supposed decline, since most of the ills they identify predate<br />
the rise of the Internet.<br />
The pessimists are often too quick to proclaim the decline of modern<br />
civilization by looking only to the b<strong>as</strong>er elements of the blogosphere or the<br />
more caustic voices of cyberspace. The Internet is a cultural and intellectual<br />
bazaar where one can find both the best and the worst of humanity on display<br />
at any given moment. True, “brutishness and barbarism,” <strong>as</strong> Helprin calls it, 81<br />
can be found on many cyber-corners, but not all of its corners. And, contrary<br />
to Helprin’s <strong>as</strong>sertion that blogging “begins the mad race to the bottom,” 82 one<br />
could just <strong>as</strong> e<strong>as</strong>ily cite countless instances of the healthy, unprecedented<br />
conversations that blogs have enabled about a diverse array of topics.<br />
Their claim that the “Daily Me” and information specialization will lead to a<br />
variety of ills is also somewhat overblown. It’s particularly hard to accept<br />
Sunstein and Carr’s claims that incre<strong>as</strong>ed personalization is breeding<br />
“extremism,” “fanaticism” and “radicalization.” A recent study by Matthew<br />
Gentzkow and Jesse M. Shapiro of the University of Chicago Booth School of<br />
Business lent credibility to this, finding “no evidence that the Internet is<br />
becoming more segregated over time” or leading to incre<strong>as</strong>ed polarization <strong>as</strong><br />
Sunstein and other pessimists fear. 83 Instead, their findings show that the Net<br />
81 Helprin, supra note 30, at 32.<br />
82 Id. at 42.<br />
83 Matthew Gentzkow & Jesse M. Shapiro, Ideological Segregation Online and Offline, CHICAGO<br />
BOOTH WORKING PAPER No. 10-19, April 5, 2010,<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1588920.
80 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
h<strong>as</strong> encouraged more ideological integration and is actually driving us to<br />
experience new, unanticipated viewpoints. 84<br />
While it’s true the Internet h<strong>as</strong> given some extremists a new soapbox to stand<br />
on and spew their hatred and stupidity, the fact is that such voices and<br />
viewpoints have always existed. The difference today is that the Internet and<br />
digital platforms have given us a platform to counter such societal extremism.<br />
As the old saying goes, the answer to bad speech is more speech—not a<br />
crackdown on the underlying technologies used to convey speech. It should<br />
not be forgotten that, throughout history, most extremist, totalitarian<br />
movements rose to power by taking over the scarce, centralized media<br />
platforms that existed in their countries. The decentralization of media makes<br />
such a take-over far less plausible to imagine.<br />
Sometimes the pessimists seem to just be suffering from a bit of old-fogeyism.<br />
Lanier, for example, dismisses most modern culture <strong>as</strong> “retro” and “a petty<br />
m<strong>as</strong>hup of preweb culture.” 85 “It’s <strong>as</strong> if culture froze just before it became<br />
digitally open, and all we can do now is mine the p<strong>as</strong>t like salvagers picking over<br />
a garbage dump.” 86 Many pessimists are guilty of such hyper-nostalgia about<br />
those mythical “good ‘ol days” when all w<strong>as</strong> supposedly much better. It’s a<br />
common refrain we’ve heard from many social and cultural critics before. But<br />
such cultural critiques are profoundly subjective. Many pessimists simply seem<br />
to be well p<strong>as</strong>sed the “adventure window.” 87 The willingness of humans to try<br />
new things and experiment with new forms of culture—our “adventure<br />
window”—fades rapidly after certain key points in life, <strong>as</strong> we gradually settle in<br />
our ways. Many cultural critics and average folk alike seem convinced the best<br />
days are behind us and the current good-for-nothing generation and their newfangled<br />
gadgets and culture are garbage. At times this devolves into a full-blown<br />
moral panic. 88 “It’s perfectly normal and probably healthy to examine whether<br />
these changes are good or bad,” says New York Times blogger Nick Bilton,<br />
author of I Live in the Future & Here’s How It Works. “But we’ll also no doubt<br />
84 “This study suggests that Internet users are a bunch of ideological Jack Kerouacs. They’re<br />
not burrowing down into comforting nests. They’re cruising far and wide looking for<br />
adventure, information, combat and arousal.” David Brooks, Riders on the Storm, NEW YORK<br />
TIMES, April 19, 2010, http://www.nytimes.com/2010/04/20/opinion/20brooks.html.<br />
85 Lanier, supra note 37, at 131.<br />
86 Id. at 133.<br />
87 Adam Thierer, The “Adventure Window,” Radio Formats and Media Ownership Rules,<br />
TECHNOLOGY LIBERATION FRONT, Aug. 16, 2006,<br />
http://techliberation.com/2006/08/16/the-adventure-window-radio-formats-andmedia-ownership-rules.<br />
88 See Adam Thierer, Parents, Kids & Policymakers in the <strong>Digital</strong> Age: Safeguarding Against ‘Techno-<br />
Panics,’ INSIDE ALEC (July 2009) at 16-7,<br />
http://www.alec.org/am/pdf/Inside_July09.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 81<br />
look back at many of the debates a generation from now and see that a lot of<br />
these fears were inflated and maybe a bit ridiculous, too.” 89<br />
The “sharecropper” concern raised by Carr and Lanier is also over-stated. This<br />
logic ignores the non-monetary benefits that many of us feel we extract from<br />
today’s online business models and social production processes. Most of us feel<br />
we get a lot back <strong>as</strong> part of this new value exchange. Carr and Lanier are<br />
certainly correct that Google, Facebook, MySpace, and a lot of other Net<br />
middlemen are getting big and rich b<strong>as</strong>ed on all the user-generated content<br />
flowing across their sites and systems. On the other hand, most cyber-citizens<br />
extract enormous benefits from the existence of those (mostly free and<br />
constantly improving) platforms and services. It’s a very different sort of value<br />
exchange and business model than in the p<strong>as</strong>t, but we are adjusting to it.<br />
Yet for all of Wikipedia’s value <strong>as</strong> a reference of first (but certainly not final)<br />
resort, the pessimists have almost nothing good to say about it. Much the same<br />
goes for open source and other collaborative efforts. They don’t appear willing<br />
to accept the possibility of any benefits coming from collective efforts. And<br />
they wrongly treat the rise of collective / collaborative efforts <strong>as</strong> a zero-sum<br />
game; imagining it represents a net loss of individual effort & “personhood.”<br />
That simply doesn’t follow. The m<strong>as</strong>ses have been given more of a voice<br />
thanks to the rise of Web 2.0 collaborative technologies and platforms, but that<br />
doesn’t mean that media “professionals” don’t still exist. Most bloggers, for<br />
example, build their narratives around facts and stories found in respected<br />
“mainstream media” outlets. It’s true that those outlets must now compete in a<br />
broad sense with many new forms of competition for human attention, but it<br />
doesn’t mean they still won’t play a lead role in the new information ecosystem.<br />
Most of all, the pessimists can and must come to terms with the Information<br />
Revolution while offering more constructive and practical solutions to<br />
legitimately difficult transitional problems created by disintermediating<br />
influences of the digital technologies and Net. After all, practically speaking,<br />
what would the pessimists have us do if we can’t mitigate the problems they<br />
identify? “Whatever the mix of good and bad,” Notes Wall Street Journal<br />
columnist Gordon Crovitz, “technology only advances and cannot be put back<br />
in the bottle.” 90 Would the pessimists have us attempt to put the digital genie<br />
back in bottle with burdensome restrictions on technology or the creation of a<br />
permissions-b<strong>as</strong>ed system of innovation? “[W]hether it’s good for society or<br />
89 NICK BILTON, I LIVE IN THE FUTURE & HERE’S HOW IT WORKS 63 (2010).<br />
90 L. Gordon Crovitz, Is Technology Good or Bad? Yes. WALL STREET JOURNAL, Aug. 23, 2010,<br />
http://online.wsj.com/article/SB10001424052748703579804575441461191438330.html.
82 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
bad … is somewhat irrelevant at this point,” argues Nick Bilton. 91 “There’s no<br />
turning back the clock.” Similarly, Ben C<strong>as</strong>nocha h<strong>as</strong> correctly noted that:<br />
the wind at the backs of all techno-optimists … [is] the<br />
forward momentum of technological development. You<br />
cannot turn back the clock. It is impossible to envision a future<br />
where there is less information and fewer people on social<br />
networks. It is very possible to envision incre<strong>as</strong>ing abundance<br />
along with better filters to manage it. The most constructive<br />
contributions to the debate, then, heed Moore’s Law in the<br />
broadest sense and offer specific suggestions for how to<br />
harness the change for the better. 92<br />
Regrettably, most of the leading Net pessimists have failed to do this in their<br />
work. However, good templates for how to accomplish this can be found in<br />
recent books by William Powers (Hamlet’s BlackBerry: A Practical Philosophy for<br />
Building a Good Life in the <strong>Digital</strong> Age) 93 and John Freeman (The Tyranny of E-Mail:<br />
The Four-Thousand-Year Journey to Your Inbox). 94 These authors, although<br />
somewhat pessimistic in their view of technology’s impact on life and learning,<br />
offer outstanding self-help tips and plans of action about how to re<strong>as</strong>onably<br />
<strong>as</strong>similate new information technologies into our lives. Their key insight: the<br />
Internet and digital technologies aren’t going away, so we must figure out how<br />
to deal with them in a responsible manner—both individually and collectively.<br />
It’s essential other pessimists come to grips with that fact.<br />
The pessimists are at their best when highlighting the very legitimate concerns<br />
about the challenges that accompany technological change, including the impact<br />
of the digital revolution on “professional” media, the decline of authority<br />
91 Bilton, supra note 89, at 216.<br />
92 Ben C<strong>as</strong>nocha, RSSted Development, THE AMERICAN, July 1, 2009,<br />
http://www.american.com/archive/2009/june/rssted-development. Clay Shirky h<strong>as</strong><br />
also noted that “There is never going to be a moment when we <strong>as</strong> a society <strong>as</strong>k ourselves,<br />
‘Do we want this? Do we want the changes that the new flood of production and access and<br />
spread of information is going to bring about?’” Clay Shirky, HERE COMES EVERYBODY:<br />
THE POWER OF ORGANIZING WITHOUT ORGANIZATIONS 73 (2008).<br />
93 WILLIAM POWERS, HAMLET’S BLACKBERRY: A PRACTICAL PHILOSOPHY FOR BUILDING A<br />
GOOD LIFE IN THE DIGITAL AGE (2010). See also Adam Thierer, Coping with Information<br />
Overload: Thoughts on Hamlet’s BlackBerry by William Powers, TECHNOLOGY LIBERATION<br />
FRONT, Sept. 6, 2010, http://techliberation.com/2010/09/06/coping-withinformation-overload-thoughts-on-hamlets-blackberry-by-william-powers.<br />
94 JOHN FREEMAN, THE TYRANNY OF E-MAIL: THE FOUR-THOUSAND-YEAR JOURNEY TO<br />
YOUR INBOX (2009). For a review of the book, see Adam Thierer, Can Humans Cope with<br />
Information Overload? Tyler Cowen & John Freeman Join the Debate, TECHNOLOGY LIBERATION<br />
FRONT, Aug. 23, 2009, http://techliberation.com/2009/08/23/can-humans-cope-withinformation-overload-tyler-cowen-john-freeman-join-the-debate.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 83<br />
among trusted experts and intermediaries, and the challenge of finding creative<br />
ways to fund “professional” media and art going forward.<br />
Thoughts on the Optimists<br />
Again, the optimists currently have the better of this debate: Web 2.0 is<br />
generally benefiting culture and society. It is almost impossible to accept that<br />
society h<strong>as</strong> not benefited from the Internet and new digital technologies<br />
compared to the p<strong>as</strong>t era of information scarcity. The <strong>Digital</strong> Revolution h<strong>as</strong><br />
greatly empowered the m<strong>as</strong>ses and offered them more informational inputs.<br />
But the optimists need to be less pollyannaish and avoid becoming the<br />
“technopolists” (or digital utopians) that Postman feared were taking over our<br />
society. There’s often too much Rousseauian romanticism at work in some<br />
optimist writings. Just <strong>as</strong> the pessimists are often guilty <strong>as</strong>suming the Net and<br />
digital technologies are responsible for far too many ills, the optimists<br />
occ<strong>as</strong>ionally do the opposite by engaging in what Nick Carr labels “the<br />
Internet’s liberation mythology.” The Internet isn’t remaking man or changing<br />
human nature in any fundamental way. Nor can it liberate us from all earthly<br />
constraints or magically solve all of civilization’s problems. Moreover, when it<br />
comes to economics, all this talk about the Long Tail being “the future of<br />
business” (Chris Anderson) and of “Wikinomics … changing everything<br />
through m<strong>as</strong>s collaboration,” (Tapscott and Williams) verges on irrational<br />
techno-exuberance.<br />
In particular, optimists often overplay the benefits of collective intelligence,<br />
collaboration, and the role of amateur production. They are occ<strong>as</strong>ionally guilty<br />
of “the elevation of information to metaphysical status” <strong>as</strong> Postman lamented. 95<br />
For example, the optimists could frame “Wiki” and peer-production models <strong>as</strong><br />
a complement to professional media, not a replacement for it. Could the equivalent<br />
of The New York Times really be cobbled together by amateurs daily? It seems<br />
highly unlikely. And why aren’t there any compelling open source video games?<br />
Similarly, free and open source software (FOSS) h<strong>as</strong> produced enormous social<br />
/ economic benefits, but it would be foolish to believe that FOSS (or “wiki”<br />
models) will replace all proprietary business models. Each model or mode of<br />
production h<strong>as</strong> its place and purpose and they will continue to co-exist and<br />
compete.<br />
We wouldn’t necessarily be better off if all the “professional” media producers<br />
and old intermediaries disappeared, even if it is no doubt true that many of<br />
them will. Some optimists play the “old media just doesn’t get it” card far too<br />
often and snobbishly dismiss many producers’ valid concerns and efforts to<br />
reinvent themselves.<br />
95 Postman, supra note 8, at 61.
84 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
There’s also a big difference between “remix culture” and “rip-off culture.”<br />
Many optimists turn a blind eye to blatant copyright piracy, for example, or<br />
even defend it <strong>as</strong> either a positive development or simply inevitable. Remix<br />
culture generally enhances and extends culture and creativity. But blatant<br />
content piracy deprives many of society’s most gifted creators of the incentive<br />
to produce culturally beneficial works. Likewise, hacking, circumvention, and<br />
reverse-engineering all play an important and legitimate role in our new digital<br />
economy, but one need not accept the legitimacy of those activities when<br />
conducted for nefarious purposes (think identity theft or chip-modding to<br />
facilitate video game piracy.)<br />
The optimists should be cautious about predicting sweeping positive changes<br />
from the Internet or Web 2.0 technologies. Consider Shirky’s generally upbeat<br />
<strong>as</strong>sessment of the impact of “cognitive surplus.” There’s a lot of fluffy talk and<br />
anecdotal examples in Shirky’s book about how the cognitive surplus spawned<br />
by cyber-life h<strong>as</strong> affected politics, advocacy, and “generosity,” but I think it’s a<br />
stretch to imply that the Net is going to upend political systems. In another<br />
essay in this collection, Evgeny Morozov challenges Shirky on some of these<br />
points, arguing that “the Internet will not automatically preserve—never mind<br />
improve—the health of democratic politics.” 96 He’s right. That digital<br />
technology and the Internet will help reshape society and politics to some<br />
degree is indisputable. But that doesn’t mean the Net will radically reshape<br />
political systems or human nature anytime soon.<br />
Finally, the optimists would be wise to separate themselves from those extreme<br />
voices in their community who speak of the “noosphere” and “global<br />
consciousness” and long for the eventual singularity. While he doesn’t go quite<br />
so far, Wired editor Kevin Kelly often pushes techno-optimism to its extreme.<br />
In his latest book, What Technology Wants, Kelly speaks of what he calls “the<br />
technium” <strong>as</strong> a “force” or even a living organism that h<strong>as</strong> a “vital spirit” and<br />
which “h<strong>as</strong> its own wants” and “a noticeable me<strong>as</strong>ure of autonomy.” 97<br />
Treating technology <strong>as</strong> an autonomous force is silly, even dangerous, thinking.<br />
It is to imbue it with attributes and feelings that simply do not exist and would<br />
probably not be desirable if they did. Yet, some optimists speak in fatalistic<br />
terms and make such an outcome seem desirable. They sound like they long for<br />
life in The Matrix—”Bring on sentient robot m<strong>as</strong>ters and the Singularity!” Thus<br />
does an optimist cross over into the realm of quixotic techno-utopianism.<br />
Optimists need to place technological progress in context and appreciate that, <strong>as</strong><br />
Postman argued, there are some moral dimensions to technological progress<br />
that deserve attention. Not all change is good change. The optimists need to be<br />
96 Evgeny Morozov, Will the Net Liberate the World?, infra at 443.<br />
97 KEVIN KELLY, WHAT TECHNOLOGY WANTS 198, 41, 15, 13 (2010).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 85<br />
mature enough to understand and address the downsides of digital life without<br />
dismissing its critics. On the other hand, some of those moral consequences are<br />
profoundly positive, which the pessimists usually fail to appreciate or even<br />
acknowledge.<br />
Conclusion: Toward “Pragmatic Optimism”<br />
Again, I believe the optimists currently have the better of this debate. It’s<br />
impossible for me to believe we were better off in an era of information poverty<br />
and un-empowered m<strong>as</strong>ses. I’ll take information overload over information<br />
poverty any day! As Dennis Baron puts it: “The Internet is a true electronic<br />
frontier where everyone is on his or her own: all manuscripts are accepted for<br />
publication, they remain in virtual print forever, and no one can tell writers what<br />
to do.” 98<br />
The rise of the Internet and digital technologies h<strong>as</strong> empowered the m<strong>as</strong>ses and<br />
given everyone a soapbox on which to speak to the world. Of course, that<br />
doesn’t necessarily mean all of them will have something interesting to say! We<br />
shouldn’t exalt user-generated content <strong>as</strong> a good in and of itself. It’s quality, not<br />
volume, that counts. But such human empowerment is worth celebrating,<br />
despite its occ<strong>as</strong>ional downsides. 99 Abundance is better than the old analog<br />
world of few choices and fewer voices.<br />
However, the pessimists have some very legitimate concerns regarding how the<br />
p<strong>as</strong>sing of the old order might leave society without some important things. For<br />
example, one need not endorse bailouts for a dying newspaper industry to<br />
nonetheless worry about the important public service provided by investigative<br />
journalists: Who will take up those efforts if large media institutions go under<br />
because of digital disintermediation?<br />
The skeptics are also certainly correct that each of us should think about how to<br />
better balance new technologies and <strong>as</strong>similate them into our lives and the lives<br />
of our families and communities. For example, children need to learn new<br />
“digital literacy” and “cyber-citizenship” skills to be savvy Netizens.<br />
To be clear, I am not suggesting that these questions should be answered by<br />
government. There exist many other ways that society can work to preserve<br />
98 DENNIS BARON, A BETTER PENCIL 25 (2009).<br />
99 “Just <strong>as</strong> well-meaning scientists and consumers feared that trains and comic books and<br />
television would rot our brains and spoil our minds, I believe many of the skeptics and<br />
worrywarts today are missing the bigger picture, the greater value that access to new and<br />
f<strong>as</strong>ter information is bringing us.” Nick Bilton, I LIVE IN THE FUTURE & HERE’S HOW IT<br />
WORKS 136 (2010).
86 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?<br />
important values and institutions without embracing the st<strong>as</strong>is mentality and<br />
using coercion to accomplish that which should be pursued voluntarily.<br />
As noted, the nostalgia the pessimists typically espouse for the p<strong>as</strong>t is a<br />
common refrain of cultural and technological critics who fear our best days are<br />
behind us. The truth typically proves less cataclysmic, of course. The great<br />
thing about humans is that we adapt better than other creatures. When it comes<br />
to technological change, resiliency is hard-wired into our genes. “The technoapocalypse<br />
never comes,” notes Slate’s Jack Shafer, because “cultures tend to<br />
<strong>as</strong>similate and normalize new technology in ways the fretful never anticipate.” 100<br />
We learn how to use the new tools given to us and make them part of our lives<br />
and culture. Indeed, we have lived through revolutions more radical than the<br />
Information Revolution. We can adapt and learn to live with some of the<br />
legitimate difficulties and downsides of the Information Age.<br />
Generally speaking, the sensible middle ground position is “pragmatic<br />
optimism”: We should embrace the amazing technological changes at work in<br />
today’s Information Age but with a healthy dose of humility and appreciation<br />
for the disruptive impact and pace of that change. We need to think about how<br />
to mitigate the negative impacts <strong>as</strong>sociated with technological change without<br />
adopting the paranoid tone or Luddite-ish recommendations of the pessimists.<br />
I’m particularly persuaded by the skeptics’ call for all of us to exercise some<br />
restraint in terms of the role technology plays in our own lives. While pessimists<br />
from Plato and Postman certainly went too far at times, there is more than just<br />
a kernel of truth to their claim that, taken to an extreme, technology can have a<br />
deleterious impact on life and learning. We need to focus on the Aristotelian<br />
mean. We must avoid neo-Luddite calls for a return to “the good ‘ol days” on<br />
the one hand, while also rejecting techno-utopian Pollyannaism on the other.<br />
We need not go to “all or nothing” extremes.<br />
In the end, however, I return to the importance of evolutionary dynamism and<br />
the importance of leaving a broad sphere for continued experimentation by<br />
individuals and organizations alike. Freedom broadly construed is valuable in its<br />
own right—even if not all of the outcomes are optimal. As Clay Shirky rightly<br />
notes:<br />
This does not mean there will be no difficulties <strong>as</strong>sociated with<br />
our new capabilities—the defenders of freedom have long<br />
noted that free societies have problems peculiar to them.<br />
Instead, it <strong>as</strong>sumes that the value of freedom outweighs the<br />
100 Jack Shafer, <strong>Digital</strong> Native Calms the Anxious M<strong>as</strong>ses, SLATE, Sept. 13, 2010,<br />
http://www.slate.com/id/2267161.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 87<br />
problems, not b<strong>as</strong>ed on calculation of net value but because<br />
freedom is the right thing to want for society. 101<br />
Finally, we cannot ignore the practical difficulties of halting or even slowing<br />
progress—<strong>as</strong>suming we somehow collectively decided we wanted to do so.<br />
Turning back the clock seems almost unfathomable at this point absent extreme<br />
me<strong>as</strong>ures that would sacrifice so many of the benefits the Information Age h<strong>as</strong><br />
brought us—not to mention the curtailment of freedom that it would demand.<br />
Regardless, the old Theuth-Thamus debate about the impact of technological<br />
change on culture and society will continue to rage. There is no chance this<br />
debate will die down anytime soon. (Just wait till new technologies like virtual<br />
reality go mainstream!) Despite real challenges in adapting to technological<br />
change, I remain generally optimistic about the prospects for technology to<br />
improve the human condition.<br />
101 Shirky, supra note 59, at 298.
88 CHAPTER 1: THE INTERNET’S IMPACT ON CULTURE & SOCIETY: GOOD OR BAD?
CHAPTER 2<br />
IS THE GENERATIVE INTERNET AT RISK?<br />
Protecting the Internet Without Wrecking It:<br />
How to Meet the Security Threat 91<br />
Jonathan Zittrain<br />
A Portrait of the Internet <strong>as</strong> a Young Man 113<br />
Ann Bartow<br />
The C<strong>as</strong>e for Internet Optimism, Part 2:<br />
Saving the Net from Its Supporters 139<br />
Adam Thierer<br />
89
90 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 91<br />
Protecting the Internet Without<br />
Wrecking It: How to Meet the<br />
Security Threat<br />
By Jonathan Zittrain *<br />
On November 2, 1988, 5-10% of the 60,000 computers hooked up to the<br />
Internet started acting strangely. Inventories of affected computers revealed<br />
that rogue programs were demanding processor time. When concerned<br />
administrators terminated these programs, they reappeared and multiplied.<br />
They then discovered that renegade code w<strong>as</strong> spreading through the Internet<br />
from one machine to another. The software—now commonly thought of <strong>as</strong><br />
the first Internet worm—w<strong>as</strong> traced to a twenty-three-year-old Cornell<br />
University graduate student, Robert Tappan Morris, Jr., who had launched it by<br />
infecting a machine at MIT from his terminal in Ithaca, New York.<br />
Morris said he unle<strong>as</strong>hed the worm to count how many machines were<br />
connected to the Internet, and analysis of his program confirmed his benign<br />
intentions. But his code turned out to be buggy. If Morris had done it right, his<br />
program would not have drawn attention to itself. It could have remained<br />
installed for days or months, and quietly performed a wide array of activities<br />
other than Morris’s digital headcount.<br />
The mainstream media had an intense but brief f<strong>as</strong>cination with the incident. A<br />
government inquiry led to the creation of the Defense Department-funded<br />
Computer Emergency Response Team Coordination Center at Carnegie Mellon<br />
University, which serves <strong>as</strong> a clearinghouse for information about viruses and<br />
other network threats. A Cornell report on what had gone wrong placed the<br />
blame solely on Morris, who had engaged in a “juvenile act” that w<strong>as</strong> “selfish<br />
and inconsiderate.” It rebuked elements of the media that had branded Morris<br />
a hero for dramatically exposing security flaws, noting that it w<strong>as</strong> well known<br />
that the computers’ Unix operating systems were imperfect. The report called<br />
for university-wide committees to provide advice on security and acceptable<br />
use. It described consensus among computer scientists that Morris’s acts<br />
warranted some form of punishment, but not “so stern <strong>as</strong> to damage<br />
permanently the perpetrator’s career.”<br />
* Professor of Law, Harvard Law School and Harvard Kennedy School; Professor of<br />
Computer Science, Harvard School of Engineering and Applied Sciences; Co-Founder,<br />
Berkman Center for Internet & Society. This chapter originally appeared in the March/April<br />
2008 BOSTON REVIEW.
92 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
In the end, Morris apologized, earned three years of criminal probation,<br />
performed four hundred hours of community service, and w<strong>as</strong> fined $10,050.<br />
He transferred from Cornell to Harvard, founded a dot-com startup with some<br />
friends in 1995, and sold it to Yahoo! in 1998 for $49 million. He is now a<br />
respected, tenured professor at MIT.<br />
In retrospect, the commission’s recommendations—urging users to patch their<br />
systems and hackers to grow up—might seem naïve. But there were few<br />
plausible alternatives. Computing architectures, both then and now, are<br />
designed for flexibility rather than security. The decentralized ownership and<br />
non-proprietary nature of the Internet and the computers connected to it made<br />
it difficult to implement structural improvements. More importantly, it w<strong>as</strong><br />
hard to imagine cures that would not entail dr<strong>as</strong>tic, wholesale, purpose-altering<br />
changes to the very fabric of the Internet. Such changes would have been<br />
wildly out of proportion to the perceived threat, and there is no record of their<br />
having even been considered.<br />
Generative systems are powerful—they enable extraordinary<br />
numbers of people to devise new ways to express themselves<br />
in speech, art, or code, perhaps because they lack central<br />
coordination and control.<br />
By design, the university workstations of 1988 were generative: Their users<br />
could write new code for them or install code written by others. This generative<br />
design lives on in today’s personal computers. Networked PCs are able to<br />
retrieve and install code from each other. We need merely click on an icon or<br />
link to install new code from afar, whether to watch a video newsc<strong>as</strong>t embedded<br />
within a Web page, update our word processing or spreadsheet software, or<br />
browse satellite images.<br />
Generative systems are powerful and valuable, not only because they foster the<br />
production of useful things like Web browsers, auction sites, and free<br />
encyclopedi<strong>as</strong>, but also because they enable extraordinary numbers of people to<br />
devise new ways to express themselves in speech, art, or code and to work with<br />
other people. These characteristics can make generative systems very successful<br />
even though—perhaps especially because—they lack central coordination and<br />
control. That success attracts new participants to the generative system.<br />
The flexibility and power that make generative systems so attractive are,<br />
however, not without risks. Such systems are built on the notion that they are<br />
never fully complete, that they have many uses yet to be conceived of, and that<br />
the public can be trusted to invent good uses and share them. But multiplying<br />
breaches of that trust threaten the very foundations of the system.<br />
Whether through a sneaky vector like the one Morris used, or through the front<br />
door, when a trusting user elects to install something that looks interesting
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 93<br />
without fully understanding it, opportunities for accidents and mischief abound.<br />
A hobbyist computer that cr<strong>as</strong>hes might be a curiosity, but when a home or<br />
office PC with years’ worth of vital correspondence and papers is compromised,<br />
it can be a crisis. And when thousands or millions of individual, business,<br />
research, and government computers are subject to attack, we may find<br />
ourselves faced with a fundamentally new and harrowing scenario. As the<br />
unsustainable nature of the current state of affairs becomes more apparent, we<br />
are left with a dilemma that cannot be ignored: How do we preserve the<br />
extraordinary benefits of generativity, while addressing the growing<br />
vulnerabilities that are innate to it?<br />
* * *<br />
How profound is today’s security threat? Since 1988, the Internet h<strong>as</strong> suffered<br />
few truly disruptive security incidents. A network designed for communication<br />
among academic and government researchers appeared to scale beautifully <strong>as</strong><br />
hundreds of millions of new users signed on during the 1990s, and three types<br />
of controls seemed adequate to address emerging dangers.<br />
First, the hacker ethos frowns upon destructive hacking. Most viruses that<br />
followed Morris’s worm had completely innocuous payloads: In 2004, Mydoom<br />
spread like wildfire and reputedly cost billions in lost productivity, but the worm<br />
did not tamper with data, and it w<strong>as</strong> programmed to stop spreading at a set<br />
time. With rare exceptions like the infamous Lovebug worm, which overwrote<br />
files with copies of itself, the few highly malicious viruses that run contrary to<br />
the hacker ethos were so poorly coded that they failed to spread very far.<br />
Second, network operations centers at universities and other institutions<br />
became more professionalized between 1988 and the advent of the mainstream<br />
Internet. For a while, most Internet-connected computers were staffed by<br />
professionals, administrators who generally heeded admonitions to patch<br />
regularly and scout for security breaches. Less adept mainstream consumers<br />
began connecting unsecured PCs to the Internet in earnest only in the mid-<br />
1990s. Then, transient dial-up connections greatly limited both the amount of<br />
time during which they were exposed to security threats, and the amount of<br />
time that, if compromised and hijacked, they would contribute to the problem.<br />
Finally, bad code lacked a business model. Programs to trick users into<br />
installing them, or to sneak onto the machines, were written for amusement.<br />
Bad code w<strong>as</strong> more like graffiti than illegal drugs: There were no economic<br />
incentives for its creation.<br />
Today each of these controls h<strong>as</strong> weakened. With the expansion of the<br />
community of users, the idea of a set of ethics governing activity on the Internet<br />
h<strong>as</strong> evaporated. Anyone is allowed online if he or she can find a way to a
94 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
computer and a connection, and mainstream users are transitioning rapidly to<br />
always-on broadband connections.<br />
Moreover, PC user awareness of security issues h<strong>as</strong> not kept pace with<br />
broadband growth. A December 2005 online safety study found 81% of home<br />
computers to be lacking first-order protection me<strong>as</strong>ures such <strong>as</strong> current<br />
antivirus software, spyware protection, and effective firewalls. 1<br />
Perhaps most significantly, bad code is now a business. What seemed genuinely<br />
remarkable when first discovered is now commonplace: Viruses that<br />
compromise PCs to create large zombie “botnets” open to later instructions.<br />
Such instructions have included directing PCs to become remotely-controlled email<br />
servers, sending spam by the thousands or millions to e-mail addresses<br />
harvested from the hard disk of the machines themselves or gleaned from<br />
Internet searches, with the entire process typically proceeding behind the back<br />
of the PCs’ owners. At one point, a single botnet occupied fifteen percent of<br />
Yahoo!’s search capacity, running random searches on Yahoo! to find text that<br />
could be inserted into spam e-mails to throw off spam filters. 2 Dave Dagon,<br />
who recently left Georgia Tech University to start a bot-fighting company<br />
named Damballa, pegs the number of botnet-infected computers at close to 30<br />
million. 3 Dagon said, “Had you told me five years ago that organized crime<br />
would control one out of every ten home machines on the Internet, I would not<br />
have believed that.” 4 So long <strong>as</strong> spam remains profitable, that crime will persist.<br />
Botnets can also be used to launch coordinated attacks on a particular Internet<br />
endpoint. For example, a criminal can attack an Internet gambling Web site and<br />
then extort payment to make the attacks stop. The going rate for a botnet to<br />
launch such an attack is reputed to be about $5,000 per day. 5<br />
Viruses are thus valuable properties. Well-crafted worms and viruses routinely<br />
infect v<strong>as</strong>t swaths of Internet-connected personal computers. Antivirus vendor<br />
Eugene K<strong>as</strong>persky of K<strong>as</strong>persky Labs told an industry conference that they<br />
“may not be able to withstand the onslaught.” 6 IBM’s Internet Security Systems<br />
1 AOL/NCSA ONLINE SAFETY STUDY 2 (Dec. 2005),<br />
http://www.bc.edu/content/dam/files/offices/help/pdf/safety_study_2005.pdf<br />
2 Tim Weber, Criminals ‘May Overwhelm the Web,’ BBC NEWS, Jan. 25, 2007,<br />
http://news.bbc.co.uk/2/hi/business/6298641.stm.<br />
3 Bob Sullivan, Is Your Computer a Criminal?, RED TAPE CHRONICLES, Mar. 27, 2007,<br />
http://redtape.msnbc.com/2007/03/bots_story.html.<br />
4 Id.<br />
5 Id.<br />
6 Id.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 95<br />
reported a 40% incre<strong>as</strong>e in software vulnerabilities reported by manufacturers<br />
and “white hat” hackers between 2005 and 2006. 7 Nearly all of those<br />
vulnerabilities could be exploited remotely, and over half allowed attackers to<br />
gain full access to the machine and its contents.<br />
As the supply of troubles h<strong>as</strong> incre<strong>as</strong>ed, the capacity to address them h<strong>as</strong><br />
steadily diminished. Patch development time incre<strong>as</strong>ed throughout 2006 for all<br />
of the top operating system providers. 8 Times shortened modestly across the<br />
board in the first half of 2007, but, on average, enterprise vendors were still<br />
exposed to vulnerabilities for 55 days—plenty of time for hazardous code to<br />
make itself felt. 9 (The patch intervals for browsers tend to be shorter than<br />
those for operating systems.) What is more, antivirus researchers and firms<br />
require extensive coordination efforts simply to agree on a naming scheme for<br />
viruses <strong>as</strong> they emerge. 10 This is a far cry from a common strategy for battling<br />
them.<br />
In addition, the idea of c<strong>as</strong>ually cleaning a virus off a PC is gone. When<br />
computers are compromised, users are now typically advised to reinstall<br />
everything on them. For example, in 2007, some PCs at the U.S. National<br />
Defense University fell victim to a virus. The institution shut down its network<br />
servers for two weeks and distributed new laptops to instructors. 11 In the<br />
absence of such dr<strong>as</strong>tic me<strong>as</strong>ures, a truly “mal” piece of malware could be<br />
programmed to, say, er<strong>as</strong>e hard drives, transpose numbers inside spreadsheets<br />
randomly, or intersperse nonsense text at arbitrary intervals in Word documents<br />
found on infected computers—and nothing would stand in the way.<br />
Recognition of these b<strong>as</strong>ic security problems h<strong>as</strong> been slowly growing in<br />
Internet research communities. Nearly two-thirds of academics, social analysts,<br />
and industry leaders surveyed by the Pew Internet & American Life Project in<br />
2004 predicted serious attacks on network infr<strong>as</strong>tructure or the power grid in<br />
7 IBM INTERNET SECURITY SYSTEMS, X-FORCE 2006 TREND STATISTICS (Jan. 2007),<br />
http://www.iss.net/documents/whitepapers/X_Force_Exec_Brief.pdf.<br />
8 SYMANTEC, GLOBAL INTERNET SECURITY THREAT REPORT, TRENDS FOR JULY-DECEMBER<br />
2007 at 24-28 (April 2008),<br />
http://eval.symantec.com/mktginfo/enterprise/white_papers/bwhitepaper_internet_security_threat_report_xiii_04-2008.en-us.pdf.<br />
9 Id. at 6.<br />
10 See, e.g., Common Malware Enumeration: Reducing Public Confusion During Malware<br />
Outbreaks, http://cme.mitre.org/ (l<strong>as</strong>t visited June 1, 2007).<br />
11 Bill Gertz & Rowan Scarborough, Inside the Ring—Notes from the Pentagon, WASH. TIMES, Jan. 5,<br />
2007, at A5, available at http://www.gertzfile.com/gertzfile/ring011207.html.
96 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
the coming decade. 12 Security concerns will lead to a fundamental shift in our<br />
tolerance of the status quo, either by a cat<strong>as</strong>trophic episode, or, more likely, a<br />
glacial death of a thousand cuts.<br />
Consider, in the latter scenario, the burgeoning realm of “badware” (or<br />
“malware”) beyond viruses and worms: Software that is often installed at the<br />
user’s invitation. The popular file-sharing program KaZaA, though advertised<br />
<strong>as</strong> “spyware-free,” contains code that users likely do not want. It adds icons to<br />
the desktop, modifies Microsoft Internet Explorer, and installs a program that<br />
cannot be closed by clicking “Quit.” Uninstalling the program does not<br />
uninstall all these extr<strong>as</strong>, and the average user does not know how to get rid of<br />
the code itself. What makes such badware “bad” h<strong>as</strong> to do with the level of<br />
disclosure made to a consumer before he or she installs it. The most common<br />
responses to the security problem cannot e<strong>as</strong>ily address this gray zone of<br />
software.<br />
Many technologically savvy people think that bad code is simply a Microsoft<br />
Windows issue. They believe that the Windows OS and the Internet Explorer<br />
browser are particularly poorly designed, and that “better” counterparts<br />
(GNU/Linux and Mac OS, or the Firefox and Opera browsers) can help shield<br />
a user. But the added protection does not get to the fundamental problem,<br />
which is that the point of a PC—regardless of its OS—is to enable its users to<br />
e<strong>as</strong>ily reconfigure it to run new software from anywhere. When users make<br />
poor decisions about what software to run, the results can be dev<strong>as</strong>tating to<br />
their machines and, if they are connected to the Internet, to countless others’<br />
machines <strong>as</strong> well.<br />
The cybersecurity problem defies e<strong>as</strong>y solution because any of its most obvious<br />
fixes will undermine the generative essence of the Internet and PC. Bad code is<br />
an inevitable side effect of generativity, and <strong>as</strong> PC users are incre<strong>as</strong>ingly<br />
victimized by bad code, consumers are likely to reject generative PCs in favor of<br />
safe information appliances—digital video recorders, mobile phones, iPods,<br />
BlackBerrys, and video game consoles—that optimize a particular application<br />
and cannot be modified by users or third-parties. It is entirely re<strong>as</strong>onable for<br />
consumers to factor security and stability into their choice. But it is an<br />
undesirable choice to have to make.<br />
* * *<br />
On January 9, 2007, Steve Jobs introduced the iPhone to an eager audience<br />
crammed into San Francisco’s Moscone Center. A beautiful and brilliantly<br />
12 Susannah Fox et al., The Future of the Internet: In a Survey, Technology Experts and Scholars Evaluate<br />
Where the Network Is Headed in the <strong>Next</strong> Few Years, Jan. 9, 2005, at i,<br />
http://www.pewinternet.org/PPF/r/145/report_display.<strong>as</strong>p.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 97<br />
engineered device, the iPhone blended three products into one: an iPod, with<br />
the highest-quality screen Apple had ever produced; a phone, with cleverly<br />
integrated functionality, such <strong>as</strong> voicemail that came wrapped <strong>as</strong> separately<br />
accessible messages; and a device to access the Internet, with a smart and<br />
elegant browser, and built-in map, weather, stock, and e-mail capabilities.<br />
Steve Jobs had no clue how the Apple II would be used. The iPhone—for all<br />
its startling inventiveness—is precisely the opposite.<br />
This w<strong>as</strong> Steve Jobs’s second revolution. Thirty years earlier, at the First West<br />
Co<strong>as</strong>t Computer Faire in nearly the same spot, the twenty-one-year-old Jobs,<br />
wearing his first suit, exhibited the Apple II personal computer to great buzz<br />
amidst “ten thousand walking, talking computer freaks.” 13 The Apple II w<strong>as</strong> a<br />
machine for hobbyists who did not want to fuss with soldering irons: all the<br />
ingredients for a functioning PC were provided in a convenient molded pl<strong>as</strong>tic<br />
c<strong>as</strong>e. Instead of puzzling over bits of hardware or typing up punch cards to<br />
feed into someone else’s mainframe, Apple owners faced only the hurdle of a<br />
cryptic blinking cursor in the upper left corner of the screen: the PC awaited<br />
instructions. But the hurdle w<strong>as</strong> not high. Some owners were inspired to<br />
program the machines themselves, but beginners, too, could load software<br />
written and then shared or sold by their more skilled counterparts. The Apple<br />
II w<strong>as</strong> a blank slate, a bold departure from previous technology that had been<br />
developed and marketed to perform specific t<strong>as</strong>ks.<br />
The Apple II quickly became popular. And when programmer and<br />
entrepreneur Dan Bricklin introduced the first killer application for the Apple II<br />
in 1979—VisiCalc, the world’s first spreadsheet program—sales of the ungainly<br />
but very cool machine took off. An Apple running VisiCalc helped to convince<br />
a skeptical world that there w<strong>as</strong> a place for the PC on everyone’s desk.<br />
The Apple II w<strong>as</strong> quintessentially generative technology. It w<strong>as</strong> a platform. It<br />
invited people to tinker with it. Hobbyists wrote programs. Businesses began<br />
to plan on selling software. Jobs (and Apple) had no clue how the machine<br />
would be used. They had their hunches, but, fortunately for them (and the rest<br />
of us), nothing constrained the PC to the hunches of the founders.<br />
The iPhone—for all its startling inventiveness—is precisely the opposite.<br />
Rather than a platform that invites innovation, the iPhone comes<br />
preprogrammed. In its first version, you were not allowed to add programs to<br />
the all-in-one device that Steve Jobs sells you except via the Siberia of its Web<br />
13 David H. Ahl, The First West Co<strong>as</strong>t Computer Faire, in 3 THE BEST OF CREATIVE COMPUTING<br />
98 (David Ahl & Burchenal Green eds., 1980), available at<br />
http://www.atariarchives.org/bcc3/showpage.php?page_98.
98 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
browser. Its functionality w<strong>as</strong> locked in, though Apple could change it through<br />
remote updates. Indeed, those who managed to tinker with the code and enable<br />
iPhone-support of more or different applications, were on the receiving end of<br />
Apple’s threat to transform the iPhone into an iBrick. 14 A threat, to be sure,<br />
that Apple later at le<strong>as</strong>t partially disavowed. The machine w<strong>as</strong> not to be<br />
generative beyond the innovations that Apple (and its exclusive carrier, AT&T)<br />
wanted. In its second version a year later, the iPhone bo<strong>as</strong>ted the App Store.<br />
Software developers could code for the phone – but the developers, and then<br />
each piece of software, would require approval from Apple before it could be<br />
made available to iPhone users. Apple would receive a 30% cut of sales,<br />
including “in-app” sales of upgrades, and an app could be banned retroactively<br />
after initial approval. This made the iPhone “contingently generative,” a hybrid<br />
status that, depending on how you look at it, is either the best or the worst of<br />
both worlds: a melding of the sterile and the generative.<br />
Jobs w<strong>as</strong> not shy about these restrictions. As he said at the iPhone launch: “We<br />
define everything that is on the phone …. You don’t want your phone to be<br />
like a PC. The l<strong>as</strong>t thing you want is to have loaded three apps on your phone<br />
and then you go to make a call and it doesn’t work anymore.” 15<br />
In the arc from the Apple II to the iPhone, we learn something important about<br />
where the Internet h<strong>as</strong> been, and something even more important about where<br />
it is going. The PC revolution w<strong>as</strong> launched with PCs that invited innovation<br />
by others. So, too, with the Internet. Both were designed to accept any<br />
contribution that followed a b<strong>as</strong>ic set of rules (either coded for a particular<br />
operating system, or respecting the protocols of the Internet). Both<br />
overwhelmed their respective proprietary, non-generative competitors: PCs<br />
crushed stand-alone word processors and the Internet displaced such<br />
proprietary online services <strong>as</strong> CompuServe and AOL.<br />
But the future is looking very different because of the security situation—not<br />
generative PCs attached to a generative network, but appliances tethered to a<br />
network of control. These appliances take the innovations already created by<br />
Internet users and package them neatly and compellingly, which is good—but<br />
only if the Internet and PC can remain sufficiently central in the digital<br />
ecosystem to compete with locked-down appliances and facilitate the next<br />
round of innovations. The balance between the two spheres is precarious, and<br />
it is slipping toward the safer appliance. For example, Microsoft’s Xbox 360<br />
14 Michael, Apple Says It May “Brick” Unlocked iPhones With <strong>Next</strong> Software Update, Apple Gazette,<br />
Sep. 24, 2007, http://www.applegazette.com/iphone/apple-says-it-may-brickunlocked-iphones-with-next-software-update/.<br />
15 See John Markoff, Steve Jobs Walks the Tightrope Again, N.Y. TIMES, Jan. 12, 2007, available at<br />
http://www.nytimes.com/2007/01/12/technology/12apple.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 99<br />
video game console is a powerful computer, but, unlike Microsoft’s Windows<br />
operating system for PCs, it does not allow just anyone to write software that<br />
can run on it – games must be licensed by Microsoft. Bill Gates sees the Xbox<br />
at the center of the future digital ecosystem, rather than its periphery: “It is a<br />
general purpose computer . . . [W]e wouldn’t have done it if it w<strong>as</strong> just a<br />
gaming device. We wouldn’t have gotten into the category at all. It w<strong>as</strong> about<br />
strategically being in the living room.” 16<br />
Devices like iPhones and Xbox 360s may be safer to use, and they may seem<br />
capacious in features so long <strong>as</strong> they offer a simple Web browser. But by<br />
focusing on security and limiting the damage that users can do through their<br />
own ignorance or carelessness, these appliances also limit the beneficial tools<br />
that users can create or receive from others—enhancements they may be<br />
clueless about when they are purch<strong>as</strong>ing the device.<br />
If the PC ce<strong>as</strong>es to be at the center of the information<br />
technology ecosystem, the most restrictive <strong>as</strong>pects of<br />
information appliances will come to the fore.<br />
Security problems related to generative PC platforms may propel people away<br />
from PCs and toward information appliances controlled by their makers. If we<br />
eliminate the PC from many dens or living rooms, we eliminate the test bed and<br />
distribution point of new, useful software from any corner of the globe. We<br />
also eliminate the safety valve that keeps those information appliances honest.<br />
If TiVo makes a digital video recorder that h<strong>as</strong> too many limits on what people<br />
can do with the video they record, people will discover DVR software like<br />
MythTV that records and plays TV shows on their PCs. If mobile phones are<br />
too expensive, people will use Skype. But people do not buy PCs <strong>as</strong> insurance<br />
policies against appliances that limit their freedoms, even though PCs serve<br />
exactly this vital function. People buy them to perform certain t<strong>as</strong>ks at the<br />
moment of acquisition. If PCs cannot reliably perform these t<strong>as</strong>ks, most<br />
consumers will not see their merit, and the safety valve will be lost. If the PC<br />
ce<strong>as</strong>es to be at the center of the information technology ecosystem, the most<br />
restrictive <strong>as</strong>pects of information appliances will come to the fore.<br />
In fact, the dangers may be more subtly packaged. PCs need not entirely<br />
disappear <strong>as</strong> people buy information appliances in their stead. PCs can<br />
themselves be made less generative. Users tired of making the wrong choices<br />
about installing code on their PCs might choose to let someone else decide<br />
what code should be run. Firewalls can protect against some bad code, but they<br />
also complicate the installation of new good code. As antivirus, antispyware,<br />
16 Ryan Block, A Lunchtime Chat with Bill Gates, ENGADGET, Jan. 8, 2007,<br />
http://www.engadget.com/2007/01/08/a-lunchtime-chat-with-bill-gates-at-ces/.
100 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
and anti-badware barriers proliferate, there are new barriers to the deployment<br />
of new good code from unprivileged sources. And in order to guarantee<br />
effectiveness, these barriers are becoming incre<strong>as</strong>ingly paternalistic, refusing to<br />
allow users e<strong>as</strong>ily to overrule them. Especially in environments where the user<br />
of the PC does not own it—offices, schools, libraries, and cyber-cafés—barriers<br />
are being put in place to prevent the running of any code not specifically<br />
approved by the relevant gatekeeper. Users may find themselves limited to<br />
using a Web browser. And while “Web 2.0” promises many more uses for a<br />
browser—consumers can now write papers and use spreadsheets through a<br />
browser, and software developers now write for Web platforms like Facebook<br />
instead of PC operating systems —these Web platforms are themselves tethered<br />
to their makers, their generativity contingent on the continued permission of the<br />
platform vendors.<br />
Short of completely banning unfamiliar software, code might be divided into<br />
first- and second-cl<strong>as</strong>s status, with second-cl<strong>as</strong>s, unapproved software allowed<br />
to perform only certain minimal t<strong>as</strong>ks on the machine, operating within a digital<br />
sandbox. This technical solution is safer than the status quo but imposes<br />
serious limits. It places the operating system creator or installer in the position<br />
of deciding what software will and will not run. The PC will itself have become<br />
an information appliance, not e<strong>as</strong>ily reconfigured or extended by its users.<br />
The key to avoiding such a future is to give the market a re<strong>as</strong>on not to abandon<br />
or lock down the PCs that have served it so well, also giving most governments<br />
re<strong>as</strong>on to refrain from major intervention into Internet architecture in the name<br />
of public safety. The solutions to the generative dilemma will rest on social and<br />
legal <strong>as</strong> much <strong>as</strong> technical innovation, and the best guideposts can be found in<br />
other generative successes in those aren<strong>as</strong>. Mitigating abuses of openness<br />
without resorting to lockdown will depend on a community ethos embodied in<br />
responsible groups with shared norms and a sense of public purpose, rather<br />
than in the hands of a single gatekeeper, whether public or private.<br />
In the medium term, the battle between generative and sterile will be played out<br />
between the iPhone and Android, which despite its own version of an App<br />
Store, also allows outside code to run that doesn’t come from the store; and<br />
with projects like Boxee and Google TV, which are seeking to bridge the gap<br />
between the PC and the living room. Each device sets the dial set at a different<br />
point between complete “open” and completely “closed.” And those dials can<br />
shift: after a security “spill,” Android could be reprogrammed overnight to be<br />
more restrictive in the code it runs; and by the same token, Apple could decide<br />
to loosen its restrictions on iPhone code.<br />
* * *<br />
We need a strategy that addresses the emerging security troubles of today’s<br />
Internet and PCs without killing their openness to innovation. This is e<strong>as</strong>ier
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 101<br />
said than done, because our familiar legal tools are not particularly attuned to<br />
maintaining generativity. A simple regulatory intervention—say, banning the<br />
creation or distribution of deceptive or harmful code—will not work because it<br />
is hard to track the identities of sophisticated wrongdoers, and, even if found,<br />
many may not be in cooperative jurisdictions. Moreover, such intervention may<br />
have a badly chilling effect: Much of the good code we have seen h<strong>as</strong> come<br />
from unaccredited people sharing what they have made for fun, collaborating in<br />
ways that would make business-like regulation of their activities burdensome for<br />
them. They might be dissuaded from sharing at all.<br />
We can find a balance between needed change and undue restriction if we think<br />
about how to move generative approaches and solutions that work at one<br />
“layer” of the Internet—content, code, or technical—to another. Consider<br />
Wikipedia, the free encyclopedia whose content—the entries and their<br />
modifications—is fully generated by the Web community. The origins of<br />
Wikipedia lie in the open architecture of the Internet and Web. This allowed<br />
Ward Cunningham to invent the wiki, generic software that offers a way of<br />
editing or organizing information within an article, and spreading this<br />
information to other articles. Unrelated non-techies then used Wikis to form<br />
Web sites at the content layer, including Wikipedia. People are free not only to<br />
edit Wikipedia, but to take all of its contents and experiment with different ways<br />
of presenting or changing the material, perhaps by placing the information on<br />
otherwise unrelated Web sites in different formats. When abuses of this<br />
openness beset Wikipedia with vandalism, copyright infringement, and lies, it<br />
turned to its community—aided by some important technical tools—<strong>as</strong> the<br />
primary line of defense, rather than copyright or defamation law. Most recently,<br />
this effort h<strong>as</strong> been aided by the introduction of Virgil Griffith’s Wikiscanner, a<br />
simple tool that uses Wikipedia’s page histories to expose p<strong>as</strong>t instances of<br />
article whitew<strong>as</strong>hing by interested parties.<br />
Unlike a form of direct regulation that would have locked down the site, the<br />
Wikipedian response so far appears to have held many of Wikipedia’s problems<br />
at bay. Why does it work so well? Generative solutions at the content layer<br />
seem to have two characteristics that suggest broad approaches to lowering the<br />
risks of the generative Internet while preserving its openness. First, much<br />
participation in generating Web content—editing Wikipedia entries, blogging, or<br />
even engaging in transactions on eBay and Amazon that <strong>as</strong>k for reviews and<br />
ratings to establish reputations—is understood to be an innately social activity.<br />
These services solicit and depend upon participation from the public, and their<br />
participation mechanisms are e<strong>as</strong>ily m<strong>as</strong>tered. The same possibility for broad<br />
participation exists one level down at the technical layer, but it h<strong>as</strong> not yet been<br />
<strong>as</strong> fully exploited: Mainstream users have thus far been eager to have someone<br />
else solve underlying problems, which they perceive <strong>as</strong> technical rather than<br />
social. Second, many content-layer enterprises have developed technical tools<br />
to support collective participation, augmenting an individualistic ethos with
102 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
community-facilitating structures. In the Internet and PC security space, on the<br />
other hand, there have been few tools available to tap the power of groups of<br />
users to, say, distinguish good code from bad.<br />
The effectiveness of the social layer in Web successes points to two approaches<br />
that might save the generative spirit of the Net, or at le<strong>as</strong>t keep it alive for<br />
another interval. The first is to reconfigure and strengthen the Net’s<br />
experimentalist architecture to make it fit better with the v<strong>as</strong>t expansion in the<br />
number and types of users. The second is to develop new tools and practices<br />
that will enable relevant people and institutions to help secure the Net<br />
themselves instead of waiting for someone else to do it.<br />
Generative PCs with E<strong>as</strong>y Reversion<br />
Wikis are designed so that anyone can edit them. This creates a genuine and<br />
ongoing risk of bad edits, through either incompetence or malice. The damage<br />
that can be done, however, is minimized by the wiki technology, because it<br />
allows bad changes to be quickly reverted. All previous versions of a page are<br />
kept, and a few clicks by another user can restore a page to the way it w<strong>as</strong><br />
before later changes were made. So long <strong>as</strong> there are more users (and<br />
automated tools they create) detecting and reverting vandalism than there are<br />
users vandalizing, the community wins. (Truly, the price of freedom is eternal<br />
vigilance.)<br />
Our PCs can be similarly equipped. For years Windows XP (and now Vista)<br />
h<strong>as</strong> had a system restore feature, where snapshots are taken of the machine at a<br />
moment in time, allowing later bad changes to be rolled back. The process of<br />
restoring is tedious, restoration choices can be frustratingly all-or-nothing, and<br />
the system restoration files themselves can become corrupted, but it represents<br />
progress. Even better would be the introduction of features that are<br />
commonplace on wikis: A quick chart of the history of each document, with an<br />
ability to see date-stamped sets of changes going back to its creation. Because<br />
our standard PC applications <strong>as</strong>sume a safer environment than really exists,<br />
these features have never been demanded or implemented. Because wikis are<br />
deployed in environments prone to vandalism, their contents are designed to be<br />
e<strong>as</strong>ily recovered after a problem.<br />
The next stage of this technology lies in new virtual machines, which would<br />
obviate the need for cyber cafés and corporate IT departments to lock down<br />
their PCs.<br />
In an effort to satisfy the desire for safety without full lockdown, PCs can be<br />
designed to pretend to be more than one machine, capable of cycling from one<br />
personality to the next. In its simplest implementation, we could divide a PC<br />
into two virtual machines: “Red” and “Green.” The Green PC would house<br />
reliable software and important data—a stable, mature OS platform and tax
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 103<br />
returns, term papers, and business documents. The Red PC would have<br />
everything else. In this setup, nothing that happens on one PC can e<strong>as</strong>ily affect<br />
the other, and the Red PC could have a simple reset button that restores a<br />
predetermined safe state. Someone could confidently store important data on<br />
the Green PC and still use the Red PC for experimentation. This isn’t rocket<br />
science – there’s already software out there to amount to a Green/Red divide<br />
on a Windows machine – but it’s not so e<strong>as</strong>y for the average user to deploy and<br />
use.<br />
E<strong>as</strong>y, wiki-style reversion, coupled with virtual PCs, would accommodate the<br />
experimentalist spirit of the early Internet while acknowledging the important<br />
uses for those PCs that we do not want to disrupt. Still, this is not a complete<br />
solution. The Red PC, despite its experimental purpose, might end up<br />
accumulating data that the user wants to keep, occ<strong>as</strong>ioning the need for what<br />
Internet architect David D. Clark calls a “checkpoint Charlie” to move sensitive<br />
data from Red to Green without also carrying a virus or anything else<br />
undesirable. There is also the question of what software can be deemed safe for<br />
Green—which is just another version of the question of what software to run<br />
on today’s single-identity PCs.<br />
For these and related re<strong>as</strong>ons, virtual machines will not be panace<strong>as</strong>, but they<br />
might buy us some more time. And they implement a guiding principle from<br />
the Net’s history: an experimentalist spirit is best maintained when failures can<br />
be contained <strong>as</strong> learning experiences rather than expanding to cat<strong>as</strong>trophes.<br />
A Generative Solution to Bad Code<br />
The Internet’s original design relied on few mechanisms of central control.<br />
This lack of control h<strong>as</strong> the generative benefit of allowing new services to be<br />
introduced, and new destinations to come online, without any up-front vetting<br />
or blocking by either private incumbents or public authorities. With this<br />
absence of central control comes an absence of me<strong>as</strong>urement. The Internet<br />
itself cannot say how many users it h<strong>as</strong>, because it does not maintain user<br />
information. There is no awareness at the network level of how much<br />
bandwidth is being used by whom. From a generative point of view this is<br />
good because it allows initially whimsical but data-intensive uses of the network<br />
to thrive (remember goldfish cams?)—and perhaps to become vital (nowroutine<br />
videoconferencing through Skype, from, unsettlingly, the makers of<br />
KaZaA).<br />
Because we cannot e<strong>as</strong>ily me<strong>as</strong>ure the network and the<br />
character of the activity on it, we cannot e<strong>as</strong>ily <strong>as</strong>sess and deal<br />
with threats from bad code without laborious and imperfect<br />
cooperation among a limited group of security software<br />
vendors.
104 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
But limited me<strong>as</strong>urement is starting to have generative drawbacks. Because we<br />
cannot e<strong>as</strong>ily me<strong>as</strong>ure the network and the character of the activity on it, we<br />
cannot e<strong>as</strong>ily <strong>as</strong>sess and deal with threats from bad code without laborious and<br />
imperfect cooperation among a limited group of security software vendors. The<br />
future of the generative Net depends on a wider circle of users able to gr<strong>as</strong>p the<br />
b<strong>as</strong>ics of what is going on within their machines and between their machines<br />
and the network.<br />
What might this system look like? Roughly, it would take the form of toolkits<br />
to overcome the digital solipsism that each of our PCs experiences when it<br />
attaches to the Internet at large, unaware of the size and dimension of the<br />
network to which it connects. These toolkits would run unobtrusively on the<br />
PCs of participating users, reporting back—to a central source, or perhaps only<br />
to each other—information about the vital signs and running code of that PC,<br />
which could help other PCs determine the level of risk posed by new code.<br />
When someone is deciding whether to run new software, the toolkit’s<br />
connections to other machines could tell the person how many other machines<br />
on the Internet are running the code, what proportion of machines belonging to<br />
self-described experts are running it, whether those experts have vouched for it,<br />
and how long the code h<strong>as</strong> been in the wild.<br />
Building on these ide<strong>as</strong> about me<strong>as</strong>urement and code <strong>as</strong>sessment, Harvard<br />
University’s Berkman Center and the Oxford Internet Institute—<br />
multidisciplinary academic enterprises dedicated to charting the future of the<br />
Net and improving it—have begun a project called StopBadware<br />
(www.stopbadware.org), designed to <strong>as</strong>sist rank-and-file Internet users in<br />
identifying and avoiding bad code. The idea is not to replicate the work of<br />
security vendors like Symantec and McAfee, which, for a fee, seek to bail new<br />
viruses out of our PCs f<strong>as</strong>ter than they pour in. Rather, these academic groups<br />
are developing a common technical and institutional framework that enables<br />
users to devote some bandwidth and processing power for better me<strong>as</strong>urement<br />
of the effect of new code. A first step in the toolkit w<strong>as</strong> developed <strong>as</strong> “Herdict<br />
PC.” Herdict PC w<strong>as</strong> a small piece of software that <strong>as</strong>sembles vital signs like<br />
number of pop-up windows or cr<strong>as</strong>hes per hour. [It incorporates that data into<br />
a d<strong>as</strong>hboard usable by mainstream PC owners. Efforts like Herdict – including<br />
such ventures <strong>as</strong> Soluto (www.soluto.com) – will test the idea that solutions<br />
that have worked for generating content might also be applicable to the<br />
technical layer. Such a system might also illuminate Internet filtering by<br />
governments around the world, <strong>as</strong> people participate in a system where they can<br />
report when they cannot access a Web site, and such reports can be collated by<br />
geography.<br />
A full adoption of the lessons of Wikipedia would give PC users the<br />
opportunity to have some ownership, some shared stake, in the process of<br />
evaluating code, especially because they have a stake in getting it right for their
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 105<br />
own machines. Sharing useful data from their PCs is one step, but this may<br />
work best when the data goes to an entity committed to the public interest of<br />
solving PC security problems and willing to share that data with others. The<br />
notion of a civic institution here does not necessarily mean cumbersome<br />
governance structures and formal lines of authority so much <strong>as</strong> it means a sense<br />
of shared responsibility and participation. Think of the volunteer fire<br />
department or neighborhood watch: While not everyone is able to fight fires or<br />
is interested in watching, a critical m<strong>as</strong>s of people are prepared to contribute,<br />
and such contributions are known to the community more broadly.<br />
The success of tools drawing on group generativity depends on participation,<br />
which helps establish the legitimacy of the project both to those participating<br />
and those not. Internet users might see themselves only <strong>as</strong> consumers whose<br />
purch<strong>as</strong>ing decisions add up to a market force, but, with the right tools, users<br />
can also see themselves <strong>as</strong> participants in the shaping of generative space—<strong>as</strong><br />
netizens.<br />
Along with netizens, hardware and software makers could also get involved.<br />
OS makers could be <strong>as</strong>ked or required to provide b<strong>as</strong>ic tools of transparency<br />
that empower users to understand exactly what their machines are doing. These<br />
need not be <strong>as</strong> sophisticated <strong>as</strong> Herdict. They could provide b<strong>as</strong>ic information<br />
on what data is going in and out of the box and to whom. Insisting on getting<br />
better information to users could be <strong>as</strong> important <strong>as</strong> providing a speedometer or<br />
fuel gauge on an automobile—even if users do not think they need one.<br />
Internet Service Providers (ISPs) can also re<strong>as</strong>onably be <strong>as</strong>ked or required to<br />
help. Thus far, ISPs have been on the sidelines regarding network security.<br />
The justification is that the Internet w<strong>as</strong> rightly designed to be a dumb network,<br />
with most of its features and complications pushed to the endpoints. The<br />
Internet’s engineers embraced the simplicity of the end-to-end principle for<br />
good re<strong>as</strong>ons. It makes the network more flexible, and it puts designers in a<br />
mindset of making the system work rather than designing against every possible<br />
thing that could go wrong. Since this early architectural decision, “keep the<br />
Internet free” advocates have advanced the notion of end-to-end neutrality <strong>as</strong><br />
an ethical ideal, one that leaves the Internet without filtering by any of its<br />
intermediaries, routing packets of information between sender and recipient<br />
without anyone looking along the way to see what they contain. Cyberlaw<br />
scholars have taken up end-to-end <strong>as</strong> a battle cry for Internet freedom, invoking<br />
it to buttress arguments about the ideological impropriety of filtering Internet<br />
traffic or favoring some types or sources of traffic over others.<br />
End-to-end neutrality h<strong>as</strong> indeed been a crucial touchstone for Internet<br />
development. But it h<strong>as</strong> limits. End-to-end design preserves users’ freedom<br />
only because the users can configure their own machines however they like.<br />
But this depends on the incre<strong>as</strong>ingly unreliable presumption that whoever runs
106 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
a machine at a given network endpoint can readily choose how the machine<br />
should work. Consider that in response to a network teeming with viruses and<br />
spam, network engineers recommend more bandwidth (so the transmission of<br />
“deadweights” like viruses and spam does not slow down the much smaller<br />
proportion of legitimate mail being carried by the network) and better<br />
protection at user endpoints. But users are not well positioned to painstakingly<br />
maintain their machines against attack, and intentional inaction at the network<br />
level may be self-defeating, because consumers may demand locked-down<br />
endpoint environments that promise security and stability with minimum user<br />
upkeep.<br />
Strict loyalty to end-to-end neutrality should give way to a new principle <strong>as</strong>king<br />
that any modifications to the Internet’s design or the behavior of ISPs be made<br />
in such a way that they will do the le<strong>as</strong>t harm to generative possibilities. Thus, it<br />
may be preferable in the medium-term to screen-out viruses through ISPoperated<br />
network gateways rather than through constantly updated PCs. To be<br />
sure, such network screening theoretically opens the door to undesirable<br />
filtering. But we need to balance this speculative risk against the growing threat<br />
to generativity. ISPs are in a good position to help in a way that falls short of<br />
undesirable perfect enforcement facilitated through endpoint lockdown, by<br />
providing a stopgap while we develop the kinds of community-b<strong>as</strong>ed tools that<br />
can promote salutary endpoint screening.<br />
Even search engines can help create a community process that h<strong>as</strong> impact. In<br />
2006, in cooperation with the Harvard and Oxford StopBadware initiative,<br />
Google began automatically identifying Web sites that had malicious code<br />
hidden in them, ready to infect browsers. Some of these sites were set up for<br />
the purpose of spreading viruses, but many more were otherwise-legitimate<br />
Web sites that had been hacked. For example, visitors to chuckro<strong>as</strong>t.com can<br />
browse fleece jackets and other offerings and place and pay for orders.<br />
However, Google found that hackers had subtly changed the chuckro<strong>as</strong>t.com<br />
code: The b<strong>as</strong>ic functionalities were untouched, but code injected on the home<br />
page would infect many visitors’ browsers. Google tagged the problem, and<br />
appended to the Google search result: “Warning: This site may harm your<br />
computer.” Those who clicked on the results link anyway would get an<br />
additional warning from Google and the suggestion to visit StopBadware or<br />
pick another page.<br />
The site’s traffic plummeted, and the owner (along with the thousands of others<br />
whose sites were listed) w<strong>as</strong> understandably anxious to fix it. But cleaning a<br />
hacked site takes more than an amateur Web designer. Requests for specialist<br />
review inundated StopBadware researchers. Until StopBadware could check<br />
each site and verify it had been cleaned of bad code, the warning pages stayed<br />
up. Prior to the Google/StopBadware project, no one took responsibility for<br />
this kind of security. Ad hoc alerts to the hacked sites’ webm<strong>as</strong>ters—and their
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 107<br />
ISPs—garnered little reaction. The sites were fulfilling their intended purposes<br />
even <strong>as</strong> they were spreading viruses to visitors. With Google/StopBadware,<br />
Web site owners have experienced a major shift in incentives for keeping their<br />
sites clean.<br />
The result is perhaps more powerful than a law that would have directly<br />
regulated them, and it could in turn generate a market for firms that help<br />
validate, clean, and secure Web sites. Still, the justice of Google/StopBadware<br />
and similar efforts remains rough, and market forces alone might not direct the<br />
desirable level of attention to those wrongly labeled <strong>as</strong> people or Web sites to be<br />
avoided, or properly labeled but with no place to seek help.<br />
The touchstone for judging such efforts is whether they reflect the generative<br />
principle: Do the solutions arise from and reinforce a system of<br />
experimentation? Are the users of the system able, so far <strong>as</strong> they are interested,<br />
to find out how the resources they control—such <strong>as</strong> a PC—are participating in<br />
the environment? Done well, these interventions can encourage even c<strong>as</strong>ual<br />
users to have some part in directing what their machines will do, while securing<br />
those users’ machines against outsiders who have not been given permission by<br />
the users to make use of them. Automatic accessibility by outsiders—whether<br />
by vendors, malware authors, or governments—can deprive a system of its<br />
generative character <strong>as</strong> its users are limited in their own control.<br />
Data Portability<br />
The generative Internet w<strong>as</strong> founded and cultivated by people and institutions<br />
acting outside traditional markets, and later carried forward by commercial<br />
forces. Its success requires an ongoing blend of expertise and contribution<br />
from multiple models and motivations. Ultimately, a move by the law to<br />
allocate responsibility to commercial technology players in a position to help<br />
but without economic incentive to do so, and to those among us, commercially<br />
inclined or not, who step forward to solve the pressing problems that elude<br />
simpler solutions may also be in order. How can the law be shaped if one wants<br />
to reconcile generative experimentation with other policy goals beyond<br />
continued technical stability? The next few proposals are focused on this<br />
question about the constructive role of law.<br />
One important step is making locked-down appliances and Web 2.0 software<strong>as</strong>-a-service<br />
more palatable. After all, they are here to stay, even if the PC and<br />
Internet are saved. The crucial issue here is that a move to tethered appliances<br />
and Web services means that more and more of our experiences in the<br />
information space will be contingent: A service or product we use at one<br />
moment could act completely differently the next, since it can be so quickly<br />
reprogrammed by the provider without our <strong>as</strong>sent. Each time we power up a<br />
mobile phone, video game console, or BlackBerry, it might have gained some
108 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
features and lost others. Each time we visit a Web site offering an ongoing<br />
service like e-mail access or photo storage, the same is true.<br />
As various services and applications become more self-contained within<br />
particular devices, there is a minor intervention the law could make to avoid<br />
undue lock-in. Online consumer protection law h<strong>as</strong> included attention to<br />
privacy policies. A Web site without a privacy policy, or one that does not live<br />
up to whatever policy it posts, is open to charges of unfair or deceptive trade<br />
practices. Similarly, makers of tethered appliances and Web sites keeping<br />
customer data ought to be <strong>as</strong>ked to offer portability policies. These policies<br />
would declare whether users will be allowed to extract their data should they<br />
wish to move their activities from one appliance or Web site to another. In<br />
some c<strong>as</strong>es, the law could create a right of data portability, in addition to merely<br />
insisting on a clear statement of a site’s policies.<br />
A requirement of data portability is a generative insurance policy applying to<br />
individual data wherever it might be stored. And the requirement need not be<br />
onerous. It could apply only to uniquely provided personal data such <strong>as</strong> photos<br />
and documents, and mandate only that such data ought to readily be extractable<br />
by the user in some standardized form. Maintaining data portability will help<br />
people p<strong>as</strong>s back and forth between the generative and the non-generative, and,<br />
by permitting third-party backup, it will also help prevent a situation in which a<br />
non-generative service suddenly goes offline, with no recourse for those who<br />
have used the service to store their data.<br />
Appliance Neutrality<br />
Re<strong>as</strong>onable people disagree on the value of defining and legally mandating<br />
network neutrality. But if there is a present worldwide threat to neutrality in the<br />
movement of bits, it comes from enhancements to traditional and emerging<br />
“appliancized” services like Google m<strong>as</strong>h-ups and Facebook apps, in which the<br />
service provider can be pressured to modify or kill others’ applications on the<br />
fly. Surprisingly, parties to the network neutrality debate—who have focused<br />
on ISPs—have yet to weigh in on this phenomenon.<br />
In the late 1990’s, Microsoft w<strong>as</strong> found to possess a monopoly in the market for<br />
PC operating systems. 17 Indeed, it w<strong>as</strong> found to be abusing that monopoly to<br />
favor its own applications—such <strong>as</strong> its Internet Explorer browser—over thirdparty<br />
software, against the wishes of PC makers who wanted to sell their<br />
hardware with Windows preinstalled but adjusted to suit the makers’ t<strong>as</strong>tes.<br />
Microsoft w<strong>as</strong> forced by the law to meet ongoing requirements to maintain a<br />
17 United States v. Microsoft Corp., 84 F. Supp. 2d 9, 19 (D.D.C. 1999).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 109<br />
level playing field between third-party software and its own by allowing thirdparty<br />
software to be pre-installed on new Windows computers.<br />
We have not seen the same requirements arising for appliances that do not<br />
allow, or strictly control, the ability of third parties to contribute from the start.<br />
So long <strong>as</strong> the market’s favorite video game console maker never opens the<br />
door to generative third-party code, it is hard to see how the firm could be<br />
found to be violating competition law. A manufacturer is entitled to make an<br />
appliance and to try to bolt down its inner workings so that they cannot be<br />
modified by others. So when should we consider network neutrality-style<br />
mandates for appliancized systems? The answer lies in that subset of<br />
appliancized systems that seeks to gain the generative benefits of third-party<br />
contribution at one point in time while reserving the right to exclude it later.<br />
The common law recognizes vested expectations. For example, the law of<br />
adverse possession dictates that people who openly occupy another’s private<br />
property without the owner’s explicit objection (or, for that matter, permission)<br />
can, after a lengthy period of time, come to legitimately acquire it. More<br />
commonly, property law can find prescriptive e<strong>as</strong>ements—rights-of-way across<br />
territory that develop by force of habit—if the owner of the territory fails to<br />
object in a timely f<strong>as</strong>hion <strong>as</strong> people go back and forth across it. These and<br />
related doctrines point to a deeply held norm: Certain consistent behaviors can<br />
give rise to obligations, sometimes despite fine print that tries to prevent those<br />
obligations from coming about.<br />
Applied to the idea of application neutrality, this norm of protecting settled<br />
expectations might suggest the following: If Microsoft wants to make the Xbox<br />
a general purpose device but still not open to third-party improvement, no<br />
regulation should prevent it. But if Microsoft does welcome third-party<br />
contribution, it should not be able to subsequently impose barriers to outside<br />
software continuing to work. Such behavior is a bait-and-switch that is not e<strong>as</strong>y<br />
for the market to anticipate and that stands to allow a platform maker to exploit<br />
habits of generativity to reach a certain plateau, dominate the market, and then<br />
make the result proprietary—exactly what the Microsoft Web browser c<strong>as</strong>e<br />
rightly w<strong>as</strong> brought to prevent.<br />
The free software movement h<strong>as</strong> produced some great works,<br />
but under prevailing copyright law even the slightest bit of<br />
“poison,” in the form of code from a proprietary source, could<br />
amount to legal liability for anyone who copies or even uses<br />
the software.<br />
Generative Software<br />
At the code layer, it is not e<strong>as</strong>y for the law to maintain neutrality between the<br />
two models of software production that have emerged with the Net: Proprietary
110 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
software whose source code recipe is nearly always hidden, and free software—<br />
free not in terms of the price, but the openness of its code to public review and<br />
modification. The free software movement h<strong>as</strong> produced some great works,<br />
but under prevailing copyright law even the slightest bit of “poison,” in the<br />
form of code from a proprietary source, could amount to legal liability for<br />
anyone who copies or even uses the software. These standards threaten the<br />
long-term flourishing of the free software movement: The risks are more<br />
burdensome than need be.<br />
But there are some changes to the law that would help. The kind of law that<br />
shields Wikipedia and Web site hosting companies from liability for<br />
unauthorized copyrighted material contributed by outsiders, at le<strong>as</strong>t so long <strong>as</strong><br />
the organization acts expeditiously to remove infringing material once it is<br />
notified, ought to be extended to the production of code itself. Code that<br />
incorporates infringing material ought not be given a free p<strong>as</strong>s, but those who<br />
have promulgated it without knowledge of the infringement would have a<br />
chance to repair the code or ce<strong>as</strong>e copying it before becoming liable.<br />
Modest changes in patent law could help <strong>as</strong> well. If those who see value in<br />
software patents are correct, infringement is rampant. And to those who think<br />
patents chill innovation, the present regime needs reform. To be sure, amateurs<br />
who do not have houses to lose to litigation can still contribute to free software<br />
projects—they are judgment proof. Others can contribute anonymously,<br />
evading any claims of patent infringement since they simply cannot be found.<br />
But this turns coding into a gray market activity, eliminating what otherwise<br />
could be a thriving middle cl<strong>as</strong>s of contributing firms should patent warfare<br />
ratchet into high gear.<br />
The law can help level the playing field. For patent infringement in the United<br />
States, the statute of limitations is six years; for civil copyright infringement it is<br />
three. Unfortunately, this limit h<strong>as</strong> little meaning for computer code because<br />
the statute of limitations starts from the time of the l<strong>as</strong>t infringement. Every<br />
time someone copies (or perhaps even runs) the code, the clock starts ticking<br />
again on a claim of infringement. This should be changed. The statute of<br />
limitations could be clarified for software, requiring that anyone who suspects<br />
or should suspect his or her work is being infringed sue within, for instance, one<br />
year of becoming aware of the suspect code. For example, the acts of those<br />
who contribute to free software projects—namely, rele<strong>as</strong>ing their code into a<br />
publicly accessible datab<strong>as</strong>e like SourceForge—could be enough to start the<br />
clock ticking on that statute of limitations. In the absence of such a rule,<br />
lawyers who think their employers’ proprietary interests have been<br />
compromised can wait to sue until a given piece of code h<strong>as</strong> become wildly<br />
popular—essentially sandbagging the process in order to let damages rack up.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 111<br />
Generative Licenses<br />
There is a parallel to how we think about balancing generative and sterile code<br />
at the content layer: Legal scholars Lawrence Lessig and Yochai Benkler, <strong>as</strong> well<br />
<strong>as</strong> others, have stressed that even the most rudimentary mixing of cultural icons<br />
and elements, including snippets of songs and video, can accrue thousands of<br />
dollars in legal liability for copyright infringement without harming the market<br />
for the original proprietary goods. 18 Benkler believes that the explosion of<br />
amateur creativity online h<strong>as</strong> occurred despite this system. The high costs of<br />
copyright enforcement and the widespread availability of tools to produce and<br />
disseminate what he calls “creative cultural bricolage” currently allow for a<br />
variety of voices to be heard even when what they are saying is theoretically<br />
sanctionable by fines up to $30,000 per copy made, $150,000 if the infringement<br />
is done “willfully.” 19 As with code, the status quo shoehorns otherwise laudable<br />
activity into a sub-rosa gray zone.<br />
As tethered appliances begin to take up more of the information space, making<br />
information that much more regulable, we have to guard against the possibility<br />
that content produced by citizens who cannot e<strong>as</strong>ily clear permissions for all its<br />
ingredients will be squeezed out. Even the gray zone will constrict.<br />
* * *<br />
Regimes of legal liability can be helpful when there is a problem and no one h<strong>as</strong><br />
taken ownership of it. No one fully owns today’s problems of copyright<br />
infringement and defamation online, just <strong>as</strong> no one fully owns security problems<br />
on the Net. But the solution is not to conscript intermediaries to become the<br />
Net police.<br />
Under prevailing law, Wikipedia could get away with much less stringent<br />
monitoring of its articles for plagiarized work, and it could leave plainly<br />
defamatory material in an article but be shielded in the United States by the<br />
Communications Decency Act provision exempting those hosting material from<br />
responsibility for what others have provided. Yet Wikipedia polices itself<br />
according to an ethical code that encourages contributors to do the right thing<br />
rather than the required thing or the profitable thing.<br />
To harness Wikipedia’s ethical instinct across the layers of the generative<br />
Internet, we must figure out how to inspire people to act humanely in digital<br />
environments. This can be accomplished with tools—some discussed above,<br />
others yet to be invented. For the generative Internet to come fully into its<br />
18 LAWRENCE LESSIG, REMIX: MAKING ART AND COMMERCE THRIVE IN A HYBRID ECONOMY<br />
(2008); YOCHAI BENKLER, THE WEALTH OF NETWORKS (2006).<br />
19 YOCHAI BENKLER, THE WEALTH OF NETWORKS 275 (2006).
112 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
own, it must allow us to exploit the connections we have with each other. Such<br />
tools allow us to express and live our civic instincts online, trusting that the<br />
expression of our collective character will be one at le<strong>as</strong>t <strong>as</strong> good <strong>as</strong> that<br />
imposed by outside sovereigns—sovereigns who, after all, are only people<br />
themselves.<br />
Our generative technologies need technically skilled people of good will to keep<br />
them going, and the fledgling generative activities—blogging, wikis, social<br />
networks—need artistically and intellectually skilled people of goodwill to serve<br />
<strong>as</strong> true alternatives to a centralized, industrialized information economy that<br />
<strong>as</strong>ks us to identify only <strong>as</strong> consumers of meaning rather than <strong>as</strong> makers of it.<br />
The deciding factor in whether our current infr<strong>as</strong>tructure can endure will be the<br />
sum of the perceptions and actions of its users. Traditional state sovereigns,<br />
pan-state organizations, and formal multi-stakeholder regimes have roles to<br />
play. They can reinforce conditions necessary for generative blossoming, and<br />
they can also step in when mere generosity of spirit cannot resolve conflict. But<br />
that generosity of spirit is a society’s crucial first line of moderation.<br />
Our fortuitous starting point is a generative device on a neutral Net in tens of<br />
millions of hands. Against the trend of sterile devices and services that will<br />
replace the PC and Net stand new architectures like those of Boxee and<br />
Android. To maintain that openness, the users of those devices must<br />
experience the Net <strong>as</strong> something with which they identify and belong. We must<br />
use the generativity of the Net to engage a constituency that will protect and<br />
nurture it.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 113<br />
A Portrait of the Internet<br />
<strong>as</strong> a Young Man<br />
By Ann Bartow *<br />
Introduction<br />
The core theory of Jonathan Zittrain’s 2008 book The Future of the Internet—And<br />
How to Stop It is this: Good laws, norms, and code are needed to regulate the<br />
Internet, to prevent bad laws, norms, and code from compromising its creative<br />
capabilities and fettering its fecund flexibility. A far snarkier, if less alliterative,<br />
summary would be “We have to regulate the Internet to preserve its open,<br />
unregulated nature.”<br />
Zittrain uses brief, informal accounts of p<strong>as</strong>t events to build two main theories<br />
that dominate the book. First, he claims that open access, which he calls<br />
generativity, is under threat by a trend toward closure, which he refers to <strong>as</strong><br />
tetheredness, which is counterproductively favored by proprietary entities.<br />
Though consumers prefer openness and the autonomy it confers, few take<br />
advantage of the opportunities it provides, and therefore undervalue it and too<br />
readily cede it in favor of the promise of security that tetheredness brings.<br />
Second, he argues that if the Internet is to find salvation it will be by the grace<br />
of “true netizens,” volunteers acting collectively in good faith to cultivate<br />
positive social norms online.<br />
One of the themes of the James Joyce novel first published in 1916, A Portrait of<br />
the Artist <strong>as</strong> a Young Man 1 is the Irish quest for autonomous rule. Jonathan<br />
Zittrain’s The Future of the Internet—And How to Stop It is similarly infused with<br />
the author’s desire for principled, legitimate governance—only of the place<br />
called cyberspace, rather than the author’s meatspace homeland.<br />
Portrait’s protagonist, Stephen Dedalus, internally defines himself <strong>as</strong> an artist<br />
through a nonlinear process of experiences and epiphanies. He consciously<br />
decides that it should be his mission to provide a voice for his family, friends,<br />
and community through his writing. Though Dedalus opts out of the<br />
* Professor of Law, University of South Carolina School of Law. This essay w<strong>as</strong> adapted<br />
from A Portrait of the Internet <strong>as</strong> a Young Man, 108 MICH. L. REV. 1079 (2010), available at<br />
http://www.michiganlawreview.org/articles/a-portrait-of-the-internet-<strong>as</strong>-a-youngman.<br />
The author dedicates this essay to her son C<strong>as</strong>ey, and to the memory of C. Edwin<br />
Baker.<br />
1 JAMES JOYCE, A PORTRAIT OF THE ARTIST AS A YOUNG MAN (1916).
114 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
traditional forms of participation in society, he envisions his writing <strong>as</strong> a way to<br />
productively influence society. Jonathan Zittrain charts the development of the<br />
Internet <strong>as</strong> a nonlinear process wrought by both conscious hard work and<br />
sweeping serendipity. He also strives to provide a voice for technologically elite<br />
Internet users, and to influence the development of online culture. He paints a<br />
portrait of the future Internet <strong>as</strong> chock full of so many enigm<strong>as</strong> and puzzles that<br />
it will keep the cyberlaw professors busy for decades, even though according to<br />
Zittrain, law <strong>as</strong> traditionally conceptualized will not be important.<br />
In addition to invoking Joyce, I chose the title of this essay for its decisive<br />
invocation of maleness. Embedded within Zittrain’s theories of generativity,<br />
there is also a perplexing gender story, in which men are fertile, crediting<br />
themselves with helping to “birth” the field of cyberlaw, 2 and engaging in<br />
stereotypically domestic pursuits such <strong>as</strong> “baking” restrictions into gadgetry. 3<br />
Non-generative appliances are deemed “sterile” 4 by Zittrain, sterility being the<br />
conceptual opposite of generativity. His deployment of reproductive imagery is<br />
odd. A metaphor equating an author’s creative output to a child is often<br />
invoked in the context of copyright law by people arguing that authors should<br />
have extensive control over the works they create. 5 Zittrain’s variation<br />
characterizes controlled technological innovations <strong>as</strong> unable to produce progeny<br />
at all. The metaphor works better if tetheredness is instead envisaged <strong>as</strong> a form<br />
of birth control, preventing unwanted offspring only. Certainly the producers<br />
of closed devices or locked software are able to provide, and generally<br />
enthusi<strong>as</strong>tic about providing, new and improved versions of their goods and<br />
services to paying customers.<br />
2 See, e.g., Lawrence Lessig, Amazon.com Customer Review of THE FUTURE OF THE<br />
INTERNET—AND HOW TO STOP IT, Cyberlaw 2.0,<br />
http://www.amazon.com/review/R131R71HS3YJVG/ref=cm_cr_rdp_perm (Dec. 4,<br />
2008) (“The field of cyberlaw, or the law of the Internet—a field I helped birth … h<strong>as</strong> suffered<br />
because people like me have spent too much time cheerleading, and not enough time<br />
focusing the world on the real problems and threats that the Internet h<strong>as</strong> produced.”)<br />
(emph<strong>as</strong>is added); see also Daniel J. Solove, Privacy and Power: Computer Datab<strong>as</strong>es and Metaphors<br />
for Information Privacy, 53 STAN. L. REV. 1393, 1416–17 (2001) (noting that Roger Clarke is<br />
credited with coining the term “dataveillance”). Roger Clarke published suggestions for<br />
Internet regulations <strong>as</strong> early <strong>as</strong> 1988. See Roger A. Clarke, Information Technology and<br />
Dataveillance, 31 COMM. ACM 498, 508–11 (1988).<br />
3 JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET—AND HOW TO STOP IT 2 (2008).<br />
(“Jobs w<strong>as</strong> not shy about these restrictions baked into the iPhone.”). [hereinafter ZITTRAIN,<br />
THE FUTURE OF THE INTERNET].<br />
4 See, e.g., id. at 2 (“The iPhone is the opposite. It is sterile.”), 73 (“Generative tools are not<br />
inherently better than their non-generative (‘sterile’) counterparts.”).<br />
5 See Malla Pollack, Towards a Feminist Theory of the Public Domain, or Rejecting the Gendered Scope of<br />
the United States Copyrightable and Patentable Subject Matter, 12 WM. & MARY J. WOMEN & L. 603,<br />
606–07 (2006); see William Patry, Gender and Copyright, THE PATRY COPYRIGHT BLOG, Jun. 20,<br />
2008, http://williampatry.blogspot.com/2008/06/gender-and-copyright.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 115<br />
Zittrain offers a well-executed collection of stories that are intended to anchor<br />
his global theories about how the Internet should optimally function, and how<br />
two cl<strong>as</strong>ses of Internet users should behave: The technologies should be<br />
generative, but also monitored to ensure that generativity is not abused by either<br />
the government or by scoundrels; elite Internet users with, <strong>as</strong> one might say<br />
today, “mad programming skilz” should be the supervisors of the Internet,<br />
scrutinizing new technological developments and establishing and modeling<br />
productive social norms online; and average, non–technically proficient Internet<br />
users should follow these norms, and should not demand security me<strong>as</strong>ures that<br />
unduly burden generativity.<br />
The anecdotes are entertaining and educational, but they do not constructively<br />
cohere into an instruction manual on how to avoid a bad future for people<br />
whose interests may not be recognized or addressed by what is likely to be a<br />
very homogeneous group of elites manning (and I do mean man-ning, given the<br />
m<strong>as</strong>culine dominance of the field) the virtual battlements they voluntarily design<br />
to defend against online forces of evil. And some of the conclusions Zittrain<br />
draws from his stories are questionable. So, I question them below.<br />
Generativity Versus Tetheredness<br />
Is a False Binary<br />
Pitting generativity against tetheredness creates a false binary that drives a lot of<br />
Zittrain’s theorizing. The book w<strong>as</strong> published in May of 2008, but its origins<br />
can be found in his earlier legal scholarship and mainstream media writings. In<br />
2006, Jonathan Zittrain published an article entitled The Generative Internet. 6 In it,<br />
he <strong>as</strong>serted the following:<br />
Cyberlaw’s challenge ought to be to find ways of regulating—<br />
though not necessarily through direct state action—which code<br />
can and cannot be readily disseminated and run upon the<br />
generative grid of Internet and PCs, lest consumer sentiment<br />
and preexisting regulatory pressures prematurely and tragically<br />
terminate the grand experiment that is the Internet today. 7<br />
Like the article, the book is useful for provoking thought and discussion, and it<br />
teaches the reader many disparate facts about the evolution of a number of<br />
different technologies. But it does not provide much direction for activists,<br />
especially not those who favor using laws to promote order. Zittrain h<strong>as</strong> come<br />
6 Jonathan L. Zittrain, The Generative Internet, 119 HARV. L. REV. 1974 (2006) [hereinafter<br />
Zittrain, The Generative Internet].<br />
7 Id. at 1979.
116 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
to bury cyberspace law <strong>as</strong> promulgated by governments, not to praise it.<br />
“Cyberlaw” <strong>as</strong> redefined by Zittrain is no longer the science of adapting existing<br />
real-space legal constructs to the online environment. Instead it is a collection<br />
of best practices chosen by people with the technological proficiency to impose<br />
them, top down, on the ignorant folks who are selfishly driven by their shallow<br />
consumer sentiments (viz., a desire for simplicity and security over openness and<br />
generativity).<br />
An abstract for the book, featured at its dedicated website, states:<br />
The Internet’s current trajectory is one of lost opportunity. Its<br />
salvation, Zittrain argues, lies in the hands of its millions of<br />
users. Drawing on generative technologies like Wikipedia that<br />
have so far survived their own successes, this book shows how<br />
to develop new technologies and social structures that allow<br />
users to work creatively and collaboratively, participate in<br />
solutions, and become true “netizens.” 8<br />
I will bluntly state (splitting an infinitive in the process) that I did not learn how<br />
to develop new technologies or new social structures from reading this book. It<br />
convinced me that new technologies and new social structures could contribute<br />
productively to the Internet if they develop appropriately, but Zittrain does not<br />
provide road maps or an instruction manual for developing them. He calls for<br />
“[c]ivic technologies [that] seek to integrate a respect for individual freedom and<br />
action with the power of cooperation,” but doesn’t paint a clear picture of<br />
which precise qualities these technologies or social structures would have,<br />
beyond cultivating generativity. 9<br />
Zittrain relentlessly informs the reader that generativity is a very good thing—<br />
except when it is abused by malefactors. But what, exactly, is generativity?<br />
Zittrain invokes the terms generative, non-generative, and generativity<br />
constantly throughout the book (over 500 times), but the definition of<br />
generative doesn’t remain constant. Sometimes it means creative or innovative,<br />
while other times it connotes openness, accessibility, or freedom. 10<br />
8 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3.<br />
9 Jonathan Zittrain, How to Get What We All Want, CATO UNBOUND, May 6, 2009,<br />
http://www.cato-unbound.org/2009/05/06/jonathan-zittrain/how-to-get-what-weall-want/.<br />
10 Compare ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 84 (“Generative systems<br />
allow users at large to try their hands at implementing and distributing new uses, and to fill a<br />
crucial gap when innovation is undertaken only in a profit-making model …”), with id. at 113<br />
(“[T]he PC telephone program Skype is not amenable to third-party changes and is tethered<br />
to Skype for its updates. Skype’s distribution partner in China h<strong>as</strong> agreed to censor words
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 117<br />
Zittrain had written previously that “Generativity denotes a technology’s overall<br />
capacity to produce unprompted change driven by large, varied, and<br />
uncoordinated audiences.” 11 Similarly, in the book he says, “Generativity is a<br />
system’s capacity to produce unanticipated change through unfiltered contributions from broad<br />
and varied audiences.” 12 He lists five elements of generativity:<br />
(1) how extensively a system or technology leverages a set of<br />
possible t<strong>as</strong>ks; (2) how well it can be adapted to a range of<br />
t<strong>as</strong>ks; (3) how e<strong>as</strong>ily new contributors can m<strong>as</strong>ter it; (4) how<br />
accessible it is to those ready and able to build on it; and (5)<br />
how transferable any changes are to others— including (and<br />
perhaps especially) non-experts. 13<br />
Generative also seems to mean idiot-resistant. In his article The Generative<br />
Internet he explains that PCs are highly adaptable machines that are connected to<br />
a network with little centralized control, resulting in “a grid that is nearly<br />
completely open to the creation and rapid distribution of the innovations of<br />
technology-savvy users to a m<strong>as</strong>s audience that can enjoy those innovations<br />
without having to know how they work.” 14 In the book, he makes the same<br />
point repeatedly—that most “mainstream” or “rank-and-file” computer users<br />
are either p<strong>as</strong>sive beneficiaries or victims of generativity, rather than generative<br />
actors. 15 There is a highly influential generative cl<strong>as</strong>s of individuals who use<br />
generativity in socially productive ways. There is a nefarious group of<br />
reprobates who abuse generativity to create online havoc. And then there are<br />
the rest of the people online, sending and receiving emails, reading and writing<br />
blogs, participating on social-networking sites, renewing antivirus subscriptions,<br />
banking, shopping, and reading newspapers online. These users are blithely<br />
unaware of the generativity that provided this v<strong>as</strong>t electronic bounty and<br />
complacently believe that, <strong>as</strong> long <strong>as</strong> they continue to pay an Internet service<br />
provider (“ISP”) for Internet access, its delivery will remain relatively smooth<br />
like ‘Falun Gong’ and ‘Dalai Lama’ in its text messaging for the Chinese version of the<br />
program. Other services that are not generative at the technical layer have been similarly<br />
modified …”).<br />
11 Zittrain, The Generative Internet, supra note 6, at 1980.<br />
12 JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET—AND HOW TO STOP IT, supra note 3<br />
at 70 (emph<strong>as</strong>is in original).<br />
13 Id. p. 71.<br />
14 Zittrain, The Generative Internet, supra note 6.<br />
15 See, e.g., id. at 3; see also ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 4, 8, 43,<br />
44–45, 51, 56, 59, 78, 100, 102, 130, 151–52, 155, 59–60, 198, 243, 245.
118 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
and uninterrupted. When they call for more security for electronic devices, they<br />
themselves are the “damage” that generativity h<strong>as</strong> to “route around.” 16<br />
The anti-generative concept of tetheredness also does some definitional shapeshifting<br />
throughout the tome. Sometimes it means unmodifiable, while other<br />
times it means controlled by proprietary entities, who may or may not facilitate,<br />
or even tolerate, alterations of their wares by end users. According to Zittrain,<br />
the dangers of tethers are twofold: Private companies can regulate how<br />
consumers use their products, and services and governments can use them to<br />
censor or spy on their citizens. 17<br />
Tethers can be good things if you are a mountain climber, or if you don’t want<br />
your horse to run off without you. And far more pertinently, tethers facilitate<br />
software updating for flaw-fixing and hole-patching purposes. Untethered<br />
software would require manual updates, a labor-intensive prospect that would<br />
require a degree of technical proficiency that many Internet users may lack.<br />
How many people are prepared to give up the advantages of tetheredness in the<br />
interest of preserving generativity is unclear. Without tethered appliances, the<br />
functionality of the Internet will be compromised. Try using a program that is<br />
no longer updated or supported by its vendor. Its obsolescence may render it<br />
untethered, but unless you have some pretty good programming chops, its<br />
usefulness will decline rapidly. Zittrain fears people will exchange generativity<br />
for security in binary f<strong>as</strong>hion, but the relationship between tetheredness and<br />
convenience needs to be taken into account, <strong>as</strong> these variables will also affect<br />
consumer preferences and behaviors.<br />
The fundamental security most people seek is probably operability. Any threat<br />
to serviceability, whether from too much generativity or too many tethers, will<br />
provoke a call for action from users. I couldn’t have accessed the downloadable<br />
version of Zittrain’s book without a host of tethered utilities, including my<br />
computer’s operating system, my Internet browser, and Adobe Acrobat, which<br />
all update automatically with great frequency, <strong>as</strong> I consented to allow them to<br />
do when I agreed to the terms of use laid out in the <strong>as</strong>sociative end user license<br />
agreements (“EULAs”). The same with my printer software, my antivirus<br />
program, my online media players, the online games I play, and every other<br />
Internet-related utility I use. In a sense, this proves Zittrain’s <strong>as</strong>sertion that we<br />
have ceded control over the mechanisms of online interface to electronic le<strong>as</strong>h-<br />
16 This is a sideways reference to the John Gilmore quote, “The Net interprets censorship <strong>as</strong><br />
damage and routes around it.” See Philip Elmer-DeWitt, First Nation in Cyberspace, TIME,<br />
Dec. 6, 1993, at 62, 64, available at<br />
http://www.time.com/time/magazine/article/0,9171,979768,00.html.<br />
17 See, e.g., ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 56–57, 113 (discussing<br />
Skype), 109–10, 113 (discussing OnStar), 113 (discussing China’s use of Google.cn), 210–14<br />
(discussing mobile phones).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 119<br />
wielding tyrants. But, he may have the timing <strong>as</strong> well <strong>as</strong> motivation wrong. I<br />
suspect most of us deferred to tethering commercial enterprises very early in the<br />
evolution of the mainstream Internet, rather than recently. Zittrain references<br />
pioneering ISPs CompuServe and AOL <strong>as</strong> proprietary services that were<br />
overwhelmed by the generativity of PCs and the Internet. 18 My initial<br />
nonacademic experiences with the Internet comprised waiting anxiously for<br />
CompuServe and then AOL to finish installing updates when I needed to check<br />
my e-mail, and I had to pay for my Internet time by the minute. Things only<br />
went downhill when AOL went to an “all you can eat” payment structure,<br />
providing unlimited Internet for a fixed monthly fee. Users surged but AOL’s<br />
capacity could not meet the demand. 19 Users didn’t want security, they wanted<br />
performance. Tetheredness, or something similar, may have been linked in<br />
some way to AOL’s difficulties meeting its customers’ demand, but overselling<br />
and insufficient server capacity were the true culprits in terms of inhibiting<br />
operability. In addition, if Zittrain is correct that CompuServe and AOL<br />
exemplify the evils of tethering, it’s pretty clear the market punished those<br />
entities pretty harshly without Internet governance-style interventions.<br />
Software and electronic devices can be simultaneously generative and tethered.<br />
And it is unfair to criticize people who quite re<strong>as</strong>onably rely on tetheredness to<br />
keep their computers and electronic equipment updated and fully functional.<br />
Many average Internet users might like more transparency about the nature and<br />
extent of the tethers that connect their computers to large multinational<br />
corporations, but short of having actual laws that require relevant disclosures,<br />
this consumer desire is unlikely to be met. For them, generativity is unlikely to<br />
be helpful or enlightening, <strong>as</strong> Zittrain correctly notes, because they are not<br />
skilled enough to take advantage of it. In the absence of helpful laws, they are<br />
at the mercy of business models.<br />
Generativity: The Good,<br />
the Bad & the Ugly<br />
Zittrain’s stories are intended to show that generative technologies are better<br />
than tethered ones. But another strand of his narrative illustrates that<br />
generativity can be used destructively, to support the contention that it cannot<br />
18 The PC revolution w<strong>as</strong> launched with PCs that invited innovation by others. So too with the<br />
Internet. Both were generative; they were designed to accept any contribution that followed<br />
a b<strong>as</strong>ic set of rules (either coded for a particular operating system, or respecting the<br />
protocols of the Internet). Both overwhelmed their respective proprietary, non-generative<br />
competitors, such <strong>as</strong> the makers of stand-alone word processors and proprietary online<br />
services like CompuServe and AOL. ZITTRAIN, THE FUTURE OF THE INTERNET, supra note<br />
3, at 23–25.<br />
19 See, e.g., Timothy C. Barmann, Judge to rule this week on AOL service, CYBERTALK, Oct. 26, 1997,<br />
http://www.cybertalk.com/102697b.htm.
120 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
be unfettered. At its worst, he warns, generativity will enable bad actors to<br />
exploit tethers for nefarious purposes, while tethers will simultaneously restrain<br />
positive generative responses to these challenges. His accounts of degenerate<br />
generativity rest une<strong>as</strong>ily with his exhortation that facilitating generativity should<br />
be the guiding principle of Internet governance.<br />
He also suggests deploying the “generative principle to determine whether and<br />
when it makes sense to violate the end-to-end principle” in the context of<br />
debates about network neutrality. 20 And the quantum of generativity that is<br />
promoted becomes the me<strong>as</strong>ure for <strong>as</strong>sessing the legitimacy and effectiveness of<br />
what he characterizes <strong>as</strong> the intrusions of cyberlaw. He writes:<br />
The touchstone for judging such efforts should be according<br />
to the generative principle: do the solutions encourage a system<br />
of experimentation? Are the users of the system able, so far <strong>as</strong><br />
they are interested, to find out how the resources they<br />
control—such <strong>as</strong> a PC—are participating in the environ–<br />
ment? 21<br />
Fostering generativity thus becomes the Prime Directive of Internet<br />
governance. 22 But there are problems he raises elsewhere in the book that<br />
generativity may not address, or may in fact exacerbate. For example, Zittrain<br />
references OnStar a number of times, warning that it can be used by law<br />
enforcement for surveillance purposes because it is tethered, and can be<br />
accessed remotely. 23 Putting <strong>as</strong>ide questions about whether OnStar is accurately<br />
described <strong>as</strong> part of the Internet, one wonders of what practical use OnStar<br />
would be to its clients if it w<strong>as</strong>n’t tethered. OnStar seems to be a service that<br />
caters to people who want higher levels of proactive information and security<br />
when they are driving than the combination of a GPS unit and mobile phone<br />
can provide. OnStar customers don’t want generativity; they want someone to<br />
call the police and an ambulance or tow truck if they have an accident so they<br />
20 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 185.<br />
21 Id. at 173.<br />
22 “The Prime Directive is a plot device cooked up by a patently optimistic TV writer (either<br />
Trek producer Gene L. Coon or writer Theodore Sturgeon, depending on who you <strong>as</strong>k) in<br />
the mid-1960s. It’s a freshmen-year philosophy student’s reaction to the Cold War, when<br />
America and the Soviets were playing out their hostilities by proxy third-world conflicts.<br />
Effectively, they were interfering in the ‘development’ of underprivileged countries to<br />
further their own ends with some awful immediate and long-term results. In Roddenberry’s<br />
vision, humanity had evolved beyond such puppeteering and become an ‘advanced’ race.”<br />
See Jay Garmon, Why ‘Star Trek’s Prime Directive is stupid’, TECHREPUBLIC.COM, Feb. 12, 2007,<br />
http://blogs.techrepublic.com.com/geekend/?p=533.<br />
23 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 109–10, 113, 117–18, 187.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 121<br />
don’t have to, or to track down the location of their vehicle if it is stolen.<br />
Security means more to them than privacy, and if they don’t consciously realize<br />
they are exchanging one for the other when they sign up with OnStar, it seems<br />
to me the best solution is to require OnStar to inform them of this trade-off in<br />
simple and unambiguous terms. The law could also require OnStar to provide<br />
further information, perhaps including a primer on the search and seizure<br />
jurisprudence of Fourth Amendment law. Making OnStar generative, so that<br />
private citizens can readily discern incursions by government actors, would not<br />
give OnStar customers any more of what they appear to want—a high level of<br />
security overtly linked to constant, dedicated supervision. Enhanced<br />
generativity might also provide opportunities for private spying or intentional<br />
service disruptions by the very villains Zittrain spills so much ink warning<br />
against.<br />
Many of his examples of useful online-governance initiatives rely on extensive<br />
amounts of volunteer labor. But the important technological innovations<br />
related to the Internet were motivated by some form of self-interest. The U.S.<br />
Defense Department developed the Internet <strong>as</strong> a decentralized communications<br />
system that would be difficult to disrupt during wartime. 24 Tim Berners-Lee<br />
invented the World Wide Web <strong>as</strong> a way to facilitate communications with other<br />
physicists. 25 Pornographers have long used spam, browser hijacking, and<br />
search-engine manipulation to reach the eyeballs of potential customers. 26 All<br />
may have relied on generativity (though one might question how open and<br />
accessible the Defense Department w<strong>as</strong>) but not all are socially beneficial. 27<br />
Sometimes Internet users may donate their labor involuntarily. Their online<br />
activities are harvested and bundled into what Zittrain applauds <strong>as</strong> the mediated<br />
wisdom of the m<strong>as</strong>ses. For example, he notes <strong>as</strong> follows:<br />
24 See Joseph D. Schleimer, Protecting Copyrights at the “Backbone” Level of the Internet, 15 UCLA<br />
ENT. L. REV. 139, 149 (2008); see also JANET ABBATE, INVENTING THE INTERNET 7–41 (1999).<br />
25 ABBATE, supra; see also Dick K<strong>as</strong>er, The Guy Who Did the WWW Thing at the Place Where He Did<br />
It, INFO. TODAY, Feb. 2004, at 30.<br />
26 See, e.g., Pornographers Can Fool You With Hi-Tech, FILTERGUIDE.COM,<br />
http://www.filterguide.com/pornsfool.htm (setting forth various ways in which<br />
pornographers use technology to fool children) (l<strong>as</strong>t visited Oct 21, 2009); PEW INTERNET &<br />
AMERICAN LIFE PROJECT, SPAM IS STARTING TO HURT EMAIL (2003),<br />
http://www.pewinternet.org/Press-Rele<strong>as</strong>es/2003/Spam-is-starting-to-hurtemail.<strong>as</strong>px<br />
(accounting for pornography-related spams’ impact on email).<br />
27 See generally Ann Bartow, Pornography, Coercion, and Copyright Law 2.0, 10 VAND. J. ENT. & TECH.<br />
L. 799, 800 (2008) (“Pornography is a dominant industrial force that h<strong>as</strong> driven the evolution<br />
of the Internet.”).
122 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
The value of aggregating data from individual sources is well<br />
known. Yochai Benkler approvingly cites Google Pagerank<br />
algorithms over search engines whose results are auctioned,<br />
because Google draws on the individual linking decisions of<br />
millions of Web sites to calculate how to rank its search results.<br />
If more people are inking to a Web site criticizing Barbie dolls<br />
than to one selling them, the critical site will, all else equal,<br />
appear higher in the rankings when a user searches for<br />
“Barbie.” 28<br />
But all else is unlikely to be equal. Mattel can hire reputation-defense<br />
companies like ReputationDefender 29 to bury the critical sites about Barbie<br />
using search engine-optimization techniques and to surreptitiously edit<br />
Wikipedia entries. 30 For-profit entities don’t just want to spy on and control<br />
their customers with tethers. They also want to manipulate <strong>as</strong> much of the<br />
Internet <strong>as</strong> possible to their benefit, and this logically includes taking steps to<br />
highlight positive information and minimize the visibility of disparagement by<br />
third parties.<br />
Additionally, collective actions by the online m<strong>as</strong>ses can be oppressive. If more<br />
people link to websites glorifying sexual violence against women than to<br />
websites where women are treated <strong>as</strong> if they are fully human, those sites appear<br />
higher in the rankings when a user searches for a wide variety of things related<br />
to sex. The same is potentially true for racist and homophobic sites and other<br />
content that depict discrete groups in derogatory ways. In this way, negative<br />
stereotypes can be reinforced and spread virally. 31<br />
Finally, in the Google PageRank example, the power and input of the m<strong>as</strong>ses is<br />
being harnessed, for profit, by a large corporation. Google is doubtlessly happy<br />
to use generative tools when they are effective. But contr<strong>as</strong>t the Google search<br />
28 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3 at 160 (footnote omitted).<br />
29 See id. at 230 (<strong>as</strong>serting that ReputationDefender uses “moral su<strong>as</strong>ion” <strong>as</strong> its primary<br />
technique for manipulating search-engine results). I offer a very different perspective on<br />
this. See Ann Bartow, Internet Defamation <strong>as</strong> Profit Center: The Monetization of Online Har<strong>as</strong>sment,<br />
32 HARV. J.L. & GENDER 383 (2009).<br />
30 Zittrain himself noted something similar, writing, “If the Wikipedia entry on Wal-Mart is<br />
one of the first hits in a search for the store, it will be important to Wal-Mart to make sure<br />
the entry is fair—or even more than fair, omitting true and relevant facts that nonetheless<br />
reflect poorly on the company.” See ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3<br />
at 139.<br />
31 See ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3 at 147. Zittrain tacitly<br />
acknowledges this: “There are plenty of online services whose choices can affect our lives.<br />
For example, Google’s choices about how to rank and calculate its search results can<br />
determine which ide<strong>as</strong> have prominence and which do not.”
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 123<br />
engine with Google’s Gmail, and it becomes apparent that the same company<br />
will keep a service tethered and proprietary when doing so best suits its<br />
purposes. 32<br />
The idiosyncratic online juggernaut that is Wikipedia, to which Zittrain devotes<br />
virtually an entire chapter, also illustrates some of the downsides of excessive<br />
generativity. 33 Wikipedia is an online encyclopedia that, at le<strong>as</strong>t in theory,<br />
anyone can edit. Zittrain is clearly enamored of it, writing, “Wikipedia stands at<br />
the apex of amateur endeavor: an undertaking done out of sheer interest in or<br />
love of a topic, built on collaborative software that enables a breathtakingly<br />
comprehensive result that is the sum of individual contributions, and one that is<br />
extraordinarily trusting of them.” 34 Zittrain provides a lot of information about<br />
Wikipedia, and the v<strong>as</strong>t majority of it skews positive. He writes, “Wikipedia h<strong>as</strong><br />
charted a path from crazy idea to stunning worldwide success”; 35 and<br />
“Wikipedia is the canonical bee that flies despite scientists’ skepticism that the<br />
aerodynamics add up”; 36 and <strong>as</strong>serts that the manner in which Wikipedia<br />
operates “is the essence of law.” 37 Perhaps echoing Zittrain’s enthusi<strong>as</strong>m, one<br />
researcher determined Wikipedia h<strong>as</strong> been cited in over 400 U.S. court<br />
opinions. 38<br />
Among myriad other facts and anecdotes, Zittrain notes that Wikipedia cofounder<br />
Larry Sanger is controversial because possibly he is given too much<br />
credit for his limited contributions to Wikipedia. 39 He also notes that another<br />
person involved with Wikipedia, former Wikimedia Foundation member Angela<br />
32 See generally Paul Boutin, Read My Mail, Ple<strong>as</strong>e, SLATE, Apr. 15, 2004,<br />
http://slate.msn.com/id/2098946; Deane, Critics Rele<strong>as</strong>e the Hounds on GMail,<br />
GADGETOPIA, Apr. 10, 2004, http://gadgetopia.com/post/2254; Google Watch,<br />
http://www.google-watch.org/gmail.html; Brian Morrissey, An Early Look at How Gmail<br />
Works, DMNEWS, Apr. 19, 2004, http://www.dmnews.com/an-early-look-at-howgmail-works/article/83946.<br />
33 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, chapter six.<br />
34 Id. at 96.<br />
35 Id. at 136.<br />
36 Id. at 148.<br />
37 Id. at 144.<br />
38 Lee F. Peoples, The Citation of Wikipedia in Judicial Opinions, 12 YALE J.L. & TECH.<br />
(forthcoming 2009), available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1272437.<br />
39 See ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 143 (“At times—they are<br />
constantly in flux—Wikipedia’s articles about Wikipedia note that there is controversy over<br />
the ‘co-founder’ label for Sanger.”); see also ZITTRAIN, THE FUTURE OF THE INTERNET, supra<br />
note 3 at 142, 145.
124 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
Beesley Starling, unsuccessfully fought to have her Wikipedia entry deleted. 40<br />
That a man who wants undeserved credit and a woman who wants no attention<br />
at all have likely both been thwarted by Wikipedians is something Zittrain seems<br />
to view <strong>as</strong> a positive indicator. Angela Beesley Starling probably feels very<br />
differently, especially if her re<strong>as</strong>ons for wanting her Wikipedia entry deleted<br />
included pressing personal safety concerns. The “talk” page of her Wikipedia<br />
biography quotes her <strong>as</strong> saying, “I’m sick of this article being trolled. It’s full of<br />
lies and nonsense.” 41 The forced publicity of Wikipedia entries is something all<br />
women may encounter under Wikipedia’s “system of self-governance that h<strong>as</strong><br />
many indicia of the rule of law without heavy reliance on outside authority or<br />
boundary.” 42 Research suggests that women, though 51% of the population,<br />
comprise a mere 13% of Wikipedia contributors, 43 for re<strong>as</strong>ons that probably<br />
have to do with the culture of this entity, which women may experience more<br />
negatively than men do.<br />
Certainly notable living feminists have been on the receiving end of a campaign<br />
of n<strong>as</strong>ty and untruthful edits to Wikipedia entries they would probably prefer<br />
not to have. Many entries on feminism have been written or edited by people<br />
who are actively hostile toward feminists, but they prevail because they seem to<br />
have a lot of free time and the few feminists who enter the wikifray seem to get<br />
driven out or edited into oblivion. To take just one example, the entries about<br />
Melissa Farley, 44 Catharine MacKinnon, 45and Sheila Jeffries 46 have all been<br />
40 Id. at 143.<br />
41 See Angela Beesley Starling Talkpage,<br />
WIKIPEDIA,http://en.wikipedia.org/wiki/Talk:Angela_Beesley_Starling (l<strong>as</strong>t visited<br />
Sept. 4, 2009) (“Angela Beesley h<strong>as</strong> tried to have her biography on Wikipedia deleted, saying<br />
‘I’m sick of this article being trolled. It’s full of lies and nonsense.’ The Register and<br />
Wikitruth claim that her objections are ironic in light of the generally liberal policy of<br />
Wikipedia administrators to the accuracy and notability of biographies in Wikipedia of living<br />
people. Seth Finkelstein, who tried to have his own entry from Wikipedia removed, called it<br />
‘a pretty stunning vote of no-confidence. Even at le<strong>as</strong>t some high-ups can’t eat the dog<br />
food.’”) (footnotes omitted).<br />
42 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3 at 143.<br />
43 See, e.g., Andrew LaVallee, Only 13% of Wikipedia Contributors Are Women, Study Says, WALL ST.<br />
J., Aug. 31, 2009, http://blogs.wsj.com/digits/2009/08/31/only-13-of-wikipediacontributors-are-women-study-says;<br />
Jennifer Van Grove, Study: Women and Wikipedia Don’t<br />
Mix, MASHABLE, Sept. 1, 2009, http://m<strong>as</strong>hable.com/2009/09/01/women-wikipedia;<br />
Cathy Davidson, Wikipedia and Women, HASTAC, Sept. 2, 2009,<br />
http://www.h<strong>as</strong>tac.org/blogs/cathy-davidson/wikipedia-and-women.<br />
44 See Melissa Farley, WIKIPEDIA, http://en.wikipedia.org/wiki/Melissa_Farley (l<strong>as</strong>t visited<br />
July28, 2009).<br />
45 See Catharine MacKinnon, WIKIPEDIA,<br />
http://en.wikipedia.org/wiki/Catharine_MacKinnon (l<strong>as</strong>t visited July 28, 2009).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 125<br />
heavily edited 47 by a rabid pornography proponent named Peter G. Werner 48<br />
who sometimes also uses the pseudonym Iamcuriousblue. 49 Each entry is the<br />
first result returned after a Google search of their names. He h<strong>as</strong> deleted or<br />
attempted to have deleted entries about other feminists. 50 He shows up under<br />
one identity or another in virtually every entry in which feminism is mentioned.<br />
And he successfully convinced the Wikipedia community to ban a feminist<br />
activist who vigorously contested his edits. 51 Any group that is not well<br />
represented within the Wikipedia editing community is likely to experience<br />
similar marginalization.<br />
Recently, Wikipedia announced that the entries of living people will receive a<br />
mandatory layer of intermediation. A new feature called “flagged revisions” will<br />
require that an experienced volunteer editor sign off on any changes before they<br />
become permanent and publicly accessible. 52 A New York Times report noted<br />
that this would “divide Wikipedia’s contributors into two cl<strong>as</strong>ses—experienced,<br />
trusted editors, and everyone else—altering Wikipedia’s implicit notion that<br />
everyone h<strong>as</strong> an equal right to edit entries.” 53 This seems to be one realization<br />
of what Zittrain broadly desires—control over the ignorant wikim<strong>as</strong>ses by a<br />
designated elite. But the project became significantly less collaborative and<br />
open when this change w<strong>as</strong> made.<br />
Wikipedia entries are generated by a m<strong>as</strong>sive <strong>as</strong>semblage of volunteers with<br />
unknown motivations and agend<strong>as</strong>. Group behavior is always unpredictable, a<br />
fact that Zittrain acknowledges but under-appreciates. One somewhat<br />
organized <strong>as</strong>semblage that calls itself Anonymous launches cyber-attacks that<br />
46 See Sheila Jeffreys, WIKIPEDIA, http://en.wikipedia.org/wiki/Sheila_Jeffreys (l<strong>as</strong>t visited<br />
July 28, 2009).<br />
47 See, e.g., Catharine MacKinnon Talkpage, WIKIPEDIA,<br />
http://en.wikipedia.org/wiki/Talk:Catharine_MacKinnon (l<strong>as</strong>t visited July 28, 2009).<br />
48 See Peter G Werner Userpage, WIKIPEDIA,<br />
http://en.wikipedia.org/wiki/User:Peter_G_Werner (l<strong>as</strong>t visited July 28, 2009).<br />
49 See Iamcuriousblue Userpage, WIKIPEDIA,<br />
http://en.wikipedia.org/wiki/User:Iamcuriousblue (l<strong>as</strong>t visited July 28, 2009).<br />
50 See, e.g., Articles for deletion/Cheryl Lindsey Seelhoff, WIKIPEDIA,<br />
http://en.wikipedia.org/w/index.php?title=Wikipedia:Articles_for_deletion/Cheryl<br />
_Lindsey_Seelhoff&oldid=150110815 (l<strong>as</strong>t visited Sept. 25, 2009), see also Nikki Craft<br />
Talkpage, WIKIPEDIA, http://en.wikipedia.org/wiki/Talk:Nikki_Craft (l<strong>as</strong>t visited Sept.<br />
25, 2009).<br />
51 Telephone interview with Nikki Craft; see also Nikki Craft Talkpage, supra (containing<br />
conversation in which user Iamcuriousblue discredits Nikki Craft’s Wikipedia article).<br />
52 Noam Cohen, Wikipedia to Limit Changes to Articles on People, N.Y. TIMES, Aug. 25, 2009, at B1.<br />
53 Id.
126 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
online norms do not seem to have any cognizable role in addressing. 54 As with<br />
Wikipedians, Anonymous is hostile to others and outsiders. One blogger noted:<br />
Interestingly … Anon never seems to take down the big sites.<br />
Walmart.com and the Pentagon are safe from his attentions.<br />
It’s not that Anon is a big fan of Walmart or the government.<br />
It’s just so much e<strong>as</strong>ier to attack the vulnerable. Big business<br />
and big government aren’t vulnerable on the Internet. They<br />
can afford not to be.<br />
Small discussion boards and blogs, particularly ones that<br />
advocate unpopular points of view, are often run by individuals<br />
who put up their own funds, if they can scrape them together,<br />
and who must be their own IT departments. They can’t afford<br />
the type of security that requires the big bucks. And since they<br />
have jobs (unlike Anon, apparently), they have to put their<br />
desire to maintain an Internet presence in the balance with<br />
supporting themselves and their families. When the crunch<br />
comes and time pressures set in, it’s not the Internet presence<br />
that wins out.<br />
So the actions of these “apolitical” hackers do have a political<br />
end: They remove unpopular, radical, fringe viewpoints from<br />
54 See e.g., Shaun Davies, ‘No Cussing’ Teen Faces Net Hate Campaign, NINEMSN NEWS, Jan. 18,<br />
2009, http://news.ninemsn.com.au/technology/720115/no-cussing-teen-faces-nethate-campaign<br />
(stating “McKay Hatch’s No Cussing Club, which encourages teens to ‘chill<br />
on the profanity’, claims to have over 20,000 members worldwide. Hatch, a 15-year-old<br />
from South P<strong>as</strong>adena in California, garnered wide media coverage for his anti-swearing<br />
campaign, including an appearance on Dr Phil. But at the beginning of the year, Hatch’s<br />
email inbox began clogging up with hate mail from an unknown source. Pizza and porn<br />
deliveries became commonplace for his family, who eventually called in the FBI after<br />
numerous receiving[sic] death threats and obscene phone calls. Anonymous appears to be<br />
behind the attacks, with threads on sites such <strong>as</strong> 4chan.org and 711chan.org identifying their<br />
members <strong>as</strong> the culprits. And the pain may not yet be over for the Hatch family—<br />
Anonymous appears to be planning future raids and h<strong>as</strong> threatened to ‘wipe this cancer [the<br />
No Cussing Club] from the face of the internet’.[sic] In one 4chan thread, a number of<br />
users bo<strong>as</strong>ted about sending bogus pizza deliveries and even prostitutes to the Hatchs’<br />
house, although it w<strong>as</strong> impossible to verify if these claims were genuine. The same thread<br />
also contained a credit card number purported to be stolen from Hatch’s father, phone<br />
numbers, the family’s home address and Hatch’s instant messenger address.”); see also Behind<br />
the Façade of the “Anonymous” Hate Group, RELIGIOUS FREEDOM WATCH, July 6, 2009,<br />
http://www.religiousfreedomwatch.org/media-newsroom/behind-the-facade-ofthe%E2%80%9Canonymous%E2%80%9D-hate-group/;<br />
see also Alex Wuori,<br />
ENCYCLOPAEDIA DRAMATICA, http://encyclopediadramatica.com/Alex_Wuori (l<strong>as</strong>t<br />
visited July 28, 2009).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 127<br />
the web. Big government doesn’t have to eliminate the<br />
subversive websites; Anon will do it. 55<br />
The activities of Anonymous have been characterized <strong>as</strong> domestic terrorism. 56<br />
And Anonymous certainly takes advantage of generative technologies, just <strong>as</strong><br />
Wikipedians with reprehensible agend<strong>as</strong> do. Zittrain <strong>as</strong>serts that bad actors like<br />
Anonymous are driving the demand for incre<strong>as</strong>ed security, 57 but he doesn’t<br />
provide any targeted mechanisms for hindering them, or explain why incre<strong>as</strong>ing<br />
security necessarily compromises productive generativity.<br />
The Zittrainnet’s Netizens:<br />
Overlords of Good Faith<br />
As with a James Joyce novel, there are a variety of transactions that the careful<br />
reader negotiates with the author. Each section h<strong>as</strong> to be read independently of<br />
the others, because while it may cohere internally, it may not combine with<br />
other delineated portions to paint a consistent picture of Zittrain’s preferred<br />
future for the Internet, which will hereafter be called the “Zittrainnet.”<br />
Some of the recommendations he makes invite broad democratic participation<br />
in Zittrainnet governance, while other times he warns against it and suggests<br />
ways to decre<strong>as</strong>e the threats posed “by outsiders—whether by vendors, malware<br />
authors, or governments.” 58 One wonders how something <strong>as</strong> disaggregated <strong>as</strong><br />
the Internet can have outsiders, until recognition dawns about what Zittrain is<br />
truly suggesting, at le<strong>as</strong>t part of the time, in terms of who should control the<br />
Internet to best ensure its evolution into the Zittrainnet: an elite circle of people<br />
with computer skills and free time who share his policy perspective.<br />
Technologists Rule<br />
Zittrain doesn’t contemplate “anyone” developing serviceable code. Zittrain’s<br />
view is that only a select few can take productive advantage of generativity, and<br />
within this elite group are bad actors <strong>as</strong> well <strong>as</strong> good. He thinks that cyberlaw is<br />
the appropriate mechanism to encourage positive uses of generativity while<br />
55 VeraCity, Dominator Tentacles, http://vera.wordpress.com/2007/08/24/dominatortentacles/<br />
(Aug. 24, 2007).<br />
56 VA. FUSION CTR., VA. DEP’T OF STATE POLICE, 2009 VIRGINIA TERRORISM THREAT<br />
ASSESSMENT 48 (2009), available at http://www.infowars.com/virginia-fusion-centerrele<strong>as</strong>eshomegrown-terrorism-document/.<br />
57 See generally ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at chapter 3. This is one<br />
of the central claims of the book.<br />
58 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 173.
128 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
thwarting the troublesome ones, cyberlaw being computer-code construction<br />
and norm entrepreneurship within Internet communities, <strong>as</strong> well <strong>as</strong> more<br />
traditionally recognized modes of law formation such <strong>as</strong> statutes and<br />
regulations. 59 As far <strong>as</strong> who exactly will divine good generativity from bad, and<br />
wield the mighty sword of cyberlaw to defend the former and defeat the latter,<br />
Zittrain is decidedly vague. In the “Solutions” section of the tome Zittrain lists<br />
“two approaches that might save the generative spirit of the Net”:<br />
The first is to reconfigure and strengthen the Net’s<br />
experimentalist architecture to make it fit better with its nowmainstream<br />
home. The second is to create and demonstrate<br />
the tools and practices by which relevant people and<br />
institutions can help secure the Net themselves instead of<br />
waiting for someone else to do it. 60<br />
By “relevant people and institutions” Zittrain seems to mean technologically<br />
skilled, Internet users of good will. 61 But <strong>as</strong> far <strong>as</strong> who it is that will<br />
“reconfigure and strengthen the Net’s experimentalist architecture” or who will<br />
“create and demonstrate the tools and practices” on behalf of these relevant<br />
people and institutions (shall we call them “generativators?”), Zittrain offers few<br />
specifics. He mentions universities generally, 62 and two organizations he is<br />
affiliated with specifically, Harvard University’s Berkman Center (where he is<br />
one of 13 Directors—all male, of course 63) and the Oxford Internet Institute<br />
(where he is a Research Associate 64), which he describes <strong>as</strong> “multidisciplinary<br />
academic enterprises dedicated to charting the future of the Net and improving<br />
it.” 65 Those who share his visions for the Zittrainnet are supposed to function<br />
<strong>as</strong> norm entrepreneurs, guiding lights for the undereducated, inadequately<br />
skilled online m<strong>as</strong>ses to follow, sheep-like.<br />
Less-relevant people are described <strong>as</strong> “[r]ank-and-file Internet users [who] enjoy<br />
its benefits while seeing its operation <strong>as</strong> a mystery, something they could not<br />
59 Id. chapter 5.<br />
60 Id. at 152.<br />
61 Id. at 246.<br />
62 Id. at 198, 245.<br />
63 See People, Berkman Center for Internet and Society at Harvard University,<br />
http://cyber.law.harvard.edu/people.<br />
64 See People, OXFORD INTERNET INSTITUTE UNIVERSITY OF OXFORD,<br />
http://www.oii.ox.ac.uk/people/?status=current&type=&keywords=zittrain.<br />
65 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 159.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 129<br />
possibly hope to affect.” 66 These ignorant non-generativators frighten Zittrain,<br />
because when he fears that, a crisis comes, they will pressure the government to<br />
enhance Internet security at the expense of Internet generativity, out of shortsighted,<br />
ill-informed perceptions of their own self-interest. 67 He knows better<br />
than they do what’s best for them.<br />
In a related article he published in Legal Affairs to promote the book, Zittrain<br />
explains:<br />
If the Internet does have a September 11 moment, a scared<br />
and frustrated public is apt to demand sweeping me<strong>as</strong>ures to<br />
protect home and business computers—a metaphorical USA<br />
Patriot Act for cyberspace. Politicians and vendors will likely<br />
h<strong>as</strong>ten to respond to these pressures, and the end result will be<br />
a radical change in the technology landscape. The biggest<br />
c<strong>as</strong>ualty will likely be a fundamental characteristic that h<strong>as</strong><br />
made both the Internet and the PC such powerful phenomena:<br />
their “generativity.” 68<br />
Many of the stories Zittrain tells in the book are intended to persuade readers<br />
that unless somebody does something, the Internet will do what the book’s<br />
cover suggests: derail and drive over a cliff. But after ominously warning his<br />
audience repeatedly that “Steps Must Be Taken Immediately,” the particulars of<br />
whom that somebody is and the details of what s/he should be doing are never<br />
made explicit.<br />
In addition, the law component of cyberlaw gets surprisingly little attention in<br />
the book, given that Zittrain is a law professor. According to Larry Lessig,<br />
“This book will redefine the field we call the law of cyberspace.” 69 This is<br />
66 Id. at 245.<br />
67 Id.<br />
68 See Jonathan Zittrain, Without a Net, LEGAL AFFAIRS, Jan./Feb. 2006, at 34, available at<br />
http://www.legalaffairs.org/issues/January-February-<br />
2006/feature_zittrain_janfeb06.msp; see also Lawrence Lessig, Z’s Book Is Out, LESSIG 2.0,<br />
May 1, 2008, http://lessig.org/b.og/just_plain_brilliant/ [hereinafter Lessig, Z’s Book Is<br />
Out]; Lawrence Lessig, The state of Cyberlaw, 2005, LESSIG 2.0, Dec. 30, 2005,<br />
http://lessig.org/b.og/read_this/ (stating “Legal Affairs h<strong>as</strong> a fant<strong>as</strong>tic collection of<br />
essays about various cyberspace related legal issues by some of my favorite writers about the<br />
subject. Zittrain’s piece outlines the beginning of his soon to be completed book. It shall be<br />
called Z-theory.”).<br />
69 See Lessig, Z’s Book Is Out, supra. Lessig explains his thoughts regarding the importance of<br />
Zittrain’s book in his blog:<br />
This book will redefine the field we call the law of cyberspace. That sounds<br />
like a hokey blurb no doubt. But hokeness [sic] does not mean it is not true.
130 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
worrisome to anyone still struggling to <strong>as</strong>certain the parameters of cyberlaw in<br />
the first instance, beyond the macro concerns about top-down versus bottomup<br />
approaches to governance identified by the scholars mentioned above. The<br />
role of law in Zittrainnet’s rule of law is extremely limited. Laws concerning<br />
jurisdiction, privacy, free speech, copyrights, and trademarks often transmogrify<br />
into cyberlaw when they are invoked in an Internet context, but they exist and<br />
evolve offline too, which prevents their total capture by cyberlaw scholars.<br />
Zittrain’s redefinition of cyberlaw compresses debates that engage complicated,<br />
intersecting bodies of law into a much narrower conversation about the value of<br />
generativity, and how best to secure the appropriate level of it. In general<br />
Zittrain seems quite pessimistic about whether cyberlaw can achieve anything<br />
positive beyond somehow—he never tells us how—fostering generativity. At<br />
one point in the book he even describes the enforcement of laws online <strong>as</strong><br />
something that could result in net social losses, and therefore a mechanism of<br />
Internet governance that is inferior to “retention of generative technologies.” 70<br />
Zittrain seems to have a lot more confidence in technologists than in attorneys.<br />
He waxes rhapsodic about the wisdom and forethought of the “framers” of the<br />
Internet throughout the tome. 71 One of “the primary” ways he proposes to<br />
address tetheredness and its <strong>as</strong>sociative ills is “a series of conversations,<br />
arguments, and experiments whose participants span the spectrum between<br />
network engineers and PC software designers, between expert users with time<br />
It is true. The field before this book w<strong>as</strong> us cheerleaders trying to convince a<br />
skeptical (academic) world about the importance and value of certain central<br />
features of the network. Zittrain gives these features a name—generativity—<br />
and then shows us an <strong>as</strong>pect of this generative net that we cheerleaders<br />
would rather you not think much about: the extraordinary explosion of<br />
malware and the like that the generative net h<strong>as</strong> also generated.<br />
70 See ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 113-114. Zittrain states:<br />
Technologies that lend themselves to an e<strong>as</strong>y and tightly coupled expression<br />
of governmental power simply will be portable from one society to the next.<br />
It will make irrelevant the question about how firms like Google and Skype<br />
should operate outside their home countries.<br />
This conclusion suggests that although some social gain may result from<br />
better enforcement of existing laws in free societies, the gain might be more<br />
than offset by better enforcement in societies that are less free—under<br />
repressive governments today, or anywhere in the future. If the gains and<br />
losses remain coupled, it might make sense to favor retention of generative<br />
technologies to put what law professor James Boyle h<strong>as</strong> called the<br />
“Libertarian gotcha” to authoritarian regimes: if one wants technological<br />
progress and the <strong>as</strong>sociated economic benefits, one must be prepared to<br />
accept some me<strong>as</strong>ure of social liberalization made possible with that<br />
technology.<br />
71 See ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 7, 27, 31, 33, 34, 69, 99.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 131<br />
to spend tinkering and those who simply want the system to work—but who<br />
appreciate the dangers of lockdown” (p. 173). On the Zittrainnet, with the<br />
exception of a select few cyberlaw professors, academics in disciplines other<br />
than law, particularly computer science, are going to be the true benevolent<br />
dictators of cyberlaw, mediating disputes with technological innovations and<br />
enforcing their judgments through code.<br />
The Private Sector<br />
Zittrain quite understandably doubts that for-profit entities will selflessly<br />
prioritize the well-being of the Internet over their own commercial gain. So,<br />
they are unlikely to consistently adhere to pro-generative business plans unless<br />
they can be convinced that doing so will benefit them. One of Zittrain’s<br />
objectives in writing the book w<strong>as</strong> to educate the reader about the ways that<br />
extensive generativity can serve commercial goals. However, while corporate<br />
actors may find Zittrain’s book of interest, I suspect actual experiences in the<br />
marketplace will be what drives their decisions about tethers and generativity.<br />
Zittrain opens his book with what is framed <strong>as</strong> an apocryphal tale: Apple II<br />
computers were revolutionary because they facilitated the development of new<br />
and original uses by outsiders; but thirty years later the same company launched<br />
an anti-generativity counterrevolution of sorts by rele<strong>as</strong>ing its innovative iPhone<br />
in a locked format intended to discourage the use of applications that were not<br />
developed or approved by Apple. 72<br />
But how would Zittrain change this? Surely when the company made this<br />
decision, it knew even more than Zittrain about the role that generativity played<br />
in the success of the Apple II, but still chose a different strategy for the iPhone.<br />
Affirmative curtailment of its generativity initially lowered the risk that iPhones<br />
would be plagued by viruses or malware, and allowed Apple to control the ways<br />
that most consumers use them. Would Zittrain have forced generativity into<br />
the mechanics of the iPhone by law? Or, would he strip Apple of its ability to<br />
use the law to interfere when others hack the iPhone and make it more<br />
customizable? Or, would he instead simply wait for the market to show Apple<br />
the error of its degenerative ways? He never specifies. What he says at the end<br />
of his iPhone discussion is:<br />
A lockdown on PCs and a corresponding rise of tethered<br />
appliances will eliminate what today we take for granted: a<br />
world where mainstream technology can be influenced, even<br />
revolutionized, out of left field. Stopping this future depends<br />
72 See generally ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 86–87 (summarizing<br />
work by Eric von Hippel on the subject).
132 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
on some wisely developed and implemented locks, along with<br />
new technologies and a community ethos that secures the keys<br />
to those locks among groups with shared norms and a sense of<br />
public purpose, rather than in the hands of a single gatekeeping<br />
entity, whether public or private. 73<br />
It sounds like Zittrain wants to prevent Apple from interfering when consumers<br />
modify their iPhones. But how he proposes to achieve this is addressed only<br />
generally, much later in the book when he suggests vague, persu<strong>as</strong>ion-b<strong>as</strong>ed<br />
solutions. My inner pragmatist thinks strong consumer protection laws might<br />
be a viable option to this and many other problems he articulates in the book,<br />
but Zittrain mentions that possibility only glancingly, in the context of<br />
maintaining data portability. 74<br />
In July of 2008, Apple began allowing software developers to sell software for<br />
the iPhone, and tens of thousands of applications have subsequently been<br />
independently developed for the iPhone, 75 suggesting either successful<br />
deployment of a strategic multistep product rollout Apple had planned all along,<br />
or a midcourse marketing correction. In either event, after the App Store the<br />
iPhone cannot accurately be described <strong>as</strong> non-generative, at le<strong>as</strong>t <strong>as</strong> I<br />
understand the concept, 76 and what Zittrain characterized <strong>as</strong> a problem seems<br />
to have been largely solved without the intervention of cyberlaw. The iPhone is<br />
still tethered, of course, possibly giving consumers just enough rope to hang<br />
themselves if Apple decides to interfere with the contents or operation of any<br />
given phone. But tethering also facilitates positive interactions, such <strong>as</strong> updates<br />
and repairs. It is now, to use a phr<strong>as</strong>e Zittrain uses in a different context, “[a]<br />
technology that splits the difference between lockdown and openness.” 77<br />
It is true that Apple could alter the iPhone’s balance between generativity and<br />
tetheredness without notice or re<strong>as</strong>on. But there is every re<strong>as</strong>on to expect that<br />
Apple will try to keep its customers happy, especially given incre<strong>as</strong>ed<br />
competition by devices running Google’s Android operating system—with its<br />
73 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 5.<br />
74 Id. at 177.<br />
75 See, e.g., Jon Fortt, iPhone apps: For fun and profit?, FORTUNE TECH DAILY, July 6, 2009,<br />
http://money.cnn.com/2009/07/06/technology/apple_iphone_apps.fortune/index.<br />
htm<br />
76 See, e.g., Adam Thierer iPhone 2.0 cracked in hours … what w<strong>as</strong> that Zittrain thesis again?, THE<br />
TECHNOLOGY LIBERATION FRONT, July 10, 2008,<br />
http://techliberation.com/2008/07/10/iphone-20-cracked-in-hours-what-w<strong>as</strong>-thatzittrain-thesis-again/.<br />
77 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 155.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 133<br />
even more open apps marketplace. 78 A recent short review of the book in The<br />
Observer noted:<br />
The problem facing books about the internet is that by the<br />
time they have hit the shelves, they are already dated. This is<br />
clear on the second page of The Future of the Internet, where<br />
Jonathan Zittrain writes that the iPhone is purposefully<br />
resistant to “applications” (programmes allowing the phone to<br />
do clever things apart from make calls). 79<br />
The problem facing this book is deeper than datedness. Zittrain is wrong in his<br />
<strong>as</strong>sumptions about rigidity and fixedness. 80 In the abstract generativity and<br />
tetheredness may be opposites, but in reality they can exist within a single<br />
appliance. He actually makes this point when he describes computers with dual<br />
applications designated “red” and “green,” one generative and the other<br />
secure. 81 But he does not acknowledge that many technological devices already<br />
78 Yi-Wyn Yen & Michal Lev-Ram, Google’s $199 Phone to Compete with the iPhone, TECHLAND,<br />
Sept. 17, 2008, http://techland.blogs.fortune.cnn.com/2008/09/17/googles-199phone-to-<br />
compete-with-the-iphone/.<br />
79 Helen Zaltzman, The Future of the Internet by Jonathan Zittrain, OBSERVER (London), June 14,<br />
2009, at 26, available at http://www.guardian.co.uk/books/2009/jun/14/futureinternet-zittrain-review.<br />
80 See Adam Thierer, Review of Zittrain’s “Future of the Internet”, THE TECHNOLOGY LIBERATION<br />
FRONT, Mar. 23, 2008, http://techliberation.com/2008/03/23/review-of-zittrainsfuture-of-the-internet/.<br />
Thierer writes:<br />
My primary objection to Jonathan’s thesis is that (1) he seems to be overstating<br />
things quite a bit; and in doing so, (2) he creates a false choice of<br />
possible futures from which we must choose. What I mean by false choice is<br />
that Jonathan doesn’t seem to believe a hybrid future is possible or desirable.<br />
I see no re<strong>as</strong>on why we can’t have the best of both worlds—a world full of<br />
plenty of tethered appliances, but also plenty of generativity and openness.<br />
See also Timothy B. Lee, Sizing Up “Code” With 20/20 Hindsight, FREEDOM TO TINKER, May<br />
14, 2009, http://www.freedom-to-tinker.com/blog/tblee/sizing-code-2020-hindsight.<br />
Lee writes:<br />
I think Jonathan Zittrain’s The Future of the Internet and How to Stop It<br />
makes the same kind of mistake Lessig made a decade ago: overestimating<br />
regulators’ ability to shape the evolution of new technologies and<br />
underestimating the robustness of open platforms. The evolution of<br />
technology is mostly shaped by engineering and economic constraints.<br />
Government policies can sometimes force new technologies underground,<br />
but regulators rarely have the kind of fine-grained control they would need<br />
to promote “generative” technologies over sterile ones, any more than they<br />
could have stopped the emergence of cookies or DPI if they’d made<br />
different policy choices a decade ago.<br />
81 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 154-57.
134 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
shift between tethered and generative functions, driven by the demands of their<br />
users.<br />
Making <strong>as</strong>sumptions about consumer preferences can be hazardous, especially<br />
for folks who tend to <strong>as</strong>sociate mostly with people who share common<br />
interests, common backgrounds, a common race, a common gender. The<br />
Zittrainnet’s netizens, being human, are likely to engage in all manner of<br />
typec<strong>as</strong>ting and generalizing when they redesign their Internet sectors of<br />
interest. If the leading netizens echo the demographic pattern of the cyberlaw<br />
scholars, white men with elite educations will be making most of the calls. 82<br />
And Internet governance will be exceedingly top-down.<br />
At present companies can dramatically alter the levels of tetheredness and<br />
generativity in their products and services for any re<strong>as</strong>on or no re<strong>as</strong>on at all, and<br />
Zittrain never explains what sort of regulations or market interventions he<br />
thinks are necessary to achieve or preserve the Zittrainnet. He is critical of<br />
companies that <strong>as</strong>sist totalitarian governments with surveillance or censorship<br />
initiatives, 83 but fails to acknowledge the re<strong>as</strong>on that many technologies that can<br />
be readily employed to spy on people are developed: Companies want to be able<br />
to shadow and scrutinize their customers themselves. Consumers usually agree<br />
to this scrutiny in nonnegotiable EULA terms and conditions. For companies,<br />
closely following the acts and omissions of their customers or client b<strong>as</strong>e is<br />
generative behavior, even though it relies on tethers. Information about<br />
consumers can lead to innovations in goods and services <strong>as</strong> well <strong>as</strong> in marketing<br />
them.<br />
Governments<br />
Zittrain expresses grave concerns about government intervention on the<br />
Internet. He does not seem to believe that government actors can competently<br />
safeguard users, or effectively regulate technology. And he fears governments<br />
will further harness the Internet to advance surveillance and censorship agend<strong>as</strong><br />
that are anathema to freedom. Zittrain writes with deep foreboding:<br />
The rise of tethered appliances significantly reduces the<br />
number and variety of people and institutions required to apply<br />
the state’s power on a m<strong>as</strong>s scale. It removes a practical check<br />
on the use of that power. It diminishes a rule’s ability to attain<br />
82 See Anupam Chander, Whose Republic?, 69 U. CHI. L. REV. 1479, 1484–85 (2002) (reviewing<br />
CASS SUNSTEIN, REPUBLIC.COM (2001)).<br />
83 ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 112–13.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 135<br />
legitimacy <strong>as</strong> people choose to participate in its enforcement,<br />
or at le<strong>as</strong>t not stand in its way. 84<br />
So it seems strange to learn that his solution to too much tethering is “a latterday<br />
Manhattan Project.” 85 The Manhattan Project w<strong>as</strong>, of course, the code<br />
name for the U.S. government’s secret project to develop a nuclear bomb. It<br />
may have been staffed by scientists, many of whom were academics, but it w<strong>as</strong><br />
organized, funded, and strictly controlled by the government. 86 An analogous<br />
initiative to formulate the Zittrainnet would hardly be open and accessible to<br />
the online public. Moreover, governments generally take some kind of<br />
proprietary interest in the outcomes of projects they fund. Even under the<br />
Bayh-Dole Act, 87 which allows universities in the United States to patent<br />
inventions developed with federal funding, the U.S. government retains marchin<br />
rights. 88 Zittrain seems to want the resources that governments can provide<br />
without any of the restrictions or obligations governments will, <strong>as</strong> experience<br />
suggests, inevitably impose. It’s possible that a well-crafted Zittrainet Project<br />
could receive the unconditional support of government actors, but I don’t think<br />
this is terribly likely to happen.<br />
Surprisingly, one of the success stories for generativity that Zittrain references is<br />
the <strong>Digital</strong> Millennium Copyright Act of 1998. 89 Not only did this require<br />
government intervention in the form of traditional law, but it also relied on<br />
tethering. Web sites could not take down potentially infringing material without<br />
retaining a level of control that enables this.<br />
In addition to generativity, one of the defining principles of the Zittrainnet will<br />
be adherence to First Amendment principles. Zittrain’s descriptions of online<br />
freedom and autonomy suggest a strong belief that all the countries of the world<br />
84 Id. at 118.<br />
85 Id. at 173.<br />
86 U.S. DEP’T OF ENERGY, OFFICE OF HISTORY & HERITAGE RES., Early Government Support, in<br />
THE MANHATTAN PROJECT: AN INTERACTIVE HISTORY,<br />
http://www.cfo.doe.gov/me70/manhattan/1939-1942.htm (l<strong>as</strong>t visited July 30, 2009);<br />
The Manhattan Project (and Before), in THE NUCLEAR WEAPON ARCHIVE,<br />
http://nuclearweaponarchive.org/Usa/Med/Med.html (l<strong>as</strong>t visited Oct. 4, 2009); U.S.<br />
DEP’T OF ENERGY, OFFICE OF HISTORY & HERITAGE RES., A Tentative Decision to Build the<br />
Bomb, in THE MANHATTAN PROJECT: AN INTERACTIVE HISTORY,<br />
http://www.cfo.doe.gov/me70/manhattan/tentative_decision_build.htm (l<strong>as</strong>t visited<br />
July 30, 2009).<br />
87 35 U.S.C. §§ 200–212 (2006).<br />
88 Id. § 203.<br />
89 See Pub. L. No. 105–304, 112 Stat. 2860 (1998). See also ZITTRAIN, THE FUTURE OF THE<br />
INTERNET, supra note 3, at 119–20 (stating Zittrain’s discussion of the DMCA).
136 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
should honor and implement the free-speech values of the First Amendment,<br />
whether they want to or not. 90 This raises complicated issues of state<br />
sovereignty and international law that Zittrain does not address.<br />
Conclusion<br />
I’ve been very hard on The Future of the Internet in this review, but I truly did<br />
enjoy reading it. The book is very informative, if you can sift through the<br />
portions contrived to illustrate an unconvincing macro theory of the Internet. I<br />
wish Zittrain had written a book that set out only to describe the history and<br />
state of the Internet, rather than one that w<strong>as</strong> formulated to support<br />
questionable generalizations and grandiose prescriptions. He could have told<br />
many of the same extremely interesting stories, but with more balance and less<br />
of a blatant “big think” agenda.<br />
The book is woefully lacking in specifics, in terms of advancing the reforms<br />
Zittrain <strong>as</strong>serts are necessary. Even if I were willing to buy into Zittrain’s claim<br />
that preserving and enhancing generativity should be the organizing principle of<br />
the Internet governance interventions, the mechanics of how this could be<br />
pursued holistically are never revealed. And the technicalities by which good<br />
generativity could be fostered while bad generativity w<strong>as</strong> simultaneously<br />
repressed are similarly unstated. The only extensively developed account of a<br />
generative system Zittrain unab<strong>as</strong>hedly admires is Wikipedia, which he admits is<br />
undemocratic. 91 It is also a system that facilitates repression of unpopular<br />
viewpoints, and this is likely to affect outsider groups most dramatically.<br />
Who will step forward to somehow cultivate the Zittrainnet is a mystery. The<br />
future of the Internet, Zittrain <strong>as</strong>serts, would be much safer in the hands of<br />
those who can competently safeguard it. He describes these people in very<br />
general terms <strong>as</strong> being skilled and of good faith. These hands do not belong to<br />
people who are affiliated with dot-coms, because they use tethering to constrain<br />
generativity when doing so is profitable. Nor do they belong to dot-gov<br />
bureaucrats, who are at best uninformed and at worst eager to use the Internet<br />
to enforce regimes of totalitarian rule. Readers of the book learn a lot more<br />
about who Zittrain thinks should not be in control of the Internet than who<br />
should be. But there are a number of hints and suggestions scattered<br />
throughout its pages that he believes he and his colleagues are capable of<br />
directing the Internet’s future wisely and beneficently. If they are going to<br />
attempt to do this by writing books, perhaps Zittrain’s offering makes sense <strong>as</strong> a<br />
90 Contra Joel R. Reidenberg, Yahoo and Democracy on the Internet, 42 JURIMETRICS 261 (2002).<br />
91 See ZITTRAIN, THE FUTURE OF THE INTERNET, supra note 3, at 141 (“And Wikipedia is<br />
decidedly not a democracy: consensus is favored over voting and its head counts.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 137<br />
declaration of first principles. Maybe his next book will describe the steps along<br />
the path to the Zittrainnet more concretely.
138 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 139<br />
The C<strong>as</strong>e for Internet Optimism,<br />
Part 2: Saving the Net from Its<br />
Supporters<br />
By Adam Thierer *<br />
In an earlier essay, I argued that two distinct strands of “Internet pessimism” incre<strong>as</strong>ingly<br />
dominate Internet policy discussions. The pessimism of “Net skeptics” is rooted in a general<br />
skepticism of the supposed benefits of cyberspace, digital technologies, and information<br />
abundance. Here, I respond to a very different strand of Internet pessimism—one expressed by<br />
fans of the Internet and cyberspace who nonetheless fear that dark days lie ahead unless steps<br />
are taken to “save the Net” from a variety of ills, especially the perceived end of “openness.”<br />
Introduction: Is the<br />
Web Really Dying?<br />
“The Death of the Internet” is a hot meme<br />
in Internet policy these days. Much <strong>as</strong> a<br />
famous Time magazine cover <strong>as</strong>ked “Is<br />
God Dead?” in 1966, 1 Wired magazine, the<br />
magazine for the modern digerati,<br />
proclaimed in a recent cover story that<br />
“The Web is Dead.” 2 A few weeks later,<br />
The Economist magazine ran a cover story<br />
fretting about “The Web’s New Walls,”<br />
wondering “how the threats to the<br />
Internet’s openness can be averted.” 3 The<br />
primary concern expressed in both essays:<br />
* Adam Thierer is a senior research fellow at the Mercatus Center at George M<strong>as</strong>on<br />
University where he works with the Technology Policy Program.<br />
1 “Is God Dead?” TIME, April 8, 1966,<br />
www.time.com/time/covers/0,16641,19660408,00.html<br />
2 Chris Anderson & Michael Wolff, The Web Is Dead. Long Live the Internet, WIRED, Aug. 17,<br />
2010, www.wired.com/magazine/2010/08/ff_webrip/all/1. Incidentally, there’s a long<br />
history of pundits declaring just about everything “dead” at some point, from email, RSS,<br />
and blogging to eReaders, browser, and even Facebook and Twitter. See Harry McCracken,<br />
The Tragic Death of Practically Everything, TECHNOLOGIZER, Aug. 18, 2010,<br />
http://technologizer.com/2010/08/18/the-tragic-death-of-practically-everything<br />
3 The Web’s New Walls, THE ECONOMIST, Sept. 2, 2010,<br />
www.economist.com/research/articlesBySubject/displayStory.cfm?story_id=169435<br />
79&subjectID=348963&fsrc=nwl
140 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
The wide-open Internet experience of the<br />
p<strong>as</strong>t decade is giving way to a new regime<br />
of corporate control, closed platforms, and<br />
walled gardens.<br />
This fear is given fuller elucidation in<br />
recent books by two of the intellectual<br />
godfathers of modern cyberlaw: Jonathan<br />
Zittrain’s The Future of the Internet—And<br />
How to Stop It, 4 and Tim Wu’s The M<strong>as</strong>ter<br />
Switch: The Rise and Fall of Information<br />
Empires. 5 These books are best understood<br />
<strong>as</strong> the second and third installments in a<br />
trilogy that began with the publication of<br />
Lawrence Lessig’s seminal 1999 book, Code<br />
and Other Laws of Cyberspace. 6<br />
Lessig’s book framed much of how we study and discuss cyberlaw and Internet<br />
policy. More importantly, Code spawned a bona fide philosophical movement<br />
within those circles <strong>as</strong> a polemic against both cyber-libertarianism and Internet<br />
exceptionalism (closely related movements), <strong>as</strong> well <strong>as</strong> a sort of call to arms for<br />
a new Net activist movement. The book gave this movement its central<br />
operating principle: Code and cyberspace can be bent to the will of some<br />
amorphous collective or public will, and it often must be if we are to avoid any<br />
number of impending dis<strong>as</strong>ters brought on by nefarious-minded (or just plain<br />
incompetent) folks in corporate America scheming to achieve “perfect control”<br />
over users.<br />
It’s difficult to know what to label this school of thinking about Internet policy,<br />
and Prof. Lessig h<strong>as</strong> taken offense at me calling it “cyber-collectivism.” 7 But<br />
the collectivism of which I speak is a more generic type, not the hard-edged<br />
Marxist brand of collectivism of modern times. Instead, it’s the belief that<br />
markets, property rights, and private decision-making about the future course of<br />
the Net must yield to supposedly more enlightened actors and mechanisms. As<br />
Declan McCullagh h<strong>as</strong> remarked, Lessig and his students<br />
4 JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET—AND HOW TO STOP IT (2008).<br />
5 TIM WU, THE MASTER SWITCH: THE RISE AND FALL OF INFORMATION EMPIRES (2010).<br />
6 LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE (1999).<br />
7 Adam Thierer, Our Conflict of Cyber-Visions, CATO UNBOUND, May 14, 2009, www.catounbound.org/2009/05/14/adam-thierer/our-conflict-of-cyber-visions/
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 141<br />
prefer … what probably could be called technocratic<br />
philosopher kings, of the breed that Plato’s The Republic said<br />
would be “best able to guard the laws and institutions of our<br />
State—let them be our guardians.” These technocrats would be<br />
entrusted with making wise decisions on our behalf, because,<br />
according to Lessig, “politics is that process by which we<br />
collectively decide how we should live.” 8<br />
What is it, exactly, that these cyber-collectivists seek to protect or accomplish?<br />
To the extent it can be boiled down to a single term, their rallying cry is:<br />
Openness! “Openness” is almost always The Good; anything “closed”<br />
(restricted or proprietary) in nature is The Bad. Thus, since they recoil at the<br />
“cyber-collectivist” label, we might think of adherents to this philosophy <strong>as</strong><br />
“Openness Evangelicals,” since they evangelize in favor of “openness” and<br />
seemingly make all else subservient to it.<br />
For example, in Future of the Internet, Zittrain argues that, for a variety of re<strong>as</strong>ons,<br />
we run the risk of seeing the glorious days of “generative” devices and the<br />
“open” Internet give way to more “tethered appliances” and closed networks.<br />
He says:<br />
Today, the same qualities that led to [the success of the<br />
Internet and general-purpose PCs] are causing [them] to falter.<br />
As ubiquitous <strong>as</strong> Internet technologies are today, the pieces are<br />
in place for a wholesale shift away from the original chaotic<br />
design that h<strong>as</strong> given rise to the modern information<br />
revolution. This counterrevolution would push mainstream<br />
users away from the generative Internet that fosters innovation<br />
and disruption, to an appliancized network that incorporates<br />
some of the most powerful features of today’s Internet while<br />
greatly limiting its innovative capacity—and, for better or<br />
worse, heightening its regulability. A seductive and more<br />
powerful generation of proprietary networks and information<br />
appliances is waiting for round two. If the problems <strong>as</strong>sociated<br />
with the Internet and PC are not addressed, a set of blunt<br />
solutions will likely be applied to solve the problems at the<br />
expense of much of what we love about today’s information<br />
ecosystem. 9<br />
8 Declan McCullagh, What Larry Didn’t Get, CATO UNBOUND, May 4, 2009, www.catounbound.org/2009/05/04/declan-mccullagh/what-larry-didnt-get<br />
9 Zittrain, supra note 4 at 8.
142 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
In other words, Zittrain fears most will flock to tethered appliances in a search<br />
for stability or security. That’s troubling, he says, because those tethered<br />
appliances are less “open” and more likely to be “regulable,” either by large<br />
corporate intermediaries or government officials. Thus, the “future of the<br />
Internet” Zittrain is hoping to “stop” is a world dominated by tethered digital<br />
appliances and closed walled gardens because they are too e<strong>as</strong>ily controlled by<br />
other actors.<br />
My primary beef with these “Openness Evangelicals” is not that openness and<br />
generativity aren’t fine generic principles but that:<br />
1. They tend to significantly overstate the severity of this problem (the<br />
supposed decline of openness or generativity, that is);<br />
2. I’m more willing to allow evolutionary dynamism to run its course within<br />
digital markets, even if that means some “closed” devices and platforms<br />
remain (or even thrive); and,<br />
3. It’s significantly more likely that the “openness” advocated by Openness<br />
Evangelicals will devolve into expanded government control of cyberspace<br />
and digital systems than that unregulated systems will become subject to<br />
“perfect control” by the private sector, <strong>as</strong> they fear.<br />
More generally, my problem with this movement—and Zittrain’s book, in<br />
particular—comes down to the dour, depressing “the-Net-is-about-to-die” fear<br />
that seems to fuel this worldview. The message seems to be: “Enjoy the good<br />
old days of the open Internet while you can, because any minute now it will be<br />
crushed and closed-off by corporate marauders!” Lessig started this nervous<br />
hand-wringing in Code when he ominously predicted that “Left to itself,<br />
cyberspace will become a perfect tool of control.” 10 Today, his many disciples<br />
in academia (including Zittrain and Wu) and a wide variety of regulatory<br />
advocacy groups continue to preach this gloomy gospel of impending digital<br />
doom and “perfect control” despite plenty of evidence that supports the c<strong>as</strong>e<br />
for optimism.<br />
For example, Wu warns there are “forces threatening the Internet <strong>as</strong> we know<br />
it” 11 while Zittrain worries about “a handful of gated cloud communities whose<br />
proprietors control the availability of new code.” 12 At times, this paranoia of<br />
10 LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE (1999) at 5-6.<br />
11 WU, supra note 5 at 7.<br />
12 Jonathan Zittrain, Lost in the Cloud, NEW YORK TIMES, July 19, 2009,<br />
www.nytimes.com/2009/07/20/opinion/20zittrain.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 143<br />
some in the Openness Evangelical clan borders on outright hysteria. In August<br />
2008, a Public Knowledge analyst likened Apple’s management of applications<br />
in its iPhone App Store to the tyranny of Orwell’s 1984! 13 In other words, the<br />
Big Brother they want us to fear is Corporate Big Brother. Someday very soon,<br />
we are repeatedly told, the corporate big boys will toss the proverbial “m<strong>as</strong>ter<br />
switch,” suffocating Internet innovation and digital freedom, and making us all<br />
cyber-slaves within their commercialized walled gardens. The possibility of<br />
consumers escaping from these walled gardens or avoiding them altogether is<br />
treated <strong>as</strong> remote—if the notion is entertained at all.<br />
We might think of this fear <strong>as</strong> “The Great Closing,” or the notion that, unless<br />
radical interventions are pursued—often through regulation—a <strong>Digital</strong> Dark<br />
Age of Closed Systems will soon unfold, complete with myriad America Onlinelike<br />
walled gardens, “sterile and tethered devices,” corporate censorship, and<br />
gouging of consumers. Finally, the implicit message in the work of all these<br />
hyper-pessimistic critics is that markets must be steered in a more sensible<br />
direction by those technocratic philosopher kings (although the details of their<br />
blueprint for digital salvation are often scarce).<br />
Problems with “The Great Closing” Thesis<br />
There are serious problems with the “Great Closing” thesis <strong>as</strong> set forth in the<br />
high-tech threnody of Lessig, Zittrain, Wu, and other Openness Evangelicals, or<br />
“”<strong>as</strong> The New York Times h<strong>as</strong> called them, digital “doomsayers.” 14<br />
No Clear Definitions of Openness or Closedness;<br />
Both Are Matters of Degree<br />
“Open” vs. closed isn’t <strong>as</strong> black and white <strong>as</strong> some Openness Evangelicals<br />
make it out to be. For example, Zittrain praises the supposedly more open<br />
nature of PCs and the openness to innovation made possible by Microsoft’s<br />
Windows operating system. How ironic, since so many have bl<strong>as</strong>ted Windows<br />
<strong>as</strong> the Great Satan of closed code! Meanwhile, while most others think of<br />
Apple <strong>as</strong> “everyone’s favorite example of innovation,” 15 Zittrain makes the<br />
13 Alex Curtis, Benefits of iPhone App Store Tainted by 1984-like Control, Public Knowledge Blog,<br />
Aug. 11, 2008, www.publicknowledge.org/node/1703 The tech gadget website<br />
Gizmodo recently ran a similar Apple-<strong>as</strong>-Big-Brother essay: Matt Buchanan, Big Brother Apple<br />
and the Death of the Program, GIZMODO, Oct. 22, 2010, http://gizmodo.com/5670812/bigbrother-apple-and-the-death-of-the-program.<br />
14 Eric Pfanner, Proclaimed Dead, Web is Showing Signs of New Life, NEW YORK TIMES, Oct. 31,<br />
2010, www.nytimes.com/2010/11/01/technology/01webwalls.html<br />
15 Amar Bhide, Don’t Expect Much From the R&D Tax Credit, WALL STREET JOURNAL, Sept. 11,<br />
2010,<br />
http://online.wsj.com/article/SB10001424052748704644404575481534193344088.html
144 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
iPhone and iPad out to be “sterile, tethered” appliances. But the company’s<br />
App Store h<strong>as</strong> offered millions of innovators the opportunity to produce almost<br />
every conceivable type of mobile application the human mind could imagine for<br />
those devices. 16 Moreover, those Apple devices don’t block completely “open”<br />
communications applications or interfaces, such <strong>as</strong> Web browsers, email and<br />
SMS clients, or Twitter. “In the abstract,” notes University of South Carolina<br />
School of Law professor Ann Bartow, “generativity and tetheredness may be<br />
opposites, but in reality they can exist within a single appliance.” 17<br />
While the Apple devices seem to prove that, in reality, almost all modern digital<br />
devices and networks feature some generative and “non-generative” attributes.<br />
“No one h<strong>as</strong> ever created, and no one will ever create, a system that allows any<br />
user to create anything he or she wants. Instead, every system designer makes<br />
innumerable tradeoffs and imposes countless constraints,” note James<br />
Grimmelmann and Paul Ohm. 18 “Every generative technology faces …<br />
tradeoffs. Good system designers always restrict generativity of some kinds in<br />
order to encourage generativity of other kinds. The trick is in striking the<br />
balance,” they argue. 19 Yet, “Zittrain never fully analyzes split-generativity<br />
systems, those with generative layers built upon non-generative layers, or viceversa.”<br />
20<br />
The zero-sum fear that the <strong>as</strong>cendancy of mobile apps means less “generativity”<br />
or the “death of the Web” is another myth. Nick Bilton of The New York Times<br />
notes:<br />
Most of these apps and Web sites are so intertwined that it’s<br />
difficult to know the difference. With the exception of<br />
downloadable games, most Web apps for news and services<br />
require pieces of the Web and Internet to function properly. So<br />
<strong>as</strong> more devices become connected to the Internet, even if<br />
they’re built to access beautiful walled gardens, like mobile<br />
16 Apple, Apple’s App Store Downloads Top Three Billion, Jan. 5, 2010,<br />
www.apple.com/pr/library/2010/01/05appstore.html<br />
17 Ann Bartow, A Portrait of the Internet <strong>as</strong> a Young Man, 108 MICHIGAN LAW REVIEW 6, at 1102-<br />
03, www.michiganlawreview.org/<strong>as</strong>sets/pdfs/108/6/bartow.pdf<br />
18 James Grimmelmann & Paul Ohm, Dr. Generative or: How I Learned to Stop Worrying and Love<br />
the iPhone, MARYLAND LAW REVIEW (2010) at 940-41.<br />
19 Id. at 941.<br />
20 Id. at 944. (emph<strong>as</strong>is in original).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 145<br />
apps or TV-specific interfaces, they will continue to access the<br />
Web too, enabling each platform to grow concurrently. 21<br />
Ironically, it w<strong>as</strong> Chris Anderson, editor of Wired and author of the apocalyptic<br />
“Web is Dead” cover story, who best explained why fears of “The Great<br />
Closing” are largely overblown:<br />
Ecommerce continues to thrive on the Web, and no company<br />
is going to shut its Web site <strong>as</strong> an information resource. More<br />
important, the great virtue of today’s Web is that so much of it<br />
is noncommercial. The wide-open Web of peer production, the<br />
so-called generative Web where everyone is free to create what<br />
they want, continues to thrive, driven by the nonmonetary<br />
incentives of expression, attention, reputation, and the like. 22<br />
And Jeff Bertolucci of PC World makes it clear generative computing is alive and<br />
well:<br />
The next big computing platform won’t be a version of<br />
Apple’s Mac OS, Google’s Android, or Microsoft’s Windows.<br />
It’s already here—and it’s the Web. And the drive to offer the<br />
most compelling window to the Web possible, via the browser,<br />
is intense. The browser is spreading beyond the PC and<br />
smartphone to new types of gadgetry, including TV set-top<br />
boxes and printers. This is a trend that will accelerate in the<br />
coming years. 23<br />
The Evils of Closed Systems or <strong>Digital</strong><br />
“Appliances” Are Greatly Over-Stated<br />
Openness Evangelicals often fail to appreciate how there obviously must have<br />
been a need / demand for some “closed” or “sterile” devices or else the market<br />
wouldn’t have supplied them. Why shouldn’t people who want a simpler or more<br />
secure digital experience be offered such options? Wu worries that devices like<br />
the iPad “are computers that have been reduced to a strictly limited set of<br />
functions that they are designed to perform extremely well.” 24 Needless to say,<br />
21 Nick Bilton, Is the Web Dying? It Doesn’t Look That Way, NEW YORK TIMES BITS BLOG, Aug.<br />
17, 2010, http://bits.blogs.nytimes.com/2010/08/17/the-growth-of-the-dying-web<br />
22 Anderson & Wolff, supra note 2.<br />
23 Jeff Bertolucci, Your Browser in Five Years, PC WORLD, June 16, 2010,<br />
www.pcworld.com/article/199071/your_browser_in_five_years.html<br />
24 Wu, supra note 5 at 292.
146 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
it will be hard for many consumers to sympathize with Wu’s complaint that<br />
products work too well!<br />
However, <strong>as</strong> noted throughout this essay, it’s also not quite true that those<br />
devices are <strong>as</strong> closed or crippled <strong>as</strong> their critics suggest. As Grimmelmann and<br />
Ohm aptly note, “restricting generativity in one place (for example, by building<br />
computers with fixed circuit boards rather than a tangle of reconfigurable wires)<br />
can m<strong>as</strong>sively enhance generativity overall (by making computers cheap and<br />
usable enough that everyone can tinker with their software).” 25 For example, in<br />
November 2010, Damon Albarn, lead singer of the popular band “Gorillaz,”<br />
announced that the group’s next album would be recorded entirely on an iPad. 26<br />
Regardless, just how far would these critics go to keep devices or platform<br />
perfectly “generative” or “open” (<strong>as</strong>suming we can even agree on how to define<br />
these concepts)? Do the Openness Evangelicals really think consumers would<br />
be better served if they were forced to fend for themselves with devices that<br />
arrived totally unconfigured? Should the iPhone or iPad, for example, be<br />
shipped to market with no apps loaded on the main screen, forcing everyone to<br />
go find them on their own? Should TiVos have no interactive menus out-ofthe-box,<br />
forcing consumers to go online and find some “homebrew” code that<br />
someone whipped up to give users an open source programming guide?<br />
Some of us are able to do so, of course, and those of us who are tech geeks<br />
sometimes find it e<strong>as</strong>y to look down our noses at those who want their hand<br />
held through cyberspace, or who favor more simplistic devices. But there’s<br />
nothing wrong with those individuals who seek simplicity, stability, or security<br />
in their digital devices and online experiences—even if they find those solutions<br />
in the form of “tethered appliances” or “walled gardens.” Not everyone wants<br />
to tinker or to experience cyberspace <strong>as</strong> geeks do. Not everyone wants to<br />
program their mobile phones, hack their consoles, or write their own code.<br />
Most people live perfectly happy lives without ever doing any of these things!<br />
Nonetheless, many of those “mere mortals” will want to use many of the same<br />
toys that the tech geeks use, or they may just want to take more cautious steps<br />
into the occ<strong>as</strong>ionally cold pool called cyberspace—one tippy toe at a time. Why<br />
shouldn’t those users be accommodated with “lesser” devices or a “curated”<br />
Web experience? Kevin Kelly argues that there’s another way of looking at<br />
these trends. <strong>Digital</strong> tools are becoming more specialized, he argues, and “with<br />
the advent of rapid fabrication … specialization will leap ahead so that any tool<br />
can be customized to an individual’s personal needs or desires.” 27 Viewed in<br />
25 Grimmelmann & Ohm, supra note 18, at 923.<br />
26 Damon Albarn Records New Gorillaz Album on an iPad, NME NEWS, November 12, 2010,<br />
http://www.nme.com/news/gorillaz/53816<br />
27 Kevin Kelly, What Technology Wants (2010) at 295-6.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 147<br />
this light, the Openness Evangelicals would hold back greater technological<br />
specialization in the name of preserving market norms or structures they prefer.<br />
The best argument against digital appliancization is that the desire for more<br />
stable and secure systems will lead to a more “regulable” world—i.e., one that<br />
can be more e<strong>as</strong>ily controlled by both corporations and government. As<br />
Zittrain puts it:<br />
Whether software developer or user, volunteering control over<br />
one’s digital environment to a Manager means that the<br />
manager can change one’s experience at any time—or worse,<br />
be compelled to by outside pressures. … The famously<br />
ungovernable Internet suddenly becomes much more<br />
governable, an outcome most libertarian types would be<br />
concerned about. 28<br />
No doubt, concerns about privacy, child safety, defamation, cybersecurity,<br />
identity theft and so on, will continue to lead to calls for more intervention. At<br />
the corporate level, however, some of that potential intervention makes a great<br />
deal of sense. For example, if ISPs are in a position to help do something to<br />
help alleviate some of these problems—especially spam and viruses—what’s<br />
wrong with that? Again, there’s a happy balance here that critics like Zittrain<br />
and Wu fail to appreciate. Bruce Owen, an economist and the author of The<br />
Internet Challenge to Television, discussed it in his response to Zittrain’s recent<br />
book:<br />
Why does Zittrain think that overreaction is likely, and that its<br />
costs will be unusually large? Neither prediction is self-evident.<br />
Faced with the risk of infection or mishap, many users already<br />
restrain their own t<strong>as</strong>te for PC-mediated adventure, or install<br />
protective software with similar effect. For the most risk-averse<br />
PC users, it may be re<strong>as</strong>onable to welcome “tethered” PCs<br />
whose suppliers compete to offer the most popular<br />
combinations of freedom and safety. Such risk-averse users are<br />
reacting, in part, to negative externalities from the poor<br />
hygiene of other users, but such users in turn create positive<br />
externalities by limiting the population of PCs vulnerable to<br />
contagion or hijacking. As far <strong>as</strong> one can tell, this can <strong>as</strong> e<strong>as</strong>ily<br />
produce balance or under-reaction <strong>as</strong> overreaction—it is an<br />
empirical question. But, <strong>as</strong> long <strong>as</strong> flexibility h<strong>as</strong> value to users,<br />
28 Jonathan Zittrain, H<strong>as</strong> the Future of the Internet Happened? Sept. 7, 2010, CONCURRING<br />
OPINIONS blog, www.concurringopinions.com/archives/2010/09/h<strong>as</strong>-the-future-ofthe-internet-come-about.html
148 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
suppliers of hardware and interconnection services will have<br />
incentives to offer it, in me<strong>as</strong>ured ways, or <strong>as</strong> options. 29<br />
Indeed, we can find happy middle-ground solutions that balance openness and<br />
stability—and platform operators must be free to discover where that happy<br />
medium is through an ongoing process of trial and error, for only through such<br />
discovery can the right balance be struck in a constantly changing landscape. A<br />
world full of hybrid solutions would offer more consumers more choices that<br />
better fit their specific needs.<br />
Finally, to the extent something more must be done to counter the supposed<br />
regulability of cyberspace, the solution should not be new limitations on<br />
innovation. Instead of imposing restrictions on code or coders to limit<br />
regulability, we should instead place more constraints on our government(s).<br />
Consider privacy and data collection concerns. While, <strong>as</strong> a general principle, it<br />
is probably wise for companies to minimize the amount of data they collect<br />
about consumers to avoid privacy concerns about data breaches, there are also<br />
benefits to the collection of that data. So rather than legislating the “right” data<br />
retention rules, we should hold companies to the promises they make about<br />
data security and breaches, and tightly limit the powers of government to access<br />
private information through intermediaries in the first place.<br />
Most obviously, we could begin by tightening up the Electronic<br />
Communications Privacy Act (ECPA) and other laws that limit government<br />
data access. 30 More subtly, we must continue to defend Section 230 of the<br />
Communications Decency Act, which shields intermediaries from liability for<br />
information posted or published by users of their systems, because (among<br />
many things) such liability would make online intermediaries more susceptible<br />
to the kind of back-room coercion that concerns Zittrain, Lessig and others. If<br />
we’re going to be legislating the Internet, we need more laws like that, not those<br />
of the “middleman deputization” model or those that would regulate code to<br />
achieve this goal.<br />
Companies Have Strong Incentives to Strike<br />
the Right Openness/Closedness Balance<br />
Various social and economic influences help ensure the scales won’t be tipped<br />
completely in the closed or non-generative direction. The Web is built on<br />
29 Bruce Owen, As Long <strong>as</strong> Flexibility H<strong>as</strong> Value to Users, Suppliers Will Have Incentives to Offer It,<br />
BOSTON REVIEW, March/April 2008, www.bostonreview.net/BR33.2/owen.php<br />
30 A broad coalition h<strong>as</strong> proposed such reforms. See www.digitaldueprocess.org.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 149<br />
powerful feedback mechanisms and possesses an extraordinary level of<br />
transparency in terms of its operations.<br />
Moreover, the breaking news cycle for tech developments can be me<strong>as</strong>ured not<br />
in days, but in minutes or even seconds. Every boneheaded move meets<br />
immediate and intense scrutiny by bloggers, tech press, pundits, gadget sites, etc.<br />
Never h<strong>as</strong> the white-hot spotlight of public attention been so intense in helping<br />
to shine a light on corporate missteps and forcing their correction. We saw this<br />
dynamic at work with the Facebook Beacon incident, 31 Google’ Buzz debacle, 32<br />
Amazon 1984 incident, 33 Apple’s Fl<strong>as</strong>h restrictions, 34 the Sony rootkit episode, 35<br />
and other examples.<br />
Things Are Getting More Open<br />
All the Time Anyway<br />
Most corporate attempts to bottle up information or close off their platforms<br />
end badly. The walled gardens of the p<strong>as</strong>t failed miserably. In critiquing<br />
Zittrain’s book, Ann Bartow h<strong>as</strong> noted that “if Zittrain is correct that<br />
CompuServe and America Online (AOL) exemplify the evils of tethering, it’s<br />
pretty clear the market punished those entities pretty harshly without Internet<br />
governance-style interventions.” 36 Indeed, let’s not forget that AOL w<strong>as</strong> the<br />
big, bad corporate boogeyman of Lessig’s Code and yet, just a decade later, it h<strong>as</strong><br />
been relegated to an also-ran in the Internet ecosystem.<br />
31 See Nancy Gohring, Facebook Faces Cl<strong>as</strong>s-Action Suit Over Beacon, NETWORKWORLD.COM, Aug.<br />
13, 2008, http://www.networkworld.com/news/2008/081308-facebook-faces-cl<strong>as</strong>saction-suit-over.html.<br />
32 See Ryan Paul, EPIC Fail: Google Faces FTC Complaint Over Buzz Privacy, ARS TECHNICA, Feb.<br />
17, 2010, http://arstechnica.com/security/news/2010/02/epic-fail-google-facescomplaint-over-buzz-privacy-issues.ars.<br />
33 See John Timmer, Amazon Settles 1984 Suit, Sets Limits on Kindle Deletions, ARS TECHNICA, Oct.<br />
2, 2009, http://arstechnica.com/web/news/2009/10/amazon-stipulates-terms-ofbook-deletion-via-1984-settlement.ars.<br />
34 See Rob Pegoraro, Apple Ipad’s Rejection of Adobe Fl<strong>as</strong>h Could Signal the Player’s Death Knell, THE<br />
WASHINGTON POST, Feb. 7, 2010, http://www.w<strong>as</strong>hingtonpost.com/wpdyn/content/article/2010/02/05/AR2010020501089.html.<br />
35 See Wikipedia, Sony BMG CD Copy Protection Scandal,<br />
http://en.wikipedia.org/wiki/Sony_BMG_CD_copy_protection_scandal (l<strong>as</strong>t<br />
accessed Dec. 9, 2010).<br />
36 Bartow, supra note 17 at 1088,<br />
www.michiganlawreview.org/<strong>as</strong>sets/pdfs/108/6/bartow.pdf
150 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
The America Online C<strong>as</strong>e Study:<br />
Remembering Yesterday’s Face of “Closed” Evil<br />
When it comes to “closed” systems, evil h<strong>as</strong> a face, but it seems the face is<br />
always changing. When Lessig penned Code a decade ago, it w<strong>as</strong> American<br />
Online (AOL) that w<strong>as</strong> set to become the corporate enslaver of cyberspace. For<br />
a time, it w<strong>as</strong> e<strong>as</strong>y to see why Lessig and others might have been worried.<br />
Twenty five million subscribers were willing to pay $20 per month to get a<br />
guided tour of AOL’s walled garden version of the Internet. Then AOL and<br />
Time Warner announced a historic mega-merger that had some predicting the<br />
rise of “new totalitarianisms” 37 and corporate “Big Brother.” 38<br />
But the deal quickly went off the rails. 39 By April 2002, just two years after the<br />
deal w<strong>as</strong> struck, AOL-Time Warner had already reported a staggering $54<br />
billion loss. 40 By January 2003, losses had grown to $99 billion. 41 By September<br />
2003, Time Warner decided to drop AOL from its name altogether and the deal<br />
continued to slowly unravel from there. 42 In a 2006 interview with the Wall<br />
Street Journal, Time Warner President Jeffrey Bewkes famously declared the<br />
death of “synergy” and went so far <strong>as</strong> to call synergy “bullsh*t”! 43 In early 2008,<br />
Time Warner decided to shed AOL’s dial-up service 44 and in 2009 spun off<br />
AOL entirely. 45 Further deconsolidation followed for Time Warner, which<br />
37 Norman Soloman, AOL Time Warner: Calling The Faithful To Their Knees, Jan. 2000,<br />
www.fair.org/media-beat/000113.html<br />
38 Robert Scheer, Confessions of an E-Columnist, Jan. 14, 2000, ONLINE JOURNALISM REVIEW,<br />
www.ojr.org/ojr/workplace/1017966109.php<br />
39 Adam Thierer, A Brief History of Media Merger Hysteria: From AOL-Time Warner to Comc<strong>as</strong>t-<br />
NBC, Progress & Freedom Foundation, PROGRESS ON POINT 16.25, Dec. 2, 2009,<br />
www.pff.org/issues-pubs/pops/2009/pop16.25-comc<strong>as</strong>t-NBC-merger-madness.pdf<br />
40 Frank Pellegrini, What AOL Time Warner’s $54 Billion Loss Means, April 25, 2002, TIME<br />
ONLINE, www.time.com/time/business/article/0,8599,233436,00.html<br />
41 Jim Hu, AOL Loses Ted Turner and $99 billion, CNET NEWS.COM, Jan. 30, 2004,<br />
http://news.cnet.com/AOL-loses-Ted-Turner-and-99-billion/2100-1023_3-<br />
982648.html<br />
42 Id.<br />
43 Matthew Karnitschnig, After Years of Pushing Synergy, Time Warner Inc. Says Enough, WALL<br />
STREET JOURNAL, June 2, 2006,<br />
http://online.wsj.com/article/SB114921801650969574.html<br />
44 Geraldine Fabrikant, Time Warner Plans to Split Off AOL’s Dial-Up Service, NEW YORK<br />
TIMES, Feb. 7, 2008,<br />
www.nytimes.com/2008/02/07/business/07warner.html?_r=1&adxnnl=1&oref=slog<br />
in&adxnnlx=1209654030-ZpEGB/n3jS5TGHX63DONHg<br />
45 Press Rele<strong>as</strong>e, Time Warner, Time Warner Inc. Completes Spin-off of AOL Inc. (Dec. 10,<br />
2009), http://www.timewarner.com/corp/newsroom/pr/0,20812,1946835,00.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 151<br />
spun off its cable TV unit and various other properties. Looking back at the<br />
deal, Fortune magazine senior editor at large Allan Sloan called it the “turkey of<br />
the decade.” 46<br />
In the larger scheme of things, AOL’s story h<strong>as</strong> already become an afterthought<br />
in our chaotic cyber-history. But we shouldn’t let those old critics forget about<br />
their lugubrious lamentations. To recap: the big, bad corporate villain of<br />
Lessig’s Code attempted to construct the largest walled garden ever, and partner<br />
with a titan of the media sector in doing so—and this d<strong>as</strong>tardly plot failed miserably.<br />
The hysteria about AOL’s looming monopolization of instant messaging—and<br />
with it, the rest of the Web—seems particularly silly: Today, anyone can<br />
download a free chat client like Digsby or Adium to manage multiple IM<br />
services from AOL, Yahoo!, Google, Facebook and just about anyone else, all<br />
within a single interface, essentially making it irrelevant which chat service your<br />
friends use.<br />
From this c<strong>as</strong>e study one would think the Openness Evangelicals would have<br />
gained a newfound appreciation for the evolutionary and dynamic nature of<br />
digital markets and come to understand that, in markets built upon code, the<br />
pace and nature of change is unrelenting and utterly unpredictable. Indeed,<br />
contra Lessig’s lament in Code that “Left to itself, cyberspace will become a<br />
perfect tool of control,” cyberspace h<strong>as</strong> proven far more difficult to “control”<br />
or regulate than any of us ever imagined. The volume and pace of technological<br />
innovation we have witnessed over the p<strong>as</strong>t decade h<strong>as</strong> been nothing short of<br />
stunning.<br />
Critics like Zittrain and Wu, however, wants to keep beating the cyber-sourpuss<br />
drum. So, the face of corporate evil had to change. Today, Steve Jobs h<strong>as</strong><br />
become the supposed apotheosis of all this closed-system evil instead of AOL.<br />
Jobs serves <strong>as</strong> a prime villain in the books of Zittrain and Wu and in many of<br />
the essays they and other Openness Evangelicals pen. It’s worth noting,<br />
however, that their enemies list is growing longer and now reads like a “Who’s<br />
Who” of high-tech corporate America. According to Zittrain and Wu’s books,<br />
’we need to worry about just about every major player in the high-tech<br />
ecosystem—telcos, cable companies, wireless operators, entertainment<br />
providers, Facebook, and others.<br />
Even Google—Silicon Valley’s supposed savior of Internet openness—is not<br />
spared their scorn. “Google is the Internet’s switch,” Wu argues. “In fact, it’s<br />
46 Allan Sloan, ‘C<strong>as</strong>h for…’ and the Year’s Other Clunkers, WASHINGTON POST, Nov. 17, 2009,<br />
www.w<strong>as</strong>hingtonpost.com/wpdyn/content/article/2009/11/16/AR2009111603775.html
152 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
the world’s most popular Internet switch, and <strong>as</strong> such, it might even be<br />
described <strong>as</strong> the current custodian of the M<strong>as</strong>ter Switch.” More ominously, he<br />
warns, “it is the switch that transforms mere communications into<br />
networking—that ultimately decides who reaches what or whom.” 47<br />
It seems, then, that the face of “closed” evil is constantly morphing. Shouldn’t<br />
that tell us something about how dynamic these markets are?<br />
There are few re<strong>as</strong>ons to believe that today’s efforts to build such walled<br />
gardens would end much differently. Indeed, incre<strong>as</strong>ingly when companies or<br />
coders erect walls of any sort, holes form quickly. For example, it usually<br />
doesn’t take long for a determined group of hackers to find ways around<br />
copy/security protections and “root” or “jailbreak” phones and other devices. 48<br />
Once hacked, users are usually then able to configure their devices or<br />
applications however they wish, effectively thumbing their noses at the<br />
developers. This process tends to unfold in a matter of just days, even hours,<br />
after the rele<strong>as</strong>e of a new device or operating system.<br />
Number of Days Before New Devices Were “Rooted” or “Jailbroken” 49<br />
original iPhone 10 days<br />
original iPod Touch 35 days<br />
iPhone 3G 8 days<br />
iPhone 3GS 1 day<br />
iPhone 4 38 days<br />
iPad 1 day<br />
T-Mobile G1 (first Android phone) 13 days<br />
Palm Pre 8 days<br />
Of course, not every user will make the effort—or take the risk 50—to hack their<br />
devices in this f<strong>as</strong>hion, even once instructions are widely available for doing so.<br />
47 Wu, supra note 5 at 280.<br />
48 “In living proof that <strong>as</strong> long <strong>as</strong> there’s a thriving geek fan culture for a device, it will never be<br />
long for the new version to be jailbroken: behold iOS 4.1. Most people are perfectly willing<br />
to let their devices do the talking for them, accept what’s given, and just run sanctioned<br />
software. But there are those intrepid few—who actually make up a fairly notable portion of<br />
the market—who want more out of their devices and find ways around the handicaps built<br />
into them by the manufacturers.” Kit Dotson, New iOS for Apple TV Firmware Rele<strong>as</strong>ed,<br />
Promptly Decrypted, SiliconAngle, Sept. 28, 2010, http://siliconangle.com/blog/2010/09/<br />
28/new-ios-for-apple-tv-firmware-rele<strong>as</strong>ed-promptly-decrypted<br />
49 Original research conducted by author and Adam Marcus b<strong>as</strong>ed on news reports.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 153<br />
Nonetheless, even if copyright law might sometimes seek to restrict it, the<br />
hacking option still exists for those who wish to exercise it. Moreover, because<br />
many manufacturers know their devices are likely to be hacked, they are<br />
incre<strong>as</strong>ingly willing to make them more “open” right out of the gates or offer<br />
more functionality/flexibility to make users happy.<br />
Innovation Continues to Unfold Rapidly<br />
in Both Directions along the “Open”<br />
vs. “Closed” Continuum<br />
As noted above, part of Zittrain and Wu’s lament seems to be that the devices<br />
that the hoi polloi choose might crowd out those favored by tinker-happy tech<br />
geeks (of which I count myself a proud member). But we geeks need not fear<br />
such foreclosure. Just because there are some “closed” systems or devices on<br />
the market, it doesn’t mean innovation h<strong>as</strong> been foreclosed among more<br />
“open” systems or platforms. A hybrid future is both possible and desirable.<br />
Again, we can have the best of both worlds—a world full of plenty of closed<br />
systems or even “tethered appliances,” but also plenty of generativity and<br />
openness. As Web 2.0 pioneer Tim O’Reilly notes:<br />
I’m not terribly taken in by the rhetoric that says that because<br />
content silos are going up, and we’re seeing more paid content,<br />
the open web is over. Individuals, small companies,<br />
entrepreneurs, artists, all have enormous ability to share and<br />
distribute their work and find an audience. I don’t see that<br />
becoming less in today’s environment. 51<br />
Consider the battle between the Apple iPhone and Google Android mobile<br />
phone operating systems. Zittrain says Android is “a sort of canary in the coal<br />
mine” 52 for open platforms, but ignores the frantic pace of its growth, now<br />
accounting for one-quarter of mobile Web traffic just three years after its<br />
inception 53 and stealing away Apple’s marketshare in the process. 54 Beyond<br />
50 Rooting or jailbreaking a smartphone creates the risk of “bricking” the device—rendering it<br />
completely inoperable (and thus no more useful than a brick). Additionally, hacking devices<br />
in this f<strong>as</strong>hion typically voids any manufacturer warranty.<br />
51 The Web is Dead? A Debate, WIRED, Aug. 17, 2010,<br />
www.wired.com/magazine/2010/08/ff_webrip_debate/all/1<br />
52 Jonathan Zittrain, H<strong>as</strong> the Future of the Internet Happened? Sept. 7, 2010, CONCURRING<br />
OPINIONS blog, www.concurringopinions.com/archives/2010/09/h<strong>as</strong>-the-future-ofthe-internet-come-about.html<br />
53 Sean Hollister, Android Accounts for One-Quarter of Mobile Web Traffic, Says Quantc<strong>as</strong>t,<br />
ENGADGET, Sept. 4, 2010, www.engadget.com/2010/09/04/android-accounts-for-onequarter-of-mobile-web-traffic-says-qua;<br />
Android Most Popular Operating System in U.S.
154 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
downplaying Android’s success <strong>as</strong> a marketplace triumph for openness (and<br />
proof of the non-governmental forces that work to force a balance between<br />
openness and closedness), Zittrain also reverts to the “kill switch” boogeyman:<br />
He warns us that any day now Google could change its mind, close the Android<br />
platform, and “kill an app, or the entire phone” remotely. 55 But where’s the<br />
business sense in that? What’s the incentive for Google to pursue such a course<br />
of action? Would Google be able to produce all those millions of apps<br />
currently produced by independent developers? That seems both unlikely and<br />
unpopular. Meanwhile, how many times h<strong>as</strong> supposedly control-minded Apple<br />
actually thrown the dreaded “kill switch” on apps? There are tens of millions of<br />
apps in Apple’s App Store and hundreds of billions of downloads. If Steve Jobs<br />
is supposed to be the great villain of independent innovation, he seems to be<br />
doing a pretty bad job at it! “The App Store is, by some estimates, now a multibillion-dollar-a-year<br />
business,” note Grimmelmann and Ohm. 56 “The iPhone is<br />
a hotbed of creative tinkering; people are doing amazing things with it.” 57<br />
In fact, Wu admits Apple’s App Store offers a “seemingly unlimited variety of<br />
functions” and that “Apple does allow outsiders to develop applications on its<br />
platform” since “the defeat of the Macintosh by Windows taught Jobs that a<br />
platform completely closed to outside developers is suicide.” 58 That should be<br />
the end of the story. Yet Wu’s fear of that big proverbial “kill switch” overrides<br />
all: Any day now, that switch will be thrown and Lessig’s pessimistic predictions<br />
of “perfect control” will finally come to p<strong>as</strong>s, he implies. As Wu says, “all<br />
innovation and functionality are ultimately subject to Apple’s veto.” 59 And<br />
consider the lament of Tom Conlon of Popular Science: “Once we replace the<br />
personal computer with a closed-platform device such <strong>as</strong> the iPad, we replace<br />
Among Recent Smartphone Buyers, NIELSEN WIRE, Oct. 5, 2010,<br />
http://blog.nielsen.com/nielsenwire/online_mobile/android-most-popularoperating-system-in-u-s-among-recent-smartphone-buyers<br />
54 Tricia Duryee, Apple Continued To Lose U.S. Marketshare Despite Spike From iPhone 4 Sales,<br />
MOCONEWS.NET, Sept. 15, 2010, http://moconews.net/article/419-apple-continuedto-lose-u.s.-marketshare-despite-spike-from-iphone-4-sa;<br />
Miguel Helft, The iPhone<br />
H<strong>as</strong> a Real Fight on Its Hands, NEW YORK TIMES BITS, Oct. 5, 2010,<br />
http://bits.blogs.nytimes.com/2010/10/05/the-iphone-h<strong>as</strong>-a-real-fight-on-itshands/<br />
55 Jonathan Zittrain, H<strong>as</strong> the Future of the Internet Happened? Sept. 7, 2010, CONCURRING<br />
OPINIONS blog, www.concurringopinions.com/archives/2010/09/h<strong>as</strong>-the-future-ofthe-internet-come-about.html<br />
56 Grimmelmann & Ohm, supra note 18 at 923.<br />
57 Id.<br />
58 Wu, supra note 5 at 292.<br />
59 Id.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 155<br />
freedom, choice, and the free market with oppression, censorship, and<br />
monopoly.” 60 But Apple is hardly the only game in town, and each time Apple<br />
creates a new product category (iPod, iPhone, iPad, etc.), other companies are<br />
quick to follow with their own, usually more open systems, often running<br />
Google’s Android operating system.<br />
Neither Wu nor Zittrain, however, spend much time investigating how often<br />
their proverbial kill switch is actually thrown—by Apple or anyone else. There<br />
have been a handful of examples, but those are hardly the rule. The v<strong>as</strong>t<br />
majority of all applications are immediately accepted and offered on the<br />
platform. Moreover, if they were blocked, they could quickly be found on other<br />
platforms. Again, there are plenty of alternatives to Apple products if you don’t<br />
like their (somewhat) more restrictive policies regarding application<br />
development.<br />
Bottom line: Today’s supposed “walled gardens” are less “walled” than ever<br />
before, and “closed” systems aren’t really so closed.<br />
The Internet W<strong>as</strong> Never Quite<br />
So Open or Generative<br />
At times, Zittrain and others seem to have created an Internet imago; an<br />
idealized conception of a supposed better time when cyberspace w<strong>as</strong> more open<br />
and vibrant. But let’s face it, the “good ol’ days” that many Openness<br />
Evangelicals seem to be longing for weren’t really so glorious. Were you online<br />
back in 1994? Did you enjoy Trumpet Winsock and noisy 14.4 baud modems?<br />
Did you like loading up multiple 5¼-inch floppy disks just to boot your<br />
machine? Needless to say, most of us don’t miss those days.<br />
Here’s the other forgotten factor about the Net’s early history: Until the Net<br />
w<strong>as</strong> commercialized, it w<strong>as</strong> an extremely closed system. As Geert Lovink<br />
reminds us:<br />
[In] [t]he first decades[,] the Internet w<strong>as</strong> a closed world, only<br />
accessible to (Western) academics and the U.S. military. In<br />
order to access the Internet one had to be an academic<br />
computer scientist or a physicist. Until the early nineties it w<strong>as</strong><br />
not possible for ordinary citizens, artists, business[es] or<br />
activists, in the USA or elsewhere, to obtain an email address<br />
60 Tom Conlon, The iPad’s Closed System: Sometimes I Hate Being Right, POPULAR SCIENCE, Jan. 29,<br />
2010, www.popsci.com/gadgets/article/2010-01/ipad%E2%80%99s-closed-systemsometimes-i-hate-being-right
156 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
and make use of the rudimentary UNIX-b<strong>as</strong>ed applications. …<br />
It w<strong>as</strong> a network of networks—but still a closed one. 61<br />
Ironically, it w<strong>as</strong> only because Lessig and Zittrain’s much-dreaded AOL and<br />
CompuServe came along that many folks were even able to experience and<br />
enjoy this strange new world called the Internet. “The fact that millions of<br />
Americans for the first time experienced the Internet through services like AOL<br />
(and continue to do so) is a reality that Zittrain simply overlooks,” notes<br />
Lovink. 62 Could it be that those glorious “good ol’ days” Zittrain longs for were<br />
really due to the way closed “walled gardens” like AOL and CompuServe held<br />
our hands to some extent and gave many new Netizens a guided tour of<br />
cyberspace?<br />
Regardless, we need not revisit or reconsider that history. That’s ancient history<br />
now because the walls around those gardens came crumbling down.<br />
Summary<br />
When you peel away all the techno-talk and hand-wringing, what Zittrain and<br />
other Openness Evangelicals object to is the fact that some people are making<br />
choices that they don’t approve of. To be generous, perhaps it’s because they<br />
believe that the “mere mortals” don’t fully understand the supposed dangers of<br />
the choices they are making. But my contention here h<strong>as</strong> been that things just<br />
aren’t <strong>as</strong> bad <strong>as</strong> they make them out to be. More pointedly, who are these critics<br />
to say those choices are irrational?<br />
Again, so what if some mere mortals choose more “closed” devices or<br />
platforms because they require less tinkering and “just work?” It isn’t the end of<br />
the world. Those devices or platforms aren’t really <strong>as</strong> closed <strong>as</strong> they suggests—<br />
in fact, they are far more open in some ways that the earlier technologies and<br />
platforms Zittrain, et.al. glorify. And it simply doesn’t follow that just because<br />
some consumers choose to use “appliances” that it’s the end of the generative<br />
devices that others so cherish. “General-purpose computers are so useful that<br />
we’re not likely to abandon them,” notes Princeton University computer science<br />
professor Ed Felten. 63 For example, a October 2010 NPD Group survey<br />
61 Geert Lovink, Zittrain’s Foundational Myth of the Open Internet, NET CRITIQUE BY GEERT<br />
LOVINK, Oct. 12, 2008,<br />
http://networkcultures.org/wpmu/geert/2008/10/12/zittrains-foundational-mythof-the-open-internet/<br />
62 Id.<br />
63 Ed Felten, iPad to Test Zittrain’s “Future of the Internet” Thesis, FREEDOM TO TINKER blog, Feb.<br />
4, 2010, www.freedom-to-tinker.com/blog/felten/ipad-test-zittrains-future-internetthesis
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 157<br />
revealed that “contrary to popular belief, the iPad isn’t causing cannibalization<br />
in the PC market because iPad owners don’t exhibit the same buying and<br />
ownership patterns <strong>as</strong> the typical consumer electronics customer.” 64 According<br />
to NPD, only 13% of iPad owners surveyed bought an iPad instead of a PC,<br />
while 24% replaced a planned e-reader purch<strong>as</strong>e with an iPad. Thus, to the<br />
extent the iPad w<strong>as</strong> replacing anything, it would be other “non-generative”<br />
devices like e-readers.<br />
In a similar vein, James Watters, Senior Manager of Cloud Solutions<br />
Development at VMware, argues:<br />
Innovation will be alive and well because the fundamental<br />
technologies at the core of cloud computing are designed for<br />
m<strong>as</strong>sive, vibrant, explosive, awesome, and amazing application<br />
innovation. There will always be a big place in the market for<br />
companies who achieve design simplicity by limiting what can<br />
be done on their platforms—Apple and Facebook may march<br />
to m<strong>as</strong>sive market share by this principle—but <strong>as</strong> long <strong>as</strong> the<br />
technologies underpinning the network are open,<br />
programmable, extensible, modular, and dynamic <strong>as</strong> they are<br />
and will be, innovation is in good hands. 65<br />
Thus, we can have the best of both worlds—a world full of plenty of “tethered”<br />
appliances, but also plenty of generativity and openness. We need not make a<br />
choice between the two, and we certainly shouldn’t be demanding someone else<br />
make it for us.<br />
Against the St<strong>as</strong>is Mentality<br />
& Static Snapshots<br />
There are some important practical questions that the Openness Evangelicals<br />
often fail to acknowledge in their work. Beyond the thorny question of how to<br />
define “openness” and “generativity,” what metric should be used when existing<br />
yardsticks become obsolete so regularly?<br />
This points to two major failings in the work of all the cyber-collectivists—<br />
Lessig in Code, Zittrain in Future of the Internet, and Wu in The M<strong>as</strong>ter Switch:<br />
64 Nearly 90 Percent of Initial iPad Sales are Incremental and not Cannibalizing the PC Market, According<br />
to NPD, NPD Group PRESS RELEASE, October 1, 2010,<br />
www.npd.com/press/rele<strong>as</strong>es/press_101001.html<br />
65 James Watters, NYT Kicks Off Cloud Paranoia Series, SILICONANGLE blog, July 21, 2009,<br />
http://siliconangle.com/blog/2009/07/21/nyt-kicks-off-cloud-paranoia-editorialseries
158 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
1. They have a tendency to adopt a static, snapshot view of markets and<br />
innovation; and,<br />
2. They often express an overly nostalgic view of the p<strong>as</strong>t (without<br />
making it clear when the “good ‘old days” began and ended) while<br />
adopting an excessively pessimist view of the present and the chances<br />
for progress in the future.<br />
This is what Virginia Postrel w<strong>as</strong> referring to in The Future and Its Enemies when<br />
she criticized the st<strong>as</strong>is mentality because “It overvalues the t<strong>as</strong>tes of an<br />
articulate elite, compares the real world of trade-offs to fant<strong>as</strong>ies of utopia,<br />
omits important details and connections, and confuses temporary growing pains<br />
with permanent cat<strong>as</strong>trophes.” 66 And it is what economist Israel Kirzner w<strong>as</strong><br />
speaking of when warned of “the shortsightedness of those who, not<br />
recognizing the open-ended character of entrepreneurial discovery, repeatedly<br />
fall into the trap of forec<strong>as</strong>ting the future against the background of today’s<br />
expectations rather than against the unknowable background of tomorrow’s<br />
discoveries.” 67<br />
Indeed, there seems to be a complete lack of appreciation among the Openness<br />
Evangelicals for just how rapid and unpredictable the pace of change in the<br />
digital realm h<strong>as</strong> been and will likely continue to be. The relentlessness and<br />
intensity of technological disruption in the digital economy is truly<br />
unprecedented but often under-appreciated. We’ve had multiple mini-industrial<br />
revolutions within the digital ecosystem over the p<strong>as</strong>t 15 years. Again, this is<br />
“evolutionary dynamism” at work. (Actually, it’s more like revolutionary<br />
dynamism!) Nothing—absolutely nothing—that w<strong>as</strong> sitting on our desks in 1995<br />
is still there today (in terms of digital hardware and software). It’s unlikely that<br />
much of what w<strong>as</strong> on our desk in 2005 is still there either—with the possible<br />
exception of some crusty desktop computers running Windows XP. Thus, at a<br />
minimum, analysts of innovation in this space “should … extend the time<br />
horizon for our <strong>as</strong>sessment of the generative ecosystem” 68 to ensure they are<br />
not guilty of the static snapshot problem.<br />
Speaking of Windows, it perfectly illustrates the complexity of defining<br />
generative systems. Compare the half-life of Windows PC operating systems—<br />
which Zittrain indirectly glorifies in his book <strong>as</strong> generativity nirvana—to the<br />
half-life of Android operating systems. Both Apple and Android-b<strong>as</strong>ed devices<br />
66 VIRGINIA POSTREL, THE FUTURE AND ITS ENEMIES (1998), at xvii-xviii.<br />
67 ISRAEL KIRZNER, DISCOVERY AND THE CAPITALIST PROCESS (University of Chicago Press,<br />
1985), at xi.<br />
68 Grimmelmann & Ohm, supra note 18 at 947.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 159<br />
have seen multiple OS upgrades since rele<strong>as</strong>e. Some application developers<br />
actually complain about this frantic pace of mobile OS “revolutions,” especially<br />
with the Android OS, since they must deal with multiple devices and OS<br />
versions instead of just one Apple iPhone. They’d rather see more OS<br />
consistency among the Android devices for which they’re developing to<br />
facilitate quicker and more stable rollouts. They also have to consider whether<br />
and how to develop the same app for several other competing platforms.<br />
Meanwhile, Windows h<strong>as</strong> offered a more “stable” developing platform for<br />
developers because Microsoft rolls out OS upgrades at a much slower pace.<br />
Should we should consider an OS with a slower upgrade trajectory more<br />
“generative” than an OS that experiences constant upgrades if, in practice, the<br />
former allows for more “open” (and potentially rapid) independent innovation<br />
by third parties? Of course, there other factors that play into the “generativity”<br />
equation, 69 but it would be no small irony to place the Windows PC model on<br />
the higher pedestal of generativity than the more rapidly-evolving mobile OS<br />
ecosystem.<br />
Conclusion: Toward Evolutionary<br />
Dynamism & Technological Agnosticism<br />
Whether we are debating where various devices sit on a generativity continuum<br />
(of “open” versus “closed” systems), or what fits where on a “code failure”<br />
continuum (of “perfect code” versus “market failure”), the key point is that the<br />
continuum itself is constantly evolving and that this evolution is taking place at a much<br />
f<strong>as</strong>ter clip in this arena than it does in other markets. Coders don’t sit still.<br />
People innovate around “failure.” Indeed, “market failure” is really just the<br />
gl<strong>as</strong>s-is-half-empty view of a golden opportunity for innovation. Markets<br />
evolve. New ide<strong>as</strong>, innovations, and companies are born. Things generally<br />
change for the better—and do so rapidly.<br />
69 “[G]enerativity is essential but can never be absolute. No technological system is perfectly<br />
generative at all levels, for all users, forever. Tradeoffs are inevitable.” Grimmelmann and<br />
Ohm, supra note 18 at 923.
160 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
In light of the radical revolutions constantly unfolding in this space and<br />
upending existing models, it’s vitally important we avoid “defining down”<br />
market failure. This is not b<strong>as</strong>ed on a blind faith in free markets, but rather a<br />
profound appreciation for the fact that in markets built upon code, the pace and nature<br />
of change is unrelenting and utterly unpredictable. Contra Lessig’s lament in Code that<br />
“Left to itself, cyberspace will become a perfect tool of control”—cyberspace<br />
h<strong>as</strong> proven far more difficult to “control” or regulate than any of us ever<br />
imagined. Again, the volume and pace of technological innovation we have<br />
witnessed over the p<strong>as</strong>t decade h<strong>as</strong> been nothing short of stunning.<br />
We need to give evolutionary dynamism a chance. Sometimes it’s during what<br />
appears to be a given sector’s darkest hour that the most exciting things are<br />
happening within it—<strong>as</strong> the AOL c<strong>as</strong>e study illustrates. It’s e<strong>as</strong>y to forget all the<br />
anxiety surrounding AOL and its “market power” circa 1999-2002, when<br />
scholars like Lessig predicted that the company’s walled garden approach would<br />
eventually spread and become the norm for cyberspace. As made clear in the<br />
breakout above, however, the exact opposite proved to be the c<strong>as</strong>e. The critics<br />
said the sky would fall, but it most certainly did not.<br />
Similarly, in the late 1990s, many critics—including governments both here and<br />
in the EU—claimed that Microsoft dominated the browser market. Dour<br />
predictions of perpetual Internet Explorer lock-in followed. For a short time,<br />
there w<strong>as</strong> some truth to this. But innovators weren’t just sitting still; exciting<br />
things were happening. In particular, the seeds were being planted for the rise<br />
of Firefox and Chrome <strong>as</strong> robust challengers to IE’s dominance—not to<br />
mention mobile browsers. Of course, it’s true that roughly half of all websurfers
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 161<br />
still use a version of IE today. But IE’s share of the market is falling rapidly 70 <strong>as</strong><br />
viable, impressive alternatives now exist and innovation among these<br />
competitors is more vibrant than ever. 71 That’s all that counts. The world<br />
changed, and for the better, despite all the doomsday predictions we heard less<br />
than a decade ago about Microsoft’s potential dominance of cyberspace.<br />
Moreover, all the innovation taking place at the browser layer today certainly<br />
undercuts the gloomy “death of the Net” thesis set forth by Zittrain and others.<br />
Thus, <strong>as</strong> O’Reilly argues, this c<strong>as</strong>e study again shows us the power of open<br />
systems and evolutionary dynamism:<br />
Just <strong>as</strong> Microsoft appeared to have everything locked down in<br />
the PC industry, the open Internet restarted the game, away<br />
from what everyone thought w<strong>as</strong> the main action. I guarantee<br />
that if anyone gets a lock on the mobile Internet, the same<br />
thing will happen. We’ll be surprised by the innovation that<br />
starts happening somewhere else, out on the free edges. And<br />
that free edge will eventually become the new center, because<br />
open is where innovation happens. […] it’s far too early to call<br />
the open web dead, just because some big media companies<br />
are excited about the app ecosystem. I predict that those same<br />
big media companies are going to get their clocks cleaned by<br />
small innovators, just <strong>as</strong> they did on the web. 72<br />
In sum, history counsels patience and humility in the face of radical uncertainty<br />
and unprecedented change. More generally, it counsels what we might call<br />
“technological agnosticism.” We should avoid declaring “openness” a<br />
sacrosanct principle and making everything else subservient to it without regard<br />
to cost or consumer desires. As Anderson notes, “there are many Web<br />
triumphalists who still believe that there is only One True Way, and will fight to<br />
the death to preserve the open, searchable common platform that the Web<br />
represented for most of its first two decades (before Apple and Facebook, to<br />
name two, decided that there were Other Ways).” 73 The better position is one<br />
b<strong>as</strong>ed on a general agnosticism regarding the nature of technological platforms<br />
and change. In this view, the spontaneous evolution of markets h<strong>as</strong> value in its<br />
70 Tim Stevens, Internet Explorer Falls Below 50 Percent Global Marketshare, Chrome Usage Triples,<br />
ENGADGET, Oct. 5, 2010, www.engadget.com/2010/10/05/internet-explorer-fallsbelow-50-percent-global-marketshare-chr<br />
71 Nick Wingfield & Don Clark, Browsers Get a Face-Lift, WALL STREET JOURNAL, Sept. 15, 2010,<br />
http://online.wsj.com/article/SB10001424052748704285104575492102514582856.html<br />
72 The Web is Dead? A Debate, WIRED, Aug. 17, 2010,<br />
www.wired.com/magazine/2010/08/ff_webrip_debate/all/1<br />
73 Id.
162 CHAPTER 2: IS THE GENERATIVE INTERNET AT RISK?<br />
own right, and continued experimentation with new models—be they “open”<br />
or “closed,” “generative” or “tethered”—should be permitted.<br />
Importantly, one need not believe that the markets in code are “perfectly<br />
competitive” to accept that they are “competitive enough” compared to the<br />
alternatives—especially those re-shaped by regulation. “Code failures” are<br />
ultimately better addressed by voluntary, spontaneous, bottom-up, marketplace<br />
responses than by coerced, top-down, governmental solutions. Moreover, the<br />
decisive advantage of the market-driven, evolutionary approach to correcting<br />
code failure comes down to the rapidity and nimbleness of those responses.<br />
Let’s give those other forces—alternative platforms, new innovators, social<br />
norms, public pressure, etc.—a chance to work some magic. Evolution happens,<br />
if you let it.
CHAPTER 3<br />
IS INTERNET EXCEPTIONALISM DEAD?<br />
The Third Wave of Internet Exceptionalism 165<br />
Eric Goldman<br />
A Declaration of the Dependence of Cyberspace 169<br />
Alex Kozinski and Josh Goldfoot<br />
Is Internet Exceptionalism Dead? 179<br />
Tim Wu<br />
Section 230 of the CDA: Internet Exceptionalism<br />
<strong>as</strong> a Statutory Construct 189<br />
H. Brian Holland<br />
Internet Exceptionalism Revisited 209<br />
Mark MacCarthy<br />
163
164 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 165<br />
The Third Wave of Internet<br />
Exceptionalism<br />
By Eric Goldman *<br />
From the beginning, the Internet h<strong>as</strong> been viewed <strong>as</strong> something special and<br />
“unique.” For example, in 1996, a judge called the Internet “a unique and<br />
wholly new medium of worldwide human communication.” 1<br />
The Internet’s perceived novelty h<strong>as</strong> prompted regulators to engage in “Internet<br />
exceptionalism”: crafting Internet-specific laws that diverge from regulatory<br />
precedents in other media. Internet exceptionalism h<strong>as</strong> come in three distinct<br />
waves:<br />
The First Wave: Internet Utopianism<br />
In the mid-1990s, some people fant<strong>as</strong>ized about an Internet “utopia” that<br />
would overcome the problems inherent in other media. Some regulators,<br />
fearing disruption of this possible utopia, sought to treat the Internet more<br />
favorably than other media.<br />
47 U.S.C. § 230 (“Section 230”—a law still on the books) is a flagship example<br />
of mid-1990s efforts to preserve Internet utopianism. The statute categorically<br />
immunizes online providers from liability for publishing most types of third<br />
party content. It w<strong>as</strong> enacted (in part) “to preserve the vibrant and competitive<br />
free market that presently exists for the Internet and other interactive computer<br />
services, unfettered by Federal or State regulation.” 2 The statute is clearly<br />
exceptionalist because it treats online providers more favorably than offline<br />
publishers—even when they publish identical content.<br />
The Second Wave: Internet Paranoia<br />
Later in the 1990s, the regulatory pendulum swung in the other direction.<br />
Regulators still embraced Internet exceptionalism, but instead of favoring the<br />
Internet, regulators treated the Internet more harshly than analogous offline<br />
activity.<br />
For example, in 2005, a Tex<strong>as</strong> website called Live-shot.com announced that it<br />
would offer “Internet hunting.” The website allowed paying customers to<br />
* Associate Professor and Director, High Tech Law Institute, Santa Clara University School of<br />
Law. Email: egoldman@gmail.com. Website: http://www.ericgoldman.org.<br />
1 American Civil Liberties Union v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996).<br />
2 47 U.S.C. § 230(b)(2).
166 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
control, via the Internet, a gun on its game farm. An employee manually<br />
monitored the gun and could override the customer’s instructions. The website<br />
wanted to give people who could not otherwise hunt, such <strong>as</strong> paraplegics, the<br />
opportunity to enjoy the hunting experience. 3<br />
The regulatory reaction to Internet hunting w<strong>as</strong> swift and severe. Over threedozen<br />
states banned Internet hunting. 4 California also banned Internet fishing<br />
for good me<strong>as</strong>ure. 5 However, regulators never explained how Internet hunting<br />
is more objectionable than physical space hunting.<br />
For example, California Sen. Debra Bowen criticized Internet hunting because it<br />
“isn’t hunting; it’s an inhumane, over the top, pay-per-view video game using<br />
live animals for target practice … . Shooting live animals over the Internet<br />
takes absolutely zero hunting skills, and it ought to be offensive to every<br />
legitimate hunter.” 6<br />
Sen. Bowen’s remarks reflect numerous unexpressed <strong>as</strong>sumptions about the<br />
nature of “hunting” and what constitutes fair play. In the end, however,<br />
hunting may just be “hunting,” in which c<strong>as</strong>e the response to Internet hunting<br />
may just be a typical example of adverse Internet exceptionalism. 7<br />
The Third Wave:<br />
Exceptionalism Proliferation<br />
The p<strong>as</strong>t few years have brought a new regulatory trend. Regulators are still<br />
engaged in Internet exceptionalism, but each new advance in Internet<br />
technology h<strong>as</strong> prompted exceptionalist regulations towards that technology.<br />
For example, the emergence of blogs and virtual worlds h<strong>as</strong> helped initiate a<br />
push towards blog-specific and virtual world-specific regulation. In effect,<br />
Internet exceptionalism h<strong>as</strong> splintered into pockets of smaller exceptionalist<br />
efforts.<br />
3 Sylvia Moreno, Mouse Click Brings Home Thrill of the Hunt, WASH. POST, May 8, 2005.<br />
4 Internet Hunting Bans, The Humane Society of the United States,<br />
http://www.hsus.org/web-files/<strong>PDF</strong>/internethunting_map.pdf (l<strong>as</strong>t visited Aug. 23,<br />
2010).<br />
5 Zachary M. Seward, Internet Hunting H<strong>as</strong> Got to Stop – If It Ever Starts, WALL ST. J., Aug. 10,<br />
2007.<br />
6 Michael Gardner, Web ‘Hunts’ in Cross Hairs of Lawmakers, S.D. UNION-TRIBUNE, Apr. 6,<br />
2005.<br />
7 Eric Goldman, A Web Site for Hunting Poses Questions About Killing, S.J. MERCURY NEWS, July<br />
25, 2005.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 167<br />
Regulatory responses to social networking sites like Facebook and MySpace are<br />
a prime example of Internet exceptionalism splintering. Rather than regulating<br />
these sites like other websites, regulators have sought social networking sitespecific<br />
laws, such <strong>as</strong> requirements to verify users’ age, 8 combat sexual<br />
predators 9 and suppress content that promotes violence. 10 The result is that the<br />
regulation of social networking sites differs not only from offline enterprises but<br />
from other websites <strong>as</strong> well.<br />
Implications<br />
Internet exceptionalism—either favoring or disfavoring the Internet—is not<br />
inherently bad. In some c<strong>as</strong>es, the Internet truly is unique, special or different<br />
and should be regulated accordingly. Unfortunately, more typically, anti-<br />
Internet exceptionalism cannot be analytically justified and instead reflects<br />
regulatory panic.<br />
In these c<strong>as</strong>es, anti-Internet regulatory exceptionalism can be harmful, especially<br />
to Internet entrepreneurs and their investors. It can distort the marketplace<br />
between Web enterprises and their offline competition by hindering the Web<br />
business’ ability to compete. In extreme c<strong>as</strong>es, such <strong>as</strong> Internet hunting,<br />
unjustified regulatory intervention may put companies out of business.<br />
Accordingly, before enacting any exceptionalist Internet regulation (and<br />
especially any anti-Internet regulation), regulators should articulate how the<br />
Internet is unique, special or different and explain why these differences justify<br />
exceptionalism. Unfortunately, emotional overreactions to perceived Internet<br />
threats or harms typically trump such a rational regulatory process. Knowing<br />
this tendency, perhaps we can better resist that temptation.<br />
8 Nick Alexander, Attorneys General Announce Agreement With MySpace Regarding Social Networking<br />
Safety, NAA GAZETTE, Jan. 18, 2008,<br />
http://www.naag.org/attorneys_general_announce_agreement_with_myspace_rega<br />
rding_social_networking_safety.php; Brad Stone, Facebook Settles with New York, N.Y.<br />
TIMES BITS BLOG, Oct. 16 2007, http://bits.blogs.nytimes.com/2007/10/16/facebooksettles-with-new-york/.<br />
9 KIDS Act of 2007 (H.R. 719/S. 431) (requiring sexual predators to register their email<br />
addresses and other screen names and enabling social networking sites to access those<br />
electronic identifiers so that the sexual predators can be blocked from registering with the<br />
social networking sites).<br />
10 H. Res. 224 (2007) (resolution requesting that social networking sites proactively remove<br />
“enemy propaganda from their sites,” such <strong>as</strong> videos made by terrorists).
168 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 169<br />
A Declaration of the<br />
Dependence of Cyberspace<br />
By Hon. Alex Kozinski * & Josh Goldfoot **<br />
Governments of the Industrial World, you weary giants of flesh and steel,<br />
I come from Cyberspace, the new home of Mind. On behalf of the future,<br />
I <strong>as</strong>k you of the p<strong>as</strong>t to leave us alone. You are not welcome among us.<br />
You have no sovereignty where we gather. 1<br />
That w<strong>as</strong> the opening of “A Declaration of the Independence of Cyberspace.”<br />
The would-be Cyber-Jefferson who wrote it w<strong>as</strong> John Perry Barlow, a cofounder<br />
of the Electronic Frontier Foundation, a noted libertarian and a<br />
Grateful Dead lyricist. He delivered the Declaration on February 8, 1996, the<br />
same day that President Clinton signed into law the Communications Decency<br />
Act. That Act w<strong>as</strong> chiefly an early effort to regulate Internet pornography.<br />
Many had concerns about that law, and, indeed, the Supreme Court would<br />
eventually declare most of it unconstitutional. 2<br />
Barlow’s argument invoked what he believed w<strong>as</strong> a more decisive criticism than<br />
anything the Supreme Court could come up with. Barlow saw the Internet <strong>as</strong><br />
literally untouchable by our laws. Extolling the power of anonymity, he taunted<br />
that “our identities have no bodies, so, unlike you, we cannot obtain order by<br />
physical coercion.” Unlike the Declaration of Independence, this w<strong>as</strong> not a<br />
declaration that cyberspace w<strong>as</strong> newly independent; it w<strong>as</strong> an observation that<br />
cyberspace had always been independent, and will always remain independent,<br />
because its denizens were beyond the law’s reach.<br />
Needless to say, the weary giants of flesh and steel did not take kindly to the<br />
Declaration. They fought back hard and won numerous battles: witness the fall<br />
of Napster, Grokster, Aimster and innumerable other file-sharing and childpornography-trading<br />
sites and services. Ironically, the Department of<br />
* Chief Judge, United States Court of Appeals for the Ninth Circuit.<br />
** B.A., Yale University; J.D., University of Virginia School of Law; Trial Attorney, Department<br />
of Justice, Criminal Division, Computer Crime & Intellectual Property Section. This essay<br />
w<strong>as</strong> originally published in 32 COLUMBIA JOURNAL OF LAW & THE ARTS, no. 4, 2009 at 365.<br />
The views expressed in this essay are the views of the authors and do not necessarily<br />
represent the views of the U.S. Department of Justice or the United States.<br />
1 John Perry Barlow, A Declaration of the Independence of Cyberspace (Feb. 8, 1996),<br />
http://homes.eff.org/~barlow/Declaration-Final.html.<br />
2 Reno v. American Civil Liberties Union, 521 U.S. 844 (1997).
170 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
Homeland Security now h<strong>as</strong> a “National Strategy to Secure Cyberspace.” 3 Even<br />
the cyber-libertarians have shifted their focus: The Electronic Frontier<br />
Foundation, which Barlow co-founded, now accepts that there may be a place<br />
for so-called “network neutrality” regulation, even though it regulates how<br />
subscribers access the Internet and how content reaches them. 4<br />
In other ways, the Declaration h<strong>as</strong> proved prescient. As far back <strong>as</strong> 1996,<br />
Barlow had identified that the Internet poses a significant problem for<br />
governments. Then, <strong>as</strong> now, people used the Internet to break the law. The<br />
Internet gives those people two powerful tools that help them escape the law’s<br />
efforts to find and punish them. First, the Internet makes anonymity e<strong>as</strong>y.<br />
Today any 11-year-old can obtain a free e-mail account, free website and free<br />
video hosting. The companies that provide these things <strong>as</strong>k for your name, but<br />
they make no effort to verify your answer; <strong>as</strong> a result, only Boy Scouts tell them<br />
the truth. You can be tracked through your Internet protocol (IP) address, but<br />
it is not too tough to use proxies or some neighbor’s open Wi-Fi connection to<br />
get around that problem. Thus, if your online conduct ever hurts someone, it<br />
will be difficult for the victim to ever find out who you are and sue you.<br />
Second, the Internet makes long-distance international communication cheap.<br />
This allows the world’s miscreants, con-artists and thieves e<strong>as</strong>y access to our<br />
gullible citizens. When people find out they’ve been had, they often find that<br />
they have no practical recourse because of the extraordinary difficulties involved<br />
in pursuing someone overse<strong>as</strong>. The Internet’s global nature makes it e<strong>as</strong>y for<br />
people to hide from our courts.<br />
These two advantages of Internet law-breakers pose a serious and recurring<br />
problem. That problem h<strong>as</strong> been particularly painful for intellectual property<br />
rights holders. It is common knowledge that instead of buying music or<br />
movies, you can use the Internet to download perfect copies for free from<br />
individuals known only by their IP addresses. In some c<strong>as</strong>es, wrongdoers have<br />
become so bold that they demand payment in exchange for the opportunity to<br />
download infringing material.<br />
The situation seemed unsolvable to Barlow and others in 1996. Armed with<br />
anonymity and invulnerability, Internet actors could ignore efforts to apply law<br />
to the Internet. Barlow concluded that the Internet’s nature posed an<br />
insurmountable barrier to any effort at legal enforcement. Some scholars even<br />
3 The National Strategy to Secure Cyberspace, Feb. 2003,<br />
http://www.dhs.gov/xlibrary/<strong>as</strong>sets/National_Cyberspace_Strategy.pdf.<br />
4 See https://www.eff.org/files/filenode/nn/EFFNNcomments.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 171<br />
began work on theorizing how the diverse denizens of cyberspace might join<br />
together and go about creating their own indigenous legal system. 5<br />
But over time, a solution to Barlow’s problem appeared. Let us entertain, for a<br />
moment, the conceit that there is a “cyberspace,” populated by people who<br />
communicate online. The denizens of cyberspace exist simultaneously in<br />
cyberspace and in the real flesh-and-steel world. Their cyberspace selves can be<br />
completely anonymous; their real-life selves are e<strong>as</strong>ier to identify. Their<br />
cyberspace selves have no physical presence; their real-life selves both exist and<br />
have b<strong>as</strong>e material desires for PlayStations, Porsche Boxsters and Battlestar<br />
Galactica memorabilia. Their physical selves can be found in the real world and<br />
made to pay in real dollars or serve real time behind real bars for the damage<br />
their cyber-selves cause.<br />
The dilemma that online law-breakers face is that their cyberspace crimes have<br />
real-life motives and fulfill real-life needs. Therefore, they need some way to<br />
translate their online misdeeds into offline benefits. The teenager downloads a<br />
MP3 so that he can listen to it. The con-artist <strong>as</strong>ks for money to be wired to<br />
him so that he can withdraw it and buy things with it. The fringe activist who emails<br />
a death threat to a judge does so in the hopes that the judge will change<br />
his behavior in the real world.<br />
These Internet actors usually rely on real-world institutions to get what they<br />
want. They use Internet Service Providers (ISPs) and hosting companies to<br />
communicate, and they use banks and credit card companies to turn online<br />
gains into c<strong>as</strong>h. Without these institutions, they either could not accomplish<br />
their online harms, or they would not be able to benefit from them in the real<br />
world. Unlike anonymous cyberspace miscreants, however, these institutions<br />
have street addresses and real, physical <strong>as</strong>sets that can satisfy judgments in the<br />
United States. By placing pressure on those institutions to cut off service to<br />
customers who break the law, we can indirectly place pressure on Internet<br />
wrong-doers. Through this pressure, we have a powerful tool to promote<br />
online compliance with the law.<br />
In some c<strong>as</strong>es, for some offenses, we have the legal tools to do this already. For<br />
intellectual property c<strong>as</strong>es, the tool for holding those institutions liable is<br />
secondary liability: contributory and vicarious infringement. The Ninth Circuit<br />
h<strong>as</strong> led the way in developing the law in this area. In Perfect 10 v. Google, the<br />
court noted the c<strong>as</strong>es that had applied contributory infringement to Internet<br />
actors, and summarized their holdings <strong>as</strong> saying that “a computer system<br />
operator can be held contributorily liable if it h<strong>as</strong> actual knowledge that specific<br />
infringing material is available using its system … and can take simple me<strong>as</strong>ures<br />
5 See, e.g., David G. Post, Anarchy, State, and the Internet: An Essay on Law-Making in Cyberspace,<br />
JOURNAL OF ONLINE LAW, Article 3, (1995), available at http://ssrn.com/abstract=943456.
172 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
to prevent further damage to copyrighted works … yet continues to provide<br />
access to infringing works.” 6 In other words, if people are using your stuff to<br />
infringe copyrights, and you know about it, and you can e<strong>as</strong>ily stop them, but<br />
you do not, then you are on the hook.<br />
The motive behind secondary liability is simple. Everyone agrees that the direct<br />
infringers ideally should be the ones to pay. But there might be too many of<br />
them to sue; or, they might be anonymous; or, they might be in Nigeria. This<br />
can make them apparently invulnerable to lawsuits. That invulnerability h<strong>as</strong> a<br />
cause: someone is providing the tools to infringe and looking the other way.<br />
The doctrine of secondary liability says that such behavior is unacceptable.<br />
Those who provide powerful tools that can be used for good or evil have some<br />
responsibility to make sure that those tools are used responsibly.<br />
Put more directly: with some changes to the law, the institutions that enable the<br />
anonymity and invulnerability of cyberspace denizens can be held accountable<br />
for what their anonymous and invulnerable customers do. The anonymity of<br />
cyberspace is <strong>as</strong> much a creation of men <strong>as</strong> it is a creation of computers. It is<br />
the result of policy choices. We have accepted, without serious examination,<br />
that it is perfectly fine for a business to grant free Web space and e-mail to any<br />
schmuck who comes off the street with an IP address, and then either keep no<br />
record of that grant or discard the record quickly. Businesses that do this are<br />
lending their powerful and potentially harmful capabilities and demanding little<br />
accountability in return. That arrangement h<strong>as</strong> obvious benefits but also<br />
obvious costs. The victims of online torts and crimes bear these costs, and<br />
those victims are, overwhelmingly, third parties. They include big movie<br />
studios, middle-aged Internet newbies and, unfortunately in some c<strong>as</strong>es, young<br />
children.<br />
If the legal rules change, and companies are held liable more often for what<br />
their users do, then the cost of anonymity would shift away from victims and<br />
toward the providers. In this world, providers will be more careful about<br />
identifying users. Perhaps online <strong>as</strong>sertions of identity will be backed up with<br />
offline proof; providers will be more careful about providing potential scam<br />
artists in distant jurisdictions with the tools to practice their craft. All this<br />
would be expensive for service providers, but not <strong>as</strong> expensive <strong>as</strong> it is for<br />
injured parties today.<br />
Secondary liability should not reach every company that plays any hand in<br />
<strong>as</strong>sisting the online wrong-doer, of course. Before secondary liability attaches,<br />
the plaintiff must show that the defendant provided a crucial service, knew of<br />
the illegal activity, and had a right and a cost-justified ability to control the<br />
6 Perfect 10, Inc. v. Amazon.com, Inc., 508 F.3d 1146, 1172 (9th Cir. 2007) (quotations,<br />
citations and italics omitted).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 173<br />
infringer’s actions. This rule will in almost every c<strong>as</strong>e exclude electrical utilities,<br />
landlords, and others whose contributions to illegal activity are minuscule.<br />
While we have come a long way from Barlow’s Declaration of the<br />
Independence of Cyberspace, the central idea behind it—that the Internet is a<br />
special place, separate somehow from the brick and mortar world, and thus<br />
subject to special rules and regulations, or no rules and regulations—lingers.<br />
The name itself h<strong>as</strong> a powerful influence: we don’t speak of “telephone-space”<br />
or “radio-space” or “TV-space”—though we do have Television City in<br />
Hollywood. Prior technological advances that aided in connecting people were<br />
generally recognized <strong>as</strong> tools to aid life in the real world; no one claimed that<br />
they made up a separate dimension that is somehow different and separate from<br />
the real word. Every time we use the term “cyberspace” or the now-outmoded<br />
“Information Superhighway,” we buy into the idea that the world-wide network<br />
of computers that people use for electronic commerce and communication is a<br />
separate, organic entity that is entitled to special treatment.<br />
This idea of cyberspace <strong>as</strong> a separate place subject to a different set of rules—<br />
one where courts ought to tread lightly lest they disturb the natural order of<br />
things and thereby cause great harm—still arises in many court c<strong>as</strong>es. 7<br />
The first of these is Perfect 10 v. Visa—a c<strong>as</strong>e where one of the authors of this<br />
piece w<strong>as</strong> in the dissent. 8 The facts are simple: plaintiff produces and owns<br />
pictures of scantily-clad young women, which it sells online. It alleged that<br />
unknown parties had copied the pictures and were selling them online, at a<br />
lower price, using servers in remote locations where the legal system w<strong>as</strong> not<br />
hospitable to copyright and trademark lawsuits, and, moreover, they could fold<br />
up their tents and open up business elsewhere if anyone really tried to pursue<br />
them. So the plaintiff didn’t try to sue the primary infringers; instead, it went<br />
after the credit card companies that were processing the payments for what they<br />
claimed were pirated photographs.<br />
7 Some disclaimers: One of the authors of this piece (Chief Judge Kozinski) sat on the panel<br />
that decided some of the c<strong>as</strong>es given <strong>as</strong> examples here. He wants to make it clear that he<br />
won’t re-argue the c<strong>as</strong>es here. Both involved split decisions, and his views <strong>as</strong> to how those<br />
c<strong>as</strong>es should have come out is set out in his opinions in those c<strong>as</strong>es. His colleagues on the<br />
other side are not present to argue their positions and, in any event, it’s unseemly to continue<br />
a judicial debate after the c<strong>as</strong>e is over. Furthermore, despite his disagreement with his<br />
colleagues, he respects and appreciates their views. The judges that came out the other way<br />
are some of the dearest of his colleagues, and some of the finest judges anywhere. The<br />
disagreement is troubling, because they bring a wealth of intelligence, diligence, talent,<br />
experience and objectivity to the problem, and he can’t quite figure out why they see things<br />
so differently.<br />
8 Perfect 10, Inc. v. Visa Int’l Serv. Ass’n, 494 F.3d 788 (9th Cir. 2007).
174 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
This w<strong>as</strong> by far not the first c<strong>as</strong>e that applied the doctrine of secondary<br />
infringement to electronic commerce. The c<strong>as</strong>es go back at le<strong>as</strong>t to the 1995<br />
c<strong>as</strong>e of Religious Technology Center v. Netcom, 9 a c<strong>as</strong>e involving the liability of an ISP<br />
for damage caused when it posted copyrighted Scientology documents to<br />
USENET, at the direction of one of its users. And, of course, the Napster,<br />
Aimster and Grokster c<strong>as</strong>es all dealt with the secondary liability of those who<br />
<strong>as</strong>sist others in infringement. 10 Perfect 10, though, presented a novel question:<br />
how do you apply the doctrine of secondary infringement to people who help<br />
the transaction along, but never have any physical contact with the protected<br />
work?<br />
Two excellent and conscientious Ninth Circuit jurists, Judges Milan Smith and<br />
Stephen Reinhardt, said there w<strong>as</strong> no liability, where<strong>as</strong> the dissenting judge<br />
concluded that there w<strong>as</strong>. Visa, the dissent argued, w<strong>as</strong> no different from any<br />
other company that provided a service to infringers, knew what it w<strong>as</strong> doing,<br />
and had the ability to withdraw its service and stop the infringement, but did<br />
nothing.<br />
This debate fits within a larger context. In the majority’s rejection of<br />
contributory liability, it cited a public policy decision that found that the<br />
Internet’s development should be promoted by keeping it free of legal<br />
regulation. Relatedly, the majority distinguished some precedent by saying that<br />
its “tests were developed for a brick-and-mortar world” and hence “do not lend<br />
themselves well to application in an electronic commerce context.” 11<br />
This argument channels Barlow’s declaration that users of the Internet are<br />
entitled to special treatment (or, <strong>as</strong> he would have it, entitled to no treatment).<br />
The chief justification for this argument is that the Internet is so new, exotic<br />
and complicated that the imposition of legal rules will chill, stifle, discourage or<br />
otherwise squelch the budding geniuses who might otherwise create the next<br />
Google, Pets.com, or HamsterDance.com. For example, the Electronic<br />
Frontier Foundation argued to the Supreme Court during the Grokster c<strong>as</strong>e that<br />
if the Ninth Circuit’s opinion were reversed, the effect would “threaten<br />
innovation by subjecting product design to expensive and indeterminate judicial<br />
9 Religious Tech. Ctr. v. Netcom, 907 F.Supp. 1361 (N.D. Cal. 1995).<br />
10 See A&M Records, Inc. v. Napster, Inc., 284 F.3d 1091 (9th Cir. 2002); In re Aimster<br />
Copyright Litig., 334 F.3d 643 (7th Cir. 2003); Metro-Goldwyn-Mayer Studios Inc. v.<br />
Grokster, Ltd., 545 U.S. 913 (2005).<br />
11 Perfect 10, 494 F.3d at 798, n.9.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 175<br />
second-guessing.” 12 The Ninth Circuit w<strong>as</strong> reversed, and if that decision slowed<br />
the pace of product design, no one seems to have noticed.<br />
This argument became particularly central in a second c<strong>as</strong>e, Fair Housing Council<br />
of San Fernando Valley v. Roommates.com. 13 The c<strong>as</strong>e involved a claim that the<br />
commercial website Roommates.com violated state and federal fair housing<br />
laws by helping to pair up roommates according to their personal preferences,<br />
the exercise of which is allegedly prohibited by law. Again, one of the authors<br />
of this piece w<strong>as</strong> a judge on that c<strong>as</strong>e, and w<strong>as</strong> in the majority at both the panel<br />
and the en banc level—despite the efforts of some conscientious and brilliant<br />
dissenting judges, of whose intellectual rigor and commitment to the rule of law<br />
no one can doubt.<br />
The majority mostly held that Roommates.com could be held liable, if the<br />
plaintiff’s allegations were proven true. The court held essentially that an online<br />
business had to be held to the same substantive law <strong>as</strong> businesses in the brickand-mortar<br />
world. The dissenters saw things quite differently; to them, the<br />
majority placed in jeopardy the survival of the Internet. Here is a t<strong>as</strong>te of the<br />
dissent:<br />
On a daily b<strong>as</strong>is, we rely on the tools of cyberspace to help us<br />
make, maintain, and rekindle friendships; find places to live,<br />
work, eat, and travel; exchange views on topics ranging from<br />
terrorism to patriotism; and enlighten ourselves on subjects<br />
from “aardvarks to Zoro<strong>as</strong>trianism.” … The majority’s<br />
unprecedented expansion of liability for Internet service<br />
providers threatens to chill the robust development of the<br />
Internet that Congress envisioned … . We should be looking<br />
at the housing issue through the lens of the Internet, not from<br />
the perspective of traditional publisher liability. 14<br />
And finally, the unkindest cut of all: “The majority’s decision, which sets us<br />
apart from five circuits, … violates the spirit and serendipity of the Internet.” 15<br />
The argument that a legal holding will bring the Internet to a standstill makes<br />
most judges listen closely. Just think of the panic that w<strong>as</strong> created when the<br />
12 See Brief for Respondents, Metro-Goldwyn-Mayer Studios Inc. v. Grokster, Ltd., No. 04-480<br />
(9th Cir. Mar. 1, 2005), available at 2005 WL 508120 and at<br />
http://w2.eff.org/IP/P2P/MGM_v_Grokster/20050301_respondents_brief.pdf.<br />
13 Fair Housing Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157 (9th<br />
Cir. 2008).<br />
14 Id. at 1176-77 (footnote omitted).<br />
15 Id. at 1177.
176 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
Blackberry server went down for a few hours. No one in a black robe wants to<br />
be responsible for anything like that, and when intelligent, hard-working,<br />
thoughtful colleagues argue that this will be the effect of one of your rulings,<br />
you have to think long and hard about whether you want to go that way. It<br />
tests the courage of your convictions.<br />
Closely related is the argument that, even if you don’t bring down the existing<br />
structure, the threat of liability will stifle innovation, so that the progress we<br />
have seen in recent years—and the gains in productivity and personal<br />
satisfaction—will stop because the legal structure h<strong>as</strong> made innovation too risky<br />
or expensive. The innovation argument is partly right but mostly wrong.<br />
Certainly, some innovators will shy away from legally murky are<strong>as</strong>. It’s hard to<br />
think of a worse recipe for creativity than having a lawyer attend every<br />
engineering meeting. But promoting innovation alone cannot be a sufficient<br />
justification for exempting innovators from the law. An unfortunate result of<br />
our complex legal system is that almost everyone is confused about what the<br />
law means, and everyone engaged in a business of any complexity at some point<br />
h<strong>as</strong> to consult a lawyer. If the need to obey the law stifles innovation, that<br />
stifling is just another cost of having a society ruled by law. In this sense, the<br />
Internet is no different than the pharmaceutical industry or the auto industry:<br />
They face formidable legal regulation, yet they continue to innovate.<br />
There is an even more fundamental re<strong>as</strong>on why it would be unwise to exempt<br />
the innovators who create the technology that will shape the course of our lives:<br />
Granting them that exemption will yield a generation of technology that<br />
facilitates the behavior that our society h<strong>as</strong> decided to prohibit. If the Internet<br />
is still being developed, then we should do what we can to guide its<br />
development in a direction that promotes compliance with the law.<br />
For example, what use is “innovation” in creating a job hunting site if the<br />
innovators produce a site that invites employers to automatically reject any<br />
applicant from a particular race? Perhaps the job site is a bold new innovation<br />
that makes hiring far e<strong>as</strong>ier and more efficient than it h<strong>as</strong> ever been. But if this<br />
site is used widely, it will facilitate racial discrimination in hiring—conduct that<br />
society h<strong>as</strong> already decided it must prohibit. Similarly, is a file-sharing service<br />
such <strong>as</strong> Grokster worth the harm it causes by offering no built-in tools for<br />
identifying participants or establishing they have the right to “share” the files<br />
they copy? Far from exempting this growing industry from the law, we should<br />
vigorously enforce the law <strong>as</strong> the industry grows, so that when it is mature, its<br />
services won’t guide behavior toward conduct that society h<strong>as</strong> decided to<br />
discourage. As difficult <strong>as</strong> it might be for innovators today, it is e<strong>as</strong>ier than the<br />
alternatives: forcing them to rebuild everything ten years down the road, or<br />
grudgingly accepting that we have surrendered key <strong>as</strong>pects of our ability to<br />
govern our society through law.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 177<br />
It is Barlow who is generally credited with taking the word “cyberspace” from<br />
the science fiction of William Gibson and applying it to the Internet. 16 In doing<br />
so, he launched the conceit that such a “space” exists at all. This w<strong>as</strong> wholly<br />
unjustified. It is a mistake to fall into Barlow’s trap of believing that the set of<br />
human interactions that is conducted online can be neatly grouped together into<br />
a discrete “cyberspace” that operates under its own rules. Technological<br />
innovations give us new capabilities, but they don’t change the fundamental<br />
ways that humans deal with each other. The introduction of telephones and<br />
cars did create new legal questions. Those questions all revolved around what<br />
the acceptable uses of the new technologies were. How closely can you follow<br />
the car in front of you on the highway? Can you repeatedly dial someone’s<br />
phone to annoy them? Can you tap into a phone conversation or put a tape<br />
recorder in a phone booth? Over time, courts and legislatures answered these<br />
questions with new legal rules. They had to; the essence of the controversy<br />
arose from the new technological abilities. But no one thought that telephones<br />
and cars changed the legal rules surrounding what w<strong>as</strong> said on a telephone or<br />
where a car traveled. Can an oral contract be formed with a telephone call? Of<br />
course; it is still two people speaking. Is it tresp<strong>as</strong>sing to drive across my<br />
neighbor’s front yard? Of course; you are on his land.<br />
Like cars and telephones, the Internet prompts new questions about the<br />
acceptable uses of the new technology. Is port-scanning a form of hacking?<br />
When does title to a domain name legally transfer? While analogies to settled<br />
legal rules are helpful in answering these questions, they are not conclusive.<br />
Answers to these questions will look like new legal rules.<br />
But when the Internet is involved in a controversy only because the parties<br />
happened to use it to communicate, new legal rules will rarely be necessary.<br />
When the substance of the offense is that something w<strong>as</strong> communicated, then<br />
the harm occurs regardless of the tools used to communicate. If an attorney<br />
betrays a client’s confidence, the duty to the client is breached regardless of<br />
whether the attorney used a telephone, a newspaper, a radio station, or the<br />
Internet. The choice of communication medium might affect the magnitude of<br />
the harm, but if it is illegal for A to communicate X to B without C’s<br />
permission, there is no re<strong>as</strong>on to f<strong>as</strong>hion new rules of liability that depend on<br />
the mode of communication used.<br />
There are some ways that the Internet might require courts to re-think legal<br />
rules. The Internet makes long-distance communication cheaper than it w<strong>as</strong><br />
before. To the extent that existing legal rules were premised on the <strong>as</strong>sumption<br />
that communications were expensive, the Internet might require a reappraisal.<br />
Courts are already reevaluating, for example, what it means to do business<br />
16 See John Perry Barlow, Crime and Puzzlement: In Advance of the Law on the Electronic Frontier,<br />
WHOLE EARTH REV., Sept. 22, 1990, at 44, 45.
178 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
within a state, for purposes of the long-arm statute, when the defendant’s<br />
“business establishment” is a server located in Uzbekistan.<br />
Yet the v<strong>as</strong>t majority of Internet c<strong>as</strong>es that have reached the courts have not<br />
required new legal rules to solve them. It h<strong>as</strong> been fifteen years since America<br />
Online unle<strong>as</strong>hed its hordes of home computing modem-owners on e-mail and<br />
the Internet and fifteen years since the rele<strong>as</strong>e of the Mosaic Web browser.<br />
After all that time, we have today relatively few legal rules that apply only to the<br />
Internet. Using the Internet, people buy stocks, advertise used goods and apply<br />
for jobs. All those transactions are governed by the exact same laws <strong>as</strong> would<br />
govern them if they were done offline.<br />
Those who claim the Internet requires special rules to deal with these ordinary<br />
controversies have trouble explaining this history. Despite this dearth of<br />
Internet-specific law, the Internet is doing wonderfully. It h<strong>as</strong> survived<br />
speculative booms and busts, made millionaires out of many and, unfortunately,<br />
rude bloggers out of more than a few. The lack of a special Internet civil code<br />
h<strong>as</strong> not hurt its development.<br />
The Internet, it turns out, w<strong>as</strong> never so independent or sovereign <strong>as</strong> early<br />
idealists believed. It w<strong>as</strong> an <strong>as</strong>tounding social and technological achievement,<br />
and it continues to change our lives. But it h<strong>as</strong> not proven to be invulnerable to<br />
legal regulation—at le<strong>as</strong>t, not unless we choose to make it invulnerable. As<br />
intriguing <strong>as</strong> Barlow’s Declaration of Independence w<strong>as</strong>, the original 1776<br />
Declaration is more profound in its understanding of the purpose and abilities<br />
of government: men have rights of “Life, Liberty and the pursuit of<br />
Happiness,” and “to secure these rights, Governments are instituted among<br />
Men.” The government that we have instituted retains its purpose of securing<br />
those rights, and it accomplishes that purpose through the law. We have seen<br />
that our government h<strong>as</strong> many tools at its disposal through which it can bring<br />
law to the Internet’s far reaches. The Internet might pose obstacles toward that<br />
job, but those obstacles can be overcome. The question is whether we will do it.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 179<br />
Is Internet Exceptionalism Dead?<br />
By Tim Wu *<br />
In 1831, Alexis de Tocqueville rele<strong>as</strong>ed Democracy in America, the founding text<br />
of “American exceptionalism.” After long study in the field, America, he had<br />
concluded, w<strong>as</strong> just different than other nations. In an often-quoted p<strong>as</strong>sage, de<br />
Tocqueville wrote:<br />
The position of the Americans is therefore quite exceptional,<br />
and it may be believed that no democratic people will ever be<br />
placed in a similar one. Their strictly Puritanical origin—their<br />
exclusively commercial habits—even the country they inhabit,<br />
which seems to divert their minds from the pursuit of science,<br />
literature, and the arts—the proximity of Europe, which allows<br />
them to neglect these pursuits without relapsing into<br />
barbarism—a thousand special causes, of which I have only<br />
been able to point out the most important—have singularly<br />
concurred to fix the mind of the American upon purely<br />
practical objects. His p<strong>as</strong>sions, his wants, his education, and<br />
everything about him seem to unite in drawing the native of<br />
the United States earthward; his religion alone bids him turn,<br />
from time to time, a transient and distracted glance to heaven. 1<br />
Is there such a thing <strong>as</strong> Internet exceptionalism? If so, just what is the Internet<br />
an exception to? It may appear technical, but this is actually one of the big<br />
questions of our generation, for the Internet h<strong>as</strong> shaped the United States and<br />
the world over the l<strong>as</strong>t twenty years in ways people still struggle to understand.<br />
From its beginnings the Internet h<strong>as</strong> always been different from the networks<br />
that preceded it—the telephone, radio and television, and cable. But is it<br />
different in a l<strong>as</strong>ting way?<br />
The question is not merely academic. The greatest Internet firms can be<br />
succinctly defined <strong>as</strong> those that have best understood what makes the Internet<br />
different. Those that have failed to understand the “Network of Networks”—<br />
say, AOL, perished, while those that have, like Google and Amazon, have<br />
flourished. Hence the question of Internet exceptionalism is often a multibillion<br />
dollar question. The state of the Internet h<strong>as</strong> an obvious effect on<br />
national and international culture. It is also of considerable political relevance,<br />
* Professor, Columbia Law School; Fellow, New America Foundation<br />
1 ALEXIS DE TOCQUEVILLE, DEMOCRACY IN AMERICA 519 (Henry Reeve trans., D. Appleton<br />
and Company 1904) (1831).
180 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
both for enforcement of the laws, and the rise of candidates and social<br />
movements.<br />
What makes the question so interesting is that the Internet is both obviously<br />
exceptional and unexceptional at the same time. It depends on what you might<br />
think it is an exception to. It is clear that the Internet w<strong>as</strong> a dramatic revolution<br />
and an exception to the ordinary ways of designing communications systems.<br />
But whether it enjoys a special immunity to the longer and deeper forces that<br />
shape human history is, shall we say, yet to be seen.<br />
* * *<br />
In the early 2000s, Jack Goldsmith and I wrote Who Controls the Internet? 2 The<br />
book is an explicitly anti-exceptionalist work. It addressed one particular way<br />
that the Internet might be an exception, namely, the susceptibility, <strong>as</strong> it were, of<br />
the Internet to regulation by the laws of nations. From the mid-1990s onward it<br />
w<strong>as</strong> widely thought that the Internet would prove impossible to control or<br />
regulate. Some legal scholars, in interesting and provocative work, argued that<br />
in some ways the Network might be considered to have its own sovereignty,<br />
like a nation-state. 3 That w<strong>as</strong> the boldest claim, but the general idea that the<br />
Internet w<strong>as</strong> difficult or impossible to regulate w<strong>as</strong>, at the time, a political,<br />
journalistic and academic commonplace, taken for granted. For example,<br />
reflecting his times, in 1998 President Clinton gave a speech about China’s<br />
efforts to control the Internet. “Now, there’s no question China h<strong>as</strong> been<br />
trying to crack down on the Internet—good luck” he said. “That’s sort of like<br />
trying to nail Jello to the wall.” 4<br />
That w<strong>as</strong> the conventional wisdom. In our book we suggested that despite the<br />
wonders of the Network it did not present an existential challenge to national<br />
legal systems, reliant, <strong>as</strong> they are, on threats of physical force. 5 We predicted<br />
that nations would, and to some degree already had, re<strong>as</strong>sert their power over<br />
the Network, at le<strong>as</strong>t, for matters they cared about. They would <strong>as</strong>sert their<br />
power not over the Network in an abstract sense, but the actual, physical<br />
humans and machinery who lie underneath it. Many of the book’s chapters<br />
ended with people in jail; unsurprisingly, China provided the strongest example<br />
of what a State will do to try to control information within its borders.<br />
2 TIM WU & JACK GOLDSMITH, WHO CONTROLS THE INTERNET (2006).<br />
3 David Post & David Johnson, Law and Borders–The Rise of Law in Cyberspace, 48 STAN. L. REV.<br />
1367 (1996).<br />
4 R. MICHAEL ALVAREZ & THAD E. HALL, POINT, CLICK AND VOTE: THE FUTURE OF<br />
INTERNET VOTING 3 (2004).<br />
5 JOHN AUSTIN, THE PROVINCE OF JURISPRUDENCE DETERMINED (Wilfrid E. Rumble, ed.,<br />
Cambridge Univ. Press 1995) (1832).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 181<br />
Drama <strong>as</strong>ide, in a deeper way, we were interested in what you might call the<br />
persistence of physicality. Despite its virtual qualities, behind the concept of a<br />
global network were living human beings, blood and flesh. The human body’s<br />
susceptibility to pain and imprisonment is a large part of what the nation-state<br />
b<strong>as</strong>es its rule on, and that had not changed. We predicted that the nation’s<br />
threat of physical force, otherwise known <strong>as</strong> laws, would therefore shape the<br />
Network <strong>as</strong> much <strong>as</strong> its founding ambitions.<br />
Here is how we put the point in the introduction to our book, written in about<br />
2005 or so:<br />
Our age is obsessed with the search for the newest “new<br />
thing.” Our story, by contr<strong>as</strong>t, is about old things—ancient<br />
principles of law and politics within nations, cooperation and<br />
cl<strong>as</strong>hes between nations, and the enduring relevance of<br />
territory, and physical coercion. It is a story where Thom<strong>as</strong><br />
Hobbes is <strong>as</strong> important <strong>as</strong> Bill Gates. Like it or not, these old<br />
things are <strong>as</strong> important to the Net’s development, if not more<br />
so, than any technological or intellectual breakthrough.<br />
In these pages we present a strong resistance to Internet<br />
exceptionalism, or any arguments that new technologies can<br />
only be understood using novel intellectual frameworks. Like<br />
other revolutionary communication technologies, the Internet<br />
h<strong>as</strong> changed the way we live, and fostering undreamt of new<br />
forms of social organization and interaction. But also like<br />
other revolutionary communication technologies, the Internet<br />
h<strong>as</strong> not changed the fundamental roles played by territorial<br />
government.<br />
We are optimists who love the internet and believe that it can<br />
and h<strong>as</strong> made the world a better place. But we are realistic<br />
about the role of government and power in that future, and<br />
realists about the prospects for the future.<br />
I regret to say that it h<strong>as</strong> been the Chinese government that h<strong>as</strong> done the most<br />
to prove our b<strong>as</strong>ic thesis correct. The Jello w<strong>as</strong>, somehow, nailed to the wall.<br />
Despite nearly a decade of Westerners (most particularly Western newspaper<br />
columnists) <strong>as</strong>suming or hoping that the Net would bring down the Chinese<br />
state, it didn’t happen; indeed it never even came close. And so, five years later<br />
the b<strong>as</strong>ic ide<strong>as</strong> in our book seem hard to contest. Consequently, this one<br />
particular species of Internet exceptionalism—the idea that the network h<strong>as</strong> its<br />
own sovereignty in a sense, or is an exception to law—h<strong>as</strong> weakened and may<br />
be dead.
182 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
In the summer of 2010, in fact, <strong>as</strong> if to hammer to point home, the Chinese<br />
government rele<strong>as</strong>ed a new White Paper on “Internet Policy.” It made its<br />
centerpiece the phr<strong>as</strong>e coined by the Internet exceptionalists of the 1990s:<br />
“Internet Sovereignty.” However, that phr<strong>as</strong>e did not mean what it did in the<br />
1990s. Rather <strong>as</strong> the People’s Daily, the state newspaper, explained, “Internet<br />
Sovereignty” means that “all foreign IT companies operating in China must<br />
abide by China’s laws and [be] subject to Beijing’s oversight.” 6<br />
* * *<br />
Leaving law <strong>as</strong>ide, however, the larger questions of Internet Exceptionalism<br />
remain unanswered. It is surely one thing for the Internet to be a living<br />
exception to the legal system, a sovereign unto itself in some way. But is the<br />
Network an exception <strong>as</strong> an information network, <strong>as</strong> a means for a nation or world<br />
to communicate? Here, surely, the exceptionalist is on far stronger ground.<br />
Whatever you might say about efforts to use the Internet to avoid law, we<br />
cannot doubt that the “Networks of Networks” h<strong>as</strong> changed the way we<br />
communicate in dramatic f<strong>as</strong>hion. Technologically, and in its effects on<br />
business, culture and politics, the Internet seems, by almost any account, an<br />
exception, different from the way other systems of m<strong>as</strong>s communications have<br />
operated, whether the telephone, radio, or the television.<br />
This point seems so obvious <strong>as</strong> to be commonplace to anyone who’s lived<br />
through the 1990s. Unlike television, radio and newspapers, which all are<br />
speech outlets for a privileged few, the Internet allows anyone to be a publisher.<br />
Unlike the private cable networks, the Internet is public and, in its totality,<br />
owned by no one. Unlike the telephone system, it carries video, graphics, the<br />
Web, and supports any idea anyone can come up with. It h<strong>as</strong> played host to<br />
generations of new inventions, from email and the World Wide Web to the<br />
search engine, from shops like eBay and Amazon to social networking and<br />
blogging. It h<strong>as</strong> challenged and changed industries, from entertainment to<br />
banking and travel industries. These features and others are what have made<br />
the Network so interesting for so many years.<br />
The question is whether, however, the Internet is different in a l<strong>as</strong>ting way.<br />
What do I mean, “a l<strong>as</strong>ting way?” I rely on the sense that certain ide<strong>as</strong>, once<br />
spread, seem to lodge permanently, or for centuries at le<strong>as</strong>t—e.g., the idea of<br />
property, civil rights, or vaccination. Each is an idea that, once received, h<strong>as</strong> a<br />
way of embedding itself so deeply <strong>as</strong> to be nearly impossible to dislodge. In<br />
contr<strong>as</strong>t are ide<strong>as</strong> that, while doubtlessly important, tend, in retrospect, to form<br />
a rather interesting blip in history, a revolution that came and went. Will we<br />
6 Information Office of the State Council of the People’s Republic of China, The Internet in<br />
China, 2010, http://www.china.org.cn/government/whitepaper/node_7093508.htm;<br />
White paper explains ‘Internet Sovereignty’, PEOPLE’S DAILY ONLINE, June 9, 2010,<br />
http://english.peopledaily.com.cn/90001/90776/90785/7018630.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 183<br />
think of the open age of the Internet the way we think of communism, or the<br />
hula-hoop? 7<br />
If the Internet is exceptional in a l<strong>as</strong>ting way, it must be for its ideology <strong>as</strong><br />
expressed in its technology. And in this sense its exceptionalism is similar to<br />
American exceptionalism. Both the Nation and the Network were founded on<br />
unusual and distinct ideologies, following a revolution (one actual, another<br />
technological). In a typical account, writer Seymour Martin Lipset writes in<br />
American Exceptionalism: A Double-Edged Sword: “The United States is exceptional<br />
in starting from a revolutionary event … it h<strong>as</strong> defined its raison d’être<br />
ideologically.” 8 Or, <strong>as</strong> one-time Columbia professor Richard Hofstadter wrote<br />
in the 20th century, “it h<strong>as</strong> been our fate <strong>as</strong> a nation to not to have ideologies,<br />
but to be one.” 9 De Tocqueville put American exceptionalism down to<br />
particular features of the United States—the religiosity of its founding, its<br />
proximity to yet freedom from Europe, and, <strong>as</strong> he wrote, “a thousand special<br />
causes.” 10<br />
Looking at the Internet, its founding and its development, we can find the same<br />
pattern of a revolution, an ideology, and many “special causes.” While much of<br />
it w<strong>as</strong> purely technical, there were deeply revolutionary ide<strong>as</strong>, even by<br />
technological standards, at the heart of the Internet, even if sometimes they<br />
were arrived at in accidental f<strong>as</strong>hion or for pragmatic re<strong>as</strong>ons.<br />
Of course, fully describing all that makes the Internet different would take<br />
another Democracy in America, and we have the benefit of many writers who’ve<br />
tried to do just that, whether in Katie Hafner and Matthew Lyon’s Where<br />
Wizards Stay up Late, the oral accounts of its creators, cl<strong>as</strong>sic works like J.H.<br />
Saltzer et al., End-to-End Arguments in System Design, or Jonathan Zittrain’s The<br />
Future of the Internet. 11<br />
7 I’ve spent some time thinking about these questions, and I want to suggest that it isn’t really<br />
possible to answer the question in full without understanding the story of the networks that<br />
preceded the Internet. My fullest answer to the question I’ve posed, then, is in THE<br />
MASTER SWITCH (Knopf 2010), an effort to try and find the patterns, over time, that<br />
surround revolutionary technologies. This time, unlike in WHO CONTROLS THE INTERNET,<br />
when it comes to the broader question of the Internet <strong>as</strong> a way of moving information, I<br />
tend to side with the exceptionalists, though it is a close call.<br />
8 SEYMOUR MARTIN LIPSET, AMERICAN EXCEPTIONALISM 18 (1996).<br />
9 JAMES M. JASPER, RESTLESS NATION 38 (2000).<br />
10 ALEXIS DE TOCQUEVILLE, DEMOCRACY IN AMERICA 519 (Henry Reeve trans., D. Appleton<br />
and Company 1904) (1831).<br />
11 KATIE HAFNER & MATTHEW LYON, WHERE WIZARDS STAY UP LATE: THE ORIGINS OF THE<br />
INTERNET (1996);J. H. Saltzer, D. P. Reed & D. D. Clark, End-To-End Arguments in System<br />
Design, 2 ACM TRANSACTIONS ON COMPUTER SYSTEMS (TOCS) 277-288 (1984); JONATHAN<br />
ZITTRAIN, THE FUTURE OF THE INTERNET—AND HOW TO STOP IT (2009).
184 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
To understand what makes the Internet different, the origins of the Internet<br />
bear careful examination. First, the Network’s predecessors (the telephone,<br />
cable, etc.) were all commercial enterprises first and foremost, invented and<br />
deployed (in the U.S.) by private firms. The Internet, in contr<strong>as</strong>t, w<strong>as</strong> founded<br />
<strong>as</strong> a research network, explicitly non-commercial and public for the first decade<br />
of its existence. Private companies were involved, yes, but it w<strong>as</strong> not a<br />
commercial operation in the same sense that, say, the cable networks always<br />
were.<br />
Perhaps, thanks to its origins, the Internet w<strong>as</strong> founded with an ideology that<br />
w<strong>as</strong> far more explicit than most—a kind of pragmatic libertarianism whose<br />
influence remains. The early Internet scientists had various principles that they<br />
were proud of. One example is David Clark’s memorable adage. “We reject:<br />
kings, presidents, and voting. We believe in: rough consensus and running<br />
code.” Another is found in a famous Request For Comments written by<br />
Internet founder Jon Postel, setting forth the following <strong>as</strong> a principle for<br />
network operators: “Be conservative in what you do. Be liberal in what you<br />
accept from others.” 12<br />
The Network constituted not just a technological advance, though it w<strong>as</strong> that <strong>as</strong><br />
well, but also a rejection of dominant theories of system design and, in a deeper<br />
sense, a revolution in information governance. The early Internet researchers<br />
were designing a radically decentralized network in an age—the mid-1960s—<br />
when highly centralized systems ran nearly every <strong>as</strong>pect of American and world<br />
life. In communications this w<strong>as</strong> represented by AT&T, the great monopolist,<br />
with its mighty and near-perfect telephone network. But it could also be found<br />
in other <strong>as</strong>pects of society, from the enlarged Defense Department that ran the<br />
Cold War, the new, giant government agencies that ran social programs, and<br />
enormous corporations like General Motors, IBM, and General Electric.<br />
So when Vint Cerf and his colleagues put the Internet on the TCP/IP protocol<br />
in 1982 (its effective “launch”), most information networks—and I don’t mean<br />
this is a pejorative sense—could be described <strong>as</strong> top-down dictatorships. One<br />
entity—usually a firm or a part of the State (or both), like AT&T or the BBC,<br />
decided what the network would be. The Internet, in contr<strong>as</strong>t, h<strong>as</strong> long been<br />
governed more like a federation of networks, and in some respects, like a<br />
Republic of Users. That is implicit in the ability of anyone to own an IP<br />
address, set up a website, and publish information—something never true, and<br />
still not true, on any other network.<br />
12 Paulina Borsook, How Anarchy Works, WIRED (Oct. 1995),<br />
http://www.wired.com/wired/archive/3.10/ietf.html; Jon Postel, Information Sciences<br />
Institute of the University of Southern California, DOD Standard Transmission Control<br />
Protocol 13 (1980), available at http://tools.ietf.org/html/rfc761#section-2.10.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 185<br />
Throughout its history, the universal Network h<strong>as</strong>, true to the governance<br />
structure, seen a pattern of innovation that is unlike any other. This too is the<br />
subject of much scholarship and popular recognition—the mode of<br />
“decentralized innovation” that had led every several years or so to the next<br />
wonder, starting with email, through the Web, search engines, online retail, Web<br />
video, social networking, and onward. These innovations arrived in a highly<br />
disorganized f<strong>as</strong>hion often led by amateurs and outsiders. The spread of<br />
computer-networking itself began with amateur geeks glorified in 1980s films<br />
like War Games. 13 It is hard to think of a truly important Internet invention that<br />
came from a firm that predated the Internet. Society-changers like Craigslist,<br />
eBay, Wikipedia and blogs are obviously the products of geeks.<br />
* * *<br />
Can it l<strong>as</strong>t? Can the Internet remain, in this sense, exceptional? Whatever the<br />
Internet’s original ide<strong>as</strong>, it is e<strong>as</strong>y to argue that all this, too, shall p<strong>as</strong>s. The<br />
argument from transience suggests that all that seems revolutionary about the<br />
Internet is actually just a ph<strong>as</strong>e common to speech inventions. In other words,<br />
the Internet is following a path already blazed by other revolutionary inventions<br />
in their time, from the telephone to radio. Such disruptive innovations usually<br />
do arrive <strong>as</strong> an outsider of some kind, and will p<strong>as</strong>s through what you might call<br />
a “utopian” or “open” ph<strong>as</strong>e—which is where we are now. But that’s just a<br />
ph<strong>as</strong>e. As time p<strong>as</strong>ses, even yesterday’s radical new invention becomes the<br />
foundation and sole possession of one or more great firms, monopolists, or<br />
sometimes, the state, particularly in totalitarian regimes like the Soviet Union or<br />
the Third Reich. The openness ends, replaced with better production value and<br />
tighter controls. It is, in other words, back to normal, or at le<strong>as</strong>t what p<strong>as</strong>sed<br />
for normal for most of human history.<br />
We might learn from the fate of the broadc<strong>as</strong>t radio, the darling new technology<br />
of the 1920s. 14 In the 1920s, opening a radio station w<strong>as</strong> relatively e<strong>as</strong>y, not<br />
quite <strong>as</strong> e<strong>as</strong>y <strong>as</strong> a website, but within the reach of amateurs. American radio w<strong>as</strong><br />
once radically decentralized, open and rather utopian in its <strong>as</strong>pirations. But by<br />
the 1930s, broadc<strong>as</strong>t in the United States w<strong>as</strong> incre<strong>as</strong>ing controlled by the<br />
chains—most of all, the National Broadc<strong>as</strong>t Company, NBC, who brought<br />
better programming, but also much less of the amateur, open spirit. But that’s<br />
nothing compared to countries like Germany and the Soviet Union, where radio<br />
became the domain of the state, used to control and cajole. In Germany, every<br />
citizen w<strong>as</strong> issued a “people’s receiver” tuned only to Nazi channels, and within<br />
13 War Games (Metro-Goldwyn-Mayer 1983)<br />
14 This story of radio can be found in TIM WU, THE MASTER SWITCH, chaps 3, 5 (2010).
186 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
the space of a decade, the free radio had became what Joseph Goebbels called<br />
the “spiritual weapon of the totalitarian state.” 15<br />
Yet I find it hard to imagine such a dramatic or immediate fate for the Internet.<br />
It seems in so many ways too established, its values too enmeshed in society, to<br />
disappear in an instant.<br />
Perhaps it would be more accurate to suggest that there are <strong>as</strong>pects of the<br />
Internet ideology that are more and less likely to fade, to become yesterday’s<br />
ide<strong>as</strong>. At one extreme, the Internet’s core technological ide<strong>as</strong>, protocol layering<br />
& packet-switching, seem unlikely to go anywhere. The re<strong>as</strong>on is that these<br />
techniques have become the b<strong>as</strong>is of almost all information technology, not just<br />
the Internet itself. The telephone networks are today layered and packetswitched,<br />
even if they don’t rely on the Internet Protocol.<br />
More vulnerable, however, are the Internet’s early ide<strong>as</strong> of openness and<br />
decentralized operation—putting the intelligence in the edges, <strong>as</strong> opposed to the<br />
center of the network. Originally described by engineers <strong>as</strong> the E2E principle,<br />
and popularly contained in the catch-phr<strong>as</strong>e “Net Neutrality,” these principles<br />
have survived the arrival of broadband networks. Yet by its nature, Net<br />
Neutrality seems e<strong>as</strong>ier to upset, for discrimination in information systems h<strong>as</strong><br />
long been the rule, not the exception. There are, importantly, certain<br />
commercial advantages to discriminatory networking that are impossible to<br />
deny, temptations that even the Internet’s most open firms find difficult to<br />
resist. So while I may personally think open networking is important for<br />
re<strong>as</strong>ons related to innovation and free speech, it seems obvious to me that open<br />
networking principles can be dislodged from their current perch.<br />
Another open question is whether some of the means of production and<br />
cultural creativity that are <strong>as</strong>sociated with the Internet are destined for l<strong>as</strong>ting<br />
importance. We have recently lived through an era when it w<strong>as</strong> not unusual for<br />
an amateur video or blog to gain a greater viewership than films made for tens<br />
of millions. But is that, Lessig’s “remix culture,” 16 a novelty of our times? We<br />
also live in era where free software is often better than that which you pay for.<br />
They are the products of open production systems, the subject of Yochai<br />
Benkler’s The Wealth of Networks, the engines behind Linux and Wikipedia and<br />
other m<strong>as</strong>s projects—<strong>as</strong> discussed in Benkler’s essay in this collection. 17 Of<br />
course such systems have always existed, but will they retreat to secondary<br />
15 Quoted in Garth S. Jowett, GARTH JOWETT & VICTORIA O’DONNELL, READINGS IN<br />
PROPAGANDA AND PERSUASION 132 (2005).<br />
16 LAWRENCE LESSIG, REMIX: MAKING ART AND COMMERCE THRIVE IN THE HYBRID<br />
ECONOMY (2008).<br />
17 YOCHAI BENKLER, THE WEALTH OF NETWORKS (2007).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 187<br />
roles? Or will they perhaps become of primary importance for many are<strong>as</strong> of<br />
national life?<br />
The only honest answer is that it is too early to tell. And yet, at the same time,<br />
the transience of all systems suggests that at le<strong>as</strong>t some of what we take for<br />
granted right now <strong>as</strong> intrinsic to our information life and to the nature of the<br />
Internet will fade.<br />
The re<strong>as</strong>ons are many. It might simply be that the underlying ide<strong>as</strong> just<br />
discussed turn out to have their limits. Or that they are subject to an almost<br />
natural cycle—excessive decentralization begins to make centralization more<br />
attractive, and vice versa. More sinisterly, it might be because forces<br />
disadvantaged by these ide<strong>as</strong> seem to undermine their power—whether<br />
concentrated forces, like a powerful state, or more subtle forces, like the human<br />
desire for security, simplicity and e<strong>as</strong>e that h<strong>as</strong> long powered firms from the<br />
National Broadc<strong>as</strong>ting Corporation to Apple, Inc.<br />
Whatever the re<strong>as</strong>ons, and while I do think the Internet is exceptional (like the<br />
United States itself), I also think it will, come to resemble more “normal”<br />
information networks—indeed, it h<strong>as</strong> already begun to do so in many ways.<br />
Exceptionalism, in short, cannot be <strong>as</strong>sumed, but must be defended.<br />
* * *<br />
I began this essay with a comparison between Internet and American<br />
exceptionalism. Yet I want to close by suggesting we can learn from the<br />
comparison in a slightly different sense. I’ve suggested that there is a natural<br />
tendency for any exceptional system to fade and transition back to observed<br />
patterns. But even if that’s true, what is natural is not always normatively good,<br />
not always what we want. For example, it may very well be “natural” for a<br />
democracy, after a few decades or less, to ripen into a dictatorship of some<br />
kind, given the frustrations and inefficiencies of democratic governance.<br />
Cromwell and Napoleon are the bearers of that particular tradition, and it h<strong>as</strong><br />
certainly been the pattern over much of history.<br />
But the idea of American Exceptionalism h<strong>as</strong> included a commitment to trying<br />
to avoid that fate, even if it may be natural. Despite a few close calls, the<br />
United States remains an exception to the old rule that Republics inevitably<br />
collapse back into dictatorship under the sway of a great leader. The Internet,<br />
so far, is an exception to the rule that open networks inevitably close and<br />
become dominated by the State or a small number of mighty monopolists.<br />
Twenty-five years after .COM, we might say we still have a republic of<br />
information—if we can keep it.
188 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
Bibliography<br />
In lieu of extensive footnotes, I thought I’d provide here the books and articles<br />
that have, implicitly or explicitly, taken on the question of Internet<br />
Exceptionalism. Notice that, for those familiar in the field, this may lead to<br />
some unusual groupings—but the fundamental question is whether the project<br />
in question tries to argue the Internet is magically different or a repeat of ageold<br />
problems.<br />
Exceptionalist Works<br />
� PETER HUBER, LAW AND DISORDER IN CYBERSPACE (1997).<br />
� J. H. Saltzer, D. P. Reed & D. D. Clark, End-To-End Arguments in<br />
System Design, 2 ACM TRANSACTIONS ON COMPUTER SYSTEMS (TOCS)<br />
277-288 (1984).<br />
� NICHOLAS NEGROPONTE, BEING DIGITAL (1996).<br />
� LAWRENCE LESSIG, THE FUTURE OF IDEAS (2002).<br />
� David R. Johnson & David Post, Law & Borders: The Rise of Law in<br />
Cyberspace, 48 STAN. L. REV. 1367 (1996).<br />
� Mark A. Lemley & Lawrence Lessig, The End of End-to-End: Preserving the<br />
Architecture of the Internet in the Broadband Era, 48 UCLA L. REV. 925<br />
(2001).<br />
� Susan P. Crawford, The Internet and the Project of Communications Law, 55<br />
UCLA L. REV. 359 (2007).<br />
� DAVID POST, IN SEARCH OF JEFFERSON’S MOOSE (2009).<br />
� TIM WU, THE MASTER SWITCH (2010).<br />
Anti-Exceptionalist Works<br />
� LAWRENCE LESSIG, CODE: AND OTHER LAWS OF CYBERSPACE<br />
(2006).<br />
� TIM WU & JACK GOLDSMITH, WHO CONTROLS THE INTERNET?<br />
(2008).<br />
� Tim Wu, Cyberspace Sovereignty?, 10 HARV. J.L. & TECH. 647 (1997).<br />
� Christopher S. Yoo, Would Mandating Broadband Network Neutrality Help<br />
or Hurt Competition? A Comment on the End-to-End Debate, 3 J.<br />
TELECOMM. & HIGH TECH. L. 23 (2004).<br />
� YOCHAI BENKLER, THE WEALTH OF NETWORKS (2007).<br />
� CORY DOCTOROW, LITTLE BROTHER (2008).<br />
On the Topic / Mixed<br />
� JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET—AND HOW<br />
TO STOP IT (2009).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 189<br />
Section 230 of the CDA:<br />
Internet Exceptionalism <strong>as</strong> a<br />
Statutory Construct<br />
By H. Brian Holland *<br />
Introduction<br />
Since its enactment in 1996, Section 230 of the Communications Decency Act<br />
h<strong>as</strong> become perhaps the most significant statute in the regulation of online<br />
content, and one of the most intensely scrutinized. Many early commentators<br />
criticized both Congress, for its apparent inability to craft the more limited<br />
statute it intended, and the courts, for interpreting the statute broadly and failing<br />
to limit its reach. Later commentators focus more clearly on policy concerns,<br />
contending that the failure to impose liability on intermediaries fails to<br />
effectuate principles of efficiency and cost avoidance. More recently,<br />
commentators have argued that Section 230 immunity should be limited<br />
because it contributes to the proliferation of anonymous hate speech,<br />
intimidation, and threats of violence against traditionally marginalized groups.<br />
Acknowledging the validity of these concerns, this essay nevertheless takes the<br />
opposing view, defending broad Section 230 immunity <strong>as</strong> essential to the<br />
evolving structure of Internet governance. Specifically, Section 230 provides a<br />
means of working within the sovereign legal system to effectuate many of the<br />
goals, ideals, and realities of the Internet exceptionalist and cyber-libertarian<br />
movements. By mitigating the imposition of certain external legal norms in the<br />
online environment, Section 230 helps to create the initial conditions necessary<br />
for the development of a modified form of exceptionalism. With the impact of<br />
external norms diminished, Web 2.0 communities, such <strong>as</strong> wikis 1 and social<br />
network services, 2 have emerged to facilitate a limited market in norms and<br />
values and to provide internal enforcement mechanisms that allow new<br />
communal norms to emerge. Section 230 plays a vital role in this process of<br />
* Associate Professor, Tex<strong>as</strong> Wesleyan School of Law. A modified version of this essay<br />
originally appeared in the University of Kans<strong>as</strong> Law Review. In Defense of Online Intermediary<br />
Immunity: Facilitating Communities of Modified Exceptionalism, 56 U. Kan. L. Rev. 369 (2008),<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=979183.<br />
1 A wiki is a website designed to allow visitors to e<strong>as</strong>ily create and edit any page on the site.<br />
For more information, see Wikipedia, Wiki, http://en.wikipedia.org/wiki/Wiki (l<strong>as</strong>t<br />
accessed Dec. 1, 2010).<br />
2 Social network services are online services designed for users to share messages, links, and<br />
media (photos and video) with friends or others with similar interests. Some popular social<br />
network services are Facebook, MySpace, and Twitter.
190 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
building heterogeneous communities that encourage collaborative production<br />
and communication. Efforts to substantially reform or restrict Section 230<br />
immunity are therefore largely unnecessary and unwise.<br />
The essay begins with a brief introduction to Section 230. As interpreted and<br />
applied by the judiciary, this statute is now conceived <strong>as</strong> a broad grant of<br />
immunity from tort liability—broad not only in terms of those who can claim<br />
its protection but also in terms of predicate acts and causes of action to which<br />
such immunity extends.<br />
Working from this foundation, I then seek to position the courts’ expansion of<br />
Section 230 immunity within the larger debate over Internet governance,<br />
suggesting that proponents of expanded immunity are successfully creating what<br />
might be characterized <strong>as</strong> a modified, less demanding form of cyber-libertarian<br />
exceptionalism than what Eric Goldman calls, in his essay in this book, the<br />
“First Wave of Internet Exceptionalism” (one of “Internet Utopianism”), <strong>as</strong><br />
articulated in the mid-1990s. The dramatic expansion of Section 230 immunity<br />
h<strong>as</strong> in a limited sense effectuated a vision of a community in which norms of<br />
relationship, thought and expression are yet to be formed. The tort liability<br />
from which Section 230 provides immunity is, together with contract, a primary<br />
means by which society defines civil wrongs actionable at law. In the near<br />
absence of these external norms of conduct regulating relationships among<br />
individuals, the online community is free to create its own norms, its own rules<br />
of conduct, or none at all. It is a glimpse of an emergent community existing<br />
within, rather than without, the sovereign legal system.<br />
Finally, I make the c<strong>as</strong>e for preserving broad Section 230 immunity. As an<br />
initial matter, many of the reforms offered by commentators are both<br />
unnecessary and unwise because the costs of imposing indirect liability on<br />
intermediaries are unre<strong>as</strong>onable in relationship to the harm deterred or<br />
remedied by doing so. Moreover, the imposition of liability would undermine<br />
the development of Web 2.0 communities <strong>as</strong> a form of modified exceptionalism<br />
that encourages the development of communal norms, efficient centers of<br />
collaborative production, and open forums for communication.<br />
The Expansion of Section 230 Immunity<br />
In May of 1995, a New York trial court rocked the emerging online industry<br />
with its decision in Stratton Oakmont, Inc. v. Prodigy Services Co., 3 holding the<br />
Prodigy computer network liable for defamatory comments posted on one of its<br />
bulletin boards by a third-party. The key factor in this result w<strong>as</strong> Prodigy’s<br />
attempt to create a more family-friendly environment through the exercise of<br />
editorial control over the bulletin boards and moderating for offensive content.<br />
3 No. 31063/94, 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 191<br />
Prodigy w<strong>as</strong> therefore treated <strong>as</strong> a publisher of the information, rather than a<br />
mere distributor, and held strictly liable for actionable third-party content.<br />
Representatives of the online industry argued that the Prodigy decision placed<br />
service providers in an untenable position by creating a “Hobson’s choice” 4<br />
between monitoring content and doing nothing, thereby insulating the service<br />
from liability. Congress responded to the decision by amending the draft<br />
Communications Decency Act (CDA) to include a tailored immunity provision<br />
addressing the online industry’s concerns. As one element of what came to be<br />
known <strong>as</strong> the Good Samaritan provisions of the CDA, Section 230 w<strong>as</strong><br />
generally intended to provide online service providers and bulletin board hosts<br />
with immunity from tort liability for the defamatory acts of their users. This<br />
w<strong>as</strong> accomplished by addressing those specific elements of common law<br />
defamation at issue in the Prodigy decision—editorial control and the distinct<br />
treatment of publishers and distributors under the law. To that end, Section<br />
230 provided that no interactive computer service should be treated <strong>as</strong> the<br />
publisher or speaker of third-party content, and that efforts to moderate<br />
content should not create such liability.<br />
In the years following the enactment of Section 230, courts consistently<br />
extended its application. This trend began in 1997 with the watershed decision<br />
in Zeran v. America Online, Inc., 5 in which the Fourth Circuit applied Section 230<br />
to claims that America Online (AOL) should be held liable for the defamatory<br />
content posted by one of its users. The plaintiffs claimed liability arose in part<br />
because AOL had allegedly failed to remove third-party defamatory messages<br />
from its bulletin board system within a re<strong>as</strong>onable time, refused to post<br />
retractions to defamatory messages, and failed to screen for similar defamatory<br />
messages thereafter. The court found the plaintiff’s tort claims were preempted<br />
by Section 230, which rendered AOL immune. In reaching this result, the court<br />
rejected a strict reading of Section 230 <strong>as</strong> being limited to its terms. Although<br />
the statute failed to make any explicit reference to distributor liability, which the<br />
Prodigy decision appeared to leave intact, the court read distributor immunity<br />
into the statute, finding distributor liability to be an included subset of the<br />
publisher liability foreclosed by the statute. By collapsing the publisherdistributor<br />
distinction, the Fourth Circuit adopted the most expansive reading<br />
possible of both defamation law and Section 230. Thus, even though AOL<br />
knew the statements were false, defamatory, and causing great injury, AOL<br />
could simply refuse to take proper remedial and preventative action without fear<br />
of liability.<br />
4 SAMUEL FISHER, THE RUSTICK’S ALARM TO THE RABBIES (1660), <strong>as</strong> cited in Hobson’s choice,<br />
Wikipedia, http://en.wikipedia.org/wiki/Hobson%27s_choice (l<strong>as</strong>t accessed Dec. 1,<br />
2010).<br />
5 129 F.3d 327 (4th Cir. 1997).
192 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
Following Zeran, and building on that court’s reading of both the statute and the<br />
policies sought to be effected, courts have extended the reach of Section 230<br />
immunity along three lines: (1) by expanding the cl<strong>as</strong>s who may claim its<br />
protections; (2) by limiting the cl<strong>as</strong>s statutorily excluded from its protections;<br />
and (3) by expanding the causes of action from which immunity is provided. 6<br />
As to the first, courts have interpreted the provision of immunity to interactive<br />
computer services to include such entities <strong>as</strong> Web hosting services, email service<br />
providers, commercial websites like eBay and Amazon, individual and company<br />
websites, Internet dating services, privately-created chat rooms, and Internet<br />
access points in copy centers and libraries. The additional provision of<br />
immunity to users of those services promises similar results. Already, one<br />
decision h<strong>as</strong> held that a newsgroup user cannot be held liable for re-posting<br />
libelous comments by a third party, 7 while another court found a website<br />
message board to be both a provider and a user of an interactive computer<br />
service. 8<br />
The second line of extension results from a narrow reading of the term<br />
“information content provider,” which defines the cl<strong>as</strong>s for whom there is no<br />
immunity. Specifically, courts have held that minor alterations to third-party<br />
content does not constitute the provision of content itself, so long <strong>as</strong> the<br />
provider does not induce the unlawful content through the provision of<br />
offending raw materials of authorship and where the b<strong>as</strong>ic form and message of<br />
the original is retained. 9 The third point of expansion h<strong>as</strong> been to extend<br />
Section 230 immunity beyond causes of action for defamation and related<br />
claims to provide immunity from such claims <strong>as</strong> negligent <strong>as</strong>sistance in the<br />
sale/distribution of child pornography, 10 negligent distribution of pornography<br />
of and to adults, 11 negligent posting of incorrect stock information, 12 sale of<br />
fraudulently autographed sports memorabilia, 13 inv<strong>as</strong>ion of privacy, 14 and<br />
misappropriation of the right of publicity. 15<br />
6 But see Fair Housing. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d<br />
1157 (9th Cir. 2008) (declining to extend Section 230 immunity to Roommates.com for<br />
certain categories of content solicited by the site for users in violation of federal fair housing<br />
laws).<br />
7 Barrett v. Rosenthal, 146 P.3d 510, 527 (Cal. 2006).<br />
8 DiMeo v. Max, 433 F. Supp. 2d 523, 531 (E.D. Pa. 2006).<br />
9 Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir. 2003). See also Donato v. Moldow, 865 A.2d<br />
711, 724 (N.J. Super. Ct. App. Div. 2005) (quoting Batzel v. Smith).<br />
10 Doe v. Am. Online, Inc., 783 So. 2d 1010, 1017 (Fla. 2001).<br />
11 Does v. Franco Prods., No. 99 C 7885, 2000 WL 816779, at *5 (N.D. Ill. June 22, 2000), aff’d<br />
sub nom. Doe v. GTE Corp., 347 F.3d 655 (7th Cir. 2003).<br />
12 Ben Ezra, Weinstein & Co. v. Am. Online, Inc., 206 F.3d 980, 986 (10th Cir. 2000).<br />
13 Gentry v. eBay, Inc., 121 Cal. Rptr. 2d 703, 715 (Cal. Ct. App. 2002).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 193<br />
Section 230, Internet<br />
Governance & Exceptionalism<br />
Situated within the larger debate over Internet governance, the concept of<br />
Internet exceptionalism presumes that cyberspace cannot be confined by<br />
physical borders or controlled by traditional sovereign governments, and thus<br />
that cyber-libertarian communities will emerge in which norms of relationship,<br />
thought and expression are yet to be formed. Although these ide<strong>as</strong> have been<br />
subjected to intense criticism and somewhat obscured by recent developments<br />
in the governance debates, they remain a touchstone for the cyber-libertarian<br />
ideal. This part of the essay seeks to clear space in the governance debates for<br />
this vision of exceptionalism, and argues that Section 230 is in some limited way<br />
facilitating the emergence of cyber-libertarian communities in a modified, less<br />
demanding form.<br />
Foundational Arguments of<br />
Internet Governance<br />
The debate over Internet governance evolved in two surprisingly distinct, albeit<br />
convergent stages. The first stage of the governance debate focused on law and<br />
social norms, and whether these traditional models of regulating human<br />
relations could be validly applied to the online environment. In this context,<br />
exceptionalism w<strong>as</strong> conceptualized <strong>as</strong> a state of being to which the Internet had<br />
naturally evolved, apart from terrestrial space. The second stage of the debate<br />
introduced network architecture <strong>as</strong> an important and potentially dominant<br />
means of regulating the online environment. In this context, exceptionalism<br />
became an objective to be pursued and protected <strong>as</strong> a matter of choice, rather<br />
than a natural state. At a more exacting level, these debates implicated<br />
fundamental questions of legitimacy, preference, politics, democracy, collective<br />
decision-making, and libertarian ideals.<br />
In the early 1990s, <strong>as</strong> the Internet began to reach the m<strong>as</strong>ses with the advent of<br />
the World Wide Web, a particular vision of the online environment emerged to<br />
advocate and defend Internet exceptionalism. Described <strong>as</strong> digital<br />
libertarianism or cyber-libertarianism, the vision w<strong>as</strong> one of freedom, liberty,<br />
and self-regulation. Cyber-libertarians believed the Internet could and would<br />
develop its own effective legal institutions through which rules would emerge.<br />
These norms would emerge from collective discourse around behavior,<br />
14 Carafano v. Metrospl<strong>as</strong>h.com, Inc., 339 F.3d 1119, 1124 (9th Cir. 2003).<br />
15 See id. at 1122, 1125 (extending § 230 immunity to defendant in claim “alleging inv<strong>as</strong>ion of<br />
privacy, misapproriation of the right of publicity, defamation and negligence”). See also<br />
Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, 1118–19 (9th Cir. 2007) (finding that § 230<br />
immunity extends to state-law intellectual property claims, including unfair competition, false<br />
advertising, and right of publicity).
194 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
relationship, and content, rather than from the control and regulation of<br />
network architecture. Control of architecture w<strong>as</strong> seen almost exclusively <strong>as</strong> an<br />
instrument by which to enforce emerging social norms, and not <strong>as</strong> a means of<br />
determining the norms themselves. By the mid-1990s this process of selfregulation<br />
w<strong>as</strong> well underway.<br />
At the same time, however, sovereign nations and their constituents incre<strong>as</strong>ingly<br />
sought to impose existing offline legal regimes on this emerging, resource-rich<br />
environment. Many in the online community resisted, perceiving this regulation<br />
<strong>as</strong> a threat to the exceptional nature of the Internet. Advocates of selfregulation<br />
envisioned cyberspace <strong>as</strong> a distinct sphere, apart from physical space.<br />
These cyber-libertarian exceptionalists saw the imposition of existing offline<br />
legal systems grounded in territorially-b<strong>as</strong>ed sovereignty <strong>as</strong> inappropriate. They<br />
believed that the online environment should instead be permitted to develop its<br />
own discrete system of legal rules and regulatory processes. Self-regulation w<strong>as</strong><br />
preferable in its own right because it had proven so effective in creating the<br />
environment sought to be preserved, and also because the alternative seemed<br />
dev<strong>as</strong>tating. The imposition of external, territorially-b<strong>as</strong>ed legal regimes would<br />
be, the exceptionalists argued, infe<strong>as</strong>ible, ineffective, and fundamentally<br />
damaging to the online environment.<br />
Faced with the attempted imposition of offline legal regimes, cyber-libertarians<br />
responded by attacking the validity of exercising sovereign authority and<br />
external control over cyberspace. According to Professors David Johnson and<br />
David Post, two leading proponents of self-governance, external regulation of<br />
the online environment would be invalid because Internet exceptionalism—the<br />
state of being to which the Internet naturally evolved—destroys the link<br />
between territorially-b<strong>as</strong>ed sovereigns and their validating principles of power,<br />
legitimacy, effect, and notice. 16 Most importantly, the Internet’s decentralized<br />
architecture deprives territorially-b<strong>as</strong>ed sovereigns of the power, or ability, to<br />
regulate online activity. Likewise, extraterritorial application of sovereign law<br />
fails to represent the consent of the governed, or to effectuate exclusivity of<br />
authority b<strong>as</strong>ed on a relative comparison of local effects. The loss of these<br />
limiting principles results in overlapping and inconsistent regulation of the same<br />
activity with significant spillover effect. Deprived of these validating principles,<br />
it would be illegitimate to apply sovereign authority and external control in<br />
cyberspace.<br />
16 David R. Johnson & David Post, Law and Borders—The Rise of Law in Cyberspace, 48 Stan. L.<br />
Rev. 1367 (1996).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 195<br />
A primary challenge to these cyber-libertarian arguments came from Professor<br />
Goldsmith, who engaged both their descriptive and normative <strong>as</strong>pects. 17 In<br />
terms of the legitimacy of sovereign regulation, Goldsmith criticized Johnson<br />
and Post’s limited view of sovereignty and over-reliance on the relationship<br />
between physical proximity and territorial effects. Moreover, he argued that<br />
they had overstated the impossibility of regulation, mistaking ability for cost;<br />
failed to recognize the deterrent effect on extraterritorial actors of local<br />
enforcement against end users and network components located within the<br />
territory; and mistakenly equated valid regulation with some me<strong>as</strong>ure of nearperfect<br />
enforcement. Finally, where true conflicts between sovereigns existed,<br />
Goldsmith argued that these could be resolved with the same tools used in the<br />
offline world—rules of jurisdiction, conflict of laws, enforcement, etc.<br />
Throughout, Goldsmith struck at Johnson and Post’s exceptionalist view of the<br />
Internet, implicitly rejecting the ultimate significance of both the technical and<br />
communal <strong>as</strong>pects of that ideal. This critique proved dev<strong>as</strong>tating to these early<br />
cyber-libertarian arguments.<br />
The governance debate entered its second ph<strong>as</strong>e in 1999 with the publication of<br />
Professor Lessig’s book, Code and Other Laws of Cyberspace. 18 Prior to Lessig’s<br />
book, the governance debate had focused primarily on behavioral and property<br />
norms, with the <strong>as</strong>sumption that either existing sovereign law or the law<br />
emerging from Internet self-governance would prevail. Network architecture<br />
merely provided the means to enforce these norms, particularly those emerging<br />
from self-governance. Lessig reconceived Internet exceptionalism <strong>as</strong> a two-part<br />
phenomenon, one regulatory and the other cultural. The former recognizes that<br />
many of those features that make the Internet exceptional (in the cyberlibertarian<br />
sense) are merely coding choices, and not the innate nature of<br />
cyberspace. Within the network, architecture and code are the most b<strong>as</strong>ic forms<br />
of regulation. Code can be e<strong>as</strong>ily changed. Thus, Lessig argued, to protect the<br />
cultural <strong>as</strong>pects of exceptionalism, we must first recognize the exceptional<br />
regulatory power of architecture and code within cyberspace, and its pivotal role<br />
in preserving or destroying that culture.<br />
Lessig first pointed out that law and social norms are but two means of<br />
regulating human behavior. In cyberspace, unlike real space, it is possible for<br />
architecture to dominate regulatory structures. Architecture acts <strong>as</strong> a regulator<br />
in the offline world <strong>as</strong> well—in the form of time, nature, physics, etc.—but our<br />
laws and social norms are generally conceived with these regulators <strong>as</strong>sumed.<br />
Alteration of that architecture is unusually difficult if not practically impossible.<br />
In cyberspace, by comparison, architecture in the form of code is remarkably<br />
17 Jack L. Goldsmith, Against Cyberanarchy, 65 U. Chi. L. Rev. 1199 (1998); Jack L. Goldsmith,<br />
The Internet and the Abiding Significance of Territorial Sovereignty, 5 Ind. J. Global Legal Stud. 475<br />
(1998).<br />
18 LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE (1999).
196 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
fluid. Code effectuates a series of choices, from data collection, to anonymity,<br />
to access. And code can be changed. Not only is code fluid, but within<br />
cyberspace it is a uniquely powerful form of regulation. Rather than regulating<br />
behavior and relationships through punishment, deterrence and post-violation<br />
corrective action, code provides the means to exercise perfect control and thus<br />
perfect regulation—regulation not just of effects, but of the very universe of<br />
choices from which an individual actor is able to select.<br />
With this shift in focus, the debate itself evolved. Lessig cautioned that the<br />
greatest threat to the exceptional culture of cyberspace comes from the union of<br />
perfect control and market forces of commerce. The architectural components<br />
that provide the means of perfect control are held almost exclusively by private<br />
entities with commercial and political interests distinct from the collective. The<br />
invisible hand, Lessig argued, cannot resist the promise of perfect control, and<br />
h<strong>as</strong> little or no motivation to protect the fundamental values promoted by<br />
cyber-libertarian exceptionalism. According to the cyber-libertarian narrative,<br />
barriers that are present in the real world do not exist or are de minimus in the<br />
online environment. In the context of Internet architecture, exceptionalism can<br />
be found in original principles of network design that rely on open protocols<br />
and non-discriminatory data transfer—a network that is decentralized,<br />
borderless, and with the potential for nearly unlimited data capacity. Indeed,<br />
the digital data flowing through this system is itself exceptional, because it is<br />
e<strong>as</strong>y to create and manipulate, e<strong>as</strong>y to copy with no degradation in quality, and<br />
e<strong>as</strong>y to access and distribute. In the context of online relationships,<br />
exceptionalism resides (at the very le<strong>as</strong>t) in the interactivity, immediacy, and<br />
potential scope of interaction, <strong>as</strong> well <strong>as</strong> the opportunity for anonymity.<br />
However, the very promise of perfect control is to eliminate many of these<br />
choices and the fundamental values they reflect <strong>as</strong> subservient to commercial<br />
goals. In cyberspace, control over coded architecture supplies the means for<br />
making this election. Building on this <strong>as</strong>sertion, Lessig argued that in order to<br />
protect fundamental values, decisions regarding architecture should emerge<br />
from the body politic and collective decision-making, rather than being<br />
concentrated in private actors.<br />
For many cyber-libertarians, Lessig’s message presented great problems.<br />
Although many had already abandoned the argument that the exercise of<br />
sovereign authority in cyberspace w<strong>as</strong> normatively invalid, they had not given<br />
up (<strong>as</strong> a matter of preference) the vision of an emergent, self-governed, digital<br />
libertarian space. Sovereign legal regimes were still seen <strong>as</strong> the greatest threat to<br />
that vision. Territorial governments should, the cyber-libertarians argued,<br />
simply leave cyberspace alone to flourish. From this perspective, Lessig’s<br />
arguments about the unique regulatory power of architecture and code in<br />
cyberspace were largely convincing. But his description of the corrupting<br />
influence of perfect control and concentrated private power, and particularly his
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 197<br />
call for government regulation to counteract those influences and preserve<br />
fundamental values, were difficult to square with most libertarian views.<br />
The debate on net neutrality provides a glimpse of this division. Many<br />
commentators, including Lessig, are concerned that the private owners that<br />
control the physical/infr<strong>as</strong>tructure layer of the network will, in pursuit of crosslayer<br />
vertical integration and incre<strong>as</strong>ed revenues, privilege certain content or<br />
applications. They therefore endorse regulatorily-mandated neutrality <strong>as</strong> a<br />
means of preserving one <strong>as</strong>pect of Internet exceptionalism. Not surprisingly,<br />
many libertarians reject this approach, endorsing instead market-b<strong>as</strong>ed solutions<br />
for effectuating individual choice.<br />
The irony of this debate is fairly apparent. Many who might otherwise have<br />
characterized themselves <strong>as</strong> cyber-libertarian, or at le<strong>as</strong>t sympathetic to that<br />
vision, are now conflicted. Net neutrality would necessarily be imposed by<br />
external sovereign legal systems and subordinated to the control of commercial<br />
entities, rather than emerging <strong>as</strong> a common norm. In the extremes, the issue<br />
seems to present a choice between entrenched political power and unregulated<br />
market forces, with neither providing adequate protection for individuals. Thus,<br />
many of the Internet exceptionalists who sought to segregate the Internet from<br />
territorial boundaries, who <strong>as</strong>sumed existing sovereign governments and legal<br />
regimes were the greatest threat to the online community, who believed that the<br />
computer scientist would remain in control of the network (and thus in control<br />
of enforcement), found themselves <strong>as</strong>king Congress to protect the Internet<br />
from private actors and market forces.<br />
What’s Left of Exceptionalism?<br />
What then is left of Internet exceptionalism? In his revolutionary essay A<br />
Declaration of the Independence of Cyberspace, John Perry Barlow described<br />
cyberspace <strong>as</strong> consisting not of computers, wires, or code, but of “transactions,<br />
relationships, and thought itself.” 19 It w<strong>as</strong> this vision, this perception of an<br />
evolving social space, that guided Barlow’s ideal of the culture he sought to<br />
preserve—a distinct vision of potential worthy of protection. Indeed, to many<br />
early inhabitants of cyberspace, communal control and regulation of network<br />
architecture appeared a given, if for no other re<strong>as</strong>on than that perfect external<br />
control seemed almost impossible. Freedom of choice in individual expression,<br />
human behavior, and relationships were the heart of the online cultural and<br />
social ideal that stirred Barlow and other cyber-libertarians.<br />
As it evolved, the governance debate fractured this largely unified vision,<br />
distinguishing validity from preference, law and social norms from architecture<br />
19 John Perry Barlow, A Declaration of the Independence of Cyberspace (Feb. 8, 1996),<br />
http://homes.eff.org/~barlow/Declaration-Final.html.
198 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
and code, technical exceptionalism from cultural exceptionalism, government<br />
power from private commercial power, and even libertarian from libertarian.<br />
Lessig argued persu<strong>as</strong>ively that the greatest threat to digital libertarianism arose<br />
from private actors, unbounded by fundamental values (including constitutional<br />
values) and with the ability to exercise perfect control over choice. Lessig’s<br />
analysis, generally speaking, w<strong>as</strong> focused on the treatment of data <strong>as</strong> data, b<strong>as</strong>ed<br />
primarily on the identity of its owner and the commercial interests represented.<br />
Choice in action w<strong>as</strong> to be controlled by the regulation of owned data,<br />
discriminatory treatment of data to the benefit of certain owners, restriction of<br />
network access, and similar means. These technical controls would then be<br />
bolstered by traditional sovereign law validating those me<strong>as</strong>ures.<br />
What seems somewhat obscured in Lessig’s architecture-and-code approach<br />
(which clearly remains the central concern of the governance debate) is Barlow’s<br />
original vision of relational libertarianism, with its focus on expression of<br />
individual choice and the development of new communal social norms within a<br />
system of self-governance. This is the part of Internet exceptionalism that w<strong>as</strong>,<br />
in a sense, overwhelmed by the debate over architecture and code. Yet there<br />
are some choices, primarily relational, that remain largely unaffected by that<br />
debate. In this sphere, the question is not access to choice, the ability to<br />
choose, or the available universe of choices, but rather what norms apply to the<br />
choices being made outside those controls.<br />
Post argues that fundamental normative values could “best be protected by<br />
allowing the widest possible scope for uncoordinated and uncoerced individual<br />
choice among different values and among different embodiments of those<br />
values.” 20 He believes that the imposition of sovereign legal regimes in<br />
cyberspace, rather than promoting fundamental values <strong>as</strong> Lessig argued, would<br />
instead deny the digital libertarian culture the opportunity to develop apart from<br />
the offline world, with its own set of fundamental values. He argues it is better<br />
to serve the private interest (even if powerful and commercially motivated) than<br />
the interest of terrestrial sovereigns. Indeed, he sees exceptionalism <strong>as</strong> requiring<br />
self-governance, to the exclusion of external legal norms imposed by sovereign<br />
powers, <strong>as</strong> a precondition to the emergence of a new system of norms.<br />
Section 230 <strong>as</strong> a Form of<br />
Cyber-Libertarian Exceptionalism<br />
Most would say that Barlow and Post lost the battle. However, this particular<br />
strain of Internet exceptionalism, envisioned <strong>as</strong> self-governance and emerging<br />
social norms applicable to relationships between individuals (<strong>as</strong> opposed to data<br />
<strong>as</strong> data), h<strong>as</strong> been preserved in a modified, less demanding form. Ironically, it is<br />
because of sovereign law, not in spite of it, that this occurred. The dramatic<br />
20 David Post, Against “Against Cyberanarchy,” 17 Berkeley Tech. L.J. 1365 (2002).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 199<br />
expansion of Section 230 immunity h<strong>as</strong> effectuated many of the ideals<br />
promoted by Post, Barlow, and others, albeit on a limited scale. This expansion<br />
h<strong>as</strong> created an environment in which many of the norms and regulatory<br />
mechanisms present in the offline world are effectively inapplicable. This is so<br />
not because the very nature of cyberspace makes such application impossible, or<br />
because sovereign law is necessarily ineffective or invalid, but rather because<br />
sovereign law h<strong>as</strong> affirmatively created that condition.<br />
The torts for which Section 230 provides immunity are, together with contract<br />
law, the primary means by which society defines civil wrongs actionable at law.<br />
These norms of conduct regulate relationships among individuals: articulating<br />
wrongs against the physical and psychic well-being of the person (e.g., <strong>as</strong>sault,<br />
battery, emotional distress), wrongs against property (e.g., tresp<strong>as</strong>s to land,<br />
tresp<strong>as</strong>s to chattels, conversion), wrongs against economic interests (e.g., fraud,<br />
tortious interference), and wrongs against reputation and privacy (e.g.,<br />
defamation, misappropriation of publicity, inv<strong>as</strong>ion of privacy). Section 230 h<strong>as</strong><br />
been interpreted and applied to provide expansive immunity from tort liability<br />
for actions taken on or in conjunction with computer networks, including the<br />
Internet. Statutory language defining who may claim the protections of Section<br />
230 immunity, including providers of interactive computer services and the<br />
users of such services, h<strong>as</strong> been broadly extended. In contr<strong>as</strong>t, the primary<br />
limitation on the range of claimants to Section 230 immunity, which is<br />
statutorily unavailable to the allegedly tortious information content provider,<br />
h<strong>as</strong> been construed fairly narrowly. Moreover, the immunity provided to this<br />
expansive cross-section of online participants now reaches well beyond<br />
defamation to include a wide range of other tortious conduct and claims. As<br />
such, many of the norms of conduct regulating relationships among individuals<br />
in the offline world—those civil wrongs actionable at (tort) law—simply do not<br />
apply to many in the online world.<br />
Even where the online entity is alleged to be aware of the illegal acts of their<br />
users, and to be either actively facilitating those illegal acts or refusing to stop<br />
them, the intermediary retains Section 230 immunity. This is true even where<br />
the intermediary h<strong>as</strong> the knowledge, technical ability, and contractual right to<br />
take remedial action. In the offline world, such active and knowing facilitation<br />
would likely violate social norms established in tort law. In the online world,<br />
however, the defendants are immune from liability. Established norms, <strong>as</strong><br />
expressed through the mechanisms of tort law, are neutralized by Section 230<br />
and its judicial interpretations.<br />
In the near absence of these external legal norms, at le<strong>as</strong>t within the range of<br />
choices being made outside the data-<strong>as</strong>-data architectural controls, the online<br />
community is free to create its own norms, its own rules of conduct, or none at<br />
all. The inhabitants may not have a blank slate—criminal law, intellectual<br />
property law, and contract law still apply—but much of what Barlow embraced
200 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
<strong>as</strong> central tenets (mind, identity, expression) remain undefined. Section 230<br />
offers a modified version of cyber-libertarian exceptionalism, less demanding of<br />
the sovereign and existing offline social norms, and therefore less satisfying.<br />
But it is nonetheless a glimpse of that society, maintained by the sovereign legal<br />
regime rather than against it. The law now applies to nearly every tort that can<br />
be committed in cyberspace. It is nibbling at the edges of intellectual property<br />
rights. It protects against the civil liability components of criminal acts. It<br />
generally extends to all but the first speaker, who may well get lost in the<br />
network to escape liability even without immunity.<br />
A C<strong>as</strong>e for Preserving<br />
Section 230 Immunity<br />
As interpreted by the courts, the immunity provisions of Section 230 have been<br />
heavily criticized. Many commentators have argued that by failing to impose<br />
indirect liability on intermediaries, significant harms will go undeterred or<br />
unremedied, and that Section 230 should be reformed to serve the interests of<br />
efficiency and cost allocation. This part of the essay addresses these criticisms<br />
directly, concluding that substantially reforming the statute is both unnecessary<br />
and unwise because the cost of such liability is unre<strong>as</strong>onable in relation to the<br />
harm deterred or remedied. Indeed, given Section 230’s role in facilitating the<br />
development of Web 2.0 communities, reforming the statute to narrow the<br />
grant of immunity would significantly damage the online environment—both <strong>as</strong><br />
it exists today and <strong>as</strong> it could become.<br />
Evaluating Calls for Reform<br />
Early critics of Section 230 tended to focus on the issues of congressional intent<br />
and broad interpretation by the courts. More recent commentators have moved<br />
beyond these issues to engage the larger implications of providing such<br />
sweeping immunity to online intermediaries, suggesting amendments to Section<br />
230 intended to effectuate policies of efficiency and cost allocation. This<br />
critique begins with the premise that in the online environment, individual bad<br />
actors are often beyond the reach of domestic legal authorities. This creates a<br />
situation in which significant individual harms cannot be legally deterred or<br />
remedied, and the fear that the Internet’s potential <strong>as</strong> a marketplace will not be<br />
realized. Given these negative conditions, where a third party maintains a<br />
certain level of control, the imposition of indirect liability is desirable. The<br />
failure to do so may create inefficiencies by failing to detect and deter harmful<br />
behavior where the cost of doing so is re<strong>as</strong>onable. Commentators have argued<br />
that, in the online environment, intermediaries are in the best position to deter<br />
negative behavior, to track down primary wrongdoers, and to mitigate damages.<br />
This is particularly true in regard to information-b<strong>as</strong>ed torts, the damages of<br />
which might be mitigated in many circumstances simply by taking down,<br />
prohibiting, or blocking the objectionable content.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 201<br />
At the heart of this attack on Section 230 immunity is the idea that, in the<br />
absence of indirect intermediary liability, significant harms will go undeterred or<br />
unremedied. These fears are either misplaced or overstated. As an initial<br />
matter, it is not clear that a significant number of bad actors are beyond the<br />
reach of the law. Advances in technology are making it incre<strong>as</strong>ingly possible to<br />
locate and identify bad actors online, such that online anonymity is difficult to<br />
maintain. Likewise, where the bad actor is identified but is found outside the<br />
jurisdiction, sovereign governments have developed methods for resolving<br />
disputes to permit the direct extraterritorial application of domestic law, such <strong>as</strong><br />
rules of jurisdiction, conflict of laws, and recognition of judgments. Indeed,<br />
anti-exceptionalists have strenuously argued that the application of sovereign<br />
authority to online activity originating outside the jurisdiction is legitimate and<br />
valid in large part because of these rules.<br />
Moreover, although the immunity provided by Section 230 arguably mitigates<br />
the legal incentives for online intermediaries to deter and remedy certain<br />
negative behavior, it does not eliminate those legal incentives. Section 230<br />
expressly states that it h<strong>as</strong> no effect on criminal law, intellectual property law, or<br />
communications privacy law. These external norms remain applicable to and<br />
enforceable against both content providers and intermediaries in the online<br />
environment. Perhaps even more significantly, although Section 230 removes<br />
legal incentives to enforce the norms expressed in tort law, law is certainly not<br />
the only incentive for an intermediary to act. Communal, commercial and other<br />
incentives also play a role. Indeed, Section 230 immunity allows intermediaries<br />
the freedom to intervene in a multitude of ways. Thus, individual harms and<br />
marketplace security can be addressed through alternate legal regimes and<br />
internal incentives.<br />
Furthermore, proponents of indirect intermediary liability concede that even<br />
where harms do exist, intermediaries may only rightly be held liable for failing to<br />
detect and deter harmful behavior where the cost of doing so is re<strong>as</strong>onable. It<br />
is unclear, however, that the costs of intermedial regulation are re<strong>as</strong>onable. In<br />
terms of remedies and reforms, critics generally suggest some form of the<br />
detect-deter-mitigate model, imposing a duty upon the intermediary with the<br />
potential for liability in c<strong>as</strong>es of breach. The two most common models are<br />
traditional liability (damages) regimes and notice-and-takedown schemes.<br />
Proponents of traditional liability schemes generally find theoretical fault with<br />
the exceptionalist view of the Internet, and analytical fault with broad judicial<br />
interpretations of the statute that collapse distributor-with-knowledge liability<br />
into immunity from publisher liability. Proponents of a notice-and-takedown<br />
scheme likewise work from a distributor-with-knowledge model that imposes a<br />
limited duty of care on intermediaries, but generally acknowledge some degree<br />
of exceptionalism that requires a distinct scheme. Most suggest some variation
202 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
utilizing elements of the <strong>Digital</strong> Millennium Copyright Act (DMCA) 21 and the<br />
European Union’s E-Commerce Directive, 22 wherein intermediary liability is<br />
triggered by actual notice of the objectionable content or a standard of<br />
re<strong>as</strong>onable care, and requiring remedial action (e.g., taking down the content at<br />
issue).<br />
The costs of these indirect intermediary liability schemes could be great. Under<br />
traditional liability rules, intermediaries may be forced to adopt a le<strong>as</strong>t-commondenominator<br />
approach, resulting in overly-broad restrictions on expression and<br />
behavior. A modified distributor-with-knowledge approach, usually in the form<br />
of a takedown scheme similar to that employed by the DMCA, may produce the<br />
same type of chilling effect. This is potentially exacerbated by the use of a<br />
should-have-known standard that can trigger the need to patrol for harmful<br />
content, raising costs and leading to even greater overbreadth in application.<br />
Moreover, indirect liability reduces incentives to develop self-help technology,<br />
such <strong>as</strong> location or identity tracking software and end-user filters, the<br />
development of which w<strong>as</strong> one of Section 230’s primary policy goals. Thus, if<br />
the scale of undeterred or unremedied harms is minimal, and the negative<br />
impact of a detect-deter-mitigate model is significant, then the cost <strong>as</strong>sociated<br />
with the imposition of indirect intermediary liability is not re<strong>as</strong>onable.<br />
Resisting the Urge Toward Homogeny<br />
The c<strong>as</strong>e for preserving Section 230 immunity begins by rec<strong>as</strong>ting intermediary<br />
immunity in terms of exceptionalism, self-governance and norms, because it is<br />
precisely the gap between the offline social norms expressed in tort law and the<br />
broad immunity provided to online participants that h<strong>as</strong> led to the rather strong<br />
criticism of Section 230. As a conceptual matter, communal enforcement<br />
presents the greatest challenge to effectuating some modified version of the<br />
exceptionalist ideal. When external legal norms are excluded, internal<br />
enforcement mechanisms facilitate the emergence of new communal norms to<br />
take their place. Much of the criticism of Section 230 stems from the lack of<br />
legal enforcement that accompanies immunity, and the resulting inability to<br />
form new social norms to replace those of the sovereign. It is important to<br />
recognize, however, that Web 2.0 communities, such <strong>as</strong> wikis and social<br />
networks, represent a real and significant manifestation of the exceptionalist<br />
vision, because they both facilitate a market in norms and values, and provide<br />
the internal enforcement mechanisms necessary for internal norms to emerge.<br />
Section 230 plays a vital role in the development of these communities by<br />
21 <strong>Digital</strong> Millenium Copyright Act, Pub. L. 105-304, 112 Stat. 2860 (1998).<br />
22 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on<br />
certain legal <strong>as</strong>pects of information society services, in particular electronic commerce, in<br />
the Internal Market (‘Directive on electronic commerce’), http://eurlex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:EN:NOT.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 203<br />
substantially and continually mitigating the primacy of external legal norms<br />
within the confines of the community. This permits choice, empowers the<br />
intermediary to create a market in social norms, and allows alternate forms and<br />
gradations of enforcement. The architecture of the community gives these<br />
choices form and substance, backed by an enforcement model, such that<br />
communal norms have the opportunity to develop. In this sense, Section 230<br />
and the Web 2.0 model effectuate the emergence of a modified form of<br />
exceptionalism. The reforms proposed by most commentators would have a<br />
negative impact on these communities, with little benefit beyond those<br />
communal norms that are likely to emerge, and should be rejected.<br />
Exceptionalism, Self-Governance & Social Norms<br />
Exceptionalism does not argue for the absence of social norms. Instead,<br />
exceptionalism embraces the idea of cyberspace <strong>as</strong> an environment in which the<br />
authority of external legal regimes is minimal, and where an open market in<br />
norms and values works in concert with self-governance to permit the online<br />
community to establish its own substantive social norms. Section 230 helps to<br />
effectuate a modified form of exceptionalism by moderating the imposition of<br />
external legal norms so <strong>as</strong> to permit a limited range of choices—bounded, at<br />
le<strong>as</strong>t, by criminal law, intellectual property law and contract law—in which the<br />
online community is free to create its own norms and rules of conduct.<br />
However, the development of social norms within this environment requires<br />
not only the ability to exercise broad individual choice among different values<br />
and embodiments of those values, but also some mechanism of communal<br />
enforcement through which to effectuate some form of self-governance.<br />
Early proponents of exceptionalism were able to focus on relational libertarian<br />
ideals, viewing the Internet <strong>as</strong> a unique social space in which norms governing<br />
thought, expression, identity, and relationship should be permitted to evolve.<br />
This focus developed precisely because the mechanisms of enforcement<br />
required for self-governance and the evolving definition of emergent social<br />
norms were taken for granted. The architecture of enforcement w<strong>as</strong> primarily<br />
controlled by a community involved in the process <strong>as</strong> adherents to the<br />
exceptionalist ideal, who could be trusted both to ensure broad individual<br />
choice and to utilize the means of enforcement <strong>as</strong> a tool of self-governance <strong>as</strong><br />
norms emerged.<br />
As a means of effectuating exceptionalism, the primary weakness of Section 230<br />
is the lack of an enforcement component. Although the modified<br />
exceptionalism enabled by Section 230 permits a range of choices, it does<br />
nothing to provide enforcement mechanisms to solidify emerging communal<br />
norms. Where immunity exists, legal enforcement mechanisms are never<br />
triggered. Likewise, the architecture of enforcement relied upon by early<br />
exceptionalists is no longer communal or likely committed to the vision of a<br />
distinct cyber-libertarian space, but is instead concentrated in private
204 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
commercial entities. As a consequence, Section 230 immunity creates a gap:<br />
Certain external legal norms are excluded, but internal communal norms are<br />
often unable to coalesce to take their place. It is this gap, resulting from the<br />
lack of architectural enforcement controls, which fuels criticism of the<br />
immunity provision. In application, however, an enforcement model h<strong>as</strong><br />
emerged that mediates the tension between the broad availability of individual<br />
value choices and the ability to effectively self-govern so <strong>as</strong> to permit the<br />
development of communal norms.<br />
Communities of Modified Exceptionalism<br />
Web 2.0 communities are structured <strong>as</strong> a limited commons and are built on an<br />
architecture of participation that operates <strong>as</strong> a platform for user-created content<br />
and collaboration. At the core are principles of open communication,<br />
decentralized authority, the freedom to share and re-use, and an idea of the<br />
Internet <strong>as</strong> a social forum or market for exchanging opinions and ide<strong>as</strong> in search<br />
of norms to create a culture b<strong>as</strong>ed on sharing. Section 230 plays a vital role in<br />
the development and maintenance of these architectures by providing<br />
intermediaries with limited immunity from liability for the tortious content<br />
provided by users. Indeed, in this sense, Section 230 seems to favor the<br />
development of Web 2.0 services and the provision of user-b<strong>as</strong>ed content over<br />
the traditional model of providing first-party institutional content.<br />
The parallels between Web 2.0 and Barlow’s vision of a communal social space<br />
are evident, albeit in modified form. Barlow embraced the potential of an<br />
environment premised upon freedom of choice in individual expression, human<br />
behavior and relationships. To achieve that potential, he and others believed<br />
that regulation by existing sovereign powers must be rejected in favor of selfgovernance,<br />
so that new communal social norms might have the opportunity to<br />
emerge. At the heart of this ideal w<strong>as</strong> an affirmation that values participation in<br />
the market of expression, ide<strong>as</strong> and action without the constraint of<br />
preconceived value judgments. Web 2.0 promises a somewhat limited version<br />
of this environment—existing within sovereign authority, narrowed by certain<br />
enduring norms, and confined to segmented communities administered by<br />
private entities—by facilitating the market by which norms are tested.<br />
Two of the most common models of these Web 2.0 services, wikis and social<br />
networks, are indicative of how Section 230 can effectuate the modified form of<br />
cyber-libertarian exceptionalism described above. Partly <strong>as</strong> a result of the<br />
immunity from liability provided by Section 230, these services facilitate the<br />
market in social norms by creating enclaves in which users may exercise broad<br />
(although not unbounded) individual choice among competing values. At the<br />
same time, the intermediary retains control over the architecture and thus the<br />
means of enforcement. As the market defines social good through the<br />
evolution of communal norms, that architecture may be employed <strong>as</strong> a<br />
mechanism of governance. In the absence of legal incentives, the enforcement
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 205<br />
of communal norms is driven by internal incentives, such <strong>as</strong> the need for<br />
financial support from community donations, a communal desire for<br />
information integrity, or the need to build an audience for advertising. In some<br />
communities, participants may be incentivized by credibility and stature in the<br />
form of temporal seniority, post count, rank within the community’s governing<br />
body, etc.<br />
The online encyclopedia Wikipedia is a specific example of a Web 2.0<br />
community of collective action. Each entry in the Wikipedia datab<strong>as</strong>e is created<br />
and edited by volunteers who are guided by three primary principles: the<br />
Neutral Point of View policy, the No Original Research policy, and the<br />
Verifiability policy. Registered users can originate new articles, and any user,<br />
whether registered or anonymous, can edit an existing article. In the period<br />
between Wikipedia’s inception in 2001 and 2010, this experiment in voluntary<br />
collaborative action produced more than ten million articles.<br />
These activities are overseen by two levels of administrators, administrators and<br />
bureaucrats. Administrators (historically called sysops, short for system<br />
operators) have the power to edit pages, delete or undelete articles and article<br />
histories, protect pages, and block or unblock user accounts or IP addresses.<br />
Bureaucrats have the further power to create additional sysops with the<br />
approval of the community. In February 2006, in response to a series of<br />
significant and persistent acts of vandalism, the co-founder of Wikipedia created<br />
an additional layer of protection: Administrators can protect any article so that<br />
all future changes must be approved by an administrator. 23 Administrators help<br />
facilitate dispute resolution and enforcement. Low-level disputes are resolved<br />
in talk pages. Here, moderators guide members to resolution with reference to<br />
policies and guidelines developed over the life of the community. Thus,<br />
principle values and norms can lead to more specific rules. This approach<br />
works in most c<strong>as</strong>es. More serious violations, such <strong>as</strong> malicious editing of an<br />
article (or vandalism), are addressed through f<strong>as</strong>t-repair mechanisms executed<br />
by community members. Wikipedia administrators are also able to block user<br />
accounts or IP addresses.<br />
As described, the Wikipedia community reflects a modified form of the<br />
exceptionalist model, initially allowing for individual choice among a range of<br />
values, facilitating a market in social norms, and providing a means of<br />
enforcement to effectuate norms <strong>as</strong> they develop. Indeed, recent studies reflect<br />
not only that norms have emerged from this market, but that those norms have<br />
solidified and expanded. Through this process, the Wikipedia community is<br />
23 See Wikipedia, Wikipedia: Protection Policy,<br />
http://en.wikipedia.org/wiki/Wikipedia:Protection_policy (l<strong>as</strong>t accessed Dec. 1,<br />
2010).
206 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
moving from an immediate focus on particular articles to more generalized<br />
concerns for quality of content and community.<br />
Not unexpectedly, open source projects such <strong>as</strong> Wikipedia are not immune to<br />
abuse. In terms of community health, and to protect against these abuses,<br />
Wikipedia h<strong>as</strong> adopted a code of conduct and principles of etiquette that stress<br />
civility and discourage personal attacks. As discussed above, these norms are<br />
enforced through an architecture that is designed to reinforce those norms with<br />
an eye towards the health of the community. At the most b<strong>as</strong>ic level, this<br />
occurs through routine editing by participants. Over time, more complex<br />
mechanisms for dispute resolution and enforcement have developed, such that<br />
in the p<strong>as</strong>t few years administrative and coordination activities have gained<br />
importance.<br />
The relationship between architecture and social norms is f<strong>as</strong>cinatingly apparent<br />
both in the Wikipedia’s architectural choice to track and correlate the IP address<br />
of any anonymous user who edits the encyclopedia, <strong>as</strong> well <strong>as</strong> the development<br />
of a monitoring system that tracks those changes for analysis. This system<br />
serves <strong>as</strong> a mechanism for enforcing social norms, particularly the norm of<br />
neutrality in more controversial are<strong>as</strong>. In terms of more formal enforcement,<br />
some edits that might previously have been overlooked are now being<br />
reexamined in light of the organization from which they originated. Less<br />
formally, but perhaps even more effectively, organizations which are perceived<br />
to have breached the norms of the community have faced, and will face,<br />
recriminations. Moreover, the entire community is now aware that enforcement<br />
of those norms is now more effective, presumably creating a deterrence effect.<br />
The Wikipedia example illuminates a constant process, <strong>as</strong> choices are narrowed<br />
by communal norms that develop and are given life through enforcement<br />
mechanisms, such that principle norms generate a breadth of more particular<br />
rules. Section 230 immunity plays an important role in this process, permitting<br />
the community to evolve and structure itself in the most efficient manner. To a<br />
limited extent, Section 230 immunity permits uncoordinated and uncoerced<br />
individual choice among different values and among different embodiments of<br />
those values. It further allows the intermediary to play an active role in<br />
facilitating the market in social norms and in creating enforcement mechanisms<br />
<strong>as</strong> a tool of self-governance. Those enforcement mechanisms can then<br />
themselves adapt. This allows not only for the development of distinct<br />
community values, but also for a means of tapping into incentives, adapting to<br />
evolving norms and conditions, and reducing costs <strong>as</strong>sociated with disputes.<br />
Within this framework, greater variations in community norms are possible. As<br />
communities grow, niche communities are formed at low cost. It is not the<br />
global vision of early exceptionalism, but rather a more limited and localized<br />
form of modified exceptionalism that functions <strong>as</strong> a laboratory for testing social<br />
norms and values.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 207<br />
Conclusion<br />
Critics of Section 230 have both overstated the harms arising from immunity<br />
and understated the costs of alternate schemes for imposing indirect liability on<br />
online intermediaries. At the same time, they have ignored the important role<br />
Section 230 plays in the development of online communities. The immunity<br />
provided by Section 230 helps to create the initial conditions necessary for the<br />
development of a modified form of exceptionalism by mitigating the effect of<br />
external legal norms in the online environment. Web 2.0 communities are then<br />
able to facilitate a market in norms and provide the architectural enforcement<br />
mechanisms that give emerging norms substance. Given Section 230’s crucial<br />
role in this process, and the growing importance of Web 2.0 communities in<br />
which collaborative production is yielding remarkable results, reforming the<br />
statute to substantially narrow the grant of immunity is both unnecessary and<br />
unwise.
208 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 209<br />
Internet Exceptionalism Revisited<br />
By Mark MacCarthy *<br />
Introduction<br />
In the mid-1990s, commentators began debating the best way for governments<br />
to react to the development of the Internet <strong>as</strong> a global communications<br />
medium. Internet exceptionalists argued that the borderless nature of this new<br />
medium meant that the application of local law to online activities would create<br />
insoluble conflicts of law. The exceptionalists believed that <strong>as</strong> the Internet<br />
grew, reliance on local governments to set rules for the new online world would<br />
not scale well. Their alternative w<strong>as</strong> the notion of cyberspace <strong>as</strong> a separate place<br />
that should be ruled by norms developed by self-governing communities of<br />
users. 1<br />
Critics of the exceptionalist view responded with a vision of a bordered Internet<br />
where local governments could apply local law. 2 In this view, cyberspace is not<br />
a separate place. It is simply a communications network that links real people in<br />
real communities with other people in different jurisdictions. Governments can<br />
regulate activity on this new communications network in many different ways,<br />
including by relying on the local operations of global intermediaries. Global<br />
intermediaries are the Internet service providers (ISPs), payment systems, search<br />
engines, auction sites, and other platform and application providers that provide<br />
the infr<strong>as</strong>tructure necessary for Internet activity. Although they are often global<br />
in character, they also have local operations subject to local government control.<br />
According to critics of the exceptionalist view, governments have the right and<br />
the obligation to use this regulatory power over intermediaries to protect their<br />
citizens from harm. 3 Conflicts that might arise from this regulatory activity can<br />
* Mark MacCarthy is Adjunct Professor in the Communications Culture and Technology<br />
Program at Georgetown University. Formerly, he w<strong>as</strong> Senior Vice President for Public<br />
Policy at Visa Inc. Substantial portions of this essay were originally published <strong>as</strong> Mark<br />
MacCarthy, What Payment Intermediaries are Doing About Online Liability and Why It Matters, 25<br />
BERKELEY TECH. L. J. 1037 (2010), available at http://btlj.org/data/articles/25_2/1037-<br />
1120%20MacCarthy%20WEB.pdf.<br />
1 See, e.g., David R. Johnson & David Post, Law and Borders — The Rise of Law in Cyberspace, 48<br />
STAN. L. REV. 1367, 1387-92 (1996).<br />
2 E.g., Jack L. Goldsmith, Against Cyberanarchy, 65 U. CHI. L. REV. 1199 (1998).<br />
3 See id. at 1238-39.
210 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
be resolved through the normal mechanisms governments use to resolve<br />
conflict of law questions. 4<br />
Governments generally followed the advice of the proponents of regulation, not<br />
the regulatory skeptics. 5 And despite some set-backs in First Amendment<br />
c<strong>as</strong>es, 6 regulators have continued a steady march toward controlling the Internet<br />
by regulating intermediaries. 7 Some legal scholars argue that government<br />
reliance on intermediaries to control unlawful behavior on the Internet is<br />
justified because putting the enforcement burden on intermediaries is the le<strong>as</strong>t<br />
expensive way for governments to effectively <strong>as</strong>sert jurisdiction. 8 The key<br />
rationale is that governments cannot e<strong>as</strong>ily find wrong-doers on the Internet,<br />
but intermediaries can. They are best positioned to monitor their own systems.<br />
As Mann and Belzley put it, they are the “le<strong>as</strong>t-cost avoider.” 9<br />
The defenders of local government jurisdiction over the Internet often rely on<br />
historical analogies to buttress their c<strong>as</strong>e that local control is inevitable and<br />
desirable. Debra Spar developed the thesis that society’s reaction to new<br />
technologies follows a predictable sequence of innovation, commercial<br />
exploitation, creative anarchy, and then government rules. 10 In the innovative<br />
stage a new technology is developed, in the second stage it is used in<br />
commercial ventures, in the third stage there is a tension between the anarchist<br />
impulse and the need for commercial order and stability, and in the final stage<br />
society reaches out to regulate the now mature technology to create and<br />
4 Id. at 1200-01 (arguing that “regulation of cyberspace is fe<strong>as</strong>ible and legitimate from the<br />
perspective of jurisdiction and choice of law”).<br />
5 The U.S. exception is § 230 of the Telecommunications Act of 1996 which immunizes many<br />
Internet actors from liability in many contexts for the illegal activity of their users. 47 U.S.C.<br />
§ 230(c) (2006).<br />
6 See, e.g,, Reno v. ACLU, 521 U.S. 844, 885 (1997) (“The interest in encouraging freedom of<br />
expression in a democratic society outweighs any theoretical but unproven benefit of<br />
censorship.”); Ctr. for Democracy & Tech. v. Pappert, 337 F. Supp. 2d 606, 665 (E.D. Pa.<br />
2004) (finding that a statute requiring ISPs to block access to websites displaying child<br />
pornography violated the First Amendment).<br />
7 See generally JACK GOLDSMITH & TIM WU, WHO CONTROLS THE INTERNET?: ILLUSIONS OF A<br />
BORDERLESS WORLD (2006) (citing many examples of this trend). This Article documents<br />
further examples in which payment systems were induced by laws, regulations, pressure, and<br />
notions of corporate responsibility to take actions to control the illegal online behavior of<br />
people using their systems.<br />
8 See, e.g., Ronald J. Mann & Seth R. Belzley, The Promise of Internet Intermediary Liability, 47 WM.<br />
& MARY L. REV. 239, 249-50 (2005).<br />
9 Id. at 249.<br />
10 DEBORA L. SPAR, RULING THE WAVES: CYCLES OF DISCOVERY, CHAOS, AND WEALTH FROM<br />
THE COMPASS TO THE INTERNET 11-22 (2001).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 211<br />
maintain the needed stability. 11 The development of radio is the standard<br />
example of this pattern. Radio’s initial pioneers thought its ability to wirelessly<br />
broadc<strong>as</strong>t information from one point to many made government control<br />
difficult and unnecessary. 12 But later commercial enterprises actively sought out<br />
government regulation in order to end the chaos on the airwaves that prevented<br />
broadc<strong>as</strong>ters from reaching their intended audience. 13 Applying Spar’s analysis<br />
here, the Internet is somewhere between stage three and stage four, where we<br />
can expect further regulation of Internet activity under the watchful eye of<br />
government. The historical example demonstrates that although every new<br />
technology is thought to be outside the jurisdiction of government, this belief<br />
usually gives way in time to the realities of government control.<br />
In the c<strong>as</strong>e of the Internet, the advent of government control prompted many<br />
observers to think the Internet exceptionalists had been routed. 14 However,<br />
Internet exceptionalism is still a widely held belief, 15 and the notion that<br />
government control of cyberspace is both impossible and illegitimate still<br />
motivates much discussion of Internet policy. 16 Moreover, the initial legislative<br />
expression of Internet exceptionalism—Section 230 of the 1996<br />
Telecommunications Act—is still on the books. This section provides a safe<br />
harbor from indirect liability for what might be called pure Internet<br />
intermediaries—those entities providing Internet access service or online<br />
11 Id.; see also Mann & Belzley, supra note 9, at 243-44; GOLDSMITH & WU, supra note 7, at 124<br />
(relying on Spar’s work).<br />
12 See generally SPAR, supra note 10, at 124-90 (describing the history of radio technology<br />
development).<br />
13 Id. at 171-72.<br />
14 See GOLDSMITH & WU, supra note 7, at 14 (<strong>as</strong>serting that “notions of a self-governing<br />
cyberspace are largely discredited”).<br />
15 See generally DAVID G. POST, IN SEARCH OF JEFFERSON’S MOOSE (David Kairys ed., 2009)<br />
[hereinafter Post, IN SEARCH OF JEFFERSON’S MOOSE ] (demonstrating an elegant take on<br />
Internet exceptionalism). The heart of the response to Goldsmith is that scale matters and<br />
that while it is physically possible and permissible under current “settled” law of crossborder<br />
jurisprudence, it is not “workable” to subject all websites to perhaps hundreds of<br />
different and possibly conflicting jurisdictions. See David G. Post, Against “Against<br />
Cyberanarchy”, 17 BERKELEY TECH. L.J. 1365, 1384 (2002) [hereinafter Post, Against “Against<br />
Cyberanarchy”].<br />
16 See H. Brian Holland, supra (adapted from H. Brian Holland,In Defense of Online Intermediary<br />
Immunity: Facilitating Communities of Modified Exceptionalism, 56 U. KAN. L. REV. 369, 397<br />
(2007)). Holland’s version of modified exceptionalism is closely connected with the legal<br />
principle that online intermediaries are not liable for third party conduct. He <strong>as</strong>serts that the<br />
immunity from liability created by § 230 of the Communications Decency Act “helps to<br />
effectuate a modified form of exceptionalism by moderating the imposition of external legal<br />
norms so <strong>as</strong> to permit a limited range of choices—bounded, at le<strong>as</strong>t, by criminal law,<br />
intellectual property law and contract law—in which the online community is free to create<br />
its own norms and rules of conduct.” Id. at 397.
212 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
services. 17 Despite a growing call to revisit this immunity, 18 it h<strong>as</strong> been<br />
extended several times. The Internet gambling law, which creates liability for<br />
traditional intermediaries such <strong>as</strong> payment systems, contains a limitation on<br />
liability for pure Internet intermediaries. 19 Similarly, the recently p<strong>as</strong>sed online<br />
pharmacy law exempts pure Internet intermediaries from a general duty to avoid<br />
aiding or abetting unauthorized Internet sales of controlled substances. 20 The<br />
adoption of these provisions in recent laws might be merely § 230 on automatic<br />
pilot, but more likely, some version of Internet exceptionalism is at work in<br />
these legislative distinctions.<br />
A recent speech by the Obama Administration’s senior communications<br />
policymaker, Lawrence Strickling, provides further evidence of the continuing<br />
relevance of the Internet exceptionalist perspective. 21 In defending Section<br />
230’s limitation on liability, Assistant Secretary Strickling argued:<br />
This limitation on liability h<strong>as</strong> enabled the creation of<br />
innovative services such <strong>as</strong> eBay and YouTube, which host<br />
content provided by others, without requiring that those<br />
services monitor every single piece of content available on their<br />
sites. Absent this protection against liability, it is hard to<br />
imagine that these services would have been <strong>as</strong> successful <strong>as</strong><br />
they turned out to be. 22<br />
Internet exceptionalism is the view that the normal rules that apply to realworld<br />
providers of goods and services should not apply to online entities.<br />
Secretary Strickling argues for this view on policy grounds. Without it, he<br />
<strong>as</strong>serts, the innovative character of the Internet would come to a halt. The next<br />
17 47 U.S.C. § 230(c)(1) (2006) (“No provider or user of an interactive computer service shall<br />
be treated <strong>as</strong> the publisher or speaker of any information provided by another information<br />
content provider.”). The interpretation of this provision is quite broad. See, e.g., Zeran v. Am.<br />
Online, Inc., 129 F.3d 327, 330-31 (4th Cir. 1997) (finding that plaintiff ’s tort claims of<br />
defamation were preempted by § 230). The immunity does not extend to criminal law,<br />
contract law, or intellectual property law. 47 U.S.C. § 230(e)(1)-(4) (2006).<br />
18 See, e.g., Doug Lichtman & Eric Posner, Holding Internet Service Providers Accountable, 14 U. CHI.<br />
SUP. CT. ECON. REV. 221 (2006), John Palfrey and Urs G<strong>as</strong>ser, BORN DIGITAL 106 (2008),<br />
and Daniel Solove, THE FUTURE OF REPUTATION 125-160 (2007).<br />
19 31 U.S.C. § 5365(c) (2006).<br />
20 Ryan Haight Online Pharmacy Consumer Protection Act of 2008, Pub. L. No. 110-425, §<br />
(h)(3)(A)(iii), 122 Stat. 4829-30.<br />
21 Remarks by Lawrence Strickling, Assistant Secretary of Commerce for Communications and<br />
Information, to Internet Society’s INET Series: Internet 2020: The <strong>Next</strong> Billion Users April 29,<br />
2010 available at<br />
http://www.ntia.doc.gov/presentations/2010/InternetSociety_04292010.html<br />
22 Id.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 213<br />
YouTube or Google could never emerge because the legal liabilities <strong>as</strong>sociated<br />
with running such a new business would strangle it.<br />
If the Internet exceptionalists rested their c<strong>as</strong>e on the literal impossibility of<br />
extending local law to cyberspace then there is not much left to their argument.<br />
A “bordered Internet” where intermediaries try to control behavior prohibited<br />
by local law is becoming a reality. Most Internet intermediaries have explicit<br />
policies that prohibit illegal activities. 23 These general policies are supplemented<br />
with specific policies and procedures designed to prevent the use of these<br />
systems for specific illegal activities.<br />
Moreover, it is not just voluntary efforts by Internet intermediaries that show<br />
how Internet activity can be controlled. Governments have been effectively<br />
extending their control over Internet activity through imposing obligations on<br />
intermediaries. It h<strong>as</strong> been estimated that at le<strong>as</strong>t 26 countries impose some<br />
kind of filtering obligations on Internet entities. 24 Recent government actions in<br />
France and the United Kingdom impose “graduated response” obligations on<br />
ISPs, requiring them to cut off Internet access for alleged repeat copyright<br />
violators. 25 It is possible to challenge these extensions of government power<br />
23 Participants in Google’s advertising programs “shall not, and shall not authorize any party to<br />
… advertise anything illegal or engage in any illegal or fraudulent business practice.” Google<br />
Inc. Advertising Program Terms 4 (Aug. 22, 2006), available at<br />
https://adwords.google.com/select/tsandcsfinder. M<strong>as</strong>terCard h<strong>as</strong> rules for both<br />
merchants and their acquiring banks: “A Merchant must not submit for payment into<br />
interchange … and an Acquirer must not accept from a Merchant for submission into<br />
interchange, any Transaction that is illegal.” MASTERCARD, MASTERCARD RULES 5.9.7 (2008),<br />
available at http://www.merchantcouncil.org/merchantaccount/downloads/m<strong>as</strong>tercard/M<strong>as</strong>terCard_Rules_5_08.pdf.<br />
M<strong>as</strong>terCard prohibits<br />
its issuing banks from engaging in illegal transactions. Id. at 3.8.4. Visa h<strong>as</strong> similar rules, for<br />
example: “A Merchant Agreement must specify that a Merchant must not knowingly submit,<br />
and an Acquirer must not knowingly accept from a Merchant, for submission into the Visa<br />
payment system, any Transaction that is illegal or that the Merchant should have known w<strong>as</strong><br />
illegal.” VISA, VISA INTERNATIONAL OPERATING REGULATIONS § 4.1.B.1.c (2008), available at<br />
http://usa.visa.com/download/merchants/visa-international-operatingregulations.pdf.<br />
Visa’s regulations also specify acquirer penalties for merchants engaging in<br />
illegal cross-border transactions. Id. § 1.6.D.16.<br />
24 RONALD DEIBERT, JOHN PALFREY, RAFAL ROHOZINSKI, JONATHAN ZITTRAIN, ACCESS<br />
DENIED: THE PRACTICE AND POLICY OF GLOBAL INTERNET FILTERING 1<br />
(2008).<br />
25 Eric Pfanner, U.K. Approves Crackdown on Internet Pirates, NEW YORK TIMES, April 8, 2010 at<br />
http://www.nytimes.com/2010/04/09/technology/09piracy.html?scp=1&sq=digital<br />
%20economy%20bill%20uk&st=cse. Eric Pfanner, France Approves Wide Crackdown on Net<br />
Piracy, NEW YORK TIMES, October 22, 2009,<br />
http://www.nytimes.com/2009/10/23/technology/23net.html?_r=1. Sometimes the<br />
ISPs cooperate in a graduated response policy to settle legal claims. For a review of<br />
government and private sector efforts to control online copyright violations, see Christina<br />
Angelopoulos, Filtering the Internet for Copyrighted Content in Europe, IRIS PLUS, March 2009<br />
available at http://www.obs.coe.int/oea_publ/iris/iris_plus/iplus4_2009.pdf.en
214 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
over Internet activity <strong>as</strong> unwise, or <strong>as</strong> a violation of a human right to Internet<br />
access or <strong>as</strong> too costly. But it is no longer plausible to maintain that they are<br />
simply impossible.<br />
This conclusion is discussed at length in another essay in this collection that<br />
focuses on the traditional payment intermediaries, payment card companies<br />
such <strong>as</strong> Visa, M<strong>as</strong>terCard, and American Express, <strong>as</strong> an instructive category of<br />
intermediary platforms. 26 Developments over the l<strong>as</strong>t several years conclusively<br />
demonstrate that these payment intermediaries can control specific illegal<br />
activities on the Internet and governments can extend their control to these<br />
payment intermediaries.<br />
Thus, the debate over Internet exceptionalism h<strong>as</strong> shifted from the “nature” of<br />
the Internet <strong>as</strong> something intrinsically beyond the control of governments to a<br />
problem of choice. 27 Intermediaries can control illegal behavior on the Internet<br />
and governments can control intermediaries, but should they? And if government<br />
should exert control over intermediaries in order to control Internet activities,<br />
how should the global legal order be restructured to accommodate their role?<br />
This essay explores the extent to which the experience of payment systems in<br />
controlling the illegal online behavior of their users illuminates the debate<br />
among the Internet exceptionalists, defenders of the bordered Internet, and the<br />
internationalists. It concludes that exceptionalism, in either its original or<br />
modified forms, is not the right framework for Internet governance because<br />
intermediaries should not defer to the judgments of self-governing communities<br />
of Internet users when the judgments conflict with local law. The<br />
exceptionalists are correct that a “bordered Internet” will not scale up, but the<br />
experience of traditional payment systems points towards international<br />
harmonization. If governments are going to use intermediaries to regulate the<br />
Internet, they need to coordinate their own laws to make that role possible.<br />
The essay addresses each of the three main approaches to Internet governance:<br />
exceptionalism, the bordered Internet, and internationalism. The first section,<br />
on exceptionalism, begins with a discussion of the original Internet<br />
exceptionalist perspective, which viewed government regulation of the Internet<br />
<strong>as</strong> infe<strong>as</strong>ible and normatively less desirable than government deference to the<br />
rules developed by self-governing Internet communities. This is followed by a<br />
discussion of Brian Holland’s revised version of exceptionalism. Under this<br />
approach, the various immunities from intermediary liability established by local<br />
jurisdictions enable the development of autonomous Internet norms. Both<br />
versions are shown to have significant limitations when viewed in light of<br />
26 See MacCarthy, Online Liability for Payment Systems, infra at 230.<br />
27 See Holland, supra note 16, at 376-77 (“In this context, exceptionalism became an objective to<br />
be pursued and protected <strong>as</strong> a matter of choice, rather than a natural state.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 215<br />
payment system experiences. The next section explores the “bordered<br />
Internet,” the idea that in certain c<strong>as</strong>es local governments may properly and<br />
unilaterally extend their jurisdiction over Internet activities through<br />
intermediaries. Payment intermediaries use standard me<strong>as</strong>ures to resolve<br />
conflicts of law and follow a practical rule that treats a transaction <strong>as</strong> illegal if it<br />
is illegal in the jurisdiction of either the merchant or the cardholder. This<br />
section then discusses limitations on this method of resolving cross-border<br />
jurisdictional conflicts. The final section concludes with a discussion and<br />
endorsement of the internationalist perspective, according to which local<br />
governments should only exercise control over specific Internet activities in a<br />
coordinated f<strong>as</strong>hion.<br />
Internet Exceptionalism:<br />
The Original Version<br />
In February 1996, John Perry Barlow identified Internet exceptionalism when<br />
he declared cyberspace to be independent of national governments, roughly on<br />
the grounds that cyberspace “does not lie within your borders” and that it “is a<br />
world that is both everywhere and nowhere, but it is not where bodies live.” 28<br />
Conflicts in cyberspace would be resolved not with the territorially-b<strong>as</strong>ed “legal<br />
concepts of property, expression, identity, movement, and context,” which “do<br />
not apply,” to cyberspace because they “are all b<strong>as</strong>ed on matter, and there is no<br />
matter here.” 29 Rather, in cyberspace “governance will arise according to the<br />
conditions of our world, not yours.” 30 Cyberspace “is different.” 31<br />
Almost concurrently, legal scholars David Johnson and David Post made a<br />
similar c<strong>as</strong>e for Internet exceptionalism. 32 In their view, the Internet destroys<br />
“the link between geographical location” and “the power of local governments to<br />
<strong>as</strong>sert control over online behavior; [and] … the legitimacy of a local sovereign’s<br />
efforts to regulate global phenomena ….” 33 The Internet destroys the power of<br />
local governments because they cannot control the flow of electrons across<br />
their physical boundaries. If they attempted to do so, determined users would<br />
just route around the barriers. Moreover, if one jurisdiction could <strong>as</strong>sert control<br />
over Internet transactions, all jurisdictions could, resulting in the impossibility<br />
28 Declaration of John P. Barlow, Cognitive Dissident, Co-Founder, Elec. Frontier Found., A<br />
Declaration of the Independence of Cyberspace (Feb. 8, 1996), available at<br />
http://w2.eff.org/Censorship/Internet_censorship_bills/barlow_0296.declaration.<br />
29 Id.<br />
30 Id.<br />
31 Id.<br />
32 See generally Johnson & Post, supra note 1.<br />
33 Id. at 1370 (emph<strong>as</strong>is added).
216 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
that all “Web-b<strong>as</strong>ed activity, in this view, must be subject simultaneously to the<br />
laws of all territorial sovereigns.” 34 The Internet destroys the legitimacy of local<br />
jurisdiction because legitimacy depends on the consent of the governed and<br />
“[t]here is no geographically localized set of constituents with a stronger and<br />
more legitimate claim to regulate it than any other local group. The strongest<br />
claim to control comes from the participants themselves, and they could be<br />
anywhere.” 35 Since “events on the Net occur everywhere but nowhere in<br />
particular … no physical jurisdiction h<strong>as</strong> a more compelling claim than any<br />
other to subject these events exclusively to its laws.” 36<br />
Behind these arguments seemed to be an appealing political vision. The ideal<br />
envisaged self-organizing groups of people making the rules that applied to their<br />
conduct. These rules would not be imposed from the outside, but would be<br />
freely chosen by the active participation of the community members. The key<br />
w<strong>as</strong> deliberation by free, rational agents in their communities, not imposition of<br />
rules by an arbitrary act of will by a distant sovereign. This ideal of participatory<br />
democracy w<strong>as</strong> intended, in part, to offset the alienating effects of large-scale<br />
modern democracies, which in practice had long failed to provide their<br />
members with the sense of community participation that alone seemed to justify<br />
the imposition of collective rules.<br />
The way this vision would be implemented on the Internet would be through<br />
the development of autonomous communities of Internet users. These Internet<br />
communities were largely isolated from “real world” communities. Since it took<br />
special care and effort to reach out to participate in them, only those people<br />
who really wanted to participate would, and the effects of activities in those<br />
communities would be limited to those who chose to participate. Given the<br />
structure of the Internet <strong>as</strong> a communications network, which moved almost all<br />
major decisions on content to the edges of the network, a diversity of law could<br />
arise in cyberspace <strong>as</strong> each community developed its own norms for regulating<br />
the conduct of its members. People would be free to participate in the<br />
communities they wanted, but could e<strong>as</strong>ily avoid those they did not like.<br />
Enforcement of the community rules would be accomplished through peer<br />
pressure, reputational systems, informal dispute resolution mechanisms, and<br />
ultimately, banishment. The system <strong>as</strong> a whole would evolve through a process<br />
analogous to biological evolution, where diverse and potentially competing rule<br />
sets <strong>as</strong> embodied in different communities would vie for acceptance in a free<br />
marketplace of rules.<br />
34 Id. at 1374.<br />
35 Id. at 1375.<br />
36 Id. at 1376.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 217<br />
Internet exceptionalism is thus the view that activity on the Internet should be<br />
regulated by Internet community norms, not laws of territorial jurisdictions or<br />
globally harmonized laws. 37 It is hard to avoid the sense that the political vision<br />
pre-dated the Internet—that the fe<strong>as</strong>ibility argument m<strong>as</strong>ked the underlying<br />
vision and the arrival of the Internet simply created the possibility of<br />
implementing the vision in a way that the “real” world did not. To see this,<br />
imagine the reaction of Internet exceptionalists to the idea of a world<br />
government that would establish uniform global laws. This would eliminate the<br />
conflict of law problem. But exceptionalists are even more appalled at the idea<br />
of world government control over the Internet than with the idea of nationstate<br />
control over it. This suggests that the issue is not fe<strong>as</strong>ibility of control, but<br />
the value of participative community decision making and diversity.<br />
This early cyber libertarian vision w<strong>as</strong> immediately attacked by those who<br />
defended the fe<strong>as</strong>ibility and legitimacy of extending local laws to cover Internet<br />
activity. 38 As they note, “[t]he mistake here is the belief that governments<br />
regulate only through direct sanctioning of individuals…. Governments can …<br />
impose liability on intermediaries like Internet service providers or credit card<br />
companies.” 39 Government action against these intermediaries “makes it harder<br />
for local users to obtain content from, or transact with, the law-evading content<br />
providers abroad. In this way, governments affect Internet flows within their<br />
borders even though they originate abroad and cannot e<strong>as</strong>ily be stopped at the<br />
border.” 40 And these efforts to bring order to the Internet through pressure on<br />
intermediaries are often legitimate because they provide “something invisible<br />
but essential: public goods like criminal law, property rights, and contract<br />
enforcement … that can usually be provided only by governments.” 41<br />
The debate took an interesting twist through the work of Larry Lessig. A key<br />
element of the early exceptionalist framework w<strong>as</strong> the idea that the Internet had<br />
37 Mann and Belzley describe their view <strong>as</strong> “consciously exceptionalist” because “specific<br />
characteristics of the Internet make intermediary liability relatively more attractive than it h<strong>as</strong><br />
been in traditional offline contexts because of the e<strong>as</strong>e of identifying intermediaries, the<br />
relative e<strong>as</strong>e of intermediary monitoring of end users, and the relative difficulty of directly<br />
regulating the conduct of end users.” Mann & Belzley, supra note 9, at 250-51. But this is an<br />
odd way of framing the issue. Internet exceptionalism is not simply the view that the<br />
Internet should be treated differently from the offline world. The claim is more specifically<br />
that the Internet should be free of local jurisdictions. Mann and Belzley’s view, which implies<br />
that the Internet should be brought under local jurisdictions through the mechanism of<br />
intermediary liability, is thus the very opposite of exceptionalism. It is one version of<br />
Internet non-exceptionalism.<br />
38 See generally Goldsmith, supra note 4 (challenging the regulation skeptics).<br />
39 Goldsmith, supra note 4, at 1238.<br />
40 GOLDSMITH & WU, supra note 7, at 68.<br />
41 Id. at 140.
218 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
a fundamental nature, which governments did not control, could not alter, and<br />
which effectively prevented them from imposing local rules. In his influential<br />
book, Code and Other Laws of Cyberspace,Lessig took aim at this idea. 42 He<br />
pointed out that computer systems, software applications, and communications<br />
networks were human creations and that the choices of the architects of these<br />
systems were embodied in the code that made it possible for these systems to<br />
run. Far from being a natural object, these systems were subject to the<br />
decisions of the parties (usually non-governmental entities) that had the right<br />
and the ability to create, maintain and alter them.<br />
The initial openness and transparency of the Internet w<strong>as</strong> therefore something<br />
that could not be <strong>as</strong>sumed <strong>as</strong> a fact of nature, but something that needed to be<br />
maintained against possible opponents. But unlike the early cyber libertarians,<br />
Lessig did not focus on the dangers that local governments might try to control<br />
choices by controlling code. He thought the openness of the Internet had to be<br />
maintained against the interests of non-governmental parties seeking to advance<br />
their own strategic interests. Lessig’s initial private sector targets were the<br />
network carriers who were seeking to alter the “end-to-end” design of the<br />
network in order to pursue their own strategic interests at the expense of<br />
application providers, service providers and end users who relied on the<br />
neutrality of the Internet to conduct their ordinary activities. In this way, the<br />
Internet exceptionalist debate merged with the net neutrality debate and the<br />
original defenders of exceptionalism seemed to be faced with the (to them)<br />
unattractive dilemma of using local governments to promote Internet values of<br />
openness or allowing their Internet choices to be dictated by unaccountable<br />
private entities that controlled the fundamental architecture of the Internet. 43<br />
This attack w<strong>as</strong> so effective that many believe that these notions of a “selfgoverning<br />
cyberspace are largely discredited.” 44 But modified versions accept<br />
the b<strong>as</strong>ic premise that the Internet should be free of local regulation and<br />
governed instead by its users. One version of the revived exceptionalism,<br />
defended by Brian Holland, focuses on Web 2.0 communities. 45 This view<br />
42 LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE (1999).<br />
43 See Holland, supra note 16 at 108-119 for a summary of this way of connecting the Internet<br />
exceptionalist debate with the net neutrality debate.<br />
44 Id. at 14.<br />
45 Holland writes:<br />
By mitigating the imposition of certain external legal norms in the online<br />
environment, § 230 helps to create the initial conditions necessary for the<br />
development of a modified form of exceptionalism. With the impact of<br />
external norms diminished, Web 2.0 communities, such <strong>as</strong> wikis and social<br />
networks, have emerged to facilitate a limited market in norms and values<br />
and to provide internal enforcement mechanisms that allow new communal<br />
norms to emerge.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 219<br />
argues that together with the immunity provisions of Section 230 of<br />
Communications Decency Act, these communities have the potential to allow<br />
internal community norms to take the place of external territorially b<strong>as</strong>ed laws. 46<br />
Critique of Internet Exceptionalism<br />
The experience of global payment intermediaries described in a companion<br />
article in this volume confirms the view that intermediaries can effectively<br />
control illegal activity in cyberspace. This still leaves the question of whether<br />
intermediaries should resist governmental pressure to control the behavior of<br />
their users. As a general matter, they should not defer to the judgments of selfgoverning<br />
communities of Internet users when these judgments conflict with<br />
local law. As corporate citizens, they have an obligation to obey the laws of the<br />
jurisdictions in which they operate, and they simply have no b<strong>as</strong>is to excuse<br />
themselves from that duty in order to let online communities determine their<br />
own fate. But even when local law does not require them to take action against<br />
illegal behavior, their responsibility to keep their systems free of illegal activity<br />
means that they often should take specific steps to stop these activities.<br />
The fundamental objection, even to Holland’s modified exceptionalism, is that<br />
the “law” of Internet communities is not really the law of that community. It is<br />
a commercial contract enforceable under the rules of some local jurisdiction,<br />
and the terms of the contract are subject to the same kinds of legal and<br />
regulatory oversight that bind contracts between people in local jurisdictions.<br />
Deferring to these contracts does not usually mean democratic community selfgovernment.<br />
Local regulations are needed to fully protect the members of these<br />
communities. 47 Moreover, in some c<strong>as</strong>es, the legal discretion granted to<br />
intermediaries to control the conduct of their members may be too broad and<br />
should be limited by replacing intermediary judgment with public authority<br />
decisions. The remainder of this section develops these points.<br />
Even if Internet communities could substantially exclude a significant portion<br />
of external legal norms, it still does not follow that internal norms will<br />
necessarily emerge from the process of debate and deliberation that Holland<br />
envisages. As Holland notes, “external legal norms are excluded, but internal<br />
communal norms are often unable to coalesce to take their place” because<br />
enforcement is “concentrated in private commercial entities.” 48 The hope of his<br />
modified Internet exceptionalism is that the intermediaries who control the new<br />
Web 2.0 platforms will be driven by internal incentives to accommodate the<br />
46 Id.<br />
Holland, supra note 16, at 369.<br />
47 This Section focuses on competition policy, privacy, and consumer protection <strong>as</strong> examples.<br />
48 Holland, supra note 16, at 398.
220 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
wishes of the online communities they create, allowing users to establish norms<br />
for their own communities. 49<br />
But it is not clear that Web 2.0 platforms are likely to grant this kind of<br />
democratic self-governance. For example, intermediaries can be subject to<br />
pressure. Craig Newmark, the operator of Craigslist, h<strong>as</strong> insisted that he made<br />
his decision to remove ads for erotic services <strong>as</strong> a result of consultation with his<br />
online community. 50 But it is also true that Craigslist w<strong>as</strong> under criminal<br />
investigation by a number of state attorneys general for violation of state laws<br />
against prostitution. 51 One could argue immunity in this c<strong>as</strong>e, but Craigslist did<br />
not. 52 It complied with a law enforcement request to remove certain postings<br />
and the decision to remove these ads will be subject to ongoing oversight by<br />
these law enforcement agencies. 53 However, the question remained whether or<br />
not Craigslist would take the legal risk if the community voted to keep these ads<br />
in place.<br />
These communities are not typically governed by democratic voting procedures<br />
that guarantee the consent of the governed. They are governed by contractual<br />
terms of service. Often prospective members of these communities have a<br />
simple take-it-or-leave-it choice when they decide to join. 54<br />
49 These internal incentives include “the need for financial support from community donations,<br />
a communal desire for information integrity, or the need to build an audience for<br />
advertising.” Id. at 400; see also Matthew Schruers, Note: The History and Economics of ISP<br />
Liability for Third Party Content, 88 VA. L. REV. 205, 261 (“ISPs respond to content-b<strong>as</strong>ed<br />
complaints <strong>as</strong> a matter of good business practice for the purpose of maintaining customer<br />
goodwill and satisfaction.”).<br />
50 Craigslist Founder Seeks Larger DC Role, NAT’L J., June 2, 2009, available at<br />
http://techdailydose.nationaljournal.com/2009/06/craigslist-founder-seekslarge.php<br />
(reporting Craig Newmark’s comments to the Computers Freedom and Privacy<br />
Conference).<br />
51 See Brad Stone, Craigslist to Remove ‘Erotic’ Ads, N.Y. TIMES, May 14, 2009, at B1. Craigslist’s<br />
attorneys <strong>as</strong>serted immunity under § 230, but chose voluntarily to remove the ads to which<br />
various state attorneys general had objected. Id. State Attorneys General felt confident that<br />
they could bring a c<strong>as</strong>e under state criminal law despite the immunity granted by § 230. Id.<br />
The c<strong>as</strong>e w<strong>as</strong> given national attention when a medical student w<strong>as</strong> accused of killing a<br />
m<strong>as</strong>seuse whom he met through Craigslist. Id.<br />
52 Id.<br />
53 Id.<br />
54 See Johnson & Post, supra note 1, at 1380 (describing AOL and Compuserve terms of service<br />
<strong>as</strong> examples of law in cyberspace). Johnson & Post view the rules for an Internet community<br />
to be “a matter for principled discussion, not an act of will by whoever h<strong>as</strong> control of the<br />
power switch.” Id. But it is hard to see how terms of service for a typical Internet service or<br />
application is anything other than an act of will by the person who controls the service or<br />
application. It might satisfy certain legal standards for informed consent, but it is not the<br />
product of principled discussion. And this might be the way consumers want it. Online
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 221<br />
If consumers do not like the terms of service, then protest can be effective, <strong>as</strong><br />
in the recent c<strong>as</strong>e of users objecting to the change in terms of service<br />
unilaterally offered by Facebook. By threatening the privacy rights of the<br />
community, the platform stirred up substantial community unrest, and<br />
ultimately the new terms of service were withdrawn. 55 But this exit right is not<br />
the same <strong>as</strong> democratic self-governance, and it is not always effective. What if<br />
Facebook had not responded to community objections? Would people actually<br />
have left, and where would they have gone? Lock-in is a real restriction in social<br />
networks.<br />
The exemption from liability b<strong>as</strong>ed on Section 230 does not mean that online<br />
entities are exempt from local law. Often, local law is needed to protect<br />
consumers from the actions of Internet intermediaries. Regulation of online<br />
communities by governments seems especially timely and urgent in three are<strong>as</strong>:<br />
competition policy, privacy, and consumer protection.<br />
With respect to competition, concentration in particular sectors of the online<br />
world should be examined because it can so significantly reduce consumer<br />
choice. The Department of Justice h<strong>as</strong> indicated, for example, that it is going to<br />
take a more active approach in this area. 56 Along with the Federal Trade<br />
communities might not offer to determine their online laws through a political process<br />
because the members of the community cannot be bothered. People visit many different<br />
websites and use many different web services. It is hard to believe that they want full<br />
democratic participation rights to set up the rules for each of these services. And it is<br />
implausible that they would actually spend the time, if they were offered the opportunity.<br />
The example of privacy policies makes the point. A recent study concluded that if all U.S.<br />
consumers read all the privacy policies for all the web sites they visited just once a year, the<br />
total amount of time spent on just reading the policies would be 53.8 billion hours per year<br />
and the cost to the economy of the time spent doing this would be $781 billion per year.<br />
Aleecia M. McDonald & Lorrie F. Cranor, The Cost of Reading Privacy Policies, 4 I/S: J.L. &<br />
POL’Y FOR INFO. SOC’Y 543, 565 (2008).<br />
55 N.Y. Times, Facebook, Inc.,<br />
http://topics.nytimes.com/top/news/business/companies/facebook_inc/index.ht<br />
ml?8qa&scp=1-spot&sq=facebook&st=nyt (l<strong>as</strong>t updated May 27, 2009). In 2007, the<br />
company had created a community backl<strong>as</strong>h when it introduced an advertising service that<br />
allowed a user’s online activities to be distributed to other community members. Epic.org:<br />
Electronic Privacy Information Center, Social Networking Privacy,<br />
http://epic.org/privacy/socialnet/default.html (l<strong>as</strong>t visited Feb. 3, 2009). In the face<br />
of this protest, it provided a simple way for users to decline to participate. Id. In February<br />
2009, it proposed new privacy rules according to which users will own and control their own<br />
information, and in April it allowed a vote of its users on these new principles. Over 75%<br />
of those voting endorse them, and on July 1, 2009 it adopted them. Id.<br />
56 Press Rele<strong>as</strong>e, U.S. Dep’t of Justice, Justice Department Withdraws Report on Antitrust<br />
Monopoly Law: Antitrust Division to Apply More Rigorous Standard with Focus on the<br />
Impact of Exclusionary Conduct on Consumers (May 11, 2009), available at<br />
http://www.justice.gov/atr/public/press_rele<strong>as</strong>es/2009/245710.pdf.
222 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
Commission (FTC), they have initiated inquiries focused on the search engine<br />
market. 57<br />
Privacy and security rules need to be defined <strong>as</strong> well. The FTC h<strong>as</strong> taken major<br />
action in this area, and is stepping up its enforcement. 58 They are also focusing<br />
on the development of a new privacy framework to analyze the b<strong>as</strong>is for the<br />
harms <strong>as</strong>sociated with privacy violations. 59 Furthermore, the FTC h<strong>as</strong> focused<br />
on developing rules for online behavioral advertising. 60 In addition, rules<br />
governing privacy for online cloud computing services need to be clarified,<br />
perhaps by additional legislation. 61<br />
Consumer protection rules should be updated to apply more effectively to new<br />
developments in electronic commerce including the growth of mobile<br />
commerce and user-generated content, the greater availability of digital goods<br />
online, and incre<strong>as</strong>ed numbers of consumers acting <strong>as</strong> online sellers, and new<br />
developments in accountability and payment protection. A timely development<br />
might be the harmonization of consumer redress and liability rights across<br />
various payment mechanisms. 62<br />
Finally, the discretion given to Internet intermediaries over which transactions<br />
to allow must be subject to public scrutiny. Today, intermediaries exercise<br />
57 See, e.g., Miguel Helft, U.S. Inquiry Is Confirmed into Google Books Deal, N.Y. TIMES, July 3, 2009,<br />
at B3; Miguel Helft & Brad Stone, Board Ties at Apple and Google Scrutinized, N.Y. TIMES, May<br />
5, 2009, at B1; Peter Whoriskey, Google Ad Deal Is Under Scrutiny: Yahoo Agreement Subject of<br />
Antitrust Probe, Sources Say, WASH. POST, July 2, 2008, at D1.<br />
58 See Press Rele<strong>as</strong>e, Fed. Trade Comm’n, Sears Settles FTC Charges Regarding Tracking<br />
Software (June 4, 2009), available at http://www.ftc.gov/opa/2009/06/sears.shtm<br />
(reporting that in the Sears c<strong>as</strong>e the FTC obtained a settlement from Sears after charging<br />
that their consent practices in regard to installing an online tracking program on customers’<br />
computers constituted an unfair or deceptive practice).<br />
59 See Stephanie Clifford, Fresh Views at Agency Overseeing Online Ads, N.Y. TIMES, Aug. 5, 2009, at<br />
B1 (stating that David Vladeck, the new head of the FTC’s consumer protection division, is<br />
rethinking privacy). Vladeck said that “[t]he frameworks that we’ve been using historically for<br />
privacy are no longer sufficient.” Id. In his view the FTC will begin to consider not just<br />
whether companies caused monetary harm, but whether they violated consumers’ dignity<br />
because, for example, “[t]here’s a huge dignity interest wrapped up in having somebody<br />
looking at your financial records when they have no business doing that.” Id.<br />
60 See Press Rele<strong>as</strong>e, Fed. Trade Comm’n, FTC Staff Revises Online Behavioral Advertising<br />
Principles (Feb. 13, 2009), available at http://www.ftc.gov/opa/2009/02/behavad.shtm.<br />
61 See generally ROBERT GELLMAN, WORLD PRIVACY FORUM, PRIVACY IN THE CLOUDS: RISKS TO<br />
PRIVACY AND CONFIDENTIALITY FROM CLOUD COMPUTING (2009) (discussing these cloud<br />
computing issues).<br />
62 Legal payment protections now differ depending on the type of payment product used<br />
(debit or credit) and the nature of the payment provider—traditional payment providers like<br />
Visa face legal requirements while new payment providers such <strong>as</strong> cell phone companies do<br />
not.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 223<br />
judgment over which transactions are subject to such legal risk that they cannot<br />
be allowed. These decisions are made in the context of the business interests<br />
and technological capabilities of the intermediaries themselves, but they have<br />
important effects on the rights and interests of other parties. Some examples,<br />
explained in a companion essay in this volume, include:<br />
� Payment systems effectively decide which Internet gambling<br />
transactions are illegal. By choosing to block all coded gambling<br />
transactions, the system disadvantages horseracing, state lottery, and<br />
Indian gaming transactions that are arguably legal.<br />
� Payment systems take complaints from third parties, make an<br />
independent legal <strong>as</strong>sessment of the merits of the c<strong>as</strong>e, and withdraw<br />
service b<strong>as</strong>ed on these <strong>as</strong>sessments. In effect, they adjudicate these<br />
copyright c<strong>as</strong>es.<br />
These decisions are sound and sensible ways to balance complex and competing<br />
interests. However, they are private sector judgments, inevitably subjective and<br />
influenced by the particular interests of the parties involved.<br />
Other intermediaries also have enforcement abilities that they can use at their<br />
own discretion. For instance, in June 2009, it w<strong>as</strong> reported that a British ISP<br />
had agreed to disconnect subscribers who were accused of three instances of<br />
infringement by a copyright owner. 63 Allegations of violations would be made<br />
by a contractor working for the content owner and transmitted to the ISP. 64 At<br />
this point, these decisions are largely up to the payment intermediaries and the<br />
ISPs themselves, although in some jurisdictions they are dictated by government<br />
requirements, 65 yet their decisions will have profound effects on the shape and<br />
63 See, e.g., Danny O’Brien, Irish ISP Agrees to Three Strikes Against Its Customers, DEEPLINKS<br />
BLOG, http://www.eff.org/deeplinks/2009/01/irish-isp-agrees-three-strikes-againstits-users<br />
(Jan. 28, 2009).<br />
64 Under the agreement the music labels, instead of going to court to get an order to have the<br />
ISP shut off a subscriber’s connection, provide evidence of infringement to the ISP directly.<br />
Id. As O’Brien noted,<br />
Id.<br />
The difference is that an ISP is not a court; and its customers will never have<br />
a chance to defend themselves against the recording industry’s accusations<br />
and “proof.” To whom, without judicial oversight, h<strong>as</strong> the ISP obligated<br />
itself to provide meaningful due process and to ensure that the standard of<br />
proof h<strong>as</strong> been met?<br />
65 The movement toward graduated response would replace this discretion with government<br />
processes. Under the recently p<strong>as</strong>sed Haute Autorité pour la Diffusion des Œuvres et la<br />
Protection des Droits sur Internet” (High Authority of Diffusion of the Art Works and<br />
Protection of the (Copy)Rights on Internet) (“HADOPI”) law, French ISPs would be<br />
required to suspend Internet access for subscribers who have been subject to three<br />
allegations of copyright violations. Catherine Saez, French HADOPI Law, Now Complete, Can<br />
Brandish Its Weapons, INTELL. PROP. WATCH, Oct. 23, 2009, http://www.ip-
224 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
direction of electronic commerce. Deferring to the norms of the Internet<br />
community in this context means deferring to these private judgments by<br />
intermediaries.<br />
There is a role for Internet community decision-making. The best<br />
circumstances for deference to law constructed for and by particular Internet<br />
communities is when an Internet community’s norms do not “fundamentally<br />
impinge upon the vital interests of others who never visit this new space.” 66 To<br />
the extent that an Internet community is self-contained or its activities affect<br />
others only on a voluntary b<strong>as</strong>is, then there is a c<strong>as</strong>e for deferring. 67<br />
Payment Systems &<br />
the Bordered Internet<br />
Goldsmith and Wu attack Internet exceptionalism, but they also construct a<br />
positive vision of a “bordered Internet.” 68 This world would work pretty much<br />
<strong>as</strong> the world worked before the Internet. New regulations would be crafted to<br />
deal with the new dangers specifically created by the Internet, but there would<br />
be no fundamental need to adjust the b<strong>as</strong>ic domestic or international<br />
framework. 69<br />
watch.org/weblog/2009/10/23/french-hadopi-law-now-complete-can-brandish-itsweapons/.<br />
A court review would be required before suspension. Id. A similar graduated<br />
response program w<strong>as</strong> adopted in Britain in April 2010. See Eric Pfanner, U.K. Approves<br />
Crackdown on Internet Pirates, NEW YORK TIMES, April 8, 2010, available at<br />
http://www.nytimes.com/2010/04/09/technology/09piracy.html. Whether these<br />
graduated response programs are needed is a point of controversy, but they replace ISP<br />
discretion with a system of public accountability.<br />
66 Johnson & Post, supra note 1, at 1389.<br />
67 See POST, IN SEARCH OF JEFFERSON’S MOOSE, supra note 15, at 178-86 (describing “m<strong>as</strong>sively<br />
multi-player online games” or MMOGS <strong>as</strong> good candidates for this effort at online rule<br />
creation). This might be. However, Linden Labs, the creator of Second Life, one of the<br />
most famous MMOGs, found it necessary to rely on external banking regulators when it<br />
decided to ban the offering of interest or any return on investment in-world without proof<br />
of an applicable government registration statement or financial institution charter. Kend<br />
Linden, New Policy Regarding In-World “Banks”, SECOND LIFE BLOGS, Jan. 8, 2008 06:43:56<br />
PM, https://blogs.secondlife.com/community/features/blog/2008/01/08/newpolicy-regarding-in-world-banks.<br />
Linden Labs properly concluded that it “isn’t, and can’t<br />
start acting <strong>as</strong>, a banking regulator.” Id. New rule-making institutions will emerge only if<br />
people think that they are real. For this re<strong>as</strong>on, a policy to defer in certain c<strong>as</strong>es should be<br />
public and stable in order to provide the opportunity for the development of alternative<br />
rules.<br />
68 GOLDSMITH & WU, supra note 7, at viii.<br />
69 Id. at 149.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 225<br />
Jurisdictional disputes would be one significant problem with the bordered<br />
Internet. The initial Internet exceptionalist argument w<strong>as</strong> that Internet activity<br />
is simultaneously present in multiple overlapping and inconsistent jurisdictions,<br />
and that no one jurisdiction h<strong>as</strong> a better claim to regulate the activity than any<br />
other jurisdiction. It would be better to think of the activity <strong>as</strong> taking place in a<br />
separate jurisdiction altogether and have the territorial governments of the<br />
world defer to the community norms created there. Goldsmith and Wu’s<br />
response w<strong>as</strong> that Internet activity w<strong>as</strong> real world activity, taking place in<br />
particular jurisdictions, and that local governments could exert control over this<br />
activity by attaching obligations to the local operations of global Internet<br />
intermediaries. 70 This indirect liability for intermediaries would make it e<strong>as</strong>ier to<br />
extend local law to the bad actor. 71 Conflict of laws would be handled by the<br />
normal mechanisms for resolving these disputes, and ultimately enforced by<br />
actions taken against local operations of global intermediaries. 72<br />
Jurisdiction in cyberspace is a complex topic with many different approaches to<br />
<strong>as</strong>signing both the applicable law and the court of jurisdiction. 73 Questions<br />
include determining the location of the transaction, the jurisdiction, and the<br />
interests of the parties. 74 An early attempt to deal with these issues in the<br />
Internet context w<strong>as</strong> the FTC’s approach to consumer protection in the global<br />
marketplace. 75 The simplest cross-border electronic transaction implicates<br />
transnational concerns. Choice of law debates inevitably follow. The FTC<br />
considered arguments for the “country of origin” approach and the “country of<br />
destination” approach. 76 Under the country of origin approach, the law of the<br />
70 Id. at 68-72.<br />
71 Mann & Belzley, supra note 9, at 259 (“[On the Internet it is] e<strong>as</strong>ier for even solvent<br />
malfe<strong>as</strong>ors engaged in high-volume conduct to avoid responsibility either through anonymity<br />
or through relocation to a jurisdiction outside the influence of concerned policymakers.”).<br />
Mann and Belzley also argue that indirect liability makes sense in “c<strong>as</strong>es in which the retailer<br />
is located in a jurisdiction outside the United States that will not cooperate with the relevant<br />
state regulators.” Id. at 277.<br />
72 GOLDSMITH & WU, supra note 7, at 158-61.<br />
73 See, e.g., Paul S. Berman, Towards a Cosmopolitan Vision of Conflict of Laws: Redefining<br />
Governmental Interests in a Global Era, 153 U. PA. L. REV. 1819, 1822 (2005) (arguing that judges<br />
should adopt a cosmopolitan approach in Internet c<strong>as</strong>es involving choice of law and foreign<br />
judgment issues, grounded in the “idea that governments have an interest not only in helping<br />
in-state litigants win the particular litigation at issue, but a more important long-term interest<br />
in being cooperative members of an international system and sharing in its reciprocal<br />
benefits and burdens”).<br />
74 See generally Goldsmith, supra note 4 (discussing many of these theories); see also Berman, supra<br />
note 73, at 1839-40 (discussing various choice-of-law theories that address these questions).<br />
75 FED. TRADE COMM’N, CONSUMER PROTECTION IN THE GLOBAL ELECTRONIC<br />
MARKETPLACE: LOOKING AHEAD (2000). The FTC’s discussion of applicable law and<br />
jurisdiction is especially relevant. Id. at 4-11.<br />
76 Id.
226 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
merchant would apply and the courts of the merchant’s country would<br />
adjudicate any disputes. 77 Under the country of destination approach, the law<br />
of the consumer would apply and the courts of the consumer’s country would<br />
adjudicate disputes. 78<br />
The defense of the country of origin approach relied on the difficulty of<br />
applying any other legal framework to the electronic marketplace. 79 Only this<br />
country of origin framework seems to allow for the growth of global ecommerce.<br />
The framework considers problems encountered by small<br />
businesses selling in many countries of creating and applying a standard for<br />
some variety of “purposeful” targeting. Creating a default rule of the country of<br />
origin w<strong>as</strong> deemed to better provide needed uniformity and predictability for<br />
online businesses.<br />
This approach h<strong>as</strong> defects. First, it forces consumers to rely on unfamiliar<br />
consumer protections. If merchants cannot be expected to know the laws of<br />
180 countries, neither can consumers. Second, it creates a “race to the bottom,”<br />
whereby unscrupulous merchants can simply locate in a country with weak<br />
consumer protections. Third, consumers cannot re<strong>as</strong>onably be expected to<br />
travel to the country of origin to obtain redress. Fourth, consumers could not<br />
rely on their own consumer protection agencies for redress either, since these<br />
agencies would also be unable to enforce the consumer’s home jurisdiction<br />
protections.<br />
So neither default rule seemed to suffice. As a practical matter, consumer<br />
education, self-regulatory efforts, and the development of codes of conduct by<br />
multinational organizations were the means chosen to address the cross–border<br />
consumer protection issue. 80 For other issues that could not be addressed<br />
77 Id. at 2.<br />
78 Id. The European Union appeared to take the side of the country of origin in its E-<br />
Commerce Directive. European Commission, E-Commerce Directive,<br />
http://ec.europa.eu/internal_market/e-commerce/directive_en.htm (l<strong>as</strong>t visited Feb.<br />
15, 2010). The Directive contains an Internal Market clause “which means that information<br />
society services are, in principle, subject to the law of the Member State in which the service<br />
provider is established.” Id.<br />
79 FED. TRADE COMM’N, supra note 75, at 4 (discussing the “two fundamental challenges” to a<br />
country-of-destination framework, including “the use of physical borders to determine<br />
rights in a borderless medium” and compliance costs).<br />
80 In 1999, the OECD issued its Guidelines for Consumer Protection in the Context of<br />
Electronic Commerce, which address principles that could be used by electronic commerce<br />
merchants in the absence of global consumer protection rules. ORG. FOR ECON. CO-<br />
OPERATION & DEV., GUIDELINES FOR CONSUMER PROTECTION IN THE CONTEXT OF<br />
ELECTRONIC COMMERCE (1999) [hereinafter OECD GUIDELINES], available at<br />
http://www.oecd.org/document/51/0,3343,en_2649_34267_1824435_1_1_1_1,00.html.<br />
The FTC and the OECD held a 10th year anniversary of the rele<strong>as</strong>e of these guidelines in
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 227<br />
through these means, the traditional tools of international conflict of law<br />
resolution would have to suffice. 81<br />
Some commentators such <strong>as</strong> Paul Berman attempted to reach beyond the<br />
traditional dispute resolution mechanisms for resolving conflict of law c<strong>as</strong>es<br />
with principles that take into account the realities of multiple community<br />
affiliations. 82 His “cosmopolitan pluralism” w<strong>as</strong> “cosmopolitan” because it<br />
went beyond the laws of any one particular jurisdiction and recognized the<br />
legitimacy of norms created by private parties and communities. 83 It w<strong>as</strong> plural<br />
because it did not dissolve the multiplicity of community affiliations and their<br />
<strong>as</strong>sociated norms into a single world-wide standard. Diversity and conflict<br />
would endure and would need to be resolved according to a series of principles<br />
that recognized the need to balance competing national norms. 84<br />
These approaches to resolving jurisdictional disputes in cyberspace have various<br />
advantages and disadvantages. However, payment system intermediaries<br />
needed a mechanism to address the jurisdictional question that w<strong>as</strong> e<strong>as</strong>y to<br />
apply, effective in resolving the dispute, and minimized legal risk to the system<br />
or its members. It could not wait for unpredictable, after-the-fact judgments by<br />
courts. The idea they developed, discussed in chapter 6 of this book, w<strong>as</strong> that a<br />
December 2009. OECD, OECD Conference on Empowering E-Consumers,<br />
http://www.oecd.org/ict/econsumerconference (l<strong>as</strong>t visited Sep. 1, 2010).<br />
81 In an interesting twist, some commentators used the presence of these dispute resolution<br />
mechanisms to argue against indirect liability for intermediaries. Why deputize intermediaries<br />
to stop illegal activities on the Internet when governments can reach the bad actors and<br />
resolve any disputes in the normal way? Responding to the argument that indirect liability is<br />
needed because the bad actor is unreachable by law enforcement or aggrieved parties,<br />
Holland says:<br />
As an initial matter, it is not clear that a significant number of bad actors are<br />
beyond the reach of the law. Advances in technology are making it<br />
incre<strong>as</strong>ingly possible to locate and identify bad actors online, such that online<br />
anonymity is difficult to maintain. Likewise, where the bad actor is identified<br />
but is found outside the jurisdiction, sovereign governments have developed<br />
methods for resolving disputes to permit the direct extraterritorial<br />
application of domestic law, such <strong>as</strong> rules of jurisdiction, conflicts of laws,<br />
and recognition of judgments.<br />
Holland, supra note 16, at 393.<br />
82 Berman, supra note 73, at 1862.<br />
83 Id.<br />
84 Id. Berman’s work h<strong>as</strong> affinities with that of political philosophers working in the area of<br />
national sovereignty in a global world. See, e.g., Thom<strong>as</strong> W. Pogge, WORLD POVERTY<br />
AND HUMAN RIGHTS 168-95 (2002).
228 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
transaction is unacceptable in the payment system if it is illegal in the<br />
jurisdiction of either the buyer or the seller. 85<br />
The payment card approach provides a simple default rule for intermediaries to<br />
apply when determining whether to allow transactions in their systems. It<br />
eliminates the heavily fact-b<strong>as</strong>ed balancing <strong>as</strong>sessments needed to determine, on<br />
a c<strong>as</strong>e-by-c<strong>as</strong>e b<strong>as</strong>is, whose law applies. The default rule also does not simply<br />
adopt a country of origin or country of destination perspective, each of which is<br />
limited. Nor does it leave the transaction in a legal limbo where no law<br />
applies. 86<br />
The payment system experience leads to several observations. First, direct<br />
conflicts of law are not <strong>as</strong> frequent <strong>as</strong> some anticipated. Technology and<br />
payment system practices effectively reduce these conflicts to the rare instance<br />
85 Visa’s policy is stated in International Piracy: The Challenges of Protecting Intellectual Property in the<br />
21st Century: Hearing Before the Subcomm. on Courts, the Internet, and Intellectual Property of the H.<br />
Comm. on the Judiciary, 110th Cong. 73–82 (2007) at 71 (statement of Mark MacCarthy, Senior<br />
Vice President for Global Public Policy, Visa Inc.). Other payment intermediaries have<br />
similar procedures, such <strong>as</strong> eBay’s restriction about selling and shipping illegal goods to the<br />
country where they are illegal. eBay, Offensive Material Policy,<br />
http://pages.ebay.com/help/policies/offensive.html (l<strong>as</strong>t visited Feb. 4, 2010)<br />
(“[B]ecause eBay is a worldwide community, many of our users live in countries where the<br />
possession or sale of items <strong>as</strong>sociated with hate organizations is a criminal offense. We can’t<br />
allow the sale or shipping of these items there.”).<br />
86 The internal application of this rule involves system efficiency and the balance of interests<br />
among the stakeholders in the system. If the merchant is in violation of its own country’s<br />
law, then enforcement is conceptually e<strong>as</strong>y. Merchants discovered in violation of local law<br />
either have to stop the transactions or be removed from the system. If the merchant is in<br />
violation of the law in a different jurisdiction, things are more complicated. Should the bank<br />
of the merchant or the bank of the customer be burdened with the enforcement<br />
responsibility? If the merchant h<strong>as</strong> this responsibility, then he must not introduce the illegal<br />
transaction into the system and the merchant’s bank must not try to process it, then steps<br />
must be taken at the merchant’s end to stop the transaction. These steps could include: a<br />
system decision requiring the merchant to stop these transactions entirely; coding and<br />
programming modifications by the merchant, the merchant’s processor, or the system<br />
operator that would block transactions at the merchant end from entering the system if the<br />
customer w<strong>as</strong> from a jurisdiction where the transaction would be illegal; or restricting the<br />
transaction to the merchant’s own jurisdiction. Alternatively, the enforcement me<strong>as</strong>ures<br />
could be put on the cardholder side. Merchants could introduce properly-coded transactions<br />
into the system and rely on action on the cardholder’s side to stop the transaction. This<br />
seems to fit the c<strong>as</strong>e of Internet gambling, where U.S. law makes Internet gambling illegal<br />
for U.S. citizens, and the payment networks responded to the Unlawful Internet Gambling<br />
Enforcement Act of 2006 (UIGEA) with a coding and blocking system that allowed<br />
merchants to continue their services in countries where Internet gambling w<strong>as</strong> illegal, <strong>as</strong><br />
discussed earlier in this Article. For instance, should merchants be responsible for knowing<br />
the laws of all the countries of all the customers they deal with? Perhaps not, but if 90% of<br />
their sales are from an offshore jurisdiction, they should be responsible for knowing that<br />
sales of their product are legal in that jurisdiction. Violations of the policy would largely be<br />
dealt with on a complaint b<strong>as</strong>is.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 229<br />
where the law of one country demands what the law of another country forbids.<br />
Directly contradicting laws are more common in “political” are<strong>as</strong>, where<br />
governments are seeking information from intermediaries to enforce local laws<br />
against their own citizens. 87<br />
Second, regulating the Internet by focusing on the local affiliates of global<br />
payment operations does not require the use of either the traditional or the new<br />
“cosmopolitan” conflict resolution methods. By relying on global payment<br />
intermediaries, local jurisdictions reach out to the local affiliates that are totally<br />
within their jurisdiction. They do not put burdens on entities in foreign<br />
jurisdictions at all. There is literally no conflict and thus nothing to which<br />
normal mechanisms of conflict resolution may attach. 88<br />
Some commentators have correctly pointed out that when the laws of different<br />
jurisdictions apply to a single transaction, the ability of any particular jurisdiction<br />
to unilaterally regulate the Internet is limited. 89 But intermediaries can reduce<br />
these conflicts. Global payment systems can simplify transactions to events in<br />
which only a buyer in one jurisdiction and a seller in another are implicated. By<br />
concentrating enforcement on intermediaries instead of individuals or<br />
merchants, local jurisdictions can take advantage of the economies that these<br />
institutions make possible.<br />
The experience of payment intermediaries reveals that, within limits, the<br />
differences among conflicting jurisdictions can be managed. The bordered<br />
87 See, e.g., Press Rele<strong>as</strong>e, Privacy Int’l, Europe’s Privacy Commissioners Rule Against SWIFT<br />
(Nov. 23, 2006), available at<br />
http://www.privacyinternational.org/article.shtml?cmd[347]=x-347-546365<br />
(describing the SWIFT c<strong>as</strong>e, where SWIFT w<strong>as</strong> required to comply with U.S. demands for<br />
access to financial information about European customers in virtue of its operations on US<br />
soil, while such compliance put them in violation of the European data protection directive).<br />
In addition, p<strong>as</strong>sage of the Global Online Freedom Act (GOFA) could put Internet<br />
intermediaries in a conflict of law situation with China and other countries. See Global<br />
Online Freedom Act of 2007, H.R. 275, 110th Cong. (2007). H.R. 275 w<strong>as</strong> introduced by<br />
Representative Chris Smith on January 5, 2007 and would require U.S. intermediaries to resist<br />
certain orders from countries in which they are doing business. Id.<br />
88 Antigua brought a complaint against the U.S. for the enforcement of its gambling laws, but<br />
its success w<strong>as</strong> b<strong>as</strong>ed only on (1) the U.S.’s failure to exclude Internet gambling from the list<br />
of services that required open treatment and (2) the idiosyncr<strong>as</strong>ies of U.S. gambling law<br />
which appear to allow domestic horse racing to engage in Internet gambling while denying<br />
similar opportunities to offshore Internet gambling merchants. But these are technical<br />
obstacles created by the interaction of complex U.S. law and international WTO law and are<br />
not real conflict of law problems. See Appellate Body Report, United States—Me<strong>as</strong>ures<br />
Affecting the Cross-Border Supply of Gambling and Betting Services, 358-64, WT/DS285/AB/R<br />
(Apr. 7, 2005). Op cit. supra note 130.<br />
89 See, e.g., H. Brian Holland, The Failure of the Rule of Law in Cyberspace?: Reorienting the Normative<br />
Debate on Borders and Territorial Sovereignty, 24 J. MARSHALL J. COMPUTER & INFO. L. 1, 26<br />
(2005).
230 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
Internet works on a small scale. The scale is currently small for two re<strong>as</strong>ons:<br />
First, he number of c<strong>as</strong>es of governments reaching across borders to inflict their<br />
laws on Internet merchants in other jurisdictions is still relatively small.<br />
Moreover, in contr<strong>as</strong>t to the rhetoric about the Internet creating a global<br />
marketplace, the scope of cross-border commerce itself is still limited. The<br />
reality is that the volume of cross-border transactions is not large enough to<br />
create a truly substantial cross-border jurisdictional crisis. Currently, only four<br />
percent of the sales for electronic commerce merchants in the U.S. come from<br />
abroad. 90 And data from Europe show that cross border online transactions are<br />
not incre<strong>as</strong>ing <strong>as</strong> f<strong>as</strong>t <strong>as</strong> overall e-commerce transactions, staying relatively<br />
stable from 2006 to 2008 at six to seven percent. 91<br />
As David Post h<strong>as</strong> warned, the problem the Internet creates for local<br />
jurisdictions is one of scale. 92 The bordered Internet simply does not scale up.<br />
Global payment systems cannot accommodate an enforcement burden in which<br />
each jurisdiction uses payment system mechanisms to enforce each of its local<br />
laws on the Internet.<br />
It is not hard to see how we can get into a kind of tragedy of the commons in<br />
this area. Each individual extension of local jurisdiction into cyberspace seems<br />
small and costless, but collectively the burden becomes unbearable.<br />
Governments might feel free to exploit this enforcement mechanism, in the<br />
same way that grazers use the commons—under the impression that it is an<br />
unlimited resource. However, one of two outcomes will occur <strong>as</strong> the crossborder<br />
rules pile up: Either cross-border transactions will remain small and the<br />
potential for the Internet to be a global channel of commerce will not be<br />
realized, or the political costs of each government attempting to regulate the e-<br />
90 This is b<strong>as</strong>ed on transaction data from the Visa system. See International Piracy Hearing, supra<br />
note 85, at 75 (statement of Mark MacCarthy, Senior Vice President for Global Public<br />
Policy, Visa Inc.).<br />
91 Comm’n of the European Cmtys., Commission Staff Working Document: Report on Cross-Border<br />
E-commerce in the EU 3, SEC (2009) 283 final (Mar. 5, 2009), available at<br />
http://ec.europa.eu/consumers/strategy/docs/com_staff_wp2009_en.pdf (“From<br />
2006 to 2008, the share of all EU consumers that have bought at le<strong>as</strong>t one item over the<br />
Internet incre<strong>as</strong>ed from 27% to 33% while cross-border e-commerce remained stable (6% to<br />
7%).”).<br />
92 See Post, Against “Against Cyberanarchy”, supra note 15, at 1377 (stating that “scale matters”); see<br />
also Holland, supra note 89, at 29. Holland states:<br />
Id.<br />
The online actor cannot know, <strong>as</strong> a practical matter, the many laws applicable<br />
to a particular act, nor when one or more sovereigns may decide to attempt<br />
regulatory action. This is particularly true in those are<strong>as</strong> of regulation in<br />
which morality, religion and culture are at their most influential, such <strong>as</strong><br />
speech, race, sex, and even intellectual property. Moreover, it is not simply<br />
one actor or a few legal systems. It is an exponential multitude.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 231<br />
commerce activities of other countries will mount. Either development reveals<br />
the limitations of the bordered Internet <strong>as</strong> a long-term framework for Internet<br />
governance.<br />
Goldsmith and Wu suggest that enforcement of Internet regulations through<br />
intermediaries is necessarily limited in size. 93 They suggest that maybe the<br />
system will not be able to scale up, but it won’t have to. 94 Small countries such<br />
<strong>as</strong> Antigua cannot enforce Internet rules because global intermediaries can<br />
simply pull up stakes and leave if the rules are too strict. 95 However, there are a<br />
sufficiently large number of countries that global intermediaries will not feel<br />
capable of abandoning. If all of them use the intermediary enforcement<br />
mechanism, the system will be overwhelmed.<br />
Internationalism<br />
The fundamentally correct insight of the Internet exceptionalists is that the<br />
unilateral imposition of one nation’s law onto all Internet activities that cross<br />
borders won’t scale. 96<br />
Internationalism might be the way out. It is the idea that the Internet will<br />
eventually be governed, at le<strong>as</strong>t for some services, by global institutions and<br />
arrangements, and that this is the right public policy for local governments to<br />
follow in their dealings with illegal cross border Internet transactions. 97 This<br />
policy could be implemented through a uniform global standard, or any of a<br />
variety of techniques such <strong>as</strong> World Trade Organization rules that bring local<br />
laws into harmony. The b<strong>as</strong>ic justification for this policy is similar to the<br />
justification for establishing a single uniform national policy that prevents the<br />
cl<strong>as</strong>h of inconsistent rules at the state level: When activities have widespread<br />
and significant effects on those outside the local jurisdiction, then uniform<br />
principles or some other coordinating mechanism should be adopted at the<br />
higher level. 98 This universalism could promise better laws, whereby the<br />
93 GOLDSMITH & WU, supra note 7, at 81-82.<br />
94 Id. at 81.<br />
95 See id. at 160 (suggesting that acting <strong>as</strong> the Internet police is just a normal cost of doing<br />
business for global companies, which they can avoid in a particular c<strong>as</strong>e by leaving a country<br />
that tried to impose costs that exceeded the benefits of continued presence in the country<br />
and thus creating another objection to the bordered Internet to effectively give larger<br />
countries a greater role in Internet governance than smaller ones).<br />
96 See Johnson & Post, supra note 1, at 1390 (“One nation’s legal institutions should not<br />
monopolize rule-making for the entire Net.”).<br />
97 GOLDSMITH & WU, supra note 7, at 26.<br />
98 Id. (“If the nations of the world agree to a single global law for questions like libel,<br />
pornography, copyright, consumer protection, and the like, the lives of Internet users
232 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
“[i]nternational standards could reflect a kind of collection of best practices<br />
from around the world — the opposite of the tyranny of the unre<strong>as</strong>onable.” 99<br />
Goldsmith and Wu make several criticisms of internationalism. First, a system<br />
of universal laws would be unattractive; it would leave the world divided and<br />
discontent because the universal law would be unpopular in large segments of<br />
the world population. Second, the system of local national laws would better<br />
reflect differences among people. Diversity is a good thing and cannot be taken<br />
into account by a universal code that overrides local differences. Third, it is not<br />
needed. The conflicts of laws, extraterritoriality, and other considerations are<br />
perfectly manageable within the current international framework. For example,<br />
since most Internet users do not have <strong>as</strong>sets in other countries, they are<br />
effectively subject only to the laws of the country where they live. Only large<br />
multinational companies with <strong>as</strong>sets all over the world face the<br />
multijurisdictional problem, and they already have to live with that because they<br />
are already global. Compliance with a plurality of international laws is simply a<br />
cost of doing business for global companies. There’s nothing new here that<br />
would justify a move to a more harmonized global order. There are extra costs<br />
to be sure, but nothing so onerous or burdensome that it would require a move<br />
to global law. 100<br />
The responses to these criticisms are straightforward. An unpopular global law<br />
is not the goal. Neither is suppression of diversity the goal. The idea is to<br />
integrate local laws in some f<strong>as</strong>hion when the regular conflicts among them<br />
prove to be intolerable. When diversity does not create this difficulty, there is<br />
no need for integration. If, for example, local governments value diversity<br />
enough to refrain from using intermediaries to enforce local laws against actors<br />
in other jurisdictions, then there is no need for harmonization of these<br />
enforcement efforts. But to the extent that governments want to take global<br />
enforcement steps, they also need to take steps to integrate the laws they want<br />
become much simpler: no conflicting laws, no worries about complying with 175 different<br />
legal systems, no race to the bottom.”).<br />
99 Id. at 27. Reidenberg also argues that <strong>as</strong> jurisdictions incre<strong>as</strong>ingly conflict there will need to<br />
be an overarching harmonization of international rules:<br />
[O]nline enforcement with electronic blockades and electronic sanctions will<br />
cause serious international political conflicts. These conflicts arise because of<br />
the impact on territorial integrity. Such conflicts are likely to force<br />
negotiations toward international agreements that establish the legal criteria<br />
for a state to use technological enforcement mechanisms. This progression<br />
leads appropriately to political decisions that will define international legal<br />
rules.<br />
Joel R. Reidenberg, States and Internet Enforcement, 1 U. OTTAWA L. & TECH. J. 213,<br />
230 (2003-2004).<br />
100 See GOLDSMITH & WU, supra note 7, at 152-60.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 233<br />
to enforce. The re<strong>as</strong>on for this is that global intermediaries’ costs to mediate<br />
the conflicts <strong>as</strong>sociated with unilateral attempts at local regulation of the<br />
Internet will be so onerous and burdensome that they will cause an unwarranted<br />
and unnecessary decline in global interaction. 101<br />
Berman also describes how the internationalist hope for global standards avoids<br />
the conflict of law problem: “if we constructed one universal ‘world<br />
community’ with one set of governing rules, there would never need to be a<br />
‘choice of law’ in the sense that conflict-of-laws scholars use the term.” 102<br />
However, he is critical of this universal world community for two re<strong>as</strong>ons.<br />
First, he is critical of this community because of its potential to dissolve<br />
community affiliations that provide important emotional connections and<br />
opportunities for normative discussion of those connections. Second, he views<br />
this universal community <strong>as</strong> fundamentally unrealistic given the dominance of<br />
current notions of nation-state sovereignty. 103<br />
These objections can be met at the level of generality at which they are c<strong>as</strong>t. We<br />
do not need to think of ourselves <strong>as</strong> primarily world citizens in order to endorse<br />
specific global approaches. We can still have deep attachments to local<br />
communities and can still debate the relative importance of the overlapping<br />
communities we participate in. The global approach endorses the view that<br />
self-government “requires a politics that plays itself out in a multiplicity of<br />
settings, from neighborhoods to nations to the world <strong>as</strong> a whole” and “citizens<br />
who can abide the ambiguity <strong>as</strong>sociated with divided sovereignty, who can think<br />
and act <strong>as</strong> multiply situated selves.” 104 But participation in global community<br />
and the wisdom to know when the global perspective should take precedence<br />
over more local concerns is essential to this vision of self-government in a<br />
global world.<br />
The internationalist proposal is to provide global coordination only when<br />
necessary. It is to move to global standards when, <strong>as</strong> a practical matter, the<br />
burdens of allowing diverse local rules are too high. The model of national<br />
uniform standards is appropriate: not everything h<strong>as</strong> to be done at the national<br />
level, but some things should be done there in order to have an efficient and fair<br />
national system. Similarly, there is no need to move from the current system to<br />
101 Interestingly, the earlier Jack Goldsmith seemed more inclined to accept these practical<br />
considerations <strong>as</strong> a rationale for international harmonization: “When in particular contexts<br />
the arbitrariness and spillovers become too severe, a uniform international solution remains<br />
possible.” Goldsmith, supra note 4, at 1235.<br />
102 Berman, supra note 73, at 1860.<br />
103 Id. at 1860-61.<br />
104 MICHAEL J. SANDEL, PUBLIC PHILOSOPHY: ESSAYS ON MORALITY IN POLITICS 34 (2005).
234 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
a world government. But if there are practical ways to improve Internet<br />
governance through global harmonization, they should be taken.<br />
If governments are going to use payment intermediaries <strong>as</strong> enforcers of local<br />
law, there are a number of steps that could be taken to coordinate their efforts,<br />
including:<br />
� In the Internet gambling context, a move to an internationallyinteroperable<br />
licensing system that would require each jurisdiction that<br />
allows Internet gambling to defer to the licensing decisions of other<br />
jurisdictions<br />
� In the copyright context, the continued evolution of uniform copyright<br />
rules.<br />
International agreements are one mechanism to create coordinated action.<br />
Although controversial because of the secrecy involved in its development, and<br />
the sense that affected parties were excluded from participation, the Anti-<br />
Counterfeiting Trade Agreement (ACTA) is a re<strong>as</strong>onable, though flawed, model<br />
for action in this area. 105 There are many mechanisms for international<br />
coordination. Decisions regarding which mechanisms to use depend on the<br />
issue and the fora available for resolution.<br />
Internationalism h<strong>as</strong> its dangers. Why should each jurisdiction have the same<br />
regulations on hate speech and the same regulations on alcohol consumption?<br />
The answer is that there will be no harmonization where there are such<br />
fundamental differences. Intermediaries will be called upon to resolve the issue<br />
themselves or they will be caught between warring governments and forced to<br />
choose sides. But efforts should be made to minimize such differences when<br />
these differences have global consequences, especially when they are superficial<br />
differences that reflect no fundamental divisions. For the same re<strong>as</strong>on that we<br />
want uniform global technical standards for information and communications<br />
technologies, if possible, we want similar legal frameworks if governments are<br />
going to enforce laws on the Internet.<br />
These efforts to e<strong>as</strong>e the friction involved in extending government authority to<br />
the Internet through a global framework are in line with other efforts to create<br />
global frameworks that promote the growth of the Internet. For example, the<br />
thirty-first International Conference of Data Protection and Privacy<br />
Commissioners, held in Madrid in November 2009, adopted a set of global<br />
105 See Media Statement, Participants in ACTA Negotiations, Anti-Counterfeiting Trade Agreement<br />
(ACTA), June 12, 2009, available at<br />
http://www.med.govt.nz/templates/Page____40974.<strong>as</strong>px. For a summary of the<br />
ACTA process and the content of the agreement, see THE ANTI-COUNTERFEITING TRADE<br />
AGREEMENT – SUMMARY OF KEY ELEMENTS UNDER DISCUSSION (2009), available at<br />
http://www.med.govt.nz/templates/MultipageDocumentTOC____40563.<strong>as</strong>px.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 235<br />
privacy standards. 106 There is also likely to be a renewed push for global<br />
consumer protection on the occ<strong>as</strong>ion of the tenth anniversary of the<br />
Organisation for Economic Co-operation and Development’s Guidelines for<br />
Consumer Protection in the Context of Electronic Commerce. 107<br />
Both these efforts relate to the growth of the Internet <strong>as</strong> a vibrant international<br />
marketplace. They do this by building online trust. Global information security<br />
standards re<strong>as</strong>sure people that their information is safe no matter what the<br />
physical location of the websites they visit. Establishing global privacy<br />
standards means that the collection and use of online information will be<br />
governed by common principles regardless of a website’s jurisdiction and will<br />
make it e<strong>as</strong>ier for global business to transfer information from one jurisdiction<br />
to another in a seamless manner. Finally, effective global consumer protection<br />
rules will mean that people will have the information and redress rights they<br />
need to shop confidently online no matter where the website is located.<br />
Conclusion<br />
The initial demand from Internet exceptionalists that the online world be left<br />
alone by governments h<strong>as</strong> morphed into the idea that governments should<br />
create a global framework to protect and spur the growth of the Internet. The<br />
intervening steps in this development are not hard to trace: Internet<br />
exceptionalists confused their ideal of self-governing Internet communities with<br />
the idea that the Internet w<strong>as</strong> ungovernable because it w<strong>as</strong> a global<br />
communications network that crossed borders. This idea of an intrinsically<br />
ungovernable Internet w<strong>as</strong> undermined by the recognition that the coding that<br />
underlies Internet applications and services is a matter of choice, not<br />
106 Artemi R. Lombarte, Dir., Agencia Española de Protección de Datos, Slide Presentation:<br />
International Standards on Data Protection & Privacy (2009), available at<br />
https://www.agpd.es/portalweb/canaldocumentacion/comparecenci<strong>as</strong>/common/I<br />
APP_Privacy_Summit_09.pdf. He describes one of the main criteria of the global privacy<br />
standards project <strong>as</strong> “To elaborate a set of principles and rights aimed to achieve the<br />
maximum degree of international acceptance, ensuring at once a high level of protection.” Id.<br />
(emph<strong>as</strong>is in original). For the standards adopted, see THE MADRID PRIVACY DECLARATION<br />
(Nov. 3, 2009), http://thepublicvoice.org/TheMadridPrivacyDeclaration.pdf.<br />
107 OECD GUIDELINES, supra note 80; see also Org. for Econ. Co-operation & Dev., Conference<br />
on Empowering E-Consumers: Strengthening Consumer Protection in the Internet<br />
Economy, Programme (2009), available at<br />
http://www.oecd.org/dataoecd/33/22/44045376.pdf (describing the conference). The<br />
OECD endorsed steps toward global enforcement of some consumer protection rules in a<br />
2003 report on cross-border fraud and a 2007 report on consumer dispute resolution and<br />
redress. See Comm. on Consumer Policy, Org. for Econ. Co-operation & Dev., OECD<br />
Guidelines for Protecting Consumers from Fraudulent and Deceptive Commercial Practices<br />
Across Borders (2003), available at http://www.oecd.org/dataoecd/24/33/2956464.pdf;<br />
Comm. on Consumer Policy, Org. for Econ. Co-operation & Dev., OECD Recommendation<br />
on Consumer Dispute Resolution and Redress (2007), available at<br />
http://www.oecd.org/dataoecd/43/50/38960101.pdf.
236 CHAPTER 3: IS INTERNET EXCEPTIONALISM DEAD?<br />
unchangeable nature. If something about this system created difficulties for<br />
government control, this could be changed. Further, the idea that governments<br />
cannot control the Internet w<strong>as</strong> undermined by the need for the local<br />
operations of global intermediaries to provide essential Internet services and the<br />
practical ability of governments to control these intermediaries.<br />
Internet intermediaries can control the content of the activities on their online<br />
communities, and government can compel or pressure intermediaries to take<br />
these steps. Intermediaries have a general obligation to follow the law, and<br />
except in extreme c<strong>as</strong>es, they have no right to resist these lawfully established<br />
burdens. The establishment of these laws needs to follow all the rules of good<br />
policymaking, including imposing an obligation only when the social benefits<br />
exceed the social costs. However, a bordered Internet in which each country<br />
attempts to use global intermediaries to enforce its local laws will not scale.<br />
This is the fundamentally correct insight of the Internet exceptionalists. If<br />
governments are going to use intermediaries to enforce local laws, they are<br />
going to have to harmonize the local laws they want intermediaries to enforce.
CHAPTER 4<br />
HAS THE INTERNET FUNDAMENTALLY<br />
CHANGED ECONOMICS?<br />
Computer-Mediated Transactions 239<br />
Hal R. Varian<br />
237<br />
Decentralization, Freedom to Operate<br />
& Human Sociality 257<br />
Yochai Benkler<br />
The Economics of Information:<br />
From Dismal Science to Strange Tales 273<br />
Larry Downes<br />
The Regulation of Reputational Information 293<br />
Eric Goldman
238 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 239<br />
Computer-Mediated Transactions<br />
By Hal R. Varian *<br />
Every now and then a set of technologies becomes available that sets off a<br />
period of “combinatorial innovation.” Think of standardized mechanical parts<br />
in the 1800s, the g<strong>as</strong>oline engine in the early 1900s, electronics in the 1920s,<br />
integrated circuits in the 1970s, and the Internet in the l<strong>as</strong>t few decades.<br />
The component parts of these technologies can be combined and recombined<br />
by innovators to create new devices and applications. Since these innovators<br />
are working in parallel with similar components, it is common to see<br />
simultaneous invention. There are many well-known examples such <strong>as</strong> the<br />
electric light, the airplane, the automobile, and the telephone. Many scholars<br />
have described such periods of innovation using terms such <strong>as</strong> “recombinant<br />
growth,” “general purpose technologies,” “cumulative synthesis” and “clusters<br />
of innovation.” 1<br />
The Internet and the Web are wonderful examples of combinatorial innovation.<br />
In the l<strong>as</strong>t 15 years we have seen a huge proliferation of Web applications, all<br />
built from a b<strong>as</strong>ic set of component technologies.<br />
The Internet itself w<strong>as</strong> a rather unlikely innovation; I like to describe it <strong>as</strong> a “lab<br />
experiment that got loose.” Since the Internet arose from the research<br />
community rather than the private sector, it had no obvious business model.<br />
Other public computer networks, such <strong>as</strong> AOL, CompuServe, and Minitel,<br />
generally used subscription models, but were centrally controlled and offered<br />
little scope for innovation at the user level. The Internet won out over these<br />
alternatives, precisely because it offered a flexible set of component<br />
technologies that encouraged combinatorial innovation.<br />
The earlier waves of combinatorial innovation required decades, or more, to<br />
play out. For example, David Hounshell argues that the utopian vision of<br />
* Univ. of Cal., Berkeley and Google. hal@ischool.berkeley.edu.<br />
1 See, e.g., Martin Weitzman, Recombinant Growth, 113 Q. J. OF ECON. 331-360 (1998); Timothy<br />
Bresnahan & M. Trajtenberg, General Purpose Technologies: Engines of Growth?, 65 J. OF<br />
ECONOMETRICS 83-108 (1995), available at<br />
http://ide<strong>as</strong>.repec.org/a/eee/econom/v65y1995i1p83-108.html; Timothy Bresnahan,<br />
General Purpose Technologies, in HANDBOOK OF THE ECONOMICS OF INNOVATION (Bronwyn<br />
Hall & Nathan Rosenberg, eds., 2010); Nathan Rosenberg, Technological Change in the Machine<br />
Tool Industry, in PERSPECTIVES IN TECHNOLOGY 9-31 (1976); ABBOTT PAYSON USHER, A<br />
HISTORY OF MECHANICAL INVENTION (revised ed., Dover Publ’ns 1998); and Joseph A.<br />
Schumpeter, The Analysis of Economic Change, in ESSAYS ON ENTREPRENEURS, INNOVATIONS,<br />
BUSINESS CYCLES AND THE EVOLUTION OF CAPITALISM 134-149 (Richard V. Clemence, ed.,<br />
2000) (originally published in Review of Economic Statistics, May 1935).
240 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
interchangeable parts took more than a century to be realized. 2 The Web w<strong>as</strong><br />
invented in the early 1990s, but it did not become widely used until the mid-<br />
1990s. Since then, we have seen a huge number of novel applications—from<br />
Web browsers, to search engines, to social networks—to mention a few<br />
examples. As with the Internet, the Web initially had no real business model,<br />
but offered a fertile ground for combinatorial innovation.<br />
Innovation w<strong>as</strong> so rapid on the Internet because the component parts were all<br />
bits. They were programming languages, protocols, standards, software<br />
libraries, productivity tools and the like. There w<strong>as</strong> no time to manufacture, no<br />
inventory management, and no shipping delay. You never run out of HTML,<br />
just like you never run out of email. New tools could be sent around the world<br />
in seconds and innovators could combine and recombine these bits to create<br />
new Web applications.<br />
This parallel invention h<strong>as</strong> led to a burst of global innovation in Web<br />
applications. While the Internet w<strong>as</strong> an American innovation, the Web w<strong>as</strong><br />
invented by an Englishman living in Switzerland. Linux, the most used<br />
operating system on the Web, came from Finland 3, <strong>as</strong> did MySQL, a widely<br />
used datab<strong>as</strong>e for Web applications. 4 Skype, which uses the Internet for voice<br />
communication, came from Estonia. 5<br />
Of course, there were many other technologies with worldwide innovation,<br />
such <strong>as</strong> automobiles, airplanes, photography, and incandescent lighting.<br />
However, applications for the Internet, which is inherently a communications<br />
technology, could be developed everywhere in the world in parallel, leading to<br />
the rapid innovation we have observed.<br />
Computer-Mediated Transactions<br />
My interest in this essay is in the economic <strong>as</strong>pects of these technological<br />
developments. I start with a point so mundane and obvious, it barely seems<br />
worth mentioning: Nowadays, most economic transactions involve a computer.<br />
2 David A. Hounshell, From the American System to M<strong>as</strong>s Production, 1800-1932: The<br />
Development of Manufacturing Technology in the United States (1985).<br />
3 See Linus Torvalds & David Diamond, Just for Fun: The Story of an Accidental Revolution<br />
(2002).<br />
4 See Oracle Corporation, From Visions to Reality: An Interview with David Axmark, Co-Founder of<br />
MySQL AB, July 2007, http://dev.mysql.com/tech-resources/interviews/davidaxmark.html.<br />
5 See Andre<strong>as</strong> Thomann, Skype: A Baltic Success Story, CREDIT SUISSE GROUP, June 9, 2006,<br />
http://emagazine.creditsuisse.com/app/article/index.cfm?fuseaction=OpenArticle&aoid=163167&coid=78<br />
05&lang=EN.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 241<br />
Sometimes this computer takes the form of a smart c<strong>as</strong>h register, part of a<br />
sophisticated point of sale system, or a website. In each of these c<strong>as</strong>es, the<br />
computer creates a record of the transaction.<br />
This record-keeping role w<strong>as</strong> the original motivation for having the computer <strong>as</strong><br />
part of the transaction. Creating a record of transactions is the first step in<br />
building an accounting system, thereby enabling a firm to understand its<br />
financial status.<br />
Now that computers are in place, they can, however, be used for many other<br />
purposes. In this essay, I explore some of the ways that computer mediation<br />
can affect economic transactions. These computer mediated transactions, I<br />
argue, have enabled significant improvements in the way transactions are carried<br />
out and will continue to impact the economy in the foreseeable future.<br />
I cl<strong>as</strong>sify the impact of computer mediated transactions into four main<br />
categories according to the innovation they facilitate:<br />
� New forms of contract;<br />
� Data extraction and analysis;<br />
� Controlled experimentation;<br />
� Personalization and customization.<br />
Enable New Forms of Contract<br />
Contracts are fundamental to commerce. The simplest commercial contract<br />
says, “I will do X if you do Y,” <strong>as</strong> in “I will give you $1 if you give me a cup of<br />
coffee.” Of course, this requires that the actions to be taken are verifiable. Just<br />
<strong>as</strong>king for coffee does not mean that I will get it. As Abraham Lincoln<br />
supposedly remarked, “If this is coffee, ple<strong>as</strong>e bring me some tea; but if this is<br />
tea, ple<strong>as</strong>e bring me some coffee.” 6<br />
A computer used in a transaction can observe and verify many <strong>as</strong>pects of that<br />
transaction. The record produced by the computer allows the contracting<br />
parties to condition the contract on terms that were previously unobservable,<br />
thereby allowing for more efficient transactions.<br />
I am not claiming that incre<strong>as</strong>ed observation will necessarily lead to more<br />
efficient contracts. There are counterexamples to the <strong>as</strong>sertion that “more<br />
6 Susan L. Rattiner, Food and Drink: A Book of Quotations (2002).
242 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
information is better” such <strong>as</strong> the Hirshleifer example. 7 I am merely claiming<br />
that additional information allows for more efficient contracts.<br />
Of course, the study of contracts is a highly developed field in economics. As<br />
such, it is hardly novel to suggest that contractual form depends on what is<br />
observable. What is interesting, however, is the way that progress in<br />
information technology enables new contractual forms.<br />
Consider, for example, a rental-car agency that buys insurance b<strong>as</strong>ed on<br />
accident rates, and that accident rates, in turn, depend on the speed of a vehicle.<br />
All renters would prefer to drive within the speed limit if they are compensated<br />
with a lower rental fee. However, if there is no way to monitor the speed of a<br />
rental car, such a contractual provision is unenforceable. Putting a computer<br />
transmitter in the trunk of the car that records the vehicle’s speed makes the<br />
contract enforceable and potentially makes everyone better off.. 8<br />
The transportation sector h<strong>as</strong> capitalized on the availability of computerized<br />
transmitters to create more efficient contracts in many are<strong>as</strong>.<br />
� Car dealers are selling cars with “starter interrupt” devices that inhibit<br />
operations if car payments are missed. 9<br />
� Similar interrupt devices attached to breath analyzers are mandated for<br />
drunk driving offenders in many states.<br />
� Parents can buy a device known <strong>as</strong> “MyKey” which allows them to<br />
limit auto speed, cap the volume on the radio, require seat belt use and<br />
encourage other safe-driving habits for teenage drivers. 10<br />
� In the relevant economics literature, Hubbard and Baker examine a<br />
variety of ways that vehicular monitoring systems have impacted the<br />
trucking industry. 11<br />
7 Jack Hirshleifer, The Private and Social Value of Information and the Reward to Inventive Activity, 61<br />
THE AM. ECON. REV. 561-74 (Sept. 1971), available at<br />
http://faculty.fuqua.duke.edu/~qc2/BA532/1971%20AER%20Hirshleifer.pdf.<br />
8 This is a particularly simple c<strong>as</strong>e. If drivers have heterogeneous preferences, those who<br />
prefer to speed may be made worse off by the availability of such a device.<br />
9 Associated Press, For Some High-risk Auto Buyers, Repo Man is a High-tech Gadget, L.A. TIMES,<br />
June 13, 2006, http://articles.latimes.com/2006/jun/13/business/fi-late13.<br />
10 Nick Bunkley & Bill Vl<strong>as</strong>ic, Ensuring Junior Goes for a Mild Ride, N.Y. TIMES, Oct. 6, 2008,<br />
http://www.nytimes.com/2008/10/07/automobiles/07auto.html.<br />
11 Thom<strong>as</strong> N. Hubbard, The Demand of Monitoring Technologies: The C<strong>as</strong>e for Trucking, 115 Q. J. OF<br />
ECON. 533-560 (2000),<br />
http://www.mitpressjournals.org/doi/abs/10.1162/003355300554845; George Baker &<br />
Thom<strong>as</strong> N. Hubbard. Contractibility and Asset Ownership: On-board Computers and Governance in<br />
US Trucking, 119 Q. J. OF ECON. 1443-1479 (2004),<br />
http://www.mitpressjournals.org/doi/abs/10.1162/0033553042476152.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 243<br />
There are many other examples of computer-mediated contracts. The work of<br />
Dana & Spier and Mortimer, provides examples that describe the efficiency<br />
gains resulting from revenue sharing in the video tape rental industry. 12<br />
Video tapes were originally purch<strong>as</strong>ed by retail stores from distributors for<br />
about $65 per tape. Since the videos were so expensive, stores only bought a<br />
few. As a result, the popular videos quickly disappeared from the shelves,<br />
making everyone unhappy.<br />
In 1998, retailers and distributors adopted a new business model: a revenue<br />
sharing arrangement in which stores paid a small upfront fee of $3 to $8, but<br />
split the revenue when the video w<strong>as</strong> rented, with 40% to 60% going to the<br />
retailer. Stores no longer had an incentive to economize on purch<strong>as</strong>es, and all<br />
parties to the transaction—retailers, distributors, and customers—were made<br />
better off.<br />
Sharing revenue at point of sale requires that both parties be able to monitor the<br />
transaction. The technological innovations of bar code scanning, the<br />
computerized c<strong>as</strong>h register, and computer networks enabled revenue-sharing<br />
arrangements.<br />
Of course, when a transaction takes place online, revenue-sharing is much<br />
e<strong>as</strong>ier. Online advertising is a c<strong>as</strong>e in point where revenue from an advertiser<br />
for an ad impression or click may be split among publishers, ad exchanges, ad<br />
networks, affiliates and other parties b<strong>as</strong>ed on contractual arrangements.<br />
Although the benefits from computers offering more information to contracting<br />
parties have only been discussed thus far, there are also c<strong>as</strong>es in which<br />
computers can be used to improve contractual performance by hiding<br />
information using cryptographic methods. A picturesque example is the<br />
“cocaine auction protocol” which describes an auction mechanism designed to<br />
hide <strong>as</strong> much information <strong>as</strong> possible. 13<br />
Finally, “algorithmic game theory” is an exciting hybrid of computer science<br />
and economic theory that deserves mention. This subject brings computational<br />
considerations to game theory (how a particular solution can be computed) and<br />
12 James D. Dana & Kathryn E. Spier. Revenue Sharing and Vertical Control in the Video Rental<br />
Industry, XLIX Q. J. OF ECON. 223-245 (2001), available at<br />
http://www3.interscience.wiley.com/journal/118972449/abstract; Julie H. Mortimer,<br />
Vertical Contracts in the Video Rental Industry, 75 REV. OF ECON. STUDIES 165-199 (2008),<br />
http://www3.interscience.wiley.com/journal/119395822/abstract.<br />
13 Frank Stajano & Ross Anderson, The Cocaine Auction Protocol: On the Power of Anonymous<br />
Broadc<strong>as</strong>t, in PROCEEDINGS OF INFORMATION HIDING WORKSHOP, LECTURE NOTES IN<br />
COMPUTER SCIENCE, 1999, http://www.cl.cam.ac.uk/ rja14/Papers/cocaine.pdf.
244 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
strategic considerations to algorithm design (whether a particular algorithm is<br />
actually incentive-compatible). 14<br />
Some History of Monitoring Technologies<br />
Though I have emph<strong>as</strong>ized computer mediated transactions, a computer can be<br />
defined quite broadly. The earliest example of an accounting technology I<br />
know of that enabled new forms of contract involves Mediterranean shipping<br />
circa 3300 B.C.<br />
The challenge w<strong>as</strong> how to write a “bill of lading” for long distance trade in<br />
societies that were pre-literate and pre-numerate. The brilliant solution w<strong>as</strong> to<br />
introduce small clay tokens, known <strong>as</strong> “bullae,” which were small<br />
representations of the material being transported. As each barrel of olive oil<br />
w<strong>as</strong> loaded onto a ship, a barrel-shaped token w<strong>as</strong> placed in a clay envelope.<br />
After the loading w<strong>as</strong> completed, the envelope w<strong>as</strong> baked in a kiln and given to<br />
the ship’s captain. At the other end of the voyage, the envelope w<strong>as</strong> then<br />
broken open and the tokens were compared to the barrels of oil on the ship <strong>as</strong><br />
they were unloaded. If the numbers matched, the contract w<strong>as</strong> verified. Later,<br />
marks were scratched on the outside of the envelope to indicate the number of<br />
tokens inside. Some authors believe that this innovation led to the invention of<br />
writing between 3400 and 3300 B.C. 15<br />
A somewhat more recent example is the invention of the c<strong>as</strong>h register in 1883<br />
by James Ritty. 16 Ritty, a saloon owner, discovered that his employees were<br />
stealing money. In response, he developed a device to record each transaction<br />
on paper tape, an invention that he patented under the name of “the<br />
incorruptible c<strong>as</strong>hier.” 17 Ritty’s machine became the b<strong>as</strong>is of the National C<strong>as</strong>h<br />
Register (NCR) Company founded in 1884. The NCR device added a c<strong>as</strong>h<br />
drawer and a bell that sounded “ka-ching” whenever the drawer w<strong>as</strong> opened, to<br />
alert the owner of the transaction, thereby discouraging pilfering. This<br />
improved monitoring technology made retailers willing to hire employees<br />
14 See NOAM NISAN, TIM ROUGHGARDEN, EVA TARDOS, AND VIJAY V. VAZIRANI, EDS.,<br />
ALGORITHMIC GAME THEORY (2007) for a comprehensive collection of articles and Hal R.<br />
Varian, Economic Mechanism Design for Computerized Agents, in USENIX WORKSHOP ON<br />
ELECTRONIC COMMERCE 13-21 (1995),<br />
http://www.sims.berkeley.edu/hal/Papers/mechanism-design.pdf for an early<br />
contribution to this theory.<br />
15 Jean-Jacques Gl<strong>as</strong>sner, Zainab Bahrani, & Marc Van de Miero, The Invention of Cuneiform:<br />
Writing in Sumer (2005).<br />
16 MIT School of Engineering, Inventor of the Week Archive: James Ritty, C<strong>as</strong>h Register, April 2002,<br />
http://web.mit.edu/invent/iow/ritty.html.<br />
17 C<strong>as</strong>h Register and Indicator, U.S. Patent 271,363 (filed Feb 15, 1882), available at<br />
http://www.google.com/patents?hl=en&lr=&vid=USPAT271363.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 245<br />
outside their immediate families, leading to larger and more efficient<br />
establishments. 18<br />
Enabling Online Advertising<br />
Online advertising serves <strong>as</strong> a poster child for algorithmic mechanism design. A<br />
P<strong>as</strong>adena company called GoTo began ranking search results using an auction. 19<br />
Users did not like this particular form of search, so GoTo switched to using an<br />
auction to rank advertisements. In the original auction, ads were ranked by “bid<br />
per click” and advertisers paid the amount they bid. After consultation with<br />
auction theorists, GoTo moved to a second-price auction: An advertiser paid a<br />
price per click determined by the bid of the advertiser in the next lower<br />
position. 20<br />
There is a fundamental divergence of incentives in advertising. The publisher<br />
(i.e. the content provider) h<strong>as</strong> space on its Web page for an ad and wants to sell<br />
ad impressions to the highest bidders. The advertiser does not care directly<br />
about ad impressions, but does care about visitors to its website, and ultimately,<br />
the sale of its products. Hence, the publisher wants to sell impressions, but the<br />
advertiser wants to buy clicks.<br />
This is similar to an international trade transaction where the buyer wants to pay<br />
in euros and the seller wants to receive dollars. The solution in both c<strong>as</strong>es is the<br />
same: an exchange rate. In the context of online advertising, the exchange rate<br />
is the predicted click-through rate, an estimate of how many clicks a particular<br />
ad impression will receive. This allows one to convert the advertiser’s offered<br />
bid per click to an equivalent bid per impression. The publisher can thus sell<br />
each impression to the highest bidder.<br />
This mechanism aligns the interests of the buyer and the seller, but creates<br />
unintended consequences. If the advertiser only pays for clicks, he h<strong>as</strong> no<br />
direct incentive to economize on impressions. Excessive impressions, however,<br />
18 See JoAnne Yates, Business Use of Information and Technology from 1880-1950, in A NATION<br />
TRANSFORMED BY INFORMATION: HOW INFORMATION HAS SHAPED THE UNITED STATES<br />
FROM COLONIAL TIMES TO THE PRESENT 107-135. (Alfred D. Chandler and James Cortada,<br />
eds., 2000) (detailing the role of office machinery in the development of commercial<br />
enterprises).<br />
19 GoTo.com Posts Strong Relevancy Ranking in NPD Survey of Search Engines, BUSINESS WIRE, April<br />
11, 2000, available at http://www.highbeam.com/doc/1G1-61423181.html.<br />
20 For accounts of the development of these auctions, see John Battelle, THE SEARCH: HOW<br />
GOOGLE AND ITS RIVALS REWROTE THE RULES OF BUSINESS AND TRANSFORMED OUR<br />
CULTURE (2005); Steve Levy, Secret of Googlenomics: Data-fueled Recipe Brews Profitability, WIRED,<br />
2009, http://www.wired.com/culture/culturereviews/magazine/17-<br />
06/nep_googlenomics?currentPage=all.
246 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
impose an attention cost on users, so further attention to ad quality is important<br />
to ensure that ad impressions remain relevant to users.<br />
Nowadays, the major providers of search engine advertising all estimate clickthrough<br />
rates along with other me<strong>as</strong>ures of ad quality and use auctions to sell<br />
these ads. Economists have applied game theory and mechanism design to<br />
analyze the properties of these auctions. 21<br />
Enabling Data Extraction & Analysis<br />
The data from computer-mediated transactions can be analyzed and used to<br />
improve the performance of future transactions.<br />
The Sabre air p<strong>as</strong>senger reservation system offered by American Airlines is an<br />
example of this. The original conception, in 1953, w<strong>as</strong> to automate the creation<br />
of an airline reservation. However, by the time the system w<strong>as</strong> rele<strong>as</strong>ed in 1960,<br />
it w<strong>as</strong> discovered that such a system could also be used to study patterns in the<br />
airline reservation process: The acronym Sabre stands for Semi-Automatic<br />
Business Research Environment. 22<br />
The existence of airline reservation systems enabled sophisticated differential<br />
pricing (also known <strong>as</strong> “yield management”) in the transportation industry. 23<br />
Many firms have built data warehouses b<strong>as</strong>ed on transaction-level data which<br />
can then be used <strong>as</strong> input for analytic models of customer behavior. A<br />
prominent example is supermarket scanner data which h<strong>as</strong> been widely used in<br />
economic analyses. 24 Scanner data h<strong>as</strong> also been useful in constructing price<br />
21 See, e.g., Susan Athey & Glenn Ellison, Position Auctions with Consumer Search, 2007.<br />
http://kuznets.f<strong>as</strong>.harvard.edu/~athey/position.pdf; Benjamin Edelman, Michael<br />
Ostrovsky, & Michael Schwartz, Internet Advertising and the Generalized Second Price Auction, 97<br />
AM. ECON. REV. 242-259 (March 2007); Hal R. Varian, Online Ad Auctions, 99 AM. ECON.<br />
REV. 430-434 (2009).<br />
22 Sabre, History, available at http://web.archive.org/web/20080225161359/<br />
http://www.sabreairlinesolutions.com/about/history.htm.<br />
23 Barry C. Smith, John F. Leimkuhler, & Ross M. Darrow, Yield Management at American Airlines,<br />
22 INTERFACES 8-31 (1992), available at http://www.jstor.org/pss/25061571 (on the<br />
history of yield management in the airline industry). Kalyan T. Talluri & Garrett J. van<br />
Ryzin, THE THEORY AND PRACTICE OF REVENUE MANAGEMENT (Kluwer Academic<br />
Publishers 2004), http://books.google.com/books?id=hogoH5LXmyIC (a textbook<br />
explanation of yield management).<br />
24 Aviv Nevo & Catherin Wolfram, Why Do Manufacturers Issue Coupons? An Empirical Analysis of<br />
Breakf<strong>as</strong>t Cereals, 22 THE RAND J. OF ECON. 319-339 (2002),<br />
http://research.chicagobooth.edu/marketing/datab<strong>as</strong>es/dominicks/docs/2002_W<br />
hy_Do_Manufacturers.pdf; Igal Hendel & Aviv Nevo, Me<strong>as</strong>uring the Implications of Sales and<br />
Consumer Inventory Behavior, 74 ECONOMETRICA 1637-1673 (2006),<br />
http://faculty.wc<strong>as</strong>.northwestern.edu/~ieh758/me<strong>as</strong>uring.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 247<br />
indexes, 25 since it allows for much more direct and timely access to prices. The<br />
fact that the data is timely is worth emph<strong>as</strong>izing, since it allows for real time<br />
analysis and intervention for businesses and at the policy level.<br />
Hyunyoung Choi and I have used real-time publicly-available search engine data<br />
to predict the current level of economic activity for automobile, real estate, retail<br />
trade, travel, and unemployment indicators. 26 There are many other sources of<br />
real-time data such <strong>as</strong> credit card, package delivery, and financial data. This h<strong>as</strong><br />
been referred to <strong>as</strong> “nowc<strong>as</strong>ting” to describe the use of real-time data in<br />
estimating the current state of the economy. 27 A variety of econometric<br />
techniques are used to deal with the problems of variable selection, gaps, lags,<br />
structural changes and so on. Much of the real-time data is also available at<br />
state and city levels, allowing for regional macroeconomic analysis.<br />
In the l<strong>as</strong>t 20 years, the field of machine learning h<strong>as</strong> made tremendous strides<br />
in “data mining.” This term w<strong>as</strong> once pejorative, at le<strong>as</strong>t among<br />
econometricians, but now enjoys a somewhat better reputation due to the<br />
exciting applications developed by computer scientists and statisticians. 28 One<br />
of the main problems with data mining is over-fitting, but various sorts of<br />
cross-validation techniques have been developed to mitigate this problem.<br />
Econometricians have only begun to utilize these techniques. 29<br />
25 ROBERT C. FEENSTRA & MATTHEW SHAPIRO, EDS., SCANNER DATA AND PRICE INDEXES<br />
(2003); Farm Foundation, Food CPI, Prices, and Expenditures: A Workshop on the Use of Scanner<br />
Data in Policy Analysis, June 2003, http://www.ers.usda.gov/<br />
briefing/CPIFoodAndExpenditures/ScannerConference.htm.<br />
26 Hyunyoung Choi & Hal R. Varian, Predicting the Present with Google Trends, GOOGLE RESEARCH<br />
BLOG, April 2, 2009, http://googleresearch.blogspot.com/2009/<br />
04/predicting-present-with-google-trends.html; Hyunyoung Choi & Hal R. Varian,<br />
Predicting Initial Claims for Unemployment Benefits, GOOGLE RESEARCH BLOG, July 22, 2009,<br />
http://googleresearch.blogspot.com/2009/07/posted-by-hal-varian-chiefeconomist.html.<br />
27 See M. P. CLEMENTS & DAVID F. HENDRY, GREAT BRITAIN STATISTICS COMMISSION,<br />
FORECASTING IN THE NATIONAL ACCOUNTS AT THE OFFICE FOR NATIONAL STATISTICS<br />
(2003); Jennifer L. C<strong>as</strong>tle & David Hendry, Nowc<strong>as</strong>ting from Disaggregates in the Face of Location<br />
Shifts, June 18, 2009, http://www.economics.ox.ac.uk/members/<br />
jennifer.c<strong>as</strong>tle/Nowc<strong>as</strong>t09JoF.pdf [hereinafter C<strong>as</strong>tle & Hendry, Nowc<strong>as</strong>ting].<br />
28 For a technical overview, see TREVOR HASTIE, JEROME FRIEDMAN, & ROBERT TIBSHIRANI,<br />
THE ELEMENTS OF STATISTICAL LEARNING: DATA MINING, INFERENCE, AND PREDICTION<br />
(2d ed. 2009).<br />
29 See C<strong>as</strong>tle & Hendry, Nowc<strong>as</strong>ting, supra note 27.
248 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
Enabling Experimentation<br />
As Ronald Co<strong>as</strong>e h<strong>as</strong> said, “If you torture the data long enough it will<br />
confess.” 30 It is difficult to establish causality from retrospective data analysis.<br />
It is thus noteworthy that computer mediation allows one to me<strong>as</strong>ure economic<br />
activity and also conduct controlled experiments.<br />
In particular, it is relatively e<strong>as</strong>y to implement experiments on Web-b<strong>as</strong>ed<br />
systems. Such experiments can be conducted at the query level, user level, or<br />
geographic level.<br />
In 2008, Google ran 6,000 experiments involving Web search which resulted in<br />
450-500 changes in the system. 31 Some of these experiments were with the user<br />
interface and some were b<strong>as</strong>ic changes to the algorithm. 32 The ad team at<br />
Google ran a similar number of experiments, tweaking everything from the<br />
background color of the ads, to the spacing between the ads and search results,<br />
to the underlying ranking algorithm.<br />
In the 1980s, Japanese manufacturers touted their “kaizen” system that allowed<br />
for “continuous improvement” of the production process. 33 In a well-designed<br />
Web-b<strong>as</strong>ed business, there can be continuous improvement of the product<br />
itself—the website.<br />
Google and other search engines also offer various experimental platforms to<br />
advertisers and publishers such <strong>as</strong> “Ad Rotation,” which rotates ad creatives<br />
(i.e., the wording of the ad) among various alternatives to choose the one that<br />
performs best and “Website Optimizer,” a system that allows websites to try<br />
different designs or layouts and determine which performs best.<br />
Building a system that allows for experimentation is critical for future<br />
improvement, but it is too often left out of initial implementation. This is<br />
unfortunate, since it is the early versions of a system that are often most in need<br />
of improvement.<br />
30 Gordon Tullock, A Comment on Daniel Klein’s ‘A Plea to Economists Who Favor Liberty’, 27<br />
EASTERN ECONOMIC JOURNAL 205 (No. 2, Spring 2001), available at<br />
http://college.holycross.edu/RePEc/eej/Archive/Volume27/V27N2P203_207.pdf.<br />
31 Rob Hoff, Google Search Guru Singhal: We Will Try Outlandish Ide<strong>as</strong>, BUS. WEEK, Oct. 2009,<br />
http://www.businessweek.com/the_thread/<br />
techbeat/archives/2009/10/google_search_g.html.<br />
32 Id.<br />
33 For more information on the Japanese kaizen philosophy, see MASAAKI IMAI, KAIZEN: THE<br />
KEY TO JAPAN’S COMPETITIVE SUCCESS (1986).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 249<br />
Cloud computing, which I will discuss later in the essay, offers a model for<br />
“software <strong>as</strong> service,” which typically means software is hosted in a remote data<br />
center and accessed via a Web interface. There are numerous advantages to this<br />
architecture. It allows for controlled experiments which can, in turn, lead to<br />
continuous improvement of the system. Alternatives such <strong>as</strong> packaged software<br />
make experimentation much more difficult.<br />
Ideally, experiments lead to understanding of causal relations that can then be<br />
modeled. In c<strong>as</strong>e of Web applications there are typically two “economic<br />
agents”: the users and the applications. The applications are already modeled<br />
via the source code that is used to implement them, so all that is necessary is to<br />
model the user behavior. The resulting model will often take the form of a<br />
computer simulation that can be used to understand how the system works.<br />
Some examples of this are the Bid Simulator and Bid Forec<strong>as</strong>ting tools offered<br />
by Google and Yahoo!. 34 These tools give an estimate of the cost and clicks<br />
<strong>as</strong>sociated with possible bids. The cost per click is determined by the rules of<br />
the auction and can be calculated directly; the clicks are part of user behavior<br />
and must be estimated with economic forec<strong>as</strong>ting. Putting them together<br />
creates a model of the auction outcomes.<br />
How Experiments Change Business<br />
Because computer mediation dr<strong>as</strong>tically reduces the cost of experimentation,<br />
there have been changes for the role of management. As Kohavi et al. have<br />
emph<strong>as</strong>ized, decisions should be b<strong>as</strong>ed on carefully controlled experiments<br />
rather than “the Highest Paid Person’s Opinion (HiPPO).” 35<br />
If experiments are costly, utilizing expert opinions by management is a plausible<br />
way to make decisions. When experiments are inexpensive, however, they are<br />
likely to provide more reliable answers than opinion, even the opinions of<br />
highly paid experts. Furthermore, even when experienced managers have<br />
better-than-average opinions, it is likely that there are more productive uses of<br />
their time than to sit around a table debating which background colors will<br />
appeal to Web users. The right response from managers to such questions<br />
should be to “run an experiment.”<br />
34 For more information on Bid Simulator and Bid Forec<strong>as</strong>ting, see Louise Rijk, Bid Simulator<br />
Adds More Transparency to Google AdWords Bidding, INTERNET MKTG. & BUS. REV., Aug. 10,<br />
2009, http://www.advmediaproductions.com/newsletter/<br />
NL_google-adwords-bid-simulator.html.<br />
35 Ron Kohavi, Roger Longbotham, Dan Sommerfield, & Randal M. Henne, Controlled<br />
Experiments on the Web: Survey and Practical Guide, 19 DATA MINING & KNOWLEDGE<br />
DISCOVERY 140-181 (2008) http://www.springerlink.com/content/r28m75k77u145115.
250 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
Businesses have always engaged in experimentation in one form or another.<br />
The availability of computer mediated transactions h<strong>as</strong>, however, made these<br />
experiments much more inexpensive and flexible than in the p<strong>as</strong>t.<br />
Enabling Customization & Personalization<br />
Finally, computer mediated transactions allow for customization and<br />
personalization of interactions by b<strong>as</strong>ing current transactions on earlier<br />
transactions or other relevant information.<br />
Instead of a “one size fits all” model, the Web offers a “market of one.”<br />
Amazon.com, for example, makes individual suggestions of items to purch<strong>as</strong>e<br />
b<strong>as</strong>ed on an individual’s previous purch<strong>as</strong>es, or on purch<strong>as</strong>es of consumers like<br />
that individual. These suggestions can be b<strong>as</strong>ed on “recommender systems” of<br />
various sorts. 36<br />
In addition to content, prices may also be personalized, leading to various forms<br />
of differential pricing. There are certainly welfare effects of such personalized<br />
pricing. Acquisiti and Varian examine a model in which firms can condition<br />
prices b<strong>as</strong>ed on p<strong>as</strong>t history. 37 The ability of firms to extract surplus, they<br />
discover, is quite limited when consumers are sophisticated. In fact, firms have<br />
to offer “enhanced services” to justify higher prices.<br />
I have previously suggested that there is a “third welfare theorem” that applies<br />
to (admittedly extreme) c<strong>as</strong>es with perfect price discrimination and free entry:<br />
Perfect price discrimination results in the optimal amount of output sold while<br />
free entry pushes profits to zero, conferring all benefits to consumers. 38<br />
The same type of personalization can occur in advertising. Search engine<br />
advertising is inherently customized since ads are shown b<strong>as</strong>ed on a user’s<br />
query. Google and Yahoo! offer services that allow users to specify their are<strong>as</strong><br />
of interest and then view ads related to those interests. It is also relatively<br />
common for advertisers to use various forms of “re-targeting” that allow them<br />
to show ads b<strong>as</strong>ed on users’ previous responses to related ads.<br />
36 Paul Resnick & Hal R. Varian, Recommender Systems, 3 COMM’CNS OF THE ASSOC. FOR<br />
COMPUTER MACH. 56-58 (March 1997), http://cacm.acm.org/magazines/1997/3/8435recommender-systems/pdf.<br />
37 Alessandro Acquisiti & Hal R. Varian, Conditioning Prices on Purch<strong>as</strong>e History. 24 MKTG. SCI.<br />
367-381 (2005), http://www.sims.berkeley.edu/hal/Papers/privacy.pdf.<br />
38 Hal R. Varian, Competition and Market Power, in JOSEPH FARRELL, CARL SHAPIRO, & HAL R.<br />
VARIAN, EDS., THE ECONOMICS OF INFORMATION TECHNOLOGY: AN INTRODUCTION, 1-46<br />
(Cambridge Univ. Press 2005). For a theoretical analysis of first-degree price discrimination,<br />
see David Ulph & Nir Vulkan, Electronic Commerce, Price Discrimination, and M<strong>as</strong>s Customisation,<br />
Nov. 2007, http://vulkan.worc.ox.ac.uk/wp-content/images/combined-paper.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 251<br />
Transactions Among Workers<br />
Thus far, there h<strong>as</strong> been an emph<strong>as</strong>is on transactions among buyers, sellers and<br />
advertisers. But computers can also mediate transactions among workers. The<br />
resulting improvements in communication and coordination can lead to<br />
productivity gains, <strong>as</strong> documented in the literature on the impact of computers<br />
on productivity.<br />
In a series of works, Paul David h<strong>as</strong> drawn an extended analogy between the<br />
productivity impact of electricity at the end of the nineteenth century and the<br />
productivity impact of computing at the end of the twentieth century. 39<br />
Originally, factories were powered by waterwheels which drove a shaft and all<br />
of the machines in the factory had to connect to this central shaft. The<br />
manufacturing process involved moving the piece being <strong>as</strong>sembled from station<br />
to station during <strong>as</strong>sembly.<br />
The power source evolved from waterwheels to steam engines to electric<br />
motors. Eventually electric motors were attached to each machine, which<br />
allowed more flexibility in how the machines were arranged within the factory.<br />
However, factories still stuck to the time-honored arrangements, grouping the<br />
same sort of machines in the same location-all the lathes in one place, saws in<br />
another, and drills in yet another.<br />
In the first decade of the twentieth century, Henry Ford invented the <strong>as</strong>sembly<br />
line. Then, the flexibility offered by electric motors became well appreciated. 40<br />
As David demonstrates, the productivity impact of the <strong>as</strong>sembly line w<strong>as</strong><br />
significant, and over the l<strong>as</strong>t century, manufacturing h<strong>as</strong> become far more<br />
efficient. 41<br />
39 See Paul David, The Dynamo and the Computer: An Historical Perspective on the Modern Productivity<br />
Paradox, 80 AM. ECON. REV. 355-61 (May 1990)<br />
http://ide<strong>as</strong>.repec.org/a/aea/aecrev/v80y1990i2p355-61.html [hereinafter David,<br />
Productivity Paradox]; Paul David, General Purpose Engines, Investment, and Productivity Growth:<br />
From the Dynamo Revolution to the Computer Revolution., in E. Deiaco, E. Hornel, & G. Vickery,<br />
eds., TECHNOLOGY AND INVESTMENT: CRUCIAL ISSUES FOR THE 90S (1991); Paul David,<br />
Computer and the Dynamo: The Modern Productivity Paradox in the Not-too-distant Mirror, in<br />
TECHNOLOGY AND PRODUCTIVITY: THE CHALLENGE FOR ECONOMIC POLICY 315-348<br />
(1991).<br />
40 Ford suggests that the inspiration for the <strong>as</strong>sembly line came from observing the<br />
meatpacking plants in Chicago, where animal carc<strong>as</strong>ses were hung on hooks and moved<br />
down a line where workers carved off different pieces. If you could use this process to dis<strong>as</strong>semble<br />
a cow, Ford figured you could use it to <strong>as</strong>semble a car. See HENRY FORD, MY LIFE<br />
AND WORK (Doubleday, Page & Co. 1923).<br />
41 I do not mean to imply that the only benefit from electric motors came from improved<br />
factory layout. Motors were also more efficient than drive belts and the building<br />
construction w<strong>as</strong> simpler. See David, Productivity Paradox, supra note 39.
252 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
I want to extend David’s <strong>as</strong>sembly line analogy to examine “knowledge worker<br />
productivity.” 42 Prior to the widespread use of the personal computer,<br />
producing office documents w<strong>as</strong> a laborious process. A memo w<strong>as</strong> dictated to a<br />
stenographer who later typed the document, making carbon copies. The typed<br />
manuscript w<strong>as</strong> corrected by the author and circulated for comments. As with<br />
pre-<strong>as</strong>sembly line production, the partially-produced product w<strong>as</strong> carried around<br />
to different stations for modification. When the comments all came back, the<br />
document w<strong>as</strong> re-typed, re-produced and re-circulated.<br />
In the latter half of the twentieth century, there were some productivity<br />
enhancements for this b<strong>as</strong>ic process, such <strong>as</strong> White-Out, Post-it Notes, and<br />
photocopy machines. Nonetheless, the b<strong>as</strong>ic production process remained the<br />
same for a century.<br />
When the personal computer became widespread, editing became much e<strong>as</strong>ier,<br />
and the process of collaborative document production involved floppy disks.<br />
The advent of email allowed one to eliminate the floppy disk and simply mail<br />
attachments to individuals.<br />
All of these effects contributed to improving the quantity and quality of<br />
collaborative document production. However, they all mimicked the same<br />
physical process: circulating a document to individuals for comments. Editing,<br />
version control, tracking changes, circulation of the documents and other t<strong>as</strong>ks<br />
remained difficult.<br />
Nowadays, there is a new model for document production enabled by “cloud<br />
computing.” 43 In this model, documents live “in the cloud,” meaning in some<br />
data center on the Internet. The documents can be accessed at any time, from<br />
anywhere, on any device, and by any authorized user.<br />
Cloud computing dramatically changes the production process for knowledge<br />
work. There is now a single m<strong>as</strong>ter copy that can be viewed and edited by all<br />
relevant parties, with version control, check points and document restore built<br />
in. All sorts of collaboration, including collaboration across time and space,<br />
have become far e<strong>as</strong>ier.<br />
42 See Peter F. Drucker, Knowledge-worker Productivity: The Biggest Challenge, 41 CAL. MGMT. REV.<br />
79-94 (1999).<br />
43 Michael Armbrust, Armando Fox, Rean Griffith, Anthony D. Joseph, Randy H. Katz,<br />
Andrew Konwinski, Gunho Lee, David A. Patterson, Ariel Rabkin, Ion Stoica, & Matei<br />
Zaharia, Above the Clouds: A Berkeley View of Cloud Computing, Feb. 10, 2009,<br />
http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-28.html<br />
[hereinafter Above the Clouds]. See also, Wikipedia, Cloud Computing,<br />
http://en.wikipedia.org/wiki/Cloud_computing.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 253<br />
Instead of p<strong>as</strong>sing the document amongst collaborators, a single m<strong>as</strong>ter copy of<br />
the document can be edited by all interested parties (simultaneously if desired).<br />
By allowing workflow to be re-organized, cloud computing changes knowledge<br />
worker productivity the same way that electricity changed the productivity of<br />
physical labor.<br />
Enabling Deployment of Applications<br />
As previously mentioned, cloud computing offers what is referred to <strong>as</strong><br />
“software <strong>as</strong> a service.” This architecture reduces support costs and makes it<br />
e<strong>as</strong>ier to update and improve applications.<br />
Cloud computing, however, does not only offer “software <strong>as</strong> a service.” It also<br />
offers “platform <strong>as</strong> a service,” which means that software developers can deploy<br />
new applications using the cloud infr<strong>as</strong>tructure.<br />
Nowadays, it is possible for a small company to purch<strong>as</strong>e data storage, hosting<br />
services, an application development environment, and Internet connectivity<br />
“off the shelf” from vendors such <strong>as</strong> Amazon.com, Google, IBM, Microsoft,<br />
and Sun.<br />
The “platform <strong>as</strong> a service” model turns a fixed cost for small Web applications<br />
into a variable cost, dramatically reducing entry costs. Computer engineers can<br />
both explore the combinatorial possibilities of generic components to create<br />
new inventions and can actually purch<strong>as</strong>e standardized services in the market in<br />
order to deploy those innovations.<br />
This development is analogous to the recent history of the book publishing<br />
industry. At one time, publishers owned facilities for printing and binding<br />
books. Today, due to the strong economies of scale inherent in this process,<br />
most publishers have outsourced the actual production process to a few<br />
specialized book production facilities.<br />
Similarly, in the future, it is likely that there will be a number of cloud<br />
computing vendors that will offer computing on a utility-b<strong>as</strong>ed model. This<br />
production model dramatically reduces the entry costs of offering online<br />
services, and will likely lead to a significant incre<strong>as</strong>e in businesses that provide<br />
such specialized services. 44<br />
The hallmarks of modern manufacturing are routinization, modularization,<br />
standardization, continuous production, and miniaturization. These practices<br />
have had a dramatic impact on manufacturing productivity in the twentieth<br />
44 Above the Clouds, supra note 43.
254 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
century. The same practices can be applied to knowledge work in the twentyfirst<br />
century.<br />
Computers, for example, can automate routine t<strong>as</strong>ks such <strong>as</strong> spell-checking and<br />
data retrieval. Communications technology allows t<strong>as</strong>ks to be modularized and<br />
routed to the workers best able to perform those t<strong>as</strong>ks. Similar to how the<br />
miniaturization of the electric motor allowed physical production to be<br />
rearranged in 1910, the miniaturization of the computer—from the mainframe,<br />
to the workstation, to the PC, to the laptop, and to the mobile phone—allows<br />
knowledge production to be rearranged on a local and global scale.<br />
Enabling Micro-Multinationals<br />
An interesting implication of computer mediated transactions among<br />
knowledge workers is that interactions are no longer constrained by time or<br />
distance.<br />
Email and other tools allow for <strong>as</strong>ynchronous communication over any<br />
distance, which allows for optimization of t<strong>as</strong>ks on a global b<strong>as</strong>is. Knowledge<br />
work can be subdivided into t<strong>as</strong>ks, much like physical work in Adam Smith’s<br />
hypothetical pin factory. 45 But even more, those t<strong>as</strong>ks can be exported around<br />
the world to where they can most effectively be performed.<br />
For example, consultants at McKinsey routinely send their PowerPoint slides to<br />
Bangalore for beautification. There are many other cognitive t<strong>as</strong>ks of this sort<br />
that can be outsourced, including translation, proofreading, document research,<br />
etc. Amazon.com’s Mechanical Turk is an intriguing example of how computers<br />
can aid in matching up workers and t<strong>as</strong>ks. 46 As of March 2007, there were<br />
reportedly more than 100,000 workers from 100 countries who were providing<br />
services via the Mechanical Turk. 47<br />
The dramatic drop in communications costs in the l<strong>as</strong>t decade h<strong>as</strong> led to the<br />
emergence of what I have termed “micro-multinationals.” 48 Nowadays, a 10- or<br />
12-person company can have communications capabilities that only the largest<br />
45 ADAM SMITH, AN INQUIRY INTO THE NATURE AND CAUSES OF THE WEALTH OF NATIONS 18-<br />
21 (Edwin Cannan, ed., Methuen & Co., Ltd. 1904) (1776),<br />
http://www.econlib.org/library/Smith/smWN.html.<br />
46 Wikipedia. Amazon Mechanical Turk,<br />
http://en.wikipedia.org/wiki/Amazon_Mechanical_Turk.<br />
47 J<strong>as</strong>on Pontin, Artificial Intelligence, with Help From the Humans, N.Y. TIMES, March 25 2007,<br />
www.nytimes.com/2007/03/25/business/yourmoney/25Stream.html<br />
?ex=1332475200en=cd1ce5d0bee647d5ei=5088partner=rssnytemc=rss.<br />
48 Hal Varian, Technology Levels the Business Playing Field, N.Y. TIMES, Aug. 25 2005,<br />
http://www.nytimes.com/2005/08/25/business/25scene.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 255<br />
multinationals could afford 15 years ago. Using tools like email, websites, wikis,<br />
voice over IP, and video conferencing, tiny companies can coordinate workflow<br />
on a global b<strong>as</strong>is. By sending work from one time zone to the next, these<br />
companies effectively work around the clock, giving them a potential<br />
competitive advantage over firms that are restricted to one time zone.<br />
Many micro-multinationals share a common history: A student comes to the<br />
United States for graduate school. They use the Internet and the collaborative<br />
tools available in scientific workgroups. Some get bitten by the start-up bug.<br />
They draw on their friends and colleagues back home, who have other contacts<br />
living abroad. The collaborative technologies previously mentioned allow such<br />
loose groups to collaborate on producing computer code, which may end up <strong>as</strong><br />
a working product.<br />
As Saxenian h<strong>as</strong> pointed out, “emigration” means something quite different<br />
now than it did 30 years ago. 49 As she puts it, a “brain drain” h<strong>as</strong> been replaced<br />
by a “brain circulation.” We now have a host of collaborative technologies that<br />
allow an immigrant to maintain ties to his social and professional networks in<br />
his home country.<br />
Conclusion<br />
I began this essay with a discussion of combinatorial innovation and pointed<br />
out that innovation h<strong>as</strong> been so rapid in the l<strong>as</strong>t decade because innovators<br />
around the world can work in parallel, exploring novel combinations of<br />
software components. When the innovations are sufficiently mature to be<br />
deployed, they can be hosted using cloud computing technology and managed<br />
by global teams, and even by tiny companies. Ideally, these new services can<br />
serve <strong>as</strong> building blocks for new sorts of combinatorial innovation in business<br />
processes that will offer a huge boost to knowledge worker productivity in the<br />
future.<br />
49 ANNALEE SAXENIAN, THE NEW ARGONAUTS: REGIONAL ADVANTAGE IN A GLOBAL<br />
ECONOMY (2006).
256 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 257<br />
Decentralization, Freedom to<br />
Operate & Human Sociality<br />
By Yochai Benkler *<br />
Three Stories of Innovation in the<br />
Networked Information Economy<br />
In 1994, two groups of software engineers were working on the next generation<br />
of critical software: a Web server; the software that a website runs to respond to<br />
requests from users. One group w<strong>as</strong> within Microsoft, understanding that the<br />
next generation of critical infr<strong>as</strong>tructure would be the Web, and trying to extend<br />
its market from the operating system to the Web server. The other w<strong>as</strong> a group<br />
of developers led by Brian Behlendorf, formerly from the group of academic<br />
computing engineers from the University of Illinois in Urbana-Champagne,<br />
who were patching up the server developed in tandem with the development of<br />
Mosaic, the first graphical interface to access the Web, at Urbana-Champagne.<br />
They called it a patchy server, which became the name of the resulting open<br />
source project: Apache server. Anyone who would have predicted that the<br />
system, built by a scrappy set of developers, who adopted a licensing approach<br />
that <strong>as</strong>serted no exclusive rights over their output, and were working in an area<br />
considered strategically critical by the largest company in the field and that w<strong>as</strong><br />
developing a product in direct competition, would win would have been<br />
laughed out of the room. And yet, it moves (<strong>as</strong> Galileo famously said,<br />
defending his theories of the Earth’s orbit around the Sun). Over 15 years,<br />
through two boom and bust cycles, Apache h<strong>as</strong> held 50-60% of the market<br />
share in Web servers (in the summer of 2010, it w<strong>as</strong> about 55%), while<br />
Microsoft’s Web server market share h<strong>as</strong> hovered between 25% and 35%<br />
(about 25% in summer of 2010).<br />
In 1999, two of the most insightful economists looking at the new rules for the<br />
information economy opened their book with an analysis of how Microsoft’s<br />
move into the market in encyclopedi<strong>as</strong> embodied the new challenges created by<br />
the digital economy. In February of 2001, the developer of one of several<br />
ongoing efforts to develop an online encyclopedia half gave up and dumped<br />
about 900 stubs onto an open source platform, under a license that let anyone<br />
edit it and gave no one power to veto. This made participation e<strong>as</strong>y, but control<br />
relatively hard. And no one w<strong>as</strong> paid to write or edit the encyclopedia. It w<strong>as</strong><br />
probably the ugliest technical system for encyclopedia development being<br />
experimented with at the time. Five years later, this ugly duckling would be<br />
* Yochai Benkler is the Berkman Professor of Entrepreneurial Legal Studies at Harvard, and<br />
faculty co-director of the Berkman Center for Internet and Society.
258 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
identified by a study done by the staff at Nature <strong>as</strong> having roughly similar error<br />
rates <strong>as</strong> Britannica for science articles. By 2009, Microsoft’s Encarta<br />
encyclopedia product w<strong>as</strong> discontinued. Wikipedia h<strong>as</strong> come to embody the<br />
fundamental changes we have to deal with when trying to understand the<br />
networked information economy.<br />
In 2001 a Swedish and Danish entrepreneur invested in software developed by<br />
three Estonian programmers and rele<strong>as</strong>ed a brilliant new solution for peer-topeer<br />
file sharing: Kazaa. Thanks to the fact that the firm w<strong>as</strong> b<strong>as</strong>ed in the<br />
Netherlands, where Dutch law provided it greater immunity from suit by record<br />
labels, Kazaa quickly became a major platform after the demise of Napster. 1 By<br />
2003, the same group of entrepreneurs and programmers had launched a peerto-peer<br />
voice telephony application built on the same b<strong>as</strong>ic architecture <strong>as</strong><br />
Kazaa: Skype. Theoretically, Skype should not have worked. For close to two<br />
decades, the Internet Protocol’s “first-come, first-served,” treat-all-packets-ona-best-efforts-b<strong>as</strong>is<br />
approach w<strong>as</strong> thought to prevent serious voice over Internet<br />
applications from working well. And yet, here w<strong>as</strong> this small company<br />
providing better quality, encrypted, end-to-end communications, using the<br />
users’ own computers and connections <strong>as</strong> its b<strong>as</strong>ic infr<strong>as</strong>tructure. They did not<br />
need to control the flow of packets in the network to provide Quality-of-Service<br />
<strong>as</strong>surances. They just provided service of a quality that w<strong>as</strong> good enough for<br />
the price: free for calls from one Skype user to another, soon followed by very<br />
low rates for calls to regular phones. In 2005, eBay bought Skype for over $2.5<br />
billion.<br />
Radical Decentralization of<br />
Physical, Human & Social Capital<br />
The three stories above outline the b<strong>as</strong>ic transformative elements of the<br />
networked information economy. We have seen a radical decentralization of<br />
the most important forms of capital in the most advanced sectors of the<br />
economy: physical, human, and social capital. For the first time since the<br />
Industrial Revolution, the most important inputs into the core economic<br />
activities of the most advanced economies are widely distributed in the<br />
population. Technologically, the change begins with physical capital:<br />
Processing, storage, communications, and sensing hardware have come to be<br />
developed in packages of sufficiently low cost to be put in service by individuals<br />
for their own personal use. These advances are capable of mixing consumer use<br />
with production activities. The rapid incre<strong>as</strong>e in physical capabilities<br />
emph<strong>as</strong>izes continuous rapid innovation <strong>as</strong> a core dimension of growth and<br />
welfare which, in turn, emph<strong>as</strong>izes human capital.<br />
1 Napster itself w<strong>as</strong> a college dorm room experiment, one of many that flourished at that<br />
time, which dramatically and permanently changed the landscape of the music industry.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 259<br />
Human capital too is, by nature, widely distributed in the population, and is<br />
extremely sticky and hard to aggregate or transfer effectively from one<br />
individual to another. While we me<strong>as</strong>ure education when we try to quantify<br />
human capital, that is far from all of what human capital really entails. Certainly<br />
it does partly entails acquired, codified knowledge of the kind we get in<br />
education; but that is only one part of it. Creativity, insight, experience—all<br />
these go into answering the critical question: Will this individual come up with<br />
an idea, and even more importantly, will this interaction and conversation<br />
among a given set of individuals result in an interesting set of ide<strong>as</strong> emerging?<br />
Organizationally, the incre<strong>as</strong>ed emph<strong>as</strong>is on interactions among human beings<br />
responding to surprising new opportunities h<strong>as</strong> incre<strong>as</strong>ed the importance of<br />
loosely-coupled interactions beyond slower-moving group boundaries like<br />
firms. These new organizational frameworks and the cooperative dynamics they<br />
require depend on lightweight, flexible mechanisms that we all carry for<br />
interacting with other people. That is, they depend on human sociality. This<br />
b<strong>as</strong>ic set of protocols for non-destructive human interaction is also<br />
fundamentally and widely shared in the population. They are not locked up in<br />
the cabinets of smart corporate lawyers’ incorporation forms or major deal<br />
documents. They are the core social and psychological features of human<br />
beings that have co-evolved, physically and culturally, to allow us to be the<br />
kinds of social creatures we in fact are—warts and all.<br />
This distributed network of human beings, possessing the physical, human, and<br />
social capital that they do, are now connected in a global network of<br />
communications and exchange that allows much greater flow and conversation,<br />
so that many new connections are possible on scales never before seen.<br />
Together, these mean that conversations and new ide<strong>as</strong>—but more importantly,<br />
pilots, experiments, and toy implementations of these new ide<strong>as</strong>—are cheap and<br />
widespread, and innovation happens everywhere, all the time, at low cost. The<br />
v<strong>as</strong>t majority of ide<strong>as</strong> go nowhere, just <strong>as</strong> the v<strong>as</strong>t majority of experiments fail.<br />
But the sheer scale of experimentation h<strong>as</strong> meant that the network h<strong>as</strong> reliably<br />
provided the flow of innovation that we have come to expect and depend on,<br />
and it h<strong>as</strong> largely come from unpredictable corners rather than from yesterday’s<br />
innovators or the previous decades’ large firms.<br />
As a result of these b<strong>as</strong>ic dynamics, in the networked information economy,<br />
experimentation, continuous learning and improvement, low-cost prototyping,<br />
deployment, iteration, and adoption are more important than well-behaved<br />
innovation investments. Social behavior plays a much larger productive<br />
economic role than it could when physical capital requirements meant that,<br />
however good an idea someone had, transitioning it to a platform that could<br />
actually be adopted by consumers/users w<strong>as</strong> simply too expensive to do except<br />
through a system of contracts and investment—through a more-or-less formal<br />
corporate model. In the networked information economy, freedom to operate is
260 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
more important than power to appropriate, and voluntarism and sociality are more important<br />
than formal contract, and play an important role alongside corporate organization.<br />
Freedom to Operate is More<br />
Important than Power to Appropriate<br />
The story of open source software is the core story about the importance of<br />
freedom to operate and loosely-coupled <strong>as</strong>sociation that transcends contract<br />
and corporate structure. What makes a software development project “open<br />
source” is that the output of the development activity, the code, is rele<strong>as</strong>ed<br />
under a copyright license that allows anyone to look at, modify, and redistribute<br />
the code and modifications to it. This means that anyone, anywhere, can come<br />
to the state of the art in the code, adopt it, adapt it, and rele<strong>as</strong>e it, building on<br />
the innovative contributions of others with complete freedom to operate with<br />
and on it. In a networked environment where human capital resides in many<br />
places, and where it is impossible for anyone firm to hire all the smartest people<br />
(or more to the point, to hire all the people who are likely to have the most<br />
relevant and powerful insights for any new challenge), a system that depends on<br />
open access to the universe of available resources, projects, and collaborators<br />
on them will outperform a system that only allows people who have already<br />
been identified, recruited, and contracted with b<strong>as</strong>ed on p<strong>as</strong>t projections of<br />
what would be important for working on a new problem.<br />
The licensing <strong>as</strong>pect of open source software raises another important <strong>as</strong>pect of<br />
change. Historically, <strong>as</strong>suring the owner of financial capital of the soundness of<br />
an entrepreneur w<strong>as</strong> the critical factor. To do so, it w<strong>as</strong> necessary to possess<br />
property in core inputs, and a network of contracts for flows of what could not<br />
reliably or efficiently be owned, like supply relations. Today, <strong>as</strong>suring a steady<br />
and reliable flow of complementary contributions from other developers is <strong>as</strong><br />
important to maintain a high rate of innovation, experimentation, and<br />
adaptation <strong>as</strong> securing the complementary financial inputs. At the early stages,<br />
complementary contributions from other developers are more important than<br />
financial inputs.<br />
With the rise of peer production, radically distributed collaborative production<br />
on the open source model, adoption of licensing terms like those of free and<br />
open source software, or Creative Commons, becomes an important avenue to<br />
secure those complimentary human investments in the project. Where it is<br />
impossible to <strong>as</strong>sure that you will always employ the right people, open source<br />
licensing h<strong>as</strong> become an incre<strong>as</strong>ingly common strategy for entrepreneurs and<br />
large firms alike to improve the probability that they will be able to attract the<br />
complimentary rapid development contributions they need, in the time frame<br />
they need it, on currently-unpredicted challenges to <strong>as</strong>sure high-velocity<br />
innovation. Formalized freedom to operate, in the form of open source<br />
licensing, is coupled with a strong pre-commitment by the firms that undertake
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 261<br />
the limitations it imposes on their power to appropriate, so <strong>as</strong> to <strong>as</strong>sure<br />
potential collaborators against defection with regard to the fruits of the<br />
common, firm-boundary-crossing enterprise. Incre<strong>as</strong>ingly this is also becoming<br />
a way for firms that compete in some domains, such <strong>as</strong> software services, to<br />
engage in pre-competitive cooperation on the development of core necessary<br />
tools, like the Linux kernel and operating system, the Apache Web server, etc.<br />
Voluntarism & Sociality Become<br />
More Important than Formal Contract<br />
Another important characteristic of the networked information economy is the<br />
critical role of knowledge and creativity. These require uniquely human inputs,<br />
and are persistently uncontractible. That is, you can neither define for explicit<br />
codification nor characterize for monitoring over time, what it means to be<br />
creative, or insightful, or usably knowledgeable in context where an innovation<br />
challenge occurs. As a result, tacit knowledge and insight are necessarily and<br />
always imperfectly defined for, or monitored through, contract. To <strong>as</strong>sure the<br />
right motivations and orientation towards finding new solutions to challenges, it<br />
is necessary for an economy at large, <strong>as</strong> it is for any given organization, to<br />
harness the non-contractible motivations of individuals to the knowledge and<br />
innovation t<strong>as</strong>k at hand. This is not new, in the sense that literature on highcommitment,<br />
high-performance organizations h<strong>as</strong> been around for decades, and<br />
management theory keeps flowing back and forth between periods that<br />
emph<strong>as</strong>ize explicit material rewards and monitoring to control employees<br />
shirking their responsibilities, and periods where the limitations of those<br />
approaches become clearer, and the benefits of models that depend on a more<br />
holistic, human view of what is required to create a motivated workforce<br />
prevail.<br />
In the networked information economy, where so much of what needs to be<br />
done is uncontractible and so many of those who need to be engaged are not<br />
even in a position to have a contractual relationship, the role of sociality and<br />
cooperative human systems designs that aim to engage, and depend on social,<br />
moral, and emotional motivations <strong>as</strong> well <strong>as</strong>, and often instead of, material<br />
motivations, h<strong>as</strong> become much larger. Wikipedia, in my three stories, stands <strong>as</strong><br />
the ultimate example of a system that critically depends on these non-material<br />
motivational vectors, mediated through a technical-social platform that is<br />
optimized to engage these motivations and allow people to cooperate over a<br />
system that provides great freedom to operate, no power to appropriate, and<br />
tremendous room for social organization and interaction (which all have their<br />
own warts and bumps).<br />
Rather than the traditional formal modes of organization—be they a formal<br />
corporation b<strong>as</strong>ed on contracts, or formal, stable <strong>as</strong>sociations on the model of<br />
rotary clubs or unions—the new forms of social networks (not the
262 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
Facebook/MySpace-type websites, but the actual social phenomenon) permit<br />
people to have more loosely-coupled social <strong>as</strong>sociations, in which they can<br />
participate for some of the time, and combine their investments with many<br />
others who are similarly loosely-tied to each other, and may spend their time at<br />
different rates, and in different enterprises, during the course of their day, week,<br />
or year. Together, these new forms of loose <strong>as</strong>sociation, b<strong>as</strong>ed on social signals<br />
rather than price signals or a formal corporate managerial hierarchy, form what<br />
I have called peer production. They are not, by any stretch of the imagination,<br />
going to replace all production activities built on more formal, structured<br />
models. Anyone who claims that the argument is one of replacement<br />
misunderstands the claim.<br />
The new models of production do, however, come to play a significant<br />
productive role in an environment that continues and will continue to be<br />
occupied by more traditional forms. They create new sources of competition—<br />
<strong>as</strong> in the c<strong>as</strong>e of Wikipedia displacing Encarta—and new forms of<br />
complementary sources of innovation and other inputs—<strong>as</strong> in the c<strong>as</strong>e of open<br />
source software and the software services industry. They do not herald the<br />
death of traditional market/firm-b<strong>as</strong>ed production. To argue otherwise would<br />
be silly. But it would be equally silly to simply <strong>as</strong>sume away a major new<br />
organizational innovation. Peer production and cooperative human systems are a new way<br />
to harness a latent but m<strong>as</strong>sively productive force. They make the line between production and<br />
consumption fuzzy, and offer new pathways to harness the time, insight, experience, wisdom<br />
and creativity of hundreds of millions of people around the world to perform t<strong>as</strong>ks that, until a<br />
decade ago, we only knew how to perform through formal models of employment and contract.<br />
They are an organizational innovation that anyone ignores at their peril. Just<br />
<strong>as</strong>k the Departments of Defense or State how they feel about WikiLeaks.<br />
Innovation Anywhere &<br />
Everywhere Over an Open Network<br />
The story of Skype rounds out the core changes that the networked information<br />
economy presents. In the mid-twentieth-century, the epitome of innovation<br />
w<strong>as</strong> Bell Labs. With enough Nobel laureates to make the most ambitious<br />
academic physics departments green with envy and m<strong>as</strong>sive investment from<br />
monopoly profits, Bell Labs is where we got the transistor on which the entire<br />
information economy is built. It is, indeed, where we got information theory<br />
itself. The Bell system also epitomizes the organizational model of the midtwentieth-century.<br />
“One System, One Policy, Universal Service” w<strong>as</strong> how the<br />
company’s legendary President, Theodore Vail, put it 100 years ago. 2 According<br />
to this model of thought, if the Internet w<strong>as</strong> ever to carry that most delay-<br />
2 AT&T, Milestones in AT&T History, http://www.corp.att.com/history/milestones.html<br />
(l<strong>as</strong>t accessed Aug. 17, 2010).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 263<br />
sensitive of all services, voice, we would have to change how we manage<br />
packets. Best effort delivery 3 just wouldn’t do it. Someone needed to manage<br />
the network and decide—this packet, which carries voice, is more latency<br />
sensitive, while that packet, which carries email or a Web page, can wait. But, <strong>as</strong><br />
it turns out, this persistent prediction w<strong>as</strong> false. And the people who proved it<br />
false were not working for Bell Labs. Indeed, it w<strong>as</strong> probably impossible for<br />
anyone inside one of the current incarnations of the Bell system to have done<br />
so. It w<strong>as</strong>, instead, left to three Estonian developers and a couple of Dutch and<br />
Danish edgy entrepreneurs to do so. They were not the only ones to try.<br />
Others did too—VocalTec in Israel w<strong>as</strong> among the first; but they were too<br />
early.<br />
The point is that in a global networked information environment, innovation<br />
can come from anywhere; insights of various forms can find each other, and<br />
experimentation and implementation are cheap to do from anywhere to<br />
anywhere else. M<strong>as</strong>sive experimentation is followed by m<strong>as</strong>sive failures. But<br />
the failures are generally cheap, at le<strong>as</strong>t by societal standards. And the successes<br />
can be readily disseminated, adopted, and generalized on a major global scale in<br />
very short time frames. Variation, selection, adaptation and survival/replication<br />
through user adoption, rather than planning and high investment, have<br />
repeatedly offered the more robust approach in this new complex and chaotic<br />
environment. Rapid, low cost experimentation and adaptation on a m<strong>as</strong>s scale,<br />
underwritten by the e<strong>as</strong>e of cheap, f<strong>as</strong>t implementation and prototyping, and<br />
cheap widespread failure punctuated by a steady flow of unpredictable successes<br />
have been more important to innovation and growth in the networked economy<br />
than models of innovation b<strong>as</strong>ed on higher-cost, more managed innovation<br />
aimed at planning for predictable, well-understood returns.<br />
Implications for Human Systems Design<br />
We live our lives through systems: organizational systems, like corporations,<br />
states, or nonprofits; technical systems, like the interstate highway system or the<br />
Internet; institutional systems like law, both public and private, or social<br />
conventions; and cultural, <strong>as</strong> in our belief systems for how we know things to<br />
be true, such <strong>as</strong> religion, or science. To a great extent, these systems are too<br />
complex for us to construct deterministic, fully understood interventions that<br />
will clearly lead to desired outcomes, along whatever dimension we think is<br />
important: efficiency, freedom, security, or justice. But we nonetheless apply<br />
ourselves to the t<strong>as</strong>k. We try to use management science to design better<br />
organizational strategies; we try to use law to refine and improve our legal<br />
3 “Best effort delivery describes a network service in which the network does not provide any<br />
guarantees that data is delivered or that a user is given a guaranteed quality of service level or<br />
a certain priority.” Wikipedia, Best effort delivery,<br />
http://en.wikipedia.org/wiki/Best_effort_delivery (l<strong>as</strong>t accessed Aug. 17, 2010).
264 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
system; we invest enormous amounts in designing better technical systems, and<br />
so forth.<br />
The characteristics of the networked information economy require that in our<br />
efforts at systems design we emph<strong>as</strong>ize openness and freedom to operate over<br />
control and power to appropriate and that we emph<strong>as</strong>ize human sociality and<br />
diverse motivations for diverse types over optimizing for material interests and<br />
letting everything else sort itself out. At a practical level, technical open design<br />
h<strong>as</strong> made the largest and most powerful steps. Anchored in the very decision to<br />
separate TCP from IP, and make the core Internet protocol <strong>as</strong> open <strong>as</strong> it can be,<br />
and continuing to the central role that open standards have played in the<br />
development of the Web, XML, and WiFi, to name just a few, a continuous<br />
emph<strong>as</strong>is on openness already h<strong>as</strong> substantial support and inertia, although it is<br />
always under pressure from firms that think they can get an edge by owning a<br />
de-facto standard, or controlling a technical choke point that would allow them<br />
to extract rents. In management science, we are seeing, slowly and in some<br />
senses at the periphery, efforts to learn the lessons of open source software and<br />
apply them to collaboration across firm boundaries and strategic management<br />
of the knowledge ecology that a firm occupies.<br />
In law, the most important battleground in the tension between the controloriented<br />
approach and the freedom-to-operate approach is intellectual property.<br />
Only this year Amazon received a patent for social networking 4 that reads<br />
more-or-less like a description of Facebook, launched four years before<br />
Amazon had even filed its patent application. But not everything is so silly.<br />
This summer, the Librarian of Congress exempted iPhone jailbreaking from the<br />
<strong>Digital</strong> Millennium Copyright Act’s anti-circumvention provisions. 5 If there is<br />
any single policy domain in which it is important to apply what we have learned<br />
about the new networked information economy, it is in the area of intellectual<br />
property. It is also the area where there is the largest potential for intellectual<br />
and political programmatic overlap between libertarians and progressives.<br />
4 Stan Schroeder, Amazon Patents Social Networking System, Winks at Facebook, MASHABLE/TECH,<br />
June 17, 2010, http://m<strong>as</strong>hable.com/2010/06/17/amazon-patents-social-networkingsystem/.<br />
5 Copyright Office, Rulemaking on Exemptions from Prohibition on Circumvention of<br />
Technological Me<strong>as</strong>ures that Control Access to Copyrighted Works, July, 28, 2010,<br />
http://www.copyright.gov/1201/.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 265<br />
A Common Agenda on<br />
Intellectual Property for the<br />
Networked Information Economy<br />
From the perspective of economic analysis, information is a public good. Once<br />
someone creates new information or knowledge, anyone can use it without<br />
reducing its availability for anyone else. Its marginal cost is therefore zero, and<br />
that is its efficient price. However, for information to be available at a price of<br />
zero, the person who produced it must find some other mechanism to extract<br />
value from their investment in creating the information. Otherwise, having<br />
information available at its marginal cost today (zero) will lead to less<br />
production tomorrow.<br />
The overwhelming majority of information, knowledge, and culture is produced<br />
without the need to rely on explicit, intellectual-property-b<strong>as</strong>ed mechanisms to<br />
appropriate its benefits. Firms continuously innovate in their processes so <strong>as</strong> to<br />
lower their costs and improve their profits; but they do generally not patent<br />
their innovations and license them or exclude competitors from using them.<br />
Individuals innovate and develop experience about their workplace to improve<br />
their own performance; people read news and create commentary for each<br />
other, and appropriate the benefits of what they find socially. Governments<br />
invest in R&D and reap the benefits through higher growth, greater military<br />
might, etc. Nonprofits and academic institutions invest in information,<br />
knowledge, and cultural production, and so forth. All these approaches have<br />
their own advantages and disadvantages; but economic survey after survey for<br />
the p<strong>as</strong>t few decades h<strong>as</strong> shown that even in industrial innovation, a minority of<br />
sectors relies on patents, and the majority relies on a range of supply-side and<br />
demand-side improvements in appropriability that come from developing the<br />
information and either using it without exchanging it or disseminating it and<br />
relying on first-mover advantages, network effects, marketing and reputational<br />
benefits, etc.<br />
The only industries that are still dependent on intellectual property protections<br />
are the pharmaceutical industry for patents and Hollywood, the recording<br />
industry, and much of book publishing for copyright. Even newspapers and<br />
magazines are not so dependent on IP. They are, rather, advertising-supported<br />
media. They depend on rele<strong>as</strong>e of the information to capture demand-side<br />
benefits for their paying clients in a two-sided market—the advertisers.<br />
The re<strong>as</strong>on that it is important to remember this quick recap of innovation and<br />
Patents & Copyrights Economics 101 is that it helps us to see that patents and<br />
copyrights represent a government decision to prohibit everyone from using<br />
ide<strong>as</strong> or information that they can practically use, in order to serve a public<br />
purpose—supporting a subset of business models for the creation of new<br />
information, knowledge, and culture. Now, it is perfectly acceptable for
266 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
government to prohibit some actions in order to serve the public good. We<br />
prevent companies from selling food unless it is labeled in certain ways to serve<br />
public health; we prohibit violence to incre<strong>as</strong>e public security, and so forth. But<br />
we try to do so only when there is indeed a good re<strong>as</strong>on.<br />
Sometimes, we combine prohibitions with a market in permissions. Tradeable<br />
emissions permits are a cl<strong>as</strong>sic example where we think it is more efficient to<br />
allow firms to trade in their permissions than to simply have direct regulation.<br />
Patents and copyrights are exactly like tradeable emissions permits. They are a<br />
market-b<strong>as</strong>ed approach toward the regulatory problem of how to prevent<br />
people who want to use the existing universe of information and knowledge<br />
that they possess in ways that will undermine future knowledge production and<br />
innovation. We prohibit everyone from using certain cl<strong>as</strong>ses of information and<br />
knowledge, and we create a market in permissions to use that information. We<br />
call these permissions “copyrights” or “patents.” What is important to<br />
remember is that these permissions markets create a drag on freedom to operate<br />
on current innovation and knowledge creation, and they create a drag on<br />
innovation in all industries that, unlike pharmaceutical or blockbuster movie<br />
markets, do not heavily depend on such permissions markets.<br />
For progressives, the best way to understand patents and copyrights is through<br />
the prism of free speech: These are government regulations on what and how<br />
we can say things; and how we can use what we know, that are implemented in<br />
pursuit of legitimate government ends—aiding innovation and creative<br />
expression by some industries—at the expense of that freedom. As with any<br />
limitation on speech and learning, it h<strong>as</strong> to be supported by very good re<strong>as</strong>ons.<br />
It is not at all clear whether our contemporary economic understanding of the<br />
functioning of copyright law in particular, and patent law to a lesser extent,<br />
provide sufficient support for such significant restrictions on free speech.<br />
For libertarians, the best way to understand patents and copyrights is <strong>as</strong> a<br />
regulatory system that imposes limitations on how individuals can act on<br />
knowledge they possess in pursuit of their own goals. It is a regulatory system<br />
that creates and allocates permissions to generate market-b<strong>as</strong>ed transfer<br />
mechanisms; but a regulatory system in pursuit of a government program,<br />
which embodies the judgment that certain business models used to sustain<br />
innovation and expression are more effective than others, and supports those<br />
deemed more effective at the expense of those other approaches.<br />
Both approaches should lead to a significant downward revision in the level of<br />
acceptable intellectual property enforcement that the United States pursues. Let<br />
me offer one example, which provides the b<strong>as</strong>ic structure of the problem: How<br />
long should a copyright l<strong>as</strong>t? If we thought that copyrights were really property,<br />
the answer would be something like forever. The U.S. Constitution, <strong>as</strong> well <strong>as</strong><br />
the laws of practically every other country, instead limit the term of copyright,
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 267<br />
understanding that there is a big difference between the need of exclusivity in a<br />
thing that, if one person uses, another cannot, and exclusivity in an idea or<br />
expression that anyone can use without making it any less available for anyone<br />
else. The former is a proper object of property. The latter is a proper object of<br />
regulation of individual freedom for so using the thing, but only to the extent<br />
justified by property.<br />
So how long should copyright terms be? Let’s try this thought experiment:<br />
Imagine that you are someone with an idea for a movie. You walk in to a group<br />
of hypothetical investors and you tell them: “Here’s my idea, here’s the audience<br />
for it, and so here is my projection for how much money we will make on it.”<br />
The investors <strong>as</strong>k you: “What are your <strong>as</strong>sumptions about timing? By when will<br />
we see our return?” Now, imagine that you answered: “We won’t really break<br />
even in the first seventy years, but just you wait until years seventy to ninetyfive:<br />
We’ll be making millions!” You would be laughed out of the room. “OK,<br />
let’s try it with not making money the first twenty years, but making a killing in<br />
the years twenty to thirty.” You get the point.<br />
If copyright is intended to <strong>as</strong>sure that there is enough appropriability to attract<br />
investment in creating a new expression, but it is a regulatory form that restrains<br />
the freedom of others to operate in pursuit of that goal, then its term should be<br />
keyed to the term necessary to attract investors. Given today’s discount rates in<br />
the relevant industries—that is, how quickly investors need to turn a profit<br />
before they will decide to put their money in some other enterprise—that likely<br />
means 18 months; maybe it means three to five years. It is possible that<br />
different industries have different levels of patience. But fundamentally, the<br />
overwhelming majority of the social cost created by the 95-year term of<br />
copyright—let alone the repeated practice of retroactive extension of copyright<br />
for works already created in response to the then-existing incentives-system—is<br />
incurred without any benefit for investment purposes. No sane investor today<br />
cares about returns on an investment in these kinds of fields (<strong>as</strong> opposed to,<br />
say, power plants or utilities) that are ten years out. For software, maybe the<br />
correct period is 18 months; for novels, maybe 10 years, although even there,<br />
the relevant party is the publisher’s decision to publish, not the author’s<br />
decision to write—because copyright-b<strong>as</strong>ed monetization runs through the<br />
publisher’s business decision, not the author’s. In patents, maybe the correct<br />
period is 20 years for pharmaceuticals. Maybe more; or maybe less. But the<br />
principle for all these is the same: The period of copyright or patent protection<br />
should be backed out of re<strong>as</strong>onable investment <strong>as</strong>sumptions and discount rates,<br />
not pulled out of the lobbying process, which is always skewed in favor of the<br />
small number of firms that possess these rights and against the millions of<br />
potential innovators who do not yet know that this or that piece of regulated<br />
access to information will get in their way five years from now.
268 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
The details of what might go on a major intellectual property reform that should<br />
be supported by both libertarians and progressives may differ among<br />
commentators. The core structure of the re<strong>as</strong>ons for change are the same: (a)<br />
Strong patents and copyrights benefit some business models over others, and in<br />
particular place a strong drag on the radically distributed, chaotic, innovationeverywhere-by-everyone<br />
model of the networked information economy in favor<br />
of twentieth-century models of much more stable and controlled markets like<br />
those of Hollywood and the recording industry; (b) There is a big difference<br />
between the level of exclusivity needed to attract investment at the margin, and<br />
the level of exclusivity that maximizes its owner’s ability to extract rents; the size<br />
of the difference between the minimal necessary to attract innovation and the<br />
rent-maximizing level of protection is equal to the amount the incumbents are<br />
willing to spend on lobbying to keep the line at the maximal point, <strong>as</strong> opposed<br />
to the minimally-necessary point; and (c) The lines in fact should be drawn<br />
where the marginal effect is to attract investment, not where rents can be<br />
maximized. The academic community h<strong>as</strong> spent years trying to refine a set of<br />
interventions that could improve access to information, knowledge and culture,<br />
while having minimal impact on incentives to invest. The following represent<br />
some of the most promising of these ide<strong>as</strong>.<br />
� Copyright term: Copyright terms should be keyed to actual market<br />
requirements and the discount rate in the business. Copyright that is<br />
any longer than necessary to attract the marginal investor that makes a<br />
difference between the project happening or not represents pure rent<br />
extraction and is a drag on innovation and creativity.<br />
� Renewal of existing copyrights: There are mountains of existing<br />
materials (animal shots from documentaries from the 1960s; explosions<br />
and action shots from 1970s B movies; etc.) that could provide the grist<br />
for new models of creative m<strong>as</strong>hup tools and sites, but instead sit<br />
unused and unusable because the rights are excessively tied up.<br />
Existing copyrights should be required to be renewed periodically,<br />
initially for a nominal price, and later on in the life of a copyright for<br />
escalating fees, rising to a level no greater than necessary to make a<br />
copyright owner think: is their any real market for this thing, rather than<br />
focing holders to make fine distinctions about the value of the work, on<br />
one hand, or simply automatically renewing everything, whether or not<br />
it h<strong>as</strong> any market, because it’s cheaper to renew than to review<br />
continued viability. Those works that continue to be of even small<br />
commercial value will be renewed. Those that continue to be of<br />
emotional significance will be renewed. All others will become freely<br />
usable upon failure to re-register.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 269<br />
� Reinstate the Sony doctrine 6 by legislative reversal of the Supreme<br />
Court’s Grokster decision 7 – In the midst of the panic over peer-topeer<br />
filesharing, the Supreme Court moved away from its long-standing<br />
precedent that an innovator cannot be forced to foresee and prevent<br />
the potentially-infringing uses of its new product (in the Sony c<strong>as</strong>e, the<br />
VCR). As long <strong>as</strong> there are substantial noninfringing uses, innovators<br />
are immune to suit by copyright owners whose works are being<br />
infringed by users of the innovator’s product. In Grokster the Supreme<br />
Court created a more intention-b<strong>as</strong>ed, fact-intensive inquiry that<br />
imposes greater litigation risk on entrepreneurs who innovate on the<br />
Net with anything that can possibly be used to infringe existing<br />
copyrights. This is an unnecessary drag on Internet innovation and<br />
entrepreneurship in favor of the movie and recording industries.<br />
� Eliminate business methods patents: Few innovations are <strong>as</strong><br />
unnecessary <strong>as</strong> a law intended to give business people an incentive to<br />
improve their business model. The incentive to develop a new business<br />
model is that it makes more money for its inventor. There is no need<br />
for an additional government-granted monopoly on doing business in<br />
this way. The Federal Circuit, which created this new doctrine 12 years<br />
ago, tried to walk it back in the Bilski c<strong>as</strong>e, 8 but the Supreme Court<br />
recently held 9 that the particular way that the federal Circuit went about<br />
doing so w<strong>as</strong> indefensible. Nonetheless, it appears that a majority of<br />
the Supreme Court would support some other, better-re<strong>as</strong>oned<br />
reversal.<br />
� Eliminate software patents: There is fairly significant evidence that<br />
software patents are unnecessary, and that software development is<br />
heavily b<strong>as</strong>ed on service models, time to market, network effects,<br />
customer habits, etc. On the other hand, patents get in the way of open<br />
source development, and throw a monkey wrench into the model of<br />
rapid innovation by anyone, anywhere, distributable everywhere. They<br />
create unnecessary barriers to entry that reduce the freedom to operate<br />
and experiment, and thereby harm innovation.<br />
6 Sony Corp. of America v. Universal City Studios, Inc., 464 U.S. 417 (1984), available at<br />
http://en.wikipedia.org/wiki/Sony_Corp._of_America_v._Universal_City_Studios,_<br />
Inc.<br />
7 MGM Studios, Inc. v. Grokster, Ltd., 545 U.S. 913 (2005), available at<br />
http://en.wikipedia.org/wiki/MGM_Studios,_Inc._v._Grokster,_Ltd..<br />
8 In re Bilski, 545 F.3d 943, 88 U.S.P.Q.2d 1385 (Fed. Cir. 2008), available at<br />
http://en.wikipedia.org/wiki/In_re_Bilski.<br />
9 Bilski v. Kappos, 561 U.S. ___, 130 S. Ct. 3218 (2010), available at<br />
http://en.wikipedia.org/wiki/Bilski_v._Kappos.
270 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
� Create a “band” of exempt experimentation, both commercial<br />
and non-commercial, whereby use of existing copyrighted or<br />
patented information or knowledge does not trigger liability:<br />
Today, to the extent that there are exemptions they are spare and<br />
niggardly. Using just three notes from a prior recording and m<strong>as</strong>hing<br />
them up into a completely new song does not, under present copyright<br />
law, count <strong>as</strong> “de minimis.” 10 Academic experimentation on a patented<br />
drug that does not result in any alternative drug that competes but<br />
merely begins to create the path to one does not come within patent<br />
law’s “research exemption.” 11 These attitudes—none forced by the<br />
language of the statutes—reflect a judicial temperament that seems to<br />
think of copyright and patents in a Blackstonian “sole and despotic<br />
dominion” 12 mindframe, an approach that w<strong>as</strong> never true of real<br />
property under common law, and would, even in terms of pure theory,<br />
be dis<strong>as</strong>trous if applied to knowledge and information. The idea would<br />
be to develop a relatively robust space for experimentation which, if it<br />
led to products and sales, would entitle the owner of the prior, enabling<br />
innovation or creative expression to claim some share of the profits of<br />
the downstream innovator or creator. The critical point of such an<br />
approach would be to allow millions of experiments to run without<br />
liability or its risk, while at the same time <strong>as</strong>suring that truly enabling<br />
innovations for those experiments that do succeed can share in the<br />
commercial upside of their contributions to downstream innovation.<br />
� Continue to expand the exemptions from the <strong>Digital</strong> Millennium<br />
Copyright Act 13 wherever that Act’s provisions place a drag on<br />
interoperability and innovation in systems that depend on access<br />
to existing platforms and systems: Federal courts have begun to<br />
reject claims under the DMCA that are efforts by copyright owners to<br />
use digital rights management to throw a monkey wrench into the<br />
works of a competitor. For example, Lexmark tried to make it hard for<br />
competitors who wanted to compete on toner for its printers by<br />
creating a chip and software handshake between the printer and the<br />
toner cartridge. When a competitor reversed engineered the handshake<br />
so that their microchip-enabled toner cartridge could work with<br />
10 Bridgeport Music, Inc. v. Dimension Films, 410 F.3d 792 (6th Cir. 2005), available at<br />
http://en.wikipedia.org/wiki/Bridgeport_Music,_Inc._v._Dimension_Films.<br />
11 Madey v. Duke University, 307 F.3d 1351 (2002).<br />
12 SIR WILLIAM BLACKSTONE, COMMENTARIES ON THE LAWS OF ENGLAND, Book II Ch. II ,<br />
Clarendon Press (Oxford) 1765-1769, available at<br />
http://en.wikipedia.org/wiki/Commentaries_on_the_Laws_of_England.<br />
13 17 U.S.C. §§ 1201-1205, available at<br />
http://en.wikipedia.org/wiki/<strong>Digital</strong>_Millennium_Copyright_Act#Anticircumvention_exemptions.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 271<br />
Lexmark printers, Lexmark argued that in order to build their<br />
competing cartridge, the competitor had to make a copy of the<br />
handshake software, which in turn required them to get around the<br />
encryption protecting that piece of copyrighted software. In other<br />
words, the competitor had violated the DMCA by circumventing the<br />
digital rights management encryption that protected their copyrighted<br />
handshake software. The court rejected the argument, emph<strong>as</strong>izing<br />
that a program copy whose core function w<strong>as</strong> interoperability did not<br />
violate the DMCA. 14<br />
In this short a piece, I neither aim for an exhaustive list nor offer a detailed<br />
analysis of each of the proposals identified. Instead, I offer these <strong>as</strong> an initial<br />
draft of a range of policies that would incre<strong>as</strong>e freedom to operate in the<br />
networked information economy, and reduce the drag of the current system of<br />
copyrights and patents on both commercial and social enterprises that have<br />
played a critical role in the explosive innovation we have experienced on the<br />
Internet in the p<strong>as</strong>t decade and a half.<br />
Conclusion<br />
The networked information environment h<strong>as</strong> introduced a period of radically<br />
decentralized capitalization of some of the core economic sectors in the most<br />
advanced economies. As a result, growth is coming to depend incre<strong>as</strong>ingly on<br />
innovation from individuals and companies at the edges, operating <strong>as</strong> the few<br />
successful experiments out of thousands of similar experiments that go<br />
nowhere. Many of these experiments are commercial. Many are noncommercial.<br />
Many combine the two. As a system, this open, chaotic, complex<br />
innovation system requires freedom to operate. It needs to take advantage of its<br />
technical, economic, and social structure more than it needs power to control<br />
uses <strong>as</strong> in prior models of well-behaved appropriation. Moreover, the diversity<br />
of models of experimentation, and the incre<strong>as</strong>ingly fuzzy line between<br />
production and consumption, between the social and the economic, suggest<br />
that for purposes of economic production and growth, formal contract and<br />
corporate structure are playing a less important role than they did in the prior<br />
century relative to the incre<strong>as</strong>ingly important role played by loosely-structured<br />
voluntarism and human sociality.<br />
14 Lexmark International, Inc. v. Static Control, 387 F.3d 522 (6th Cir. 2004).
272 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 273<br />
The Economics of Information:<br />
From Dismal Science<br />
to Strange Tales<br />
By Larry Downes *<br />
Heroes<br />
It w<strong>as</strong> a fight over nothing.<br />
In 2008, 12,000 members of the Writers Guild of America staged a withering<br />
strike against the major Hollywood studios. It l<strong>as</strong>ted three months, interrupted<br />
dozens of TV series, and delayed several big-budget films. The two sides<br />
reportedly lost more than $2 billion. Yet the sole issue in the dispute w<strong>as</strong> when<br />
and how revenues from the Internet and other digital distribution of<br />
entertainment would be allocated. 1<br />
So far, no such revenues exist.<br />
Online distribution of movies and especially TV is a recent phenomenon,<br />
powered by ever-f<strong>as</strong>ter data transmission speeds, the continued spread of<br />
broadband technologies into the home, and improved protocols for file<br />
compression. It seems certain that profitable models for delivering Hollywood<br />
content to computers, personal digital <strong>as</strong>sistants (PDAs), cell phones, and other<br />
non-TV devices will emerge. But in these early days, <strong>as</strong> with music before it, it<br />
isn’t clear what those models will be. Will they be supported by advertising?<br />
Will content be pay-per-view or b<strong>as</strong>ed on all-you-can-eat subscriptions? Will<br />
consumers prefer to own or rent?<br />
As industry ponders these unanswerable questions, consumers are doing much<br />
of the innovating themselves <strong>as</strong> they did with earlier, less bandwidth-intensive<br />
content such <strong>as</strong> text and music. Users of YouTube, BitTorrent, and all<br />
variations of video streaming or file-sharing applications, in the interest of speed<br />
* Larry Downes is an Internet analyst and consultant, helping clients develop business<br />
strategies in an age of constant disruption caused by information technology. He is the<br />
author of UNLEASHING THE KILLER APP: DIGITAL STRATEGIES FOR MARKET<br />
DOMINANCE (Harvard Business School Press 1998) and, most recently, of THE LAWS OF<br />
DISRUPTION: HARNESSING THE NEW FORCES THAT GOVERN LIE AND BUSINESS IN THE<br />
DIGITAL AGE (B<strong>as</strong>ic Books 2009). This essay is adapted from THE LAWS OF DISRUPTION.<br />
1 Michael White & Andy Fixmer, Hollywood Workers Return to Work After Ending Strike,<br />
BLOOMBERG, Feb. 13, 2008, http://www.bloomberg.com/apps/news?<br />
pid=newsarchive&sid=aKdwR9oC54WM.
274 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
and experimentation, do not bother with the niceties of obeying the law<br />
(Viacom’s $1 billion lawsuit against YouTube and Google is currently on<br />
appeal). 2<br />
Why did the two sides risk so much fighting over revenue that doesn’t yet exist<br />
from channels that haven’t been invented? The writers say they took a stand in<br />
large part because they did not do so in the early days of videoc<strong>as</strong>sette sales and<br />
rentals. When the profitable models finally arrived, the writers believed they got<br />
a much worse deal. Because media sales and rentals now represent the largest<br />
share of entertainment income, missing the boat h<strong>as</strong> been painful for writers.<br />
The studios argued that until it is clear how and when money is to be made<br />
from digital distribution, pre-<strong>as</strong>signing residual royalties to writers would limit<br />
the studios’ ability to experiment with different distribution and partnership<br />
models. They made the same argument with videotapes.<br />
Ultimately, new rates for residual royalties were agreed upon for categories<br />
including <strong>downloaded</strong> rentals and sales, ad-supported streaming media, short<br />
clips, and promotional uses. Whether these prove to be favorable rates, or even<br />
the right categories, remains to be seen. Either way, media will continue to<br />
migrate to the Internet at the expense of other forms of distribution.<br />
The Strange Behavior<br />
of Non-Rivalrous Goods<br />
It is hard to say if anyone made the right decisions in the writers’ strike, in part<br />
because the tools for valuing information products and services, even for<br />
present uses, are terrible. You will look in vain at the balance sheets of<br />
companies whose sole <strong>as</strong>sets are information—including much of the<br />
entertainment industry, <strong>as</strong> well <strong>as</strong> professional services such <strong>as</strong> doctors, lawyers,<br />
and consultants—to find any useful me<strong>as</strong>ure of the current or future value of<br />
the company’s real <strong>as</strong>sets. While management gurus sing the praises of<br />
developing a company’s intellectual capital, financial reporting systems ignore it.<br />
Accountants refer to all the valuable information in a business—its information<br />
<strong>as</strong>sets—<strong>as</strong> intangibles. As the name suggests, these are <strong>as</strong>sets that never take a<br />
physical form <strong>as</strong> do factories and inventory. Unlike physical <strong>as</strong>sets, information<br />
<strong>as</strong>sets are generally not counted in calculating the total worth of an enterprise.<br />
For the most part, a company’s human resources, brands, and good<br />
relationships with customers and suppliers—let alone its copyrights, patents,<br />
trade secrets, and trademarks—are left off its balance sheet. The value of the<br />
2 See Adam Ostrow, Viacom Loses $1 Billion Again YouTube, MASHABLE, June 23, 2010,<br />
http://m<strong>as</strong>hable.com/2010/06/23/youtube-wins-viacom-lawsuit/.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 275<br />
company’s information, at le<strong>as</strong>t <strong>as</strong> far <strong>as</strong> accounting is concerned, is b<strong>as</strong>ically<br />
nothing.<br />
Why? Accountants have argued for years that information and other intangibles<br />
are so different economically from material goods that traditional methods of<br />
valuation just don’t apply. As an <strong>as</strong>set, the explanation goes, information<br />
behaves precisely the opposite of its tangible counterparts: Capital <strong>as</strong>sets lose<br />
value <strong>as</strong> they are used, equipment becomes obsolete, and raw materials are<br />
depleted. Brands and reputations, by contr<strong>as</strong>t, become more valuable the more<br />
they are exercised, in theory generating revenue forever. You cannot determine<br />
the price of a logo or a customer relationship with the same tools you use to<br />
depreciate a tractor.<br />
Fair enough. But that doesn’t explain why accountants have done so little to<br />
develop valuation techniques that apply to information <strong>as</strong>sets. A dangerous<br />
result of that failure is that few managers understand information or how it<br />
generates value. Even CEOs of large companies regularly get it wrong when<br />
they talk c<strong>as</strong>ually about “trademarking an idea” or “copyrighting a word.” (You<br />
cannot do either.)<br />
As information becomes more central to economic performance, the failure to<br />
account for its value h<strong>as</strong> become dangerous. Executives, especially in public<br />
businesses, are compensated b<strong>as</strong>ed on the health of their companies’ balance<br />
sheets. To the extent that information value doesn’t appear there, it’s<br />
understandable that many companies don’t put much, if any, effort into<br />
developing or managing those <strong>as</strong>sets.<br />
That’s unfortunate because the strategic cultivation of information <strong>as</strong>sets is<br />
beneficial in many ways. Consider Harrah’s Entertainment, which operates<br />
c<strong>as</strong>inos worldwide in places where gambling is legal.<br />
When former business school professor Gary Loveman joined the company <strong>as</strong><br />
chief operating officer in 1998, he decided to look for underutilized <strong>as</strong>sets on<br />
the company’s balance sheet. He found them in Harrah’s data warehouse. Like<br />
most c<strong>as</strong>ino chains, Harrah’s had implemented a rewards program that gave<br />
customers special benefits for using their membership cards while playing slot<br />
machines. Harrah’s w<strong>as</strong> collecting v<strong>as</strong>t amounts of information on its “factory<br />
floor,” but had done very little to put that data to use.<br />
A detailed review of the collected information upended several long-standing<br />
myths about where Harrah’s made the most money. Most of the company’s<br />
profits came from a quarter of its customers. Those customers were not,<br />
however, the “cuff-linked, limousine-riding high rollers [Harrah’s] and [their]
276 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
competitors had fawned over for many years.” 3 Instead, Harrah’s discovered<br />
that the high-profit customers were regular visitors, many of them recent<br />
retirees. They made frequent trips to the c<strong>as</strong>ino and spent steadily, if modestly,<br />
at its gaming tables, restaurants, and hotels.<br />
Harrah’s quickly reconfigured its customer-facing activities, including check-in,<br />
complimentary meals, and special promotions, orienting them toward the actual,<br />
<strong>as</strong> opposed to the presumed, best customers. The result w<strong>as</strong> a changed<br />
enterprise, one that consistently outperforms its competition.<br />
Even though the balance sheet never reported the value of the diamonds<br />
Loveman found when he looked in his data mine, his information <strong>as</strong>sets were<br />
by no means worthless. In 2006, Harrah’s w<strong>as</strong> sold to a private equity<br />
partnership at a price that valued the information at more than $1 billion,<br />
representing a 30% premium in the total purch<strong>as</strong>e price. 4 Today, Gary<br />
Loveman remains CEO of the company, a position he h<strong>as</strong> held since 2003.<br />
The writers’ strike and the Harrah’s story teach an important lesson about the<br />
economics of information. Just because information value is indeterminate<br />
doesn’t mean it’s worthless. Not by a long shot. The Hollywood writers and<br />
producers clearly did not think so, nor did the buyers of Harrah’s.<br />
Consider another example: Search giant Google h<strong>as</strong> $20 billion in <strong>as</strong>sets, mostly<br />
c<strong>as</strong>h, on its balance sheet. The company, however, even on the lowest day of<br />
the stock market in ten years, w<strong>as</strong> worth nearly $100 billion—more than five<br />
times its book value. Somebody h<strong>as</strong> figured out, at le<strong>as</strong>t in part, how to value<br />
the company’s information <strong>as</strong>sets.<br />
<strong>Digital</strong> life is made up of information. It comes in a wide range of types,<br />
including private data, speech, news and entertainment, business practices, and<br />
information products and services such <strong>as</strong> films, music, inventions, and<br />
software. But all information operates under a common set of economic<br />
principles. So to thrive in the next digital decade, you must understand the<br />
b<strong>as</strong>ic elements of information economics.<br />
In modern economic terminology, goods are categorized <strong>as</strong> either “private” or<br />
“public” goods. Most goods in our industrial economy are private goods.<br />
Purely private goods are those that can be possessed by only one person at a<br />
3 Gary Loveman, Diamonds in the Data Mine, HARV. BUS. REV., May 2003. See also Julie<br />
Schlosser, Teacher’s Bet, FORTUNE, March 8, 2004, http://money.cnn.com/magazines/<br />
fortune/fortune_archive/2004/03/08/363688/index.htm.<br />
4 Ryan Nak<strong>as</strong>hima, Harrah’s Entertainment Accepts Buyout Bid from Private Equity Group, USA<br />
TODAY, Dec. 19, 2006, http://www.usatoday.com/money/industries/2006-12-19harrahbuyout_x.htm.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 277<br />
time (“rivalrous”) and whose use can be limited to that person or with<br />
whomever she might share it (“excludable”). If I own a barrel of oil, then you<br />
don’t own it, unless I sell it to you, in which c<strong>as</strong>e I no longer have it. Once it’s<br />
used, it’s gone forever—no one h<strong>as</strong> it anymore.<br />
Public goods, by contr<strong>as</strong>t, can be used by more than one person at the same<br />
time (“non-rivalrous”), and limiting access to them is difficult, if not impossible<br />
(“non-excludable”). The cl<strong>as</strong>sic example in economics textbooks is national<br />
defense. Either everyone h<strong>as</strong> this good or nobody does. The military protects<br />
everyone, including those who do not pay taxes. Defensive missiles cannot be<br />
programmed to leave a single house unguarded.<br />
Information is an archetypically non-rivalrous good. As Thom<strong>as</strong> Jefferson<br />
famously wrote, “He who receives an idea from me, receives instruction himself<br />
without lessening mine; <strong>as</strong> he who lights his taper at mine, receives light without<br />
darkening me.” 5 Once a composer completes a song, there’s no physical limit<br />
to how many people can perform it simultaneously. There is a cost <strong>as</strong>sociated<br />
with its creation, but the composer incurs no additional cost no matter how<br />
many times the work is played. Regardless of how often it is performed, the<br />
composition still exists. In fact, it becomes more valuable the more freely it’s<br />
shared—it becomes more popular, maybe even a “hit.”<br />
So information is non-rivalrous, but is it also excludable? Until recently, the<br />
answer in practice w<strong>as</strong> often no. That’s because many information products<br />
that sprang from the creativity of the human mind could not e<strong>as</strong>ily be<br />
distributed without first being copied to physical media such <strong>as</strong> books,<br />
newspapers, or, in the c<strong>as</strong>e of music, CDs and records. In that transformation<br />
(“dem<strong>as</strong>sification,” in Alvin Toffler’s terminology 6), information lost its<br />
excludable property, looking more like the barrel of oil than like national<br />
defense. It’s e<strong>as</strong>y to limit access to the barrel of oil—there’s only one, after all.<br />
The song, once recorded and duplicated, is harder to control, but it’s still<br />
possible to exclude those who didn’t pay for a copy or pay for the right, <strong>as</strong> in<br />
radio, to broadc<strong>as</strong>t it.<br />
Information, until recently, w<strong>as</strong> a public good in theory but in practice behaved<br />
more like a private good. The need to reduce it to physical media m<strong>as</strong>ked its<br />
true nature, and gave rise to seemingly incongruous terminology that includes<br />
“stealing an idea,” “pirating content,” and, most significantly, “intellectual<br />
property.” After more than five hundred years of Gutenberg’s moveable type,<br />
we’re so conditioned to experiencing information through m<strong>as</strong>s-produced<br />
media that we equate the cost of the media with the value of the content. As<br />
5 Letter from Thom<strong>as</strong> Jefferson to Isaac McPherson (Aug. 13, 1813), available at<br />
http://press-pubs.uchicago.edu/founders/documents/a1_8_8s12.html<br />
6 See generally ALVIN TOFFLER, THE THIRD WAVE (1980).
278 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
John Perry Barlow poetically put it, until the advent of electronic distribution<br />
through the Internet, “the bottle w<strong>as</strong> protected, not the wine.” 7<br />
For information products, that protection—the ability to exclude—is almost<br />
entirely a function of law: The law of copyright makes it a crime to “copy”<br />
information without permission. Copyright gives the composer the exclusive<br />
right to make or authorize performances of a song, for example, or to record it<br />
and produce copies. At the same time, copyright outlaws the production of<br />
copies by anyone else, including someone who purch<strong>as</strong>ed a legal copy.<br />
By limiting both the performance and production of a song, copyright<br />
transforms non-rivalrous information into a rivalrous physical good. But the<br />
alchemy of copyright is starting to fail <strong>as</strong> digital technology makes it e<strong>as</strong>ier to<br />
distribute songs electronically. Given the Internet, it’s now much harder to<br />
limit who gets to hear a song and when. A copy no longer requires expensive<br />
recording and pressing equipment, or access to a costly and very visible retail<br />
distribution network.<br />
Although the composer can legally exclude those who do not buy authorized<br />
copies of his work, his ability to police that right is incre<strong>as</strong>ingly expensive, often<br />
costing more than it’s worth. You can’t realistically stop people from humming<br />
your tune, even if they do it out loud. And now you can’t really stop them from<br />
sharing copies of a digital recording, either.<br />
Other than the composer herself, however, consumers are also potentially<br />
harmed by the transformation of information goods back to their non-rivalrous<br />
state. There were and remain important re<strong>as</strong>ons for legal systems that treat<br />
information <strong>as</strong> if it were a rivalrous good.<br />
Copyright, for example, is designed to maximize the value of the up-front<br />
investment that information producers must make. If copyright didn’t exist,<br />
you could simply buy a recording of the song, reproduce it, and sell your own<br />
copies. Because your total investment would be only the cost of a single copy,<br />
your version would likely be cheaper than the one marketed by the composer<br />
himself. In theory, the composer would find it difficult to recover his creative<br />
investment, making him less likely to undertake his important work in the first<br />
place. Ultimately, everyone would be worse off.<br />
But copyright’s value comes at a high price. By imposing costs on the exchange<br />
of information that otherwise would not exist, the law neutralizes many of the<br />
valuable features of non-rivalrous goods.<br />
7 John Perry Barlow, The Economy of Ide<strong>as</strong>, WIRED 2.03, 1994,<br />
http://www.wired.com/wired/archive/2.03/economy.ide<strong>as</strong>_pr.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 279<br />
Fortunately, this special power is limited. Even with copyright, some forms of<br />
sharing are perfectly legal. Libraries can loan out the same copy of a recording<br />
to <strong>as</strong> many people <strong>as</strong> want to hear or play it, one at a time. Fans who purch<strong>as</strong>ed<br />
their own copy are likewise free to loan it to their colleagues, or even to resell<br />
their copy to a used record store or through online services such <strong>as</strong> Amazon<br />
Marketplace or eBay.<br />
Copyrights also have an expiration date. In the United States, for example,<br />
copyrights l<strong>as</strong>ts 70 years beyond the life of the author or 120 years for certain<br />
works. 8 After this period, the work is no one’s property—the public can use it<br />
however they want to. Anyone can perform the work, make copies of it, adapt<br />
it, or incorporate it into new works. It becomes forever after a purely nonrivalrous<br />
good.<br />
Copyright protection is also limited to the producer’s particular expression and<br />
not the underlying ide<strong>as</strong>. The ide<strong>as</strong> in a song (love conquers all, love stinks) are<br />
non-rivalrous from the moment the song is written.<br />
Consider a 1996 court c<strong>as</strong>e involving sports statistics. In the days before the<br />
Web and wireless data devices, sports fans who were not attending a game<br />
could get up-to-the-minute information from a dedicated paging device from<br />
Motorola called Sportstrax. Sportstrax employees watched sporting events on<br />
TV and entered key information (e.g., who had the ball or who had scored) into<br />
a computer system. A few minutes later, Sportstrax customers would be paged<br />
with short updates.<br />
Motorola w<strong>as</strong> sued by the National B<strong>as</strong>ketball Association, which claimed the<br />
transmission of information by pager violated its copyright in the broadc<strong>as</strong>t of<br />
games. 9 The court disagreed. Sporting events are not “authored,” the judges<br />
noted, and are therefore not protected by copyright in the first place. Game<br />
data, including interim scores, are facts, not a particular expression of an<br />
information producer. Facts are non-rivalrous, outside the protection of<br />
copyright.<br />
Today, pagers have given way to cell phones that can take photographs and<br />
videos and share them via the Internet. Popular television programs such <strong>as</strong><br />
“American Idol” have armies of fans who watch the show and write blogs<br />
about each performance even <strong>as</strong> they’re watching them. So long <strong>as</strong> the actual<br />
performances aren’t being copied, however, the commentary is perfectly legal.<br />
8 17 U.S.C §§ 302- 03.<br />
9 Nat’l B<strong>as</strong>ketball Assoc. v. Motorola, Inc., 105 F.3d 841 (2d Cir. 1997),<br />
http://www.law.cornell.edu/copyright/c<strong>as</strong>es/105_F3d_841.htm.
280 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
As these examples suggest, the challenge for copyright and other laws<br />
controlling information h<strong>as</strong> always been to strike the right balance between<br />
incentives for creators and the value that the public derives from unfettered use.<br />
It’s a balance that is constantly being unsettled by new technologies, a problem<br />
that h<strong>as</strong> accelerated with the advent of the digital age.<br />
On one hand, information technology h<strong>as</strong> greatly lowered the cost of creating<br />
and distributing information including books, movies, and recorded music. But<br />
the same technologies have also made it e<strong>as</strong>ier to make and distribute<br />
unauthorized copies which are, in many c<strong>as</strong>es, perfect replic<strong>as</strong> of the original.<br />
Should information laws be tightened or relaxed in the next digital decade? Do<br />
producers need more protection from copyright laws, or do consumers deserve<br />
greater freedom? Are new information uses made possible by software<br />
applications such <strong>as</strong> YouTube, Facebook and Flickr, creating more value than<br />
they destroy, and for whom?<br />
The Five Principles<br />
of Information Economics<br />
Unfortunately, many of those debating these questions—and there are many,<br />
including lawmakers, industry leaders, and consumer groups—don’t understand<br />
the economic properties of information any better than do the accountants who<br />
refuse to me<strong>as</strong>ure it. So it’s worth summarizing the five most important<br />
principles of information economics. It’s even better to memorize them:<br />
Renewability<br />
Information cannot be used up. It can be enhanced or challenged, it can<br />
become more or less valuable over time, but once it h<strong>as</strong> been created, it can be<br />
used over and over again. In the end it exists <strong>as</strong> it began. Most new<br />
information, moreover, is created from other information, making it a<br />
renewable energy source. In electronic form, neither its production nor its use<br />
generates w<strong>as</strong>te products that damage the environment. In that sense,<br />
information is the ultimate “green” energy.<br />
The online encyclopedia Wikipedia, for example, isn’t written by hired experts.<br />
It’s written by volunteers who post articles on subjects they either know or<br />
think they know something about. Within certain limits, anyone else can edit,<br />
correct, or change entries. Over time, the articles evolve into a useful and<br />
reliable form. No money changes hands in either the creation of Wikipedia or<br />
its use.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 281<br />
Universality<br />
Everyone h<strong>as</strong> the ability to use the same information simultaneously. Blog<br />
entries, news articles, and YouTube videos can be enjoyed simultaneously by an<br />
unlimited number of people. The only distribution costs are the photons on a<br />
screen. Each consumer, moreover, may have a completely different re<strong>as</strong>on for<br />
consuming the same information, and perhaps her own response to it. She may<br />
be inspired to respond with information of her own.<br />
Facebook, for example, is comprised almost entirely of user-generated content.<br />
Users are constantly commenting on status updates, photographs, and other<br />
information posted by their friends, and inviting each other to join interest<br />
groups or play information games.<br />
Magnetism<br />
Private goods operate under the law of supply and demand: The greater the<br />
supply, the lower the price you can charge for it, and vice versa. The value of<br />
information, on the other hand, incre<strong>as</strong>es <strong>as</strong> a function of use. Information<br />
value grows exponentially <strong>as</strong> new users absorb it. The more places my brand or<br />
logo appears, the higher the value customers attach to all my goods. Use makes<br />
the brand more, not less, valuable.<br />
This incre<strong>as</strong>e in value accelerates <strong>as</strong> the information spreads, creating a kind of<br />
magnetic pull that generates network effects. Since no one owns the Internet’s<br />
protocols, for example, these standards have spread e<strong>as</strong>ily, resulting in the<br />
explosive growth that began in the 1990s. The standards are now more valuable<br />
than when only a few people used them.<br />
Friction-Free<br />
The more e<strong>as</strong>ily information flows, the more quickly its value incre<strong>as</strong>es. In<br />
electronic form, information can move in any direction at the speed of light. It<br />
experiences no decay along the way, arriving at its destinations in the same form<br />
<strong>as</strong> when it departed. For many kinds of information, including languages,<br />
religious doctrines, and advertising, the e<strong>as</strong>e of transfer helps to improve<br />
society—or at le<strong>as</strong>t the profits of those who disseminate it. The cheaper it is to<br />
spread the word, the more likely and quickly it will be spread.<br />
There is, however, an inherent paradox: The frictionless spread of information<br />
can undermine the incentives for its production. Content producers, including<br />
authors, musicians, news organizations, and movie studios, invest heavily in the<br />
production of new information. To recover their investment, information<br />
producers must charge for its use. Economically, however, even the simplest<br />
payment schemes (subscriptions, for example) slow the natural tendency of<br />
information to move freely. Since information flows along the path of le<strong>as</strong>t
282 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
resistance, markets look for ways to avoid fees. In that sense, new technologies<br />
often subvert old business paradigms, even when the inventors of those<br />
technologies didn’t intend for them to do so.<br />
Vulnerability<br />
Information’s value does share one property with tangible goods: It’s not<br />
indestructible. Value can be destroyed through misuse. If you license your<br />
company’s name (and thus its reputation) to an inferior product or a product<br />
that does not have a clear connection to your brand, you risk confusing<br />
consumers about what your brand stands for. Value can also be destroyed by a<br />
third party, perhaps a competitor offering a knockoff product that looks like<br />
yours but is of lesser quality. Or an identity thief can appropriate your name<br />
and credit history to borrow money from banks or credit card companies.<br />
When the thief disappears, not only is the money gone, so is your reputation.<br />
Information can also be a victim of its own success. It is now so e<strong>as</strong>y to<br />
produce, distribute, and consume information that users are experiencing<br />
overload. Today, websites, e-mails, blogs, text messages, and even “tweets”—<br />
brief messages that reflect the thoughts of a user on Twitter—all compete for<br />
users’ limited time. As the sources of information and the volume produced<br />
expand rapidly, consumers find it incre<strong>as</strong>ingly difficult to limit their exposure to<br />
information of real value to them.<br />
* * *<br />
It’s e<strong>as</strong>y to see these five principles in action in digital life. Consider Google.<br />
One might wonder how a company can be worth anything, let alone $100<br />
billion, when it charges absolutely nothing for its products and services. You<br />
can search Google’s datab<strong>as</strong>es and use its e-mail service day and night without<br />
spending a penny; you can store photos on its Pic<strong>as</strong>a photo service, create<br />
documents with its online word processing software, view Google Maps, and<br />
share videos on its YouTube service for free. Indeed, the company is<br />
determined to have <strong>as</strong> many people <strong>as</strong> possible take complete advantage of it.<br />
Even though datab<strong>as</strong>es and other services are what consumers want and<br />
therefore represent the source of the company’s value, Google does not hoard<br />
those <strong>as</strong>sets <strong>as</strong> if they were barrels of oil to be used only when necessary. It<br />
treats them instead <strong>as</strong> non-rivalrous goods that incre<strong>as</strong>e in value <strong>as</strong> more people<br />
use them.<br />
The company isn’t being generous. Google makes nearly all its money by<br />
renting out advertising space to companies whose products and services<br />
complement the things consumers do when they are using Google. If<br />
information “wants to be free,” then let it be <strong>as</strong> free <strong>as</strong> possible, and make all<br />
the profits from the collateral effects of the network. That’s the company’s
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 283<br />
simple business strategy, one that h<strong>as</strong> created remarkable new value even <strong>as</strong> it<br />
disrupts the <strong>as</strong>sumptions of every industry the company touches.<br />
And for social networking sites, just giving away content isn’t good enough.<br />
These companies also have to find ways to get users to help them develop their<br />
sites in the first place. Companies like Facebook, MySpace, and their<br />
professional equivalent, LinkedIn, are constantly adding free tools and gadgets<br />
to make their products more compelling. Once a service reaches the tipping<br />
point (Facebook now h<strong>as</strong> 500 million users!), the search for ways to make<br />
money begins in earnest, including premium services and targeted<br />
advertisements. But thanks to the weird economics of information, there<br />
remain powerful re<strong>as</strong>ons not to charge the users for the core product—ever.<br />
The Problem of Transaction Costs<br />
There’s one additional <strong>as</strong>pect of information economics that is essential to<br />
understand. The frictionless transfer of information and the problem of<br />
information overload suggest that the economy of digital life is a kind of<br />
machine. Like the best engines, it can operate with remarkable efficiency—<br />
provided its parts are kept lubricated and free of foreign matter. Already,<br />
technologies perfected during the l<strong>as</strong>t digital decade have ruthlessly eliminated<br />
w<strong>as</strong>te in our incre<strong>as</strong>ingly efficient online lives. Still, the information economy is<br />
not perfect. It suffers, like its physical counterpart, from a kind of inefficiency,<br />
what economist Ronald Co<strong>as</strong>e first called “transaction costs.”<br />
Co<strong>as</strong>e came to the United States from England <strong>as</strong> an economics graduate<br />
student in 1931. Only twenty years old, Co<strong>as</strong>e had a revolutionary agenda.<br />
Struggling to reconcile the socialism of his youth with the free-market sensibility<br />
of his professors, Co<strong>as</strong>e saw big companies <strong>as</strong> proof that centralizing activities<br />
could work on a grand scale. If he could learn how big companies did it, Co<strong>as</strong>e<br />
imagined, then perhaps the lessons could be applied to big governments <strong>as</strong> well.<br />
Oddly enough, no one had ever <strong>as</strong>ked why companies existed, and certainly no<br />
one had ever thought to <strong>as</strong>k the people who were running them.<br />
What Co<strong>as</strong>e learned made him swear off socialism forever, and led to the<br />
publication of an article that changed economic thinking forever—an article<br />
cited <strong>as</strong> revolutionary sixty years later, when Co<strong>as</strong>e received the Nobel Prize in<br />
Economics.<br />
In “The Nature of the Firm,” Co<strong>as</strong>e argued that there is a price not only to<br />
what companies buy and sell, but also to the process of buying and selling it. 10<br />
Buyers and sellers have to find each other, negotiate deals, and then<br />
10 Ronald H. Co<strong>as</strong>e, The Nature of the Firm, 4 ECONOMICA 368-405 (Nov., 1937),<br />
http://aetds.hnuc.edu.cn/uploadfile/20080316211913444.pdf.
284 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
consummate them. This activity w<strong>as</strong> neither especially e<strong>as</strong>y nor without costs.<br />
Co<strong>as</strong>e therefore argued that companies were becoming bigger because markets<br />
were, relatively speaking, too expensive.<br />
Co<strong>as</strong>e called the price of doing a deal its “transaction cost.” The existence of<br />
transaction costs, he believed, explained why companies were internalizing more<br />
and more activities, especially repeated functions like buying raw materials and<br />
marketing. For these activities, maintaining an inside function such <strong>as</strong> a<br />
purch<strong>as</strong>ing department w<strong>as</strong> cheaper than relying for each individual purch<strong>as</strong>e on<br />
whoever might happen to be in the market.<br />
To understand why, let’s take a simple example. Say you work for an averagesize<br />
company and you’ve run out of paper clips. Almost <strong>as</strong>suredly, you will get<br />
your paper clips not by leaving your office to drive to the office supply store but<br />
by going down the hall to the supply cabinet, where your company’s purch<strong>as</strong>ing<br />
department maintains an inventory of b<strong>as</strong>ic supplies. Your company will, in<br />
fact, keep such b<strong>as</strong>ic supplies on hand <strong>as</strong> a matter of course, without giving<br />
much thought to the cost of carrying this inventory. This holds true even if<br />
buying and distributing office supplies have nothing to do with what your<br />
business does. Your company is likely to keep paper clips on hand even if there<br />
is no discount for buying in bulk.<br />
Why? Even if you could get paper clips on your own for the same price, you<br />
still have to go out and get them. This means finding the stores that carry them<br />
and learning how much they charge. Then you have to choose between the<br />
closest store and the one with the best price. At the checkout stand, you need<br />
to make sure you are really charged what the store advertises. If the clips are<br />
somehow defective, you have to take them back and demand replacements or<br />
some other remedy.<br />
And that’s just for a simple transaction. Imagine instead that you’re buying raw<br />
materials needed to manufacture a jet airplane. There is the additional effort of<br />
negotiating a price, writing a contract, inspecting the goods, and, potentially,<br />
invoking the legal system to enforce the terms and conditions. It’s better, you<br />
say, to own the supplier or at le<strong>as</strong>t to buy in bulk and avoid all that trouble.<br />
That “trouble” is transaction costs.<br />
Working from Co<strong>as</strong>e’s b<strong>as</strong>ic idea, economists have identified six main types of<br />
transaction costs:<br />
� Search costs: Buyers and sellers must find each other in incre<strong>as</strong>ingly<br />
diverse and distributed markets.<br />
� Information costs: For buyers, learning about the products and<br />
services of sellers and the b<strong>as</strong>is for their cost, profit margins, and
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 285<br />
quality; for sellers, learning about the legitimacy, financial condition,<br />
and needs of the buyer, which may lead to a higher or lower price.<br />
� Bargaining costs: Buyers and sellers setting the terms of a sale, or<br />
contract for services, which might include meetings, phone calls, letters,<br />
faxes, e-mails, exchanges of technical data, brochures, meals and<br />
entertainment, and the legal costs of contract negotiations.<br />
� Decision costs: For buyers, comparing the terms of one seller to<br />
other sellers, and processes such <strong>as</strong> purch<strong>as</strong>ing approval designed to<br />
ensure that purch<strong>as</strong>es meet the policies of the organization; for sellers,<br />
evaluating whether to sell to one buyer instead of another buyer or not<br />
at all.<br />
� Policing costs: Buyers and sellers taking steps to ensure that the good<br />
or service and the terms under which the sale w<strong>as</strong> made, which may<br />
have been ambiguous or even unstated, are translated into the behavior<br />
expected by each party. This might include inspecting the goods and<br />
any negotiations having to do with late or inadequate delivery or<br />
payment.<br />
� Enforcement costs: Buyers and sellers agreeing on remedies for<br />
incomplete performance. These include everything from mutual<br />
agreements for a discount or other penalties to expensive litigation.<br />
As this list suggests, transaction costs range from the trivial (turning over a box<br />
of paper clips to see the price) to amounts greatly in excess of the transaction<br />
itself (imagine if you were seriously injured by a defective paper clip flying off<br />
the shelf and sticking you in the eye). In fact, economists Dougl<strong>as</strong>s North and<br />
John Wallis have estimated that up to 45%of total economic activity consists of<br />
transaction costs. 11 Eliminating them entirely would translate to a staggering<br />
$4.5 trillion in annual savings in the United States alone, eliminating much of<br />
the work done by accountants, lawyers, advertisers, and government agencies.<br />
One needn’t go that far to improve economic performance, however. Firms are<br />
created, Co<strong>as</strong>e concluded, because the additional cost of organizing and<br />
maintaining them is cheaper than the transaction costs involved when<br />
individuals conduct business with each other using the market. Firms, while<br />
suffering inefficiencies of their own, are more efficient at certain types of<br />
activities than the market. Technologies—in 1937, Co<strong>as</strong>e had in mind<br />
telephones, in particular—improved the performance of one or both, constantly<br />
resetting the balance between what w<strong>as</strong> best to internalize and what w<strong>as</strong> best<br />
left to the market.<br />
11 John Joseph Wallis & Dougl<strong>as</strong>s C. North, Me<strong>as</strong>uring the Transaction Sector in the American<br />
Economy, in LONG-TERM FACTORS IN AMERICAN GROWTH 95-162 (Stanley L. Engerman &<br />
Robert E. Gallman, eds. 1986), http://www.nber.org/chapters/c9679.
286 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
So which functions should a firm perform internally? The deceptively-simple<br />
answer is only those activities that cannot be performed more cheaply in the<br />
market or by another firm. In fact, <strong>as</strong> Co<strong>as</strong>e says, a firm will tend to expand<br />
precisely to the point where “the costs of organizing an extra transaction within<br />
the firm become equal to the costs of carrying out the same transaction by<br />
means of an exchange on the open market.” 12<br />
For some activities, say plumbing, the open market works relatively well, and<br />
the need for plumbers to form large firms to avoid transaction costs h<strong>as</strong> never<br />
arisen. For the large-scale operations of integrated manufacturers, such <strong>as</strong><br />
Boeing and General Motors, which require coordination, heavy capital<br />
investment, and complex distribution systems, the firm is the only economically<br />
viable solution.<br />
Co<strong>as</strong>e believed economists should turn their attention to the practical problem<br />
of uncovering transaction costs wherever they occur and eliminating those that<br />
are unnecessary. Doing so, he hoped, would, among other things, help reduce<br />
the need for government intervention. A great deal of regulation and liability<br />
laws, Co<strong>as</strong>e argued, were unconscious efforts to overcome transaction costs for<br />
certain types of activities, such <strong>as</strong> accidents and pollution. But the regulations<br />
themselves generate so many transaction costs that in many c<strong>as</strong>es doing nothing<br />
at all would have produced a better result. To find out how much law and<br />
regulation are optimal requires a better understanding, once again, of the costs<br />
involved.<br />
Co<strong>as</strong>e had hoped his elegant proof would get economists working on the real<br />
problem at hand. Ironically, all he did w<strong>as</strong> make economics more esoteric.<br />
Instead of lowering themselves to the kind of empirical research that w<strong>as</strong><br />
common in other social sciences, economists simply dispose of Co<strong>as</strong>e in an<br />
opening footnote. They “<strong>as</strong>sume a frictionless economy” and then proceed to<br />
develop elaborate mathematical models of behavior in a purely theoretical<br />
universe. Rather than join his quest, most economists retreated to more<br />
abstract models of economic behavior, which Co<strong>as</strong>e dismisses <strong>as</strong> little more<br />
than a “v<strong>as</strong>t mopping-up exercise” of loose ends left by Adam Smith’s seminal<br />
18 th century work, The Wealth of Nations.<br />
Incre<strong>as</strong>ingly frustrated with his economist colleagues, Co<strong>as</strong>e instead took up<br />
residence at the University of Chicago’s law school. Economists, he came to<br />
see, avoided information, and misused the few sources, such <strong>as</strong> government<br />
data, that were readily available. Economics had become a shell game. “If you<br />
torture the data enough,” he wrote, dismissing much of modern economic<br />
12 Co<strong>as</strong>e, The Nature of the Firm, supra note 10.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 287<br />
analysis, “nature will always confess.” 13 If that w<strong>as</strong> all economists could do,<br />
Co<strong>as</strong>e decided he w<strong>as</strong> no economist. Awarded the Nobel Prize in 1991, Co<strong>as</strong>e<br />
began his acceptance speech on a note of despair. “In my long life I have<br />
known some great economists,” he told the committee, “but I have never<br />
counted myself among their number nor walked in their company.” 14<br />
Look at the performance of the economy over the p<strong>as</strong>t twenty years and it’s<br />
e<strong>as</strong>y to sympathize with Co<strong>as</strong>e’s frustration. The “rational” stock market still<br />
booms and busts. Cyclical industries continue to overexpand and then<br />
overcontract. Efforts at creating an open global economy without trade barriers<br />
are met with rioting mobs. National banking regulators read every tea leaf they<br />
can find and still go to bed wondering if they have cut rates too soon or too<br />
late, too much or too little, or even if their cuts have made an iota of difference.<br />
While most economists fiddle with formul<strong>as</strong>, the economy is burning. Without<br />
a better understanding of the nature of transaction costs, we’ll never be able to<br />
predict—let alone improve—what seem to be the most b<strong>as</strong>ic elements of<br />
economic behavior.<br />
That, in any c<strong>as</strong>e, is the real world. In the digital world, the problem is not only<br />
less severe, but also solvable. The free flow of information made possible by<br />
digital technology is decre<strong>as</strong>ing the friction of transaction costs in a variety of<br />
interactions. From global price comparisons to searches of much of the world’s<br />
knowledge to auctions for anything, the cost of deal-making is plummeting.<br />
The Internet is driving down all six types of transaction costs. That’s what’s<br />
made the Internet so disruptive in the l<strong>as</strong>t decade, and what will continue to<br />
drive dramatic consumer, business, and regulatory changes in the next digital<br />
decade.<br />
Consider a few examples:<br />
1. Search costs: Technology connects people across geographical, time, and<br />
national borders. Automatic notifications for obscure collectibles on eBay,<br />
finding old friends through the “People You May Know” feature on<br />
Facebook, or letting your TiVo pick programs for you that it thinks you<br />
might like to watch—each of these reduces search costs, sometimes<br />
dramatically. Restaurant and other business reviews available directly on<br />
cell phones make it e<strong>as</strong>ier to find just the right place no matter where you<br />
are. There’s even an iPhone application uses GPS technology to help you<br />
find your car in a crowded parking lot!<br />
13 Ronald H. Co<strong>as</strong>e, How Should Economists Choose?, G. Warren Nutter Lecture in Political<br />
Economy, American Enterprise Institute (1982).<br />
14 Ronald H. Co<strong>as</strong>e, Nobel Prize Lecture, Dec. 9, 1991,<br />
http://nobelprize.org/nobel_prizes/economics/laureates/1991/co<strong>as</strong>e-lecture.html.
288 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
2. Information costs: Technology creates standard data structures that can be<br />
searched and consolidated over a growing network of computers. The<br />
<strong>as</strong>ymmetry of sellers concealing data h<strong>as</strong> eroded, radically changing the way<br />
people buy cars, real estate, and investment securities. Free or subscription<br />
services including CarFax, Zillow, and Yahoo! Finance give buyers an<br />
abundance of valuable information that w<strong>as</strong> previously inaccessible at any<br />
price. Online dating services such <strong>as</strong> Chemistry.com incre<strong>as</strong>ingly use<br />
sophisticated profiling technology to suggest compatible matches.<br />
3. Bargaining costs: The exchange of information can now take place<br />
digitally and is captured in datab<strong>as</strong>es for e<strong>as</strong>y reuse in subsequent<br />
transactions. Instant publication of cl<strong>as</strong>sified ads on Craigslist means many<br />
local transactions are completed within minutes. Business-to-business<br />
transactions incre<strong>as</strong>ingly rely on libraries of standard terms. The nonprofit<br />
Association for Cooperative Operations Research and Development<br />
(ACORD), for example, uses the XML data standard to create standard<br />
forms used by insurance and reinsurance agents and brokers offering life,<br />
property, and other lines of products.<br />
4. Decision costs: Visibility to expanded online markets gives both buyers<br />
and sellers a better picture of minute-to-minute market conditions. Several<br />
insurance websites, including Progressive.com, provide instant quotes and<br />
comparisons to the prices of their competitors. Cell phone users can<br />
compare prices from online merchants while shopping at retail stores,<br />
putting added pressure on merchants to match or beat those prices or offer<br />
other incentives, including delivery or after-sales support. Online gamers<br />
can check the reputation of potential participants to decide whether to<br />
allow them to join their teams.<br />
5. Policing costs: Transactions conducted with system-to-system data<br />
transfers create a more complete record of the actual performance of the<br />
participants, which can then be captured and queried. For goods purch<strong>as</strong>ed<br />
online, most merchants now provide direct access to detailed shipping and<br />
tracking information from expediters such <strong>as</strong> UPS or FedEx or even<br />
standard delivery from the postal service. Some merchants, including Dell<br />
Computers, provide information about the manufacturing process, allowing<br />
customers to track their products before they are even shipped. Most<br />
software products now collect bug and other failure information in real<br />
time, automatically installing updates and repairs. Players of the online<br />
World of Warcraft game can “speak” directly to in-game employees or<br />
robots whenever they have a problem.<br />
6. Enforcement costs: Electronic records can simplify the process of<br />
resolving disputes over what w<strong>as</strong> agreed upon or what did or did not occur.<br />
Online payment services such <strong>as</strong> PayPal offer elaborate dispute resolution
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 289<br />
functions that include mediation and arbitration when buyers and sellers<br />
cannot resolve their differences, along with insurance and guaranteed<br />
satisfaction. These are all supported by the collection of end-to-end<br />
transaction data documenting the actual performance of buyers and sellers.<br />
Bloggers can quickly whip up electronic mobs to put pressure on<br />
companies, politicians, or celebrities whose behavior they feel does not<br />
comply with agreed-upon standards.<br />
Conclusion: Conflicts at the Border<br />
<strong>Digital</strong> technology, <strong>as</strong> I argued in my 1998 book, “Unle<strong>as</strong>hing the Killer App,” 15<br />
h<strong>as</strong> created a corollary to Co<strong>as</strong>e’s observation about business organizations. As<br />
transaction costs in the open market approach zero, so does the size of the<br />
firm—if transaction costs are nonexistent, then there is no re<strong>as</strong>on to have large<br />
companies. For products constructed entirely or largely out of information, we<br />
now stand on the verge of what Don Tapscott and Anthony Williams call “peer<br />
production,” where just the right group of people come together to apply the<br />
right set of skills to solve complex problems, whether in business or<br />
otherwise. 16 I called this phenomenon “The Law of Diminishing Firms.”<br />
Technology is changing the dynamics of firms, making them smaller but more<br />
numerous. This, however, is good for the overall economy. Efficiency<br />
translates to savings of time, money, and decre<strong>as</strong>ed w<strong>as</strong>te. Productivity,<br />
customer satisfaction, and the availability of customized products and services<br />
have improved dramatically. Keeping in touch across time zones and long<br />
distances gets e<strong>as</strong>ier, <strong>as</strong> does organizing diverse groups of people for social,<br />
political, or business re<strong>as</strong>ons. The average consumer can now edit an online<br />
encyclopedia, post news and photos <strong>as</strong> a citizen journalist, or operate a homeb<strong>as</strong>ed<br />
business that can produce and distribute just about anything.<br />
Now for the bad news. Our current legal system, forged in the factories of the<br />
Industrial Revolution, w<strong>as</strong> designed to maximize the value of rivalrous goods.<br />
It cannot be e<strong>as</strong>ily modified to deal with the unique economic properties of<br />
information. Worse, the crushing overhead of regulations and lawsuits, which<br />
may no longer be cost-effective even in the physical world, adds even less value<br />
when applied to the lower-transaction-cost-environment of digital life.<br />
Incre<strong>as</strong>ingly, the old rules do little more than hold back innovation for the<br />
benefit of those who cannot or do not know how to adapt to the economics of<br />
digital life. In many c<strong>as</strong>es, inefficient laws are propped up by failing businesses<br />
15 LARRY DOWNES & CHUNKA MUI, UNLEASHING THE KILLER APP: DIGITAL STRATEGIES FOR<br />
MARKET DOMINANCE (1998).<br />
16 DON TAPSCOTT & ANTHONY WILLIAMS, WIKINOMICS: HOW MASS COLLABORATION<br />
CHANGES EVERYTHING (2003).
290 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
that are not eager to see their advantages er<strong>as</strong>ed. Sometimes those fighting<br />
transformation are powerful business interests, including large media<br />
companies, real estate agents, and even some of the technology companies<br />
whose products fuel the digital revolution.<br />
Resistance may also come from the users themselves. In digital life, private<br />
information can be invaluable in deciding who to interact with, either for<br />
business or for interpersonal transactions. As advances in technology bring<br />
more private information online, powerful emotions have been activated.<br />
Citizens in much of the world believe their rights to privacy are being violated,<br />
not only by businesses but by their cl<strong>as</strong>smates and neighbors.<br />
Perhaps most worrisome, governments are taking advantage of lower<br />
transaction costs to improve the technology of surveillance, raising fears of the<br />
dystopic world described by George Orwell in his novel “1984.”<br />
Lower transaction costs have also proven useful to criminals and terrorists, who<br />
operate freely and anonymously in digital realms. Sometimes their crimes<br />
exploit the vulnerability of information, <strong>as</strong> in the c<strong>as</strong>e of identity theft and other<br />
forms of Internet fraud. More ominously, virtual gangs are able to attack the<br />
infr<strong>as</strong>tructure of the Internet itself, rele<strong>as</strong>ing viruses and other harmful software<br />
that incapacitate servers, destroy data, or, in the c<strong>as</strong>e of spam, simply w<strong>as</strong>te<br />
people’s most precious resource: time.<br />
Perhaps the most difficult problems of information economics, however,<br />
involve the pl<strong>as</strong>ticity of information in electronic form. Technology h<strong>as</strong> made it<br />
possible to realize the remarkable potential of information to be shared and<br />
even enhanced by <strong>as</strong> many people <strong>as</strong> are interested. Inevitably, every new<br />
innovation that supports this creative urge runs headlong into laws protecting<br />
information <strong>as</strong> property—laws that treat public goods <strong>as</strong> if they were private<br />
goods. Although such laws may be necessary, they have proven unduly rigid in<br />
their current form, sparking some of the most vitriolic fights on the digital<br />
frontier.<br />
The explosion of digital technology at home, at work, and in government,<br />
coupled with the economics of information, h<strong>as</strong> created a perfect storm. Our<br />
industrial-age legal system will not survive this social transformation. After the<br />
flood, <strong>as</strong> in previous technological revolutions, a new legal paradigm will emerge<br />
to guide the construction of laws better suited to digital life.<br />
Implementing these new laws will require a great deal of coordination and<br />
collaboration. Most of all, it will require considerable courage on the part of<br />
those who live in both the physical and digital worlds. The next digital decade,<br />
like the l<strong>as</strong>t one, will proceed in fits and starts, with surprising changes of c<strong>as</strong>t<br />
and characters, allies becoming enemies and enemies finding common ground.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 291<br />
Some winners and losers will prove, in retrospect, to have been e<strong>as</strong>ily predicted.<br />
Others will come from nowhere.<br />
The only thing certain is the author of the script: the poorly-understood but<br />
incre<strong>as</strong>ingly critical economic properties of information.
292 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 293<br />
The Regulation of<br />
Reputational Information<br />
By Eric Goldman*<br />
Introduction<br />
This essay considers the role of reputational information in our marketplace. It<br />
explains how well-functioning marketplaces depend on the vibrant flow of<br />
accurate reputational information, and how misdirected regulation of<br />
reputational information could harm marketplace mechanisms. It then explores<br />
some challenges created by the existing regulation of reputational information<br />
and identifies some regulatory options for the future.<br />
Reputational Information Defined<br />
Typical definitions of “reputation” focus on third-party cognitive perceptions of<br />
a person. 1 For example, Black’s Law Dictionary defines reputation <strong>as</strong> the “esteem<br />
in which a person is held by others.” 2 Bryan Garner’s A Dictionary of Modern<br />
Legal Usage defines reputation <strong>as</strong> “what one is thought by others to be.” 3 The<br />
Federal Rules of Evidence also reflect this perception-centric view of<br />
“reputation.” 4<br />
* Associate Professor and Director, High Tech Law Institute, Santa Clara University School of<br />
Law. Email: egoldman@gmail.com. Website: http://www.ericgoldman.org. In<br />
addition to a stint <strong>as</strong> General Counsel of Epinions.com, a consumer review website now<br />
part of the eBay enterprise, I have provided legal or consulting advice to some of the other<br />
companies mentioned in this essay. I prepared this essay in connection with a talk at the<br />
Third Annual Conference on the Law and Economics of Innovation at George M<strong>as</strong>on<br />
University, May 2009.<br />
1 As one commentator explained:<br />
Through one’s actions, one relates to others and makes impressions on them.<br />
These impressions, taken <strong>as</strong> a whole, constitute an individual’s reputation—<br />
that is, what other people think of you, to the extent that their thoughts arise<br />
from what they know about you, or think they know about you.<br />
Elizabeth D. De Armond, Frothy Chaos: Modern Data Warehousing and Old-F<strong>as</strong>hioned Defamation,<br />
41 VAL. U.L. REV. 1061, 1065 (2007).<br />
2 BLACK’S LAW DICTIONARY (8th ed. 2004).<br />
3 BRYAN A. GARNER, A DICTIONARY OF MODERN LEGAL USAGE (1990).<br />
4 See, e.g., FED. R. EVID. 803(19), 803(21).
294 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
Although this definition is useful so far <strong>as</strong> it goes, I am more interested in how<br />
information affects prospective decision-making. 5 Accordingly, I define<br />
“reputational information” <strong>as</strong> follows:<br />
information about an actor’s p<strong>as</strong>t performance that helps<br />
predict the actor’s future ability to perform or to satisfy the<br />
decision-maker’s preferences.<br />
This definition contemplates that actors create a pool of data (both subjective<br />
and objective) through their conduct. This pool of data—the reputational<br />
information—can provide insights into the actor’s likely future behavior.<br />
Reputation Systems<br />
“Reputation systems” aggregate and disseminate reputational information to<br />
consumers of that information. Reputation systems can be mediated or<br />
unmediated.<br />
In unmediated reputation systems, the producers and consumers of reputational<br />
information communicate directly. Examples of unmediated reputation systems<br />
include word of mouth, letters of recommendation and job references.<br />
In mediated reputation systems, a third-party publisher gathers, organizes and<br />
publishes reputational information. Examples of mediated reputation systems<br />
include the Better Business Bureau’s ratings, credit reports/scores, investment<br />
ratings (such <strong>as</strong> Morningstar mutual fund ratings and Moody bond ratings), and<br />
consumer review sites.<br />
The Internet h<strong>as</strong> led to a proliferation of mediated reputation systems, and in<br />
particular consumer review sites. 6 Consumers can review just about anything<br />
online; examples include:<br />
� eBay’s feedback forum, 7 which allows eBay’s buyers and sellers to rate<br />
each other.<br />
� Amazon’s product reviews, which allows consumers to rate and review<br />
millions of marketplace products.<br />
� Yelp.com, which allows consumers to review local businesses.<br />
5 Luis M.B. Cabral, The Economics of Trust and Reputation: A Primer (June 2005 draft),<br />
http://pages.stern.nyu.edu/~lcabral/reputation/Reputation_June05.pdf (treating<br />
information about reputation <strong>as</strong> inputs into Bayesian calculations).<br />
6 Indeed, this h<strong>as</strong> spurred the formation of an industry <strong>as</strong>sociation, the Rating and Review<br />
Professional Association. http://www.rarpa.org.<br />
7 http://pages.ebay.com/services/forum/feedback.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 295<br />
� TripAdvisor.com, which allows consumers to review hotels and other<br />
travel attractions.<br />
� RealSelf.com, which allows consumers to review cosmetic surgery<br />
procedures.<br />
� Avvo.com, which allows consumers to rate and review attorneys.<br />
� Gl<strong>as</strong>sdoor.com, which allows employees to share salary information<br />
and critique the working conditions at their employers.<br />
� Honestly.com, 8 which allows co-workers to review each other.<br />
� RateMyProfessors.com, which allows students to publicly rate and<br />
review their professors.<br />
� DontDateHimGirl.com, which allows people to create and “find<br />
profiles of men who are alleged cheaters.” 9<br />
� TheEroticReview.com, which allows johns to rank prostitutes. 10<br />
Why Reputational Information Matters<br />
In theory, the marketplace works through an “invisible hand”: consumers and<br />
producers make individual and autonomous decisions that, without any<br />
centralized coordination, collectively determine the price and quantity of goods<br />
and services. When it works properly, the invisible hand maximizes social<br />
welfare by allocating goods and services to those consumers who value them<br />
the most.<br />
A properly functioning invisible hand also should reward good producers and<br />
punish poor ones. Consumers allocating their scarce dollars in a competitive<br />
market will transact with producers who provide the best cost or quality<br />
options. Over time, uncompetitive producers should be drummed out of the<br />
industry by the aggregate but uncoordinated choices of rational and informed<br />
consumers.<br />
However, given the transaction costs inherent in the real world, the invisible<br />
hand can be subject to distortions. In particular, to the extent information<br />
8 Honestly.com w<strong>as</strong> previously called Unvarnished. See Evelyn Rusli, Unvarnished: A Clean,<br />
Well-Lighted Place For Defamation, TECHCRUNCH, Mar. 30, 2010,<br />
http://techcrunch.com/2010/03/30/unvarnished-a-clean-well-lighted-place-fordefamation/.<br />
9 PlayerBlock is a similar service, tracking undesirable dating prospects by their cellphone<br />
number. See Leslie Katz, Is Your Date a Player? Send a Text and Find Out, CNET News.com,<br />
Oct. 22, 2007, http://news.cnet.com/8301-10784_3-9802025-7.html.<br />
10 See Matt Richtel, Sex Trade Monitors a Key Figure’s Woes, N.Y. TIMES, June 17, 2008. PunterNet<br />
is another website in this category, providing reviews of British sex workers. John Omizek,<br />
PunterNet Thanks Harriet for M<strong>as</strong>sive Upswing, THE REGISTER, Oct. 5, 2009,<br />
http://www.theregister.co.uk/2009/10/05/punternet_harman/.
296 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
about producers is costly to obtain or use, consumers may lack crucial<br />
information to make accurate decisions. To that extent, consumers may not be<br />
able to e<strong>as</strong>ily compare producers or their price/quality offerings, in which c<strong>as</strong>e<br />
good producers may not be rewarded and bad producers may not be punished.<br />
When information is costly, reputational information can improve the operation<br />
of the invisible hand by helping consumers make better decisions about<br />
vendors. In this sense, reputational information acts like an invisible hand<br />
guiding the invisible hand (an effect I call the “secondary invisible hand”),<br />
because reputational information can guide consumers to make marketplace<br />
choices that, in aggregate, effectuate the invisible hand. Thus, in an information<br />
economy with transaction costs, reputational information can play an essential<br />
role in rewarding good producers and punishing poor ones.<br />
Given this crucial role in marketplace mechanisms, any distortions in<br />
reputational information may effectively distort the marketplace itself. In effect,<br />
it may cause the secondary invisible hand to push the invisible hand in the<br />
wrong direction, allowing bad producers to escape punishment and failing to<br />
reward good producers. To avoid this unwanted consequence, any regulation<br />
of reputational information needs to be carefully considered to ensure it is<br />
improving, not harming, marketplace mechanisms.<br />
Note that the secondary invisible hand is, itself, subject to transaction costs. It<br />
is costly for consumers to find and <strong>as</strong>sess the credibility of reputational<br />
information. Therefore, reputation systems themselves typically seek to<br />
establish their own reputation. I describe the reputation of reputation systems<br />
<strong>as</strong> a “tertiary” invisible hand—it is the invisible hand that guides reputational<br />
information (the secondary invisible hand) to guide the invisible hand of<br />
individual uncoordinated decisions by marketplace actors (the primary invisible<br />
hand). Thus, the tertiary invisible hand allows the reputation system to earn<br />
consumer trust <strong>as</strong> a credible source (such <strong>as</strong> the Wall Street Journal, the New<br />
York Times or Consumer Reports) or to be drummed out of the market for<br />
lack of credibility (such <strong>as</strong> the now-defunct anonymous gossip website<br />
JuicyCampus). 11<br />
Thinking About Reputation Regulation<br />
This part explores some ways that the regulatory system interacts with<br />
reputation systems and some issues caused by those interactions.<br />
11 Matt Ivester, A Juicy Shutdown, JUICYCAMPUS BLOG, Feb. 4, 2009,<br />
http://juicycampus.blogspot.com/2009/02/juicy-shutdown.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 297<br />
Regulatory Heterogeneity<br />
Regulators have taken divergent approaches to reputation systems. For<br />
example, consider the three different regulatory schemes governing job<br />
references, credit reporting datab<strong>as</strong>es and consumer review websites:<br />
� Job references are subject to a mix of statutory (primarily state law) and<br />
common law tort regulation.<br />
� Credit reporting datab<strong>as</strong>es are statutorily micromanaged through the<br />
voluminous and detailed Fair Credit Reporting Act. 12<br />
� Consumer review websites are virtually unregulated, and many potential<br />
regulations of consumer review websites (such <strong>as</strong> defamation) are<br />
statutorily preempted.<br />
These different regulatory structures raise some related questions. Are there<br />
meaningful distinctions between reputation systems that support heterogeneous<br />
regulation? Are there “best practices” we can observe from these<br />
heterogeneous regulatory approaches that can be used to improve other<br />
regulatory systems? These questions are important because regulatory schemes<br />
can significantly affect the efficacy of reputation systems. As an example,<br />
consider the differences between the job reference and online consumer review<br />
markets.<br />
A former employer giving a job reference can face significant liability whether<br />
the reference is positive or negative. 13 Giving unfavorable references of former<br />
employees can lead to defamation or related claims; 14 and there may be liability<br />
for a former employee giving an incomplete positive reference. 15<br />
Employers may be statutorily required to provide certain objective information<br />
about former employees. 16 Otherwise, given the potentially no-win liability<br />
regime for communicating job references, most knowledgeable employers<br />
12 15 U.S.C. §§ 1681-81x.<br />
13 See Tresa Bald<strong>as</strong>, A R<strong>as</strong>h of Problems over Job References, NAT’L L.J., Mar. 10, 2008 (“Employers<br />
are finding that they are being sued no matter what course they take; whether they give a bad<br />
reference, a good reference or stay entirely silent.”).<br />
14 1-2 EMPLOYMENT SCREENING § 2.05 (Matthew Bender & Co. 2008) (hereinafter<br />
“EMPLOYMENT SCREENING”).<br />
15 Randi W. v. Muroc Joint Unified Sch. Dist., 14 Cal. 4th 1066 (1997).<br />
16 These laws are called “service letter statutes.” See EMPLOYMENT SCREENING, supra note 14.<br />
Germany h<strong>as</strong> a mandatory reference law requiring employers to furnish job references, but<br />
in response German employers have developed an elaborate system for coding the<br />
references. Matthew W. Finkin & Kenneth G. Dau-Schmidt, Solving the Employee Reference<br />
Problem, 57 AM. J. COMP. L. 387 (2009).
298 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
refuse to provide any subjective recommendations of former employees,<br />
positive or negative. 17<br />
To curb employers’ tendency towards silence, many states enacted statutory<br />
immunities to protect employers from lawsuits over job references. 18 However,<br />
the immunities have not changed employer reticence, which h<strong>as</strong> led to a virtual<br />
collapse of the job reference market. 19 As a result, due to mis-calibrated<br />
regulation, the job reference market fails to provide reliable reputational<br />
information.<br />
In contr<strong>as</strong>t, the online consumer review system is one of the most robust<br />
reputation systems ever. Millions of consumers freely share their subjective<br />
opinions about marketplace goods and services, and consumer review websites<br />
keep proliferating.<br />
There are several possible re<strong>as</strong>ons why consumer review websites might succeed<br />
where offline reputation systems might fail. My hypothesis, discussed in a<br />
companion essay in this collection, is that the difference is partially explained by<br />
47 U.S.C. § 230, p<strong>as</strong>sed in 1996—at the height of Internet exceptionalism—to<br />
protect online publishers from liability for third party content. Section 230 lets<br />
websites collect and organize individual consumer reviews without worrying<br />
about crippling legal liability for those reviews. As a result, consumer review<br />
websites can motivate consumers to share their opinions and then publish those<br />
opinions widely—<strong>as</strong> determined by marketplace mechanisms (i.e., the tertiary<br />
invisible hand), not concerns about legal liability.<br />
The success of consumer review websites is especially noteworthy given that<br />
individual reviewers face the same legal risks that former employers face when<br />
providing job references, such <strong>as</strong> the risk of personal liability for publishing<br />
negative reputational information. Indeed, numerous individuals have been<br />
sued for posting negative online reviews. 20 As a result, rational actors should<br />
find it imprudent to submit negative reviews; yet, millions of such reviews are<br />
published online. A number of theories might explain this discrepancy, but one<br />
theory is especially intriguing: Mediating websites, privileged by their own<br />
liability immunity, find innovative ways to get consumers over their fears of<br />
legal liability.<br />
17 See Bald<strong>as</strong>, supra note 13.<br />
18 The immunizations protect employer statements made in good faith. EMPLOYMENT<br />
SCREENING, supra note 14.<br />
19 See Finkin & Dau-Schmidt, supra note 16.<br />
20 See, e.g., Wendy Davis, Yelp Reviews Spawn At Le<strong>as</strong>t Five Lawsuits, MEDIAPOST ONLINE MEDIA<br />
DAILY, Jan. 21, 2009,<br />
http://www.mediapost.com/publications/?fa=Articles.printFriendly&art_aid=9877<br />
8; Agard v. Hill, 2010 U.S. Dist. LEXIS 35014 (E.D. Cal. 2010).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 299<br />
What lessons can we draw from this comparison? One possible lesson is that<br />
reputation systems are too important to be left to the market. In other words,<br />
the tertiary invisible hand may not ensure accurate and useful information, or<br />
the costs of inaccurate information (such <strong>as</strong> denying a job to a qualified<br />
candidate) may be too excessive. If so, extensive regulatory intervention of<br />
reputation systems may improve the marketplace.<br />
An alternative conclusion—and a more convincing one to me—is that the<br />
tertiary invisible hand, aided by a powerful statutory immunity like Section 230,<br />
works better than regulatory intervention. If so, we may get better results by<br />
deregulating reputation systems.<br />
System Configurations<br />
Given the regulatory heterogeneity, I wonder if there is an “ideal” regulatory<br />
configuration for reputation systems, especially given the tertiary invisible hand<br />
and its salutary effect on publisher behavior. Two brief examples illustrate the<br />
choices available to regulators, including the option of letting the marketplace<br />
operate unimpeded:<br />
Anti-Gaming. A vendor may have financial incentives to distort the flow of<br />
reputational information about it. This reputational gaming can take many<br />
forms, including disseminating false positive reports about the vendor, 21<br />
disseminating false negative reports about the vendor’s competitors, or<br />
manipulating an intermediary’s sorting or weighting algorithm to get more credit<br />
for positive reports or reduce credit for negative reports. Another sort of<br />
gaming can occur when users intentionally flood a reputation system with<br />
inaccurate negative reports <strong>as</strong> a form of protest. 22<br />
Do regulators need to curb this gaming behavior, or will other forces be<br />
adequate? There are several marketplace pressures that curb gaming, including<br />
competitors policing each other, 23 just <strong>as</strong> they do in false advertising c<strong>as</strong>es. 24 In<br />
21 Lifestyle Lift Holding, Inc. v. RealSelf Inc., 2:08-cv-10089-PJD-RSW (answer/counterclaims<br />
filed March 3, 2008), http://www.realself.com/files/Answer.pdf (alleging that Lifestyle<br />
Lift posted fake positive reviews about its own business to an online review website).<br />
22 For example, consumers protesting the digital rights management (DRM) in EA’s Spore<br />
game flooded Amazon’s review site with one-star reviews, even though many of them<br />
actually enjoyed the game. See Austin Modine, Amazon Fl<strong>as</strong>h Mob Mauls Spore DRM, THE<br />
REGISTER, Sept. 10, 2008,<br />
http://www.theregister.co.uk/2008/09/10/spore_drm_amazon_effect/. A similar<br />
protest hit Intuit’s TurboTax 2008 over its incre<strong>as</strong>ed prices. See Steven Musil, Amazon<br />
Reviewers Slam TurboTax Fee Changes, CNET NEWS.COM, Dec. 7, 2008,<br />
http://news.cnet.com/8301-1001_3-10117323-92.html.<br />
23 See Cornelius v. DeLuca, 2010 WL 1709928 (D. Idaho Apr. 26, 2010) (a marketplace vendor<br />
sued over alleged shill online reviews posted by competitors).
300 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
addition, the tertiary invisible hand may encourage reputation systems to<br />
provide adequate “policing” against gaming. However, when the tertiary<br />
invisible hand is weak, such <strong>as</strong> with fake blog posts where search engines are the<br />
only mediators, 25 government intervention might be worth considering.<br />
Right of Reply. A vendor may wish to publicly respond to reputational<br />
information published about it in an immediately adjacent f<strong>as</strong>hion. Many<br />
consumer review websites allow vendors to comment or otherwise reply to<br />
user-supplied reviews, but not all do. For example, Yelp initially drew<br />
significant criticism from business owners who could not effectively reply to<br />
negative Yelp reviews because of Yelp’s architecture, 26 but Yelp eventually<br />
relented and voluntarily changed its policy. 27 As another example, Google<br />
permitted quoted sources to reply to news articles appearing in Google News <strong>as</strong><br />
a way to “correct the record.” 28<br />
Regulators could require consumer review websites and other reputation<br />
systems to permit an adjacent response from the vendor. 29 But such<br />
intervention may not be necessary; the tertiary invisible hand can prompt<br />
reputation systems to voluntarily provide a reply option (<strong>as</strong> Yelp and Google<br />
did) when they think the additional information helps consumers.<br />
Undersupply of Reputational Information<br />
There are three primary categories of re<strong>as</strong>ons why reputational information may<br />
be undersupplied.<br />
24 See, e.g., Lillian R. BeVier, A Puzzle in the Law of Deception, 78 VA. L. REV. 1 (1992).<br />
25 See Press Rele<strong>as</strong>e, New York Office of the Attorney General, Attorney General Cuomo<br />
Secures Settlement With Pl<strong>as</strong>tic Surgery Franchise That Flooded Internet With False Positive<br />
Reviews, July 14, 2009,<br />
http://www.ag.ny.gov/media_center/2009/july/july14b_09.html.<br />
26 See Claire Cain Miller, The Review Site Yelp Draws Some Outcries of Its Own, N.Y. TIMES, Mar. 3,<br />
2009.<br />
27 See Claire Cain Miller, Yelp Will Let Businesses Respond to Web Reviews, N.Y. TIMES, Apr. 10,<br />
2009.<br />
28 See Dan Meredith & Andy Golding, Perspectives About the News from People in the News,<br />
GOOGLE NEWS BLOG, Aug. 7, 2007,<br />
http://googlenewsblog.blogspot.com/2007/08/perspectives-about-news-frompeople-in.html.<br />
29 See Frank A. P<strong>as</strong>quale, Rankings, Reductionism, and Responsibility, 54 CLEV. ST. L. REV. 115<br />
(2006); Frank A. P<strong>as</strong>quale, Asterisk Revisited: Debating a Right of Reply on Search Results, 3 J. BUS.<br />
& TECH. L. 61 (2008).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 301<br />
Inadequate Production Incentives<br />
Much reputational information starts out <strong>as</strong> non-public (i.e., “private”)<br />
information in the form of a customer’s subjective mental impressions about<br />
his/her interactions with the vendor. To the extent this information remains<br />
non-public, it does not help other consumers make marketplace decisions.<br />
These collective mental impressions represent a vital but potentially<br />
underutilized social resource.<br />
The fact that non-public information remains locked in consumers’ heads could<br />
represent a marketplace failure. If the social benefit from public reputational<br />
information exceeds the private benefit from making it public, then<br />
presumptively there will be an undersupply of public reputational information.<br />
If so, the government may need to correct this failure by encouraging the<br />
disclosure of reputational information—such <strong>as</strong> by creating a tort immunity for<br />
sites that host that disclosure, <strong>as</strong> Section 230 does, or perhaps by going further.<br />
But there already may be market solutions to this problem, <strong>as</strong> evidenced by the<br />
proliferation of online review websites eliciting lots of formerly non-public<br />
reputational information.<br />
Further, relatively small amounts of publicly disclosed reputational information<br />
might be enough to properly steer the invisible hand. For example, the first<br />
consumer review of a product in a reputation system creates a lot of value for<br />
subsequent consumers, but the 1,000 th consumer review of the same product<br />
may add very little incrementally. So even if most consumer impressions remain<br />
non-public, perhaps m<strong>as</strong>s-market products and vendors still have enough<br />
information produced to keep them honest. At the same time, vendors and<br />
products in the “long tail” 30 may have inadequate non-public impressions put<br />
into the public discourse, creating a valuable opportunity for comprehensive<br />
reputation systems to fix the omission. However, reputation systems will tackle<br />
these obscure marketplace options only when they can keep their costs low<br />
(given that consumer interest and traffic will, by definition, be low), and<br />
reputation system deregulation helps reduce both the costs of litigation <strong>as</strong> well<br />
<strong>as</strong> responding to takedown demands.<br />
30 Chris Anderson, The Long Tail, WIRED, Oct. 2004,<br />
http://www.wired.com/wired/archive/12.10/tail.html.
302 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
Vendor Suppression of Reputational Information<br />
Vendors are not shy about trying to suppress unwanted consumer reviews ex<br />
post, 31 but vendors might try to suppress such reviews ex ante. For example,<br />
one café owner grew so tired of negative Yelp reviews that he put a “No<br />
Yelpers” sign in his café’s windows. 32<br />
That sign probably had no legal effect, but Medical Justice offers an ex ante<br />
system to help doctors use preemptive contracts to suppress reviews by their<br />
patients. Medical Justice provides doctors with a form agreement that h<strong>as</strong><br />
patients waive their rights to post online reviews of the doctor. 33 Further, to<br />
byp<strong>as</strong>s 47 U.S.C. § 230’s protective immunity for online reputation systems that<br />
might republish such patient reviews, the Medical Justice form prospectively<br />
takes copyright ownership of any patient-authored reviews. 34 (Section 230 does<br />
not immunize against copyright infringement). This approach effectively allows<br />
doctors—or Medical Justice <strong>as</strong> their designee—to get reputation systems to<br />
remove any unwanted patient reviews simply by sending a DMCA takedown<br />
notice. 35<br />
Ex ante customer gag orders may be illegal. In the early 2000s, the New York<br />
Attorney General challenged software manufacturer Network Associates’ end<br />
user license agreement, which said the “customer will not publish reviews of<br />
this product without prior consent from Network Associates, Inc.” In<br />
response, the New York Supreme Court enjoined Network Associates from<br />
restricting user reviews in its end user license agreement. 36 Medical Justice’s<br />
scheme may be equally legally problematic.<br />
From a policy standpoint, ex ante customer gag orders pose serious threats to<br />
the invisible hand. If they work <strong>as</strong> intended, they starve reputation systems of<br />
the public information necessary to facilitate the marketplace. Therefore,<br />
31 See Eric Goldman, Online Word of Mouth and Its Implications for Trademark Law, in TRADEMARK<br />
LAW AND THEORY: A HANDBOOK OF CONTEMPORARY RESEARCH 404 (Graeme B.<br />
Dinwoodie and Mark D. Janis eds.) (2008) (discussing lopsided datab<strong>as</strong>es where all negative<br />
reviews are removed, leaving only positive reviews).<br />
32 Stefanie Olsen, No Dogs, Yelpers Allowed, CNET NEWS.COM, Aug. 14, 2007,<br />
http://news.cnet.com/8301-10784_3-9759933-7.html.<br />
33 Lindsey Tanner, Doctors Seek Gag Orders to Stop Patients’ Online Reviews, ASSOCIATED PRESS,<br />
Mar. 3, 2009, http://www.usatoday.com/news/health/2009-03-05-doctorreviews_N.htm.<br />
34 Michael E. Carbine, Physicians Use Copyright Infringement Threat to Block Patient Ratings on the Web,<br />
AIS’S HEALTH BUSINESS DAILY, Mar. 30, 2009,<br />
http://www.aishealth.com/Bnow/hbd033009.html.<br />
35 17 U.S.C. § 512(c)(3).<br />
36 People v. Network Associates, Inc., 758 N.Y.S.2d 466 (N.Y. Sup. Ct. 2003).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 303<br />
regulatory efforts might be required to prevent ex ante customer gag orders<br />
from wreaking havoc on marketplace mechanisms.<br />
Distorted Decision-Making<br />
from Reputational Information<br />
Reputational information generally improves decision-making, but not always.<br />
Most obviously, reputational information relies on the accuracy of p<strong>as</strong>t<br />
information in predicting future behavior, but this predictive power is not<br />
perfect.<br />
First, marketplace actors are constantly changing and evolving, so p<strong>as</strong>t behavior<br />
may not predict future performance. For example, a person with historically<br />
bad credit may obtain a well-paying job that puts him or her on good financial<br />
footing. Or, in the corporate world, a business may be sold to a new owner<br />
with different management practices. In these situations, the predictive<br />
accuracy of p<strong>as</strong>t information is reduced. 37<br />
Second, some p<strong>as</strong>t behavior may be so distracting that information consumers<br />
might overlook other information that h<strong>as</strong> more accurate predictive power. For<br />
example, a p<strong>as</strong>t crime or bankruptcy can overwhelm the predictive information<br />
in an otherwise-unblemished track record of good performance.<br />
Ultimately, a consumer of information must make smart choices about what<br />
information to consult and how much predictive weight to <strong>as</strong>sign to that<br />
information. Perhaps regulation can improve the marketplace’s operation by<br />
shaping the information that consumers consider. For example, if some<br />
information is so highly prejudicial that it is likely to distort consumer decisionmaking,<br />
the marketplace might work better if we suppress that information<br />
from the decision-maker. 38<br />
At the same time, taking useful information out of the marketplace could create<br />
its own adverse distortions of the invisible hand. Therefore, we should tread<br />
cautiously in suppressing certain categories of information.<br />
37 Cf. Note, Badwill, 116 HARV. L. REV. 1845 (2003) (describing how companies can m<strong>as</strong>k a<br />
track record of bad performance through corporate renaming).<br />
38 Cf. FED. R. EVID. 403 (“Although relevant, evidence may be excluded if its probative value is<br />
substantially outweighed by the danger of unfair prejudice, confusion of the issues, or<br />
misleading the jury…”). This fear underlies a French proposal to enact a “right to forget”<br />
statute. See David Reid, France Ponders Right-to-Forget Law, BBC CLICK, Jan. 8, 2010,<br />
http://news.bbc.co.uk/2/hi/programmes/click_online/8447742.stm.
304 CHAPTER 4: HAS THE INTERNET FUNDAMENTALLY CHANGED ECONOMICS?<br />
Conclusion<br />
Although “reputation” h<strong>as</strong> been extensively studied in a variety of social science<br />
disciplines, there h<strong>as</strong> been comparatively little attention paid to how regulation<br />
affects the flow of reputational information in our economy. Understanding<br />
these dynamics would be especially valuable in light of the proliferation of<br />
Internet-mediated reputation systems and the irresistible temptation to regulate<br />
novel and innovative reputation systems b<strong>as</strong>ed on emotion, not necessarily<br />
sound policy considerations.
CHAPTER 5<br />
WHO WILL GOVERN THE NET IN 2020?<br />
Imagining the Future of Global Internet Governance 307<br />
Milton Mueller<br />
305<br />
Democracy in Cyberspace: Self-Governing Netizens &<br />
a New, Global Form of Civic Virtue, Online 315<br />
David R. Johnson<br />
Who’s Who in Internet Politics: A Taxonomy of<br />
Information Technology Policy & Politics 327<br />
Robert D. Atkinson
306 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 307<br />
Imagining the Future of<br />
Global Internet Governance<br />
By Milton Mueller *<br />
When discussing who (or what) will govern the Internet in 2020, people tend to<br />
want predictions. They want authoritative statements from experts. They want<br />
you to tell them what will happen. But an honest scholar of Internet governance<br />
would never attempt to meet that demand. The problem is not just that the<br />
future of Internet governance is uncertain, subject to the influence of many<br />
complex variables. The problem is that its future is, literally, indeterminate.<br />
While it is correct that there is an ongoing struggle over the governance of the<br />
Internet, we cannot know how it will come out.<br />
Forget about predictions and forec<strong>as</strong>ts. It’s better to have a clear conception of<br />
how we want the Internet to be governed. This means that we need to be able to<br />
imagine fe<strong>as</strong>ible futures and to create strategies to realize them.<br />
Let’s step back. Why is Internet governance an interesting problem in the first<br />
place? Why does contemplating the Internet’s future require imagination and<br />
creativity? Because there is a tension, even a contradiction, between the existing<br />
institutions for regulating communications and information, and the technical<br />
capabilities and processes of open internetworking. Existing institutions are<br />
organized around territorial, hierarchical nation-states; the process of<br />
internetworking, on the other hand, provides globalized and distributed<br />
interoperation amongst all the elements of an incre<strong>as</strong>ingly powerful and<br />
ubiquitous system of digital devices and networks.<br />
This technical capability puts pressure on the nation-state in five distinct ways.<br />
1. It globalizes the scope of communication. Its distance-insensitive cost<br />
structure and non-territorial addressing and routing architecture make<br />
borderless communication the default; any attempt to impose a<br />
jurisdictional overlay on its use requires additional (costly)<br />
interventions.<br />
2. It facilitates a quantum jump in the scale of communication. It m<strong>as</strong>sively<br />
enlarges our capacity for message generation, duplication, and storage.<br />
As a programmable environment, it industrializes information services,<br />
information collection, and information retrieval. The sheer volume of<br />
* Milton Mueller teaches and does research on the political economy of communication and<br />
information. His new book NETWORKS AND STATES: THE GLOBAL POLITICS OF<br />
INTERNET GOVERNANCE (MIT Press, 2010) provides a comprehensive overview of the<br />
political and economic drivers of a new global politics.
308 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
transactions and content on the Internet often overwhelms the capacity<br />
of traditional governmental processes to respond—but that same<br />
scalability can transform governmental processes <strong>as</strong> well.<br />
3. It distributes control. Combined with liberalization of the<br />
telecommunications sector, the Internet protocols have decentralized<br />
and distributed participation in and authority over networking and<br />
ensured that the decision-making units over network operations are not<br />
necessarily closely aligned with political units, <strong>as</strong> they were in the days<br />
of post, telephone and telegraph monopolies.<br />
4. It grows new institutions. Decision-making authority over standards and<br />
critical Internet resources rests in the hands of a transnational network<br />
of actors that emerged organically alongside the Internet, outside of the<br />
nation-state system. These relatively young but maturing institutions,<br />
such <strong>as</strong> the Internet Engineering T<strong>as</strong>k Force (IETF), the Internet<br />
Corporation for Assigned Names and Numbers (ICANN), and<br />
Regional Internet Address Registries (RIRs) provide a new locus of<br />
authority for key decisions about standards and critical resources.<br />
5. It changes the polity. By converging different media forms and facilitating<br />
fully interactive communication, the Internet dramatically alters the cost<br />
and capabilities of group action. As a result, radically new forms of<br />
collaboration, discourse, and organization are emerging. This makes it<br />
possible to mobilize new transnational policy networks and enables<br />
new forms of governance <strong>as</strong> a solution to some of the problems of<br />
Internet governance itself.<br />
Transnational scope, boundless scale, distributed control, new institutions and<br />
radical changes in collective action capabilities—these factors are transforming<br />
national control and sovereignty over communication and information policy,<br />
setting in motion new institutional forms and new kinds of geopolitical<br />
competition. The governance of global Internetworking is thus a relatively new<br />
problem created by socio-technical change. The future of Internet governance<br />
will be driven by the cl<strong>as</strong>h between its raw technical potential and the desire of<br />
various incumbent interests—most notably nation-states—to <strong>as</strong>sert control<br />
over that potential.<br />
While the Internet poses novel governance problems, how we solve them<br />
cannot be predicted. It depends vitally on our ability to accurately diagnose the<br />
economic, technical and political forces at work and on our ability to imagine<br />
strategies, mechanisms and techniques that can harness those forces to do what<br />
we want to do. Thus, to repeat, it is better to invest our mental resources in<br />
conceptualizing and enacting fe<strong>as</strong>ible visions of how we want the Internet to be<br />
governed than it is to invest in making deterministic predictions.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 309<br />
The Pace of Change<br />
2020 is not very far away. Ten years is the blink of an eye when it comes to<br />
institutional development at the global level. Proof of this can be found by<br />
glancing back ten years from our current vantage point of 2010. Despite the<br />
Internet’s reputation for rapid change, the b<strong>as</strong>ic issues and problems of Internet<br />
governance have not changed much since 2000. Yes, there h<strong>as</strong> been turbulence<br />
in the market economy, with specific firms rising and falling. But the c<strong>as</strong>t of<br />
institutional characters that regulate or govern Internetworking w<strong>as</strong> already well<br />
in place by 2000. The organically developed Internet institutions such <strong>as</strong> the<br />
IETF, the Internet Society, and the Internet address registries 1 already existed.<br />
On the other hand, the U.S. government and its rival nation-states were<br />
entering the scene. ICANN, with its unilateral control by the U.S. government,<br />
had already emerged <strong>as</strong> the uncomfortable compromise between the a-national<br />
“Internet community” and the community of states. The seeds of the tensions<br />
among the U.S., the E.U. and the BRIC 2 nations caused by U.S. pre-eminence<br />
in that regime were already sown. There w<strong>as</strong> already theoretical talk of cyberwar<br />
(though this h<strong>as</strong> picked up dramatically in the l<strong>as</strong>t few years). There were already<br />
efforts to block and filter Internet content, though these have become<br />
incre<strong>as</strong>ingly refined. Peer-to-peer file sharing w<strong>as</strong> already beginning to drive<br />
copyright holders mad (Napster w<strong>as</strong> started in 2000). Whatever change h<strong>as</strong><br />
taken place since 2000 h<strong>as</strong> been evolutionary rather than revolutionary.<br />
Disruption or Continuity?<br />
It is possible to identify some <strong>as</strong>pects of the current Internet governance regime<br />
that could disrupt the existing evolutionary trajectory. I divide them into two<br />
distinct categories: the geopolitical and the techno-economic.<br />
Geopolitical Factors<br />
The Root: One of the most important geopolitical factors is still U.S. control of<br />
the root of the name and numbering hierarchies. This control is bound up with<br />
the issue of the singularity of those roots and the universal interoperability of<br />
the Internet. Here the U.S. is pre-eminent, and along with that pre-eminence<br />
come forms of responsibility and danger. A policy misstep or mistake can<br />
disrupt the status quo. Will the U.S. finally fully privatize ICANN, or will it<br />
“internationalize” it? Will the Internet Assigned Numbers Authority (IANA)<br />
contract 3 be competitively bid, or routinely re<strong>as</strong>signed to ICANN, keeping it<br />
1 Users are <strong>as</strong>signed IP addresses by Internet service providers (ISPs), who usually obtain<br />
larger blocks of IP addresses from a Regional Internet Registry (RIR).<br />
2 Brazil, Russia, India, and China<br />
3 The IANA contract is a contract between the U.S. Commerce Department that authorizes<br />
ICANN to perform what is commonly referred to <strong>as</strong> the IANA function, a bundle of<br />
technical operations that includes the registration and allocation of IP addresses,
310 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
subordinate to the United States government? The governance issues related to<br />
the root and control/coordination hierarchies have intensified <strong>as</strong> the technical<br />
community (usually funded by U.S. government contracts) h<strong>as</strong> moved to<br />
“secure” the Internet by making access to and use of critical identifier resources<br />
reliant upon cryptographic key hierarchies. The harder the U.S. government<br />
tries to rigidify its existing forms of institutional control over Internet resources,<br />
the more likely it becomes that the Internet will fragment.<br />
Cyber-Warfare: Military conflict is always a potent source of institutional<br />
disruption. Geopolitically, the U.S. is pursuing a dangerously contradictory<br />
agenda. On the one hand it insists on retaining pre-eminent control of Internet<br />
standards, protocols and virtual resources and maximizing the dependence of<br />
the rest of the world on them. On the other hand it wants to treat cyberspace <strong>as</strong><br />
a “national <strong>as</strong>set” and develop an overwhelming cyber-warfare and cyberweapons<br />
capability b<strong>as</strong>ed on those very same standards, protocols and<br />
resources. But insofar <strong>as</strong> cyberspace is militarized, its status <strong>as</strong> a globalized<br />
platform for information and communication among the business and civil<br />
society is undermined. Contradictions abound here, and <strong>as</strong> they play out, the<br />
chances of a structural change incre<strong>as</strong>e.<br />
Free Trade in Information Services: The U.S. approach to Internet freedom<br />
is driven <strong>as</strong> much by economics <strong>as</strong> by ideology and ethics. Due to its liberal<br />
policies, the U.S. leads the world in the supply of Internet-b<strong>as</strong>ed information<br />
services. Of course the rest of the world will gradually catch up, but the natural<br />
state of Internet-b<strong>as</strong>ed information services is to be transnational, accessible<br />
anywhere in the world, and so suppliers who would challenge the Googles and<br />
Facebooks must be transnational <strong>as</strong> well. The contradiction between the open<br />
Internet and various forms of trade protectionism in the content industries—<br />
including the cultural protectionism that is often disguised <strong>as</strong> support for<br />
“diversity”—could be a key driver of Internet governance. Advocates of civil<br />
liberties and communication rights need to forge common ground with<br />
advocates of free trade and market liberalism for anything important to happen<br />
here.<br />
Techno-Economic Factors<br />
Unlicensed Wireless Broadband: A great deal of the consolidation of control<br />
over the Internet is contingent upon the access bottleneck. The fewer market<br />
players in the Internet service provider space, the e<strong>as</strong>ier it becomes for states<br />
and state-favored monopolies to blunt and channel the potential of information<br />
and communication technology. Thus, new access technologies like unlicensed<br />
wireless broadband become critical factors shaping the future. If they can take<br />
management of the Internet root server system, and maintenance of the authoritative root<br />
zone file for the domain name system.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 311<br />
root and disrupt current market structures around the supply of Internet access,<br />
the arrangements for governance and control will need to be reconsidered. With<br />
greater choice of access arrangements, the less fe<strong>as</strong>ible it becomes for<br />
governments to impose onerous regulatory arrangement upon consumers<br />
through intermediaries.<br />
DPI and Net Neutrality: A key characteristic of the Internet so far h<strong>as</strong> been<br />
the “end-to-end” principle, which put the processing intelligence for<br />
applications and services at the end points and made the network a relatively<br />
simple packet-forwarding system. Deep packet inspection (DPI) is a new<br />
technological capability that could lead to a wholesale departure from that<br />
principle. Developed in response to legitimate concerns about efficient<br />
bandwidth management and the detection and interception of malware, it<br />
incre<strong>as</strong>es the awareness and control of the network intermediary over the traffic<br />
coursing through its system. This is a fateful shift of control. Needless to say,<br />
there are demands to extend its capabilities to less technical forms of<br />
intervention, such <strong>as</strong> censorship, copyright protection or national securityoriented<br />
surveillance of communications. At the same time, concerns about<br />
privacy, network neutrality, and competition policy have put legal and regulatory<br />
checks upon the usage of DPI. This is an arena that goes to the heart of<br />
Internet governance in the future.<br />
Two Visions<br />
Two visions of possible futures should help to illustrate how these themes might<br />
play out, but more importantly, how I think they ought and ought not play out.<br />
The Dark Vision<br />
Picture a world in a long-term global recession, one that l<strong>as</strong>ts the better part of<br />
the decade we are discussing. There is growing conservatism—by which I mean<br />
incre<strong>as</strong>ingly nationalist and ethnocentric attitudes, a growing impatience with,<br />
and rejection of, the rigors of market liberalism, and a greater willingness to<br />
trade freedom and innovation for security and stability. In such a scenario of<br />
recession-driven reaction, trade barriers rise. Hostility to immigration and<br />
“offshoring” grows. Internet-b<strong>as</strong>ed communications become incre<strong>as</strong>ingly<br />
confined to national spaces. There is blocking and filtering of content at the<br />
national level; the full linkage of online identity to national identity; the licensing<br />
of content, application and service providers at the national level; the<br />
subordination of information flows to the surveillance needs of national<br />
governments. Infr<strong>as</strong>tructure providers stop expanding and rely on national<br />
broadband subsidy plans. As this happens, the major U.S. Internet/media<br />
corporations succeed in minimizing competition and maximize rent-seeking in<br />
an incre<strong>as</strong>ingly mature, stable market. With the number of players winnowed<br />
down, these corporations will make dis<strong>as</strong>trous concessions to governments<br />
seeking to extend their authority over cyberspace in are<strong>as</strong> such <strong>as</strong> online identity
312 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
and identification, online surveillance, security practices, protectionist standards<br />
and content regulation. Some variant of a Google-Verizon merger spawns the<br />
AT&T of the 21 st century, a dominant private sector entity with its own<br />
commercial interests, but one whose markets and fortunes follow the flag of<br />
U.S. policy worldwide. Reacting to its qu<strong>as</strong>i-sponsorship by the State<br />
Department, other countries erect barriers. In this context, with national<br />
security and a cyber- version of the military-industrial complex becoming the<br />
main driver of international policy, the U.S. government eventually participates<br />
in the strangling of its own progeny.<br />
As the U.S. develops an overwhelming cyber-warfare and cyber-weapons<br />
capability, the rest of the world revolts. The U.S. provokes a cyber-cold war, or<br />
perhaps even a short “hot war” with Russia and China, and uses it to rationalize<br />
and extend many of the controls. The European Commission—but not<br />
European civil society—will side with the U.S., effectively paralyzing and<br />
subordinating Europe’s ability to contribute anything constructive, much less<br />
innovative, to the Internet governance debates. The Internet world fragments<br />
on linguistic grounds, with the English-speaking or English-dominant world<br />
drifting away from the Chinese, Korean, Russian and Japanese societies.<br />
The Bright Vision<br />
It’s e<strong>as</strong>y enough to describe that scenario because it seems to be the road we are<br />
already on. It is much harder to imagine a better future, one that is both fe<strong>as</strong>ible<br />
and consistent with the interests and capabilities of current actors. But let’s give<br />
it a try. In another work, I’ve tried to describe the b<strong>as</strong>ic nature of what I call a<br />
denationalized liberalism <strong>as</strong> the guide to the future of Internet governance. 4<br />
At its core, a denationalized liberalism favors a universal right to receive and<br />
impart information regardless of frontiers, and sees freedom to communicate<br />
and exchange information <strong>as</strong> fundamental, primary elements of human choice<br />
and political and social activity. Political institutions should seek to build upon,<br />
not undermine or reverse, the limitless possibilities for forming new social<br />
aggregations around digital communications. In line with its commitment to<br />
freedom, this ideology holds a presumption in favor of networked, <strong>as</strong>sociative<br />
relations over hierarchical relations <strong>as</strong> a mode of transnational governance.<br />
Governance should emerge primarily <strong>as</strong> a byproduct of many unilateral and<br />
bilateral decisions by its members to exchange or negotiate with other members<br />
(or to refuse to do so). This networked liberalism thus moves decisively away<br />
from the dangerous, conflict-prone tendency to build political institutions<br />
around linguistic, religious, and ethnic communities. Instead of rigid, bounded<br />
communities that conceal domination with the pretense of homogeneity and a<br />
4 MILTON MUELLER, NETWORKS AND STATES: THE GLOBAL POLITICS OF INTERNET<br />
GOVERNANCE (MIT Press: 2010).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 313<br />
“collective will,” this liberalism offers governance of communication and<br />
information through more flexible and shifting social aggregations.<br />
Although committed to globalism in the communicative sector, networked<br />
liberalism recognizes that, for the time being, people are deeply situated within<br />
national laws and institutions regarding such b<strong>as</strong>ic matters <strong>as</strong> contracts,<br />
property, crime, education, and welfare. It is characterized not by absolute<br />
hostility to national and subnational governments <strong>as</strong> such, but rather by an<br />
attempt to contain them to those domains of law and policy suited to localized or<br />
territorialized authority. It seeks to detach the transnational operations of<br />
Internet infr<strong>as</strong>tructure and the governance of services and content from those<br />
limited jurisdictions <strong>as</strong> much <strong>as</strong> possible, and to prevent states from ensnaring<br />
global communications in interstate rivalries and politico-military games. This<br />
requires a complete detachment of Internet governance institutions from<br />
nation-state institutions, and the creation of new, direct accountability<br />
relationships for Internet institutions.<br />
Such an ideology needs to answer tough questions about when hierarchical<br />
exercises of power are justified and through which instruments they are<br />
exercised. A realistic denationalized liberalism recognizes that emergent forms<br />
of control will arise from globally networked communities. It recognizes that<br />
authoritative interventions will be needed to secure b<strong>as</strong>ic rights against coercive<br />
attacks, and that network externalities or bottlenecks over essential facilities may<br />
create a concentrated power with coercive effect. It should also recognize the<br />
exceptional c<strong>as</strong>es where the governance of shared resources requires binding<br />
collective action. Insofar <strong>as</strong> collective governance is necessary and unavoidable,<br />
a denationalized liberalism strives to make Internet users and suppliers an<br />
autonomous, global polity, with what might be called neodemocratic rights to<br />
representation and participation in these new global governance institutions.<br />
The concept of democracy is qualified by the realization that the specific form<br />
of democratic governance <strong>as</strong>sociated with the territorial nation-state cannot and<br />
should not be directly translated into the global level. However, it does maintain<br />
the b<strong>as</strong>ic objectives of traditional democracy: to give all individuals the same<br />
formal rights and representational status within the institutions that govern<br />
them so that they can preserve and protect their rights <strong>as</strong> individuals. Such a<br />
liberalism is not interested, however, in using global governance institutions to<br />
redistribute wealth. That would require an overarching hierarchical power that<br />
would be almost impossible to control democratically; its mere existence would<br />
trigger organized political competition for its levers, which would, in the current<br />
historical context, devolve into competition among preexisting political and<br />
ethnic collectivities—the very opposite of networked liberalism.<br />
In short, we need to find ways to translate cl<strong>as</strong>sical liberal rights and freedoms<br />
into a governance framework suitable for the global Internet. There can be no
314 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
cyber-liberty without a political movement to define, defend, and institutionalize<br />
individual rights and freedoms on a transnational scale.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 315<br />
Democracy in Cyberspace: Self-<br />
Governing Netizens & a New, Global<br />
Form of Civic Virtue, Online<br />
By David R. Johnson *<br />
The Internet can be viewed <strong>as</strong> a set of wires, wireless “pipes” and servers, a set<br />
of protocols, or <strong>as</strong> a v<strong>as</strong>t array of content and applications to which these lower<br />
layers of the stack provide access. None of those tangible and intangible things<br />
can be “governed.” They may or may not be owned or manipulated. But it is<br />
the actions of the people involved in creating and using the Internet that are the<br />
proper subject of a question regarding “governance.” Viewed with respect to<br />
the social and legal relationships among the people who are creating and using<br />
it, and who would be “governed,” the Internet is a complex system—so making<br />
accurate predictions about its future state is impossible.<br />
But it is possible to answer the question: “Who could and who should govern the<br />
Internet in 2020?” My answer is, in a word: netizens—the global polity of those<br />
who collaborate online, seek to use the new affordances of the Internet to<br />
improve the world, and care about protecting an Internet architecture that<br />
facilitates new forms of civic virtue.<br />
The Internet Governance Debate<br />
The debate about “Internet Governance” h<strong>as</strong> continued for more than fifteen<br />
years and settled into an unsatisfying rut. The established trope is that early<br />
visionaries (e.g., John Perry Barlow 1) claimed that cyberspace w<strong>as</strong> a new realm of<br />
freedom, poised to escape from regulation by local governments. Then, later<br />
“realists” (e.g., Jack Goldsmith and Tim Wu 2) discovered that sovereign<br />
governments indeed had ways to regulate online speech and even use the<br />
Internet for surveillance and tyranny. Early idealists envisioned the Internet<br />
Corporation for Assigned Names and Numbers (charged with setting policy for<br />
* David Johnson joined New York Law School’s faculty in spring 2004 <strong>as</strong> a visiting professor<br />
of law. He is a faculty member of the Institute for Information Law and Policy.<br />
1 See John Perry Barlow, A Declaration of the Independence of Cyberspace (Feb. 8, 1996),<br />
https://projects.eff.org/~barlow/Declaration-Final.html [hereinafter A Declaration of<br />
the Independence of Cyberspace]; John Perry Barlow, Declaring Independence, 4.06 WIRED 121-22<br />
(June 1996), available at<br />
http://www.wired.com/wired/archive/4.06/independence.html.<br />
2 See JACK GOLDSMITH & TIM WU, WHO CONTROLS THE INTERNET?: ILLUSIONS OF A<br />
BORDERLESS WORLD (2006) [hereinafter WHO CONTROLS THE INTERNET?].
316 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
the domain name system and allocating blocks of IP numbers) 3 <strong>as</strong> a new<br />
democratic global institution constrained by consensus reached in a global<br />
community. Later, we observe an expensive bureaucracy imposing complex<br />
regulations—<strong>as</strong> one wag h<strong>as</strong> put it: “recapitulating the FCC and doing it<br />
badly.” 4 Internet Governance <strong>as</strong> an empowering and liberating democratic ideal<br />
is a total failure—or so it would seem.<br />
This debate h<strong>as</strong> missed a fundamental point by <strong>as</strong>king the wrong question. The<br />
key question is not “Who will govern the Internet?” Instead, it is “Will the<br />
global Internet affect the way in which we (the global polity) govern ourselves?”<br />
One way or another, we act through governments, NGOs, private corporations<br />
and many other types of groups to collectively set the rules by which we live our<br />
lives. We all strive for a world in which our own choices determine how we are<br />
governed. So the more salient way to put the real question at issue is: “Can the<br />
Internet make society more democratic?”<br />
The Vision of the Internet’s Founders<br />
The founders of the Internet (technologists, advocates, policy makers, and<br />
visionaries) saw its democratic potential. They were not (mostly) seeking to<br />
create a “lawless frontier.” They were instead seizing a moment of flexibility<br />
during which new modes of <strong>as</strong>sociation for community improvement might<br />
flourish. They opposed rigid regulation and unaccountable power, of course.<br />
But they also favored collaborative decision-making to establish rules, group<br />
effort to write empowering and constraining code, and respectful deliberation to<br />
forge new norms. They favored civility and civic virtue (good netizenship) <strong>as</strong><br />
much <strong>as</strong> individual liberty.<br />
We’re not just talking about the founding technologists. Many individuals,<br />
nonprofit organizations and companies came together, in the late 1980s and<br />
throughout the 1990s, to create a technology and policy framework to enable a<br />
democratic Internet: open access, limitations on intermediaries’ powers and<br />
liabilities, privacy protections for communications, limitations on centralized<br />
levers of power (like the control over the domain name system) and, at le<strong>as</strong>t in<br />
the United States, establishment of strong First Amendment rights (in contr<strong>as</strong>t<br />
to regulation of the Internet <strong>as</strong> a form of m<strong>as</strong>s media). To a considerable<br />
3 ICANN is generally accepted <strong>as</strong> the authority for decisions about what top level domains<br />
will be placed in the “root zone file.” By establishing contractual conditions in connection<br />
with such additions, it can establish rules that flow down onto registries, registrars and<br />
registrants. For example, this capability h<strong>as</strong> been used to require registrants in generic Top<br />
Level Domains to submit to a “Uniform Dispute Resolution Policy” that decides and takes<br />
action on disputes about “cybersquatting.”<br />
4 Harold Feld, quoted in Jonathan Weinberg, ICANN, “Internet Stability,” and New Top Level<br />
Domains 1 n. 1, http://faculty.law.wayne.edu/Weinberg/icannetc.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 317<br />
degree, from that perspective, the Internet’s founders have, up to this point,<br />
succeeded.<br />
I want to re-emph<strong>as</strong>ize that, while the Internet relies on wires and protocols, it<br />
most fundamentally consists of connections among people. Local governments<br />
may control who h<strong>as</strong> the right to provide access. Standard setting bodies may<br />
have some say in what protocols become widely adopted. Law plays a role—and<br />
the law of local governments does constrain the actions of people over whom<br />
they can <strong>as</strong>sert jurisdiction. But the “governance” of the Internet is<br />
fundamentally a question about how we all constrain the manner in which we<br />
do whatever it is we do in groups online, including establishing new structures<br />
of society, new forms of social organization, and new roles and rules that<br />
incentivize our efforts and focus our minds. No government could even hope<br />
to make the rule set for a global web of relationships involving billions of<br />
people interacting in complex and ever evolving ways. The governance of the<br />
social layer of the Internet will be, perforce, decentralized.<br />
The Democratic Nature of the Internet<br />
Thus, contrary to Larry Lessig’s suggestion in Code, 5 I submit that the Internet<br />
of today h<strong>as</strong> a nature: It is inherently democratic. Not inevitably so, in the sense<br />
that any global communications network would necessarily be democratic. And<br />
not necessarily so in the future. But historically so and by design—in the sense<br />
that this Internet, the one we have and the one that scaled globally in no time,<br />
w<strong>as</strong> successful precisely because it w<strong>as</strong> open, decentralized, tolerant of<br />
innovation and disagreement, voluntary, and empowering of anyone who cared<br />
to use it to join with others to improve the world. Every time we address an<br />
email, or establish a blog, or “agree” to some “terms of service,” we are creating<br />
the rules for our online society.<br />
From Wikipedia to PatientsLikeMe, 6 we are continuously learning how to use<br />
the Internet to come together to share knowledge, improve education, solve<br />
health care problems, and provide charitable <strong>as</strong>sistance to those in need around<br />
the world. We use the Internet to participate in local politics and explore new<br />
ways to make global society energy efficient. In these and countless other ways,<br />
the Internet is an engine of democratic civic virtue.<br />
5 See LAWRENCE LESSIG, CODE: AND OTHER LAWS OF CYBERSPACE, VERSION 2.0 (B<strong>as</strong>ic Books<br />
2006).<br />
6 For example, PatientsLikeMe allows users to create profiles and then share, interact, and<br />
learn from the experience of other users on health and wellness issues.
318 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
Self-Governance Online<br />
As noted, and sometimes in defiance of repressive regulations, we choose many<br />
of the rules under which these collaborations occur simply by logging in to one<br />
website or platform rather than another. We can rise up together in protest<br />
when a site like Facebook changes its “terms of service” and “privacy policy” in<br />
ways we don’t like. And we incre<strong>as</strong>ingly accept obligations to spend our<br />
attention and effort in support of groups we find online with whom we share a<br />
persistent purpose or goal.<br />
Even an email listserv involves social duties (not always honored, of course) to<br />
fellow participants. Open source efforts have evolved complex, hierarchical, yet<br />
open and democratic (or meritocratic) self-governance structures. We are<br />
beginning to understand that, at le<strong>as</strong>t for those things that can happen online,<br />
the choices we make about which groups to join have <strong>as</strong> much impact on<br />
setting the terms of our relationships with others than any governmental laws or<br />
regulations.<br />
Goldsmith and Wu and other “anti-exceptionalists” have cited the Yahoo!<br />
France c<strong>as</strong>e, Australia’s imposition of its libel laws on a New Jersey-b<strong>as</strong>ed<br />
publisher, and Italy’s conviction of Google executives for the proposition that<br />
the Net cannot escape from the “real world” governance of sovereign states. 7<br />
Their examples, rather than making a c<strong>as</strong>e for a bordered Internet, in fact prove<br />
the opposite: A world in which every local sovereign seeks to control the<br />
activities of netizens beyond its borders violates the true meaning of selfgovernance<br />
and democratic sovereignty.<br />
Attention Governance<br />
& Global Civic Virtue<br />
Even though governments still have the guns and have not yet uniformly agreed<br />
to defer to self-governing online groups, “We the Netizens” are still mostly in<br />
control of what happens online. The everyday actions of millions of bloggers<br />
and tweeters (and re-tweeters) and senders of emails and instant messages draw<br />
the attention of the entire world to new and interesting (though not always<br />
important) developments every day. The distributed collaboration arising from<br />
ratings, rankings and reviews disclose and shame, or shine a flattering light on,<br />
every action of every author, seller, politician, organization and anyone else who<br />
wields any form of power. We are learning to use the Internet to engage in a<br />
decentralized form of democracy that might be called “attention governance.”<br />
Democracy is about decentralization and equalization of power—particularly<br />
the power to influence the rules under which we live our lives together. This<br />
7 WHO CONTROLS THE INTERNET?, supra note 2.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 319<br />
requires the absence of centralized, unaccountable power—tyranny—whether<br />
exercised by government or corporations. It also requires that individuals<br />
participate in collective action and adopt a frame of mind that <strong>as</strong>ks how to<br />
improve society rather than only how best to achieve their own private goals.<br />
That frame of mind is called “civic virtue.” It creates a feedback loop—showing<br />
us all ourselves in a mirror, thereby enhancing our ethical standards for both<br />
individual actions and the actions we take together in organizations and groups<br />
(including via the global corporations in which we invest, serve <strong>as</strong> employees or<br />
participate <strong>as</strong> customers). Perhaps the single most powerful contribution to the<br />
sovereignty of the people made by the Internet is its ability to direct our<br />
collective attention. At one time, m<strong>as</strong>s media held that power. Now we all do, in<br />
potentially equal me<strong>as</strong>ure. While online anonymity may (sometimes<br />
unfortunately) give us the power to act without disclosing our identity, the<br />
Internet simultaneously makes it virtually impossible for groups of people to act<br />
together without being confronted with the consequences of their actions and<br />
the views of others regarding the moral and social value (or lack thereof) of<br />
those consequences.<br />
Democracy is not just about how we organize political campaigns or<br />
governmental institutions. It is about how we self-organize all <strong>as</strong>pects of<br />
society—and what we can do in collaboration with others to improve the world.<br />
It is not just about freedom from arbitrary control by government (or corporate<br />
tyrants)—it is, rather, about the myriad ways in which we come together to<br />
construct society. The Internet h<strong>as</strong> had a profound impact on political<br />
campaigns by making it much e<strong>as</strong>ier for individuals to contribute small amounts<br />
and get involved in local activities. And netizens could and should use their<br />
newfound collective voice to instruct their local governments to protect the<br />
Internet and its new freedoms, rather than using it to restrict our freedom. But<br />
the Internet h<strong>as</strong> done even more to decentralize decision-making—about how<br />
we spend our attention and effort and how we organize our collaborative<br />
efforts. Today, anyone can form a purposeful group online on Facebook,<br />
Yahoo! Groups, Google Groups, Twitter, etc. and attract adherents to a cause<br />
or even start a new organization. When we can more e<strong>as</strong>ily act together, we<br />
become more powerful and more free—we enjoy what Tocqueville would have<br />
called “a new equality of condition.” 8<br />
The interesting thing about attention is that, <strong>as</strong> long <strong>as</strong> you are awake, you have<br />
to spend it! It is a non-renewable resource. You can fritter it away on the<br />
entertainments of television or tweeting. You can give it away to an employer<br />
whose goals and ethics don’t match your own. Or, after the b<strong>as</strong>ic necessities of<br />
life have been arranged for, you can apply a higher standard. Through the<br />
8 ALEXIS DE TOCQUEVILLE, DEMOCRACY IN AMERICA (1835).
320 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
Internet, you can join with others in groups that try to make the world a better<br />
place, and talk together about what that means. 9<br />
Good Netizenship<br />
Governing the Internet well fundamentally entails governing ourselves—making<br />
sure that more of our time, attention and effort is spent in roles that are defined<br />
in relation to social organizations and purposive groups that make society more<br />
productive, congruent, ethical, and, yes, interesting, complex and empowering<br />
for everyone. It involves defending the new civic religion of the Internet,<br />
including preservation of individual choice and deference to the self-governance<br />
of online groups. It is no light duty to be a good netizen.<br />
Goldsmith and Wu have shown that open communications via the Internet can<br />
indeed be shut down by governments. 10 But they don’t say anything about how<br />
to preserve and enhance democracy in the context of a global Internet. They<br />
postulate the extension of local and state power onto people who have no<br />
opportunity to participate in making these policies. But they fail to take account<br />
of the possibility that once We the Netizens understand the threat, we could<br />
refuse to allow that to happen.<br />
How could we netizens prevent the tyranny of local governments or of<br />
corporate intermediaries with a new kind of power generated by network<br />
effects? It may be a struggle at times. But we use our minds online—we direct<br />
our attention and support to groups we value. It takes a very seriously repressive<br />
governmental regime to regulate minds rather than behavior. And not even<br />
governments, much less corporations, can stand forever in opposition to what<br />
large segments of the people they regulate think—especially when they are<br />
thinking, and talking, together. As Victor Hugo famously remarked, “One<br />
resists the inv<strong>as</strong>ion of armies; one simply cannot resist the inv<strong>as</strong>ion of ide<strong>as</strong>.” 11<br />
Even those who purport to want to preserve civil civic dialogue and selfgovernance<br />
by the people sometimes suggest that the Internet h<strong>as</strong> broken our<br />
existing (U.S.) democratic institutions—polarizing political factions, eliminating<br />
the possibility of political compromise, fostering hate speech and inciting the<br />
9 As Tocqueville observed about America in the 1830s: “In their political <strong>as</strong>sociations the Americans,<br />
of all conditions, minds, and ages, daily acquire a general t<strong>as</strong>te for <strong>as</strong>sociation and grow accustomed to the use<br />
of it. There they meet together in large numbers, they converse, they listen to one another,<br />
and they are mutually stimulated to all sorts of undertakings. They afterwards transfer to civil<br />
life the notions they have thus acquired and make them subservient to a thousand purposes. Thus it is by the<br />
enjoyment of a dangerous freedom that the Americans learn the art of rendering the dangers of freedom less<br />
formidable.” Id. (emph<strong>as</strong>is added).<br />
10 WHO CONTROLS THE INTERNET?, supra note 2.<br />
11 VICTOR HUGO, THE HISTORY OF A CRIME (1877).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 321<br />
mob. 12 Even those who favor civic virtue suggest that the Internet h<strong>as</strong> led to a<br />
retreat into individualism, mindless narcissism, pornography, intellectual<br />
distraction and worse. 13<br />
The democratic potential of the Internet is under threat—<strong>as</strong> democracy always<br />
is. Some local sovereigns attempt to limit its freedoms. People lacking the<br />
requisite civic virtue may lapse into self-indulgent individualism. We might all<br />
decide to live under the benevolent dictatorship of search engines and online<br />
app stores that make our lives a little more convenient and secure. We might all<br />
decide it is too much trouble to help our fellow netizens in foreign countries<br />
who are fighting repressive local governments. Large corporations with<br />
monopolies born of network effects might gain enough power to become a new<br />
aristocracy, purporting to benefit the people, but ruling <strong>as</strong> they ple<strong>as</strong>e and giving<br />
priority to profit. Bad actors might turn this new communications medium into<br />
a social nightmare. So any theory of the democratic potential (and actual<br />
achievement) of the Internet must include a view regarding how we can all use<br />
the Internet itself to preserve and enhance gains achieved up till now in popular<br />
sovereignty and civic collaboration.<br />
The Trajectory of Freedom<br />
I have such a theory—one derived from reflecting on Tocqueville’s views<br />
regarding the new democracy that he discovered in the America of 1830: The<br />
Internet establishes a new equality of condition and enables us to exercise liberty to form<br />
<strong>as</strong>sociations to pursue new civic, social, and cultural goals. Such a world can produce<br />
wonders simply because we become more powerful when we act together in<br />
groups. Moreover, to paraphr<strong>as</strong>e Oliver Wendell Holmes, “Man’s mind, once<br />
stretched to a new democratic practice, never regains its original dimensions.” 14<br />
The theory, then, is that having discovered and exercised new ways to improve<br />
the world, whatever they mean by “improve,” netizens will collaborate in<br />
myriad ways to protect their newfound powers.<br />
The actual state of society may, of course, periodically regress. Some groups will<br />
adopt definitions of “improvement” that are so intrusive upon and<br />
unacceptable to other groups that governmental and corporate powers will be<br />
rightly invited in to constrain such non-congruent actions. (For example, almost<br />
everyone agrees that the Internet should not provide a safe haven for child<br />
12 See, e.g., CASS SUNSTEIN, REPUBLIC.COM 2.0 (2007).<br />
13 See, e.g., NICHOLAS CARR, THE SHALLOWS: WHAT THE INTERNET IS DOING TO OUR BRAINS<br />
(2010).<br />
14 The original quotation is <strong>as</strong> follows: “Man’s mind, once stretched by a new idea, never<br />
regains its original dimensions.” Oliver Wendell Holmes, quoted in H. JACKSON BROWN, JR., A<br />
FATHER’S BOOK OF WISDOM (1989).
322 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
pornography or terrorism and governmental powers will need to be used to<br />
address these problems.)<br />
But the trajectory of freedom and even civic virtue h<strong>as</strong> been, in broad terms,<br />
over time, constantly upward—because everyone who gets a chance to<br />
experience an incre<strong>as</strong>ed level of democratic self-government—a new “equality<br />
of condition”—a new kind of power that comes from the ability to direct and<br />
control one’s own attention and combine one’s efforts with those of others—<br />
comes to share a desire to have a voice in shaping the world for the better (even<br />
when we don’t all agree on what “better” means). And everyone h<strong>as</strong> now t<strong>as</strong>ted<br />
an empowering opportunity to join with others, online, to do so.<br />
Acting together, the founding netizens created a global network, and thus,<br />
inevitably, a global economy, society and politics. The visionary founders of the<br />
Internet did not seek to liberate selfish individualism, frontier justice b<strong>as</strong>ed on<br />
force, or mere wilderness escape. They were civically virtuous themselves and<br />
foresaw the creation of great schools and libraries, social services, cultural<br />
venues, and everything else a prosperous and democratic global township might<br />
want, online. 15 Perhaps they were naïve, a bit too optimistic that everyone else<br />
shared their civility. Perhaps they <strong>as</strong>sumed that most online groups would make<br />
rules and take actions designed to benefit those who were affected by those<br />
rules and actions. Confronted with criminals or tyrants, these Internet optimists<br />
would be (and are) <strong>as</strong> quick <strong>as</strong> anyone to call for a “rule of law.” But they can<br />
now envision a “law” consisting in part of globally applicable rule sets and<br />
globally accessible self-governing organizations that exist only because netizens<br />
have devoted their time, attention and effort to support or shun new online<br />
institutions.<br />
Copyright law won’t disappear, but we now also have Creative Commons. Laws<br />
against spam will survive, but we also have software filters. Local content<br />
regulation will persist, but we now have proxy servers. Banks will be regulated,<br />
but online currencies can also flourish. Governments will still regulate and tax<br />
the shipment of physical goods, but most long ago gave up trying to establish<br />
custom houses at their virtual border. Every netizen is still a citizen, subject to<br />
local regulation. But, incre<strong>as</strong>ingly, we can travel online to virtual places that have<br />
rules no local legislature would adopt. Land use in Second Life will not become<br />
a subject of any real world government’s zoning laws. Topic moderation in an<br />
online discussion group is not likely to become a matter of local regulation.<br />
15 See, e.g., HOWARD RHEINGOLD, THE VIRTUAL COMMUNITY: HOMESTEADING ON THE<br />
ELECTRONIC FRONTIER (1993); FRED TURNER, FROM COUNTER CULTURE TO<br />
CYBERCULTURE: STEWARD BRAND, THE WHOLE EARTH NETWORK, AND THE RISE OF<br />
DIGITAL UTOPIANISM (2006); KATIE HAFNER & MATTHEW LYON, WHERE WIZARDS STAY UP<br />
LATE: THE ORIGINS OF THE INTERNET (1996).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 323<br />
Online schools will establish their own rules for participation in the cl<strong>as</strong>ses they<br />
offer.<br />
Where we congregate online, most of the relevant rules will originate with our<br />
shared support of the manner in which the proprietor (the owner of the server,<br />
the writer of constraining code, a moderator) “governs” that online space.<br />
Governments will, in general, defer to online spaces that are mostly minding<br />
their own business, rather than inflicting harms on outsiders. They have enough<br />
to do in securing our physical safety. So congruent rule sets voluntarily<br />
“adopted” by willing “users” will become most of the applicable “law” of online<br />
life. That new law will be fundamentally democratic in character, not because<br />
we elected representatives to a global legislature, but because we all have new<br />
powers to decide where to go online and to persuade others to join us.<br />
Because global scale and interconnection make it e<strong>as</strong>ier to find each other,<br />
valuable online groups can exist even if there are very few who share the<br />
group’s goals and interests. By the same token, online sites that no one visits<br />
lose social salience. An online source of destructive code or spam may still<br />
intrude upon our finite attention, but almost everyone agrees that we can and<br />
should get better at constraining the actions of those who use the Internet to<br />
inflict that kind of harm on others. ISPs provide centralized filters but these<br />
depend in part on user actions to flag spam and malicious code. This same<br />
dependence on our collective attention happens to apply to governments and<br />
corporations, the very real “legal fictions” that we call into existence by means<br />
of a shared act of imagination. They thrive only insofar <strong>as</strong> we allocate our own<br />
time, effort and attention to facilitate their purposes. If we refuse to play along,<br />
governments and corporations are doomed to lose (legitimate) power and<br />
cannot ultimately impose their will on an unwilling global polity.<br />
In short, there w<strong>as</strong> never any possibility, or dream, that global online society<br />
would be a society without some amount of order. Even the most idealistic<br />
Internet Exceptionalists, like John Perry Barlow, never denied that there might<br />
be problems in Cyberspace that needed to be solved, only that “We in<br />
Cyberspace” would solve them by “forming our own Social Contract.” 16 The<br />
questions have always been: How much order? Where would it come from?<br />
And, if it came from the online community, how closely would its mechanisms<br />
approximate the democratic ideal of giving everyone at le<strong>as</strong>t a potentially equal<br />
say in what particular kind of order (rules, norms, incentives, roles) were to be<br />
established? Who gets to tell who else what they can and cannot do? Will the<br />
Internet preserve, or even enhance, the sovereignty of the people? Those were<br />
the key questions from the Internet’s very beginning.<br />
16 See A Declaration of the Independence of Cyberspace, supra note 1.
324 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
Wu, Goldsmith and others triumphantly declare the death of Internet<br />
Exceptionalism b<strong>as</strong>ed on the reality that governments seek to regulate the<br />
Internet—and succeed in doing so to a much greater extent than some Internet<br />
Exceptionalists might have imagined possible. But this does not mean the<br />
Internet does not present profoundly new opportunities for self-government—a<br />
new potential form of democracy. It is up to us to seize that opportunity. If we<br />
do so, the Internet will have proved itself <strong>as</strong> exceptional <strong>as</strong> its founders hoped.<br />
Cosmopolitan Pluralism<br />
Because people have many different values (ide<strong>as</strong> about the social good), the<br />
global online society will need to be pluralistic, cosmopolitan, and tolerant of<br />
diverse coexisting groups. As the Internet’s founding netizens urged, online<br />
society could and should be b<strong>as</strong>ed on the moral norm that all groups should be<br />
“conservative in what they send, liberal in what they accept.” 17 Above all, they<br />
imagined (correctly, in my view) that such a global online society would become<br />
ever more complex—providing incre<strong>as</strong>ingly diverse roles for people to play<br />
while, simultaneously, preserving connections (causal and communicative)<br />
among all its parts and the whole—thereby creating v<strong>as</strong>t new wealth (of all<br />
types) for all to share. All wealth ultimately comes from trade. And trade<br />
requires two people who value whatever they have to exchange differently—so<br />
that the bargain makes them both better off. 18 We need our differences.<br />
Democratic Internet governance can preserve them, by allowing us to tolerate<br />
diverse groups, rather than seeking to impose a single rule set on everyone.<br />
If you think democracy is about voting, then the apogee of democratization is<br />
when everyone h<strong>as</strong> an equal vote. That just doesn’t scale globally. We will<br />
“vote” online with our clicks, not with ballots. If you think democracy is about<br />
deliberation and discussion, then the ideal would seem to be a continuous global<br />
town meeting, with everyone getting an equal turn at the microphone. Ple<strong>as</strong>e,<br />
spare us!<br />
If, however, you think that democracy is about equalization of (potential) power<br />
to have an influence on how society is structured, on how we will improve the<br />
world, then you have to <strong>as</strong>k: What is it that we all have in equal me<strong>as</strong>ure, the<br />
deployment of which can shape our world and its rules. The answer is attention<br />
and effort. That is why attention governance is inherently democratic. That is<br />
17 The Internet Society, RFC 4677, The Tao of IETF: A Novice’s Guide to the Internet Engineering<br />
T<strong>as</strong>k Force 6 (Sept. 2006) (quoting Jon Postel), available at<br />
http://tools.ietf.org/pdf/rfc4677.pdf.<br />
18 See generally, ERIC D. BEINHOCKER, THE ORIGIN OF WEALTH: EVOLUTION, COMPLEXITY, AND<br />
THE RADICAL REMAKING OF ECONOMICS (2006); DAVID WARSH, KNOWLEDGE AND THE<br />
WEALTH OF NATIONS: A STORY OF ECONOMIC DISCOVERY (2006).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 325<br />
how the Internet can make self-governance real in the context of a new global<br />
society of netizens.<br />
Every social organization with power (including governments and corporations)<br />
depends critically on a collective act of imagination. We create our social<br />
institutions by going along together with the idea that they exist, adopting roles<br />
that constrain our time and attention in the service of their goals and investing<br />
our attention and effort in ways that further empower them. The Internet is,<br />
centrally, a way for us to deploy our attention and effort together, in a context<br />
in which we can see the resulting effects and correct or constrain social<br />
organizations that don’t share our values. It makes us all citizens (and, indeed,<br />
global netizens) in a new way. It incre<strong>as</strong>es our power and, by doing so, our<br />
responsibilities.<br />
A New Sovereignty of the People<br />
Internet governance will not be about voting, or complex governmental<br />
regulations, or even treaties among states. It will be more mundane, more<br />
perv<strong>as</strong>ive, and more profoundly important than that. As long <strong>as</strong> some states<br />
create havens for freedom (whether the First Amendment and Section 230 of<br />
the Communications Decency Act 19 in the United States or the Icelandic<br />
protection of WikiLeaks 20), netizens will find ways to exercise those freedoms.<br />
Insofar <strong>as</strong> some states create havens for wrongdoers (think Nigerian phishers,<br />
Russian botnets), governments, corporations and individuals will all respond, in<br />
their own ways, to avoid or suppress such evils. It would be folly to suggest that<br />
no organizations (and, therefore, people in organizational roles) will be more<br />
powerful than others, or that no one will use power for evil so widely<br />
condemned that it cannot and should not be “tolerated.” But no collective<br />
action can persist online over the long term if it is not “tolerated” by those who<br />
decide where to direct their attention, what products to buy, what Terms of<br />
Service to accept, what jobs to take, what companies to invest in, and what local<br />
politicians to send packing.<br />
America w<strong>as</strong> founded on the idea of sovereignty of the people. The global<br />
Internet gives the people more tools to exercise that sovereignty and greater<br />
visibility on when and how and where to do so. Good netizenship isn’t<br />
effortless. Civic virtue requires, well, virtue. It is about responsibilities, not<br />
rights. Whether the Internet will realize its democratic potential will ultimately<br />
depend, of course, on the character of the people—the new global polity.<br />
19 47 U.S.C. § 230 (providing liability immunity for providers and users of an “interactive<br />
computer service” who publish third party information).<br />
20 See Robert Mackey, Victory for WikiLeaks in Iceland’s Parliament, The Lede Blog, N.Y. TIMES, June<br />
17, 2010, available at http://thelede.blogs.nytimes.com/2010/06/17/victory-forwikileaks-in-icelands-parliament/.
326 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
I’m optimistic about the prospects for enhanced self-governance by a global<br />
polity, for one simple re<strong>as</strong>on: Most of us, whatever our nationality, want to be<br />
empowered. To communicate. To <strong>as</strong>sociate to make the world better for<br />
ourselves, our children and everyone else. We’ve had a t<strong>as</strong>te of the new<br />
democracy of the Internet. We’ll never willingly, or for long, go back. We<br />
certainly shouldn’t turn away from this new opportunity, the Internet founders’<br />
shared dream of a more empowering, democratic, global society. If we<br />
remember the democratic visions of the founders, and commit to continue to<br />
act <strong>as</strong> good netizens, then we could and should all “govern the Internet,”<br />
together, in 2020—and beyond.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 327<br />
Who’s Who in Internet Politics:<br />
A Taxonomy of Information<br />
Technology Policy & Politics<br />
By Robert D. Atkinson *<br />
Where’s the Internet in the United States going to be in a decade? Given the<br />
important role of public policy in shaping a host of Internet issues, one way to<br />
answer this question is to understand the political constellation that now shapes<br />
U.S. and—to some extent—international Internet policy.<br />
Debates have erupted over myriad information technology (IT) issues such <strong>as</strong><br />
copyright protection, privacy, open source software procurement, cybersecurity,<br />
Internet taxation, media ownership, Internet governance, electronic voting,<br />
broadband deployment and adoption, anti-trust, spectrum reform, net<br />
neutrality, Internet censorship, and equality of access. These issues raise familiar<br />
legal and political questions in some unfamiliar contexts, and have given rise to<br />
a lively, incre<strong>as</strong>ingly shrill, and important digital politics. Today, interest groups<br />
of all kinds, including a host of single-issue advocacy organizations, routinely<br />
weigh in on a range of Internet and digital economy issues. Vexing policy<br />
conundrums arise constantly, with each new business model and Internet<br />
innovation creating a new wrinkle in the fabric of the debate.<br />
How we resolve these issues will have important implications for what the<br />
Internet of 2020 looks like. The debate over IT policy issues does not take<br />
place in a vacuum or only in the corridors of Congress. From think tanks to<br />
trade <strong>as</strong>sociations to single-issue advocacy groups, a proliferation of<br />
organizations fights to shape digital policy debates. This essay is a field guide to<br />
help the reader understand the politics of IT. 1 It describes the major groups of<br />
players in the IT policy debate and discusses how they differ along two key<br />
* Robert D. Atkinson is the founder and president of the Information Technology and<br />
Innovation Foundation, a W<strong>as</strong>hington, DC-b<strong>as</strong>ed technology policy think tank. He is also<br />
author of the State New Economy Index series and the book, THE PAST AND FUTURE OF<br />
AMERICA’S ECONOMY: LONG WAVES OF INNOVATION THAT POWER CYCLES OF GROWTH<br />
(Edward Elgar, 2005).<br />
1 For other useful attempts at creating Internet policy typologies, see Adam Thierer & Berin<br />
Szoka, Cyber-Libertarianism: The C<strong>as</strong>e for Real Internet Freedom, THE TECHNOLOGY LIBERATION<br />
FRONT, Aug. 12, 2009, http://techliberation.com/2009/08/12/cyber-libertarianismthe-c<strong>as</strong>e-for-real-internet-freedom/;<br />
and Adam Thierer, Are You an Internet Optimist or<br />
Pessimist? The Great Debate over Technology’s Impact on Society, THE TECHNOLOGY LIBERATION<br />
FRONT, Jan. 31, 2010, http://techliberation.com/2010/01/31/are-you-an-internetoptimist-or-pessimist-the-great-debate-over-technology%E2%80%99s-impact-onsociety.
328 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
dimensions shaping policy: individual empowerment vs. societal benefit; and<br />
laissez-faire vs. government regulation. It then uses four timely and important<br />
policy c<strong>as</strong>es (privacy, taxation, copyright protection, and net neutrality) to<br />
illuminate how these politics play out today in the United States. While primarily<br />
focused on American digital politics, this framework is not entirely unique to<br />
the United States.<br />
The Major Players<br />
The primary players in the IT policy debate fall into eight b<strong>as</strong>ic groups:<br />
1. Internet Exceptionalists: These “Netizens” believe that they launched the<br />
Internet revolution. Typified by groups such <strong>as</strong> the Free Software<br />
Foundation and the Electronic Frontier Foundation, and dedicated readers<br />
of Wired magazine, they believe “information wants to be free” 2 and that all<br />
software should be open-source. They think technology itself can solve<br />
many problems that it might create (if users are only smart enough to<br />
program software to protect themselves), and that cyberspace should be<br />
governed by the informally enforced social mores (i.e., “netiquette”) that<br />
evolved among early users. Like John Perry Barlow in his 1996 Declaration<br />
of Independence of Cyberspace, 3 they deplore both government<br />
involvement in the Internet and its widespread commercialization. In their<br />
view, anyone who suggests that society, through its legitimately elected<br />
government leaders, might have a role to play in shaping the Internet,<br />
including defending copyright, “just doesn’t get it.” Internet exceptionalists<br />
believe the Internet should be governed by its users. Afraid your privacy is<br />
being violated? Technologically-empowered users are the best solution, <strong>as</strong><br />
they set their Web browser to reject cookies, use anonymizer tools and<br />
encrypt their web traffic. Worried about the recording industry losing<br />
money from Internet piracy? Encourage artists to find a new business<br />
model, like selling T-shirts and putting on more concerts. Worried over<br />
lackluster IT industry competitiveness in the U.S.? Don’t make waves,<br />
Government intervention generally makes things worse. After all, Silicon<br />
Valley didn’t need W<strong>as</strong>hington to get where it is.<br />
2. Social Engineers: These liberals believe the Internet is empowering but<br />
they worry that its growth is having unintended and sometimes dire<br />
consequences for society—whether they invoke the so-called “<strong>Digital</strong><br />
Divide “ (between the “wired” and the “unwired”) the purported loss of<br />
privacy, net neutrality, or concern that corporations are controlling the use<br />
2 Stewart Brand, speaking at the first Hacker’s Conference, 1984. Roger Clarke, Information<br />
Wants to be Free, http://www.rogerclarke.com/II/IWtbF.html.<br />
3 John Perry Barlow, A Declaration of the Independence of Cyberspace, Feb. 8, 1996,<br />
https://projects.eff.org/~barlow/Declaration-Final.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 329<br />
of digital content. They mistrust both government and corporations, the<br />
latter especially—particularly large telecommunications companies and<br />
Internet companies making money from the use of consumer data (to,<br />
ironically, provide free content and services). A large array of groups and<br />
individuals carry this mantle, including the Benton Foundation, Center for<br />
Democracy and Technology (on some issues), Center for <strong>Digital</strong><br />
Democracy, Civil Rights Forum on Communication Policy, Consumer<br />
Project on Technology, Electronic Privacy Information Center, Free Press,<br />
Media Access Project, and Public Knowledge, and scholars such <strong>as</strong><br />
Columbia’s Tim Wu and most of those hanging their hats at Harvard’s<br />
Berkman Center (among them, Larry Lessig and Yochai Benkler). Social<br />
engineers tend to believe the Internet should serve mainly <strong>as</strong> an educational<br />
and communications tool. They fear that its empowering capabilities will be<br />
taken away by powerful multinational corporations and statist governments<br />
that will reshape it to serve their own narrow purposes (either to steal our<br />
privacy, limit our freedom on the Internet, spy on us, or all three). As such,<br />
they minimize the role of IT <strong>as</strong> an economic engine, and focus more on the<br />
impact of IT on social issues, such <strong>as</strong> privacy, community, access to<br />
information and content, and civil liberties.<br />
3. Free Marketers: This group views the digital revolution <strong>as</strong> the great third<br />
wave of economic innovation in human history (after the agricultural and<br />
industrial revolutions). IT reduces transaction costs and facilitates the<br />
application of markets to many more are<strong>as</strong> of human activity. Free<br />
marketers envision a dramatically reduced role for government <strong>as</strong> the<br />
Internet empowers people, liberates entrepreneurs, and enables markets.<br />
Influenced by groups such <strong>as</strong> the Cato Institute, the Mercatus Center, the<br />
Pacific Research Institute, the Phoenix Center, The Progress & Freedom<br />
Foundation, and the Technology Policy Institute, they consider the<br />
emergence of the Internet <strong>as</strong> a vehicle for commerce (e.g., exchanging<br />
goods, services, and information in the marketplace) and a liberating and<br />
progressive force. They are skeptical of the need for government<br />
involvement, even government partnering with industry to more rapidly<br />
digitize the economy.<br />
4. Moderates: This group is staunchly and unab<strong>as</strong>hedly pro-IT, seeing it <strong>as</strong><br />
this era’s driving force for both economic growth and social progress.<br />
While they view the Internet <strong>as</strong> a unique development to which old rules<br />
and laws may not apply, they believe appropriate guidelines must be<br />
developed if it is to reach its full potential. Likewise, they argue that while<br />
rules and regulations should not favor bricks-and-mortar companies (see<br />
#8 below) over Internet ones, neither should they favor Internet companies<br />
over bricks-and-mortars. Moreover, they argue that while government<br />
should “do no harm” to limit IT innovations, it should also “actively do<br />
good” by adopting policies to promote digital transformation in are<strong>as</strong> such
330 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
<strong>as</strong> broadband, the smart electric grid, health IT, intelligent transportation<br />
systems, mobile payments, digital signatures, and others. Examples of<br />
moderates include the Center for Advanced Studies in Science and<br />
Technology Policy, the Center for Strategic and International Studies, the<br />
Information Technology and Innovation Foundation (ITIF), and the<br />
Stilwell Center.<br />
5. Moral Conservatives: This group sees the Internet <strong>as</strong> a dangerous place, a<br />
virtual den of iniquity, populated by pornographers, gamblers, child<br />
molesters, terrorists, and other degenerates. Unlike the free marketers, the<br />
moral conservatives have no qualms about enlisting governments to<br />
regulate the Internet. They have been the driving force behind the<br />
Communications Decency Act’s censorship restrictions and Child Online<br />
Protection Act (both deemed unconstitutional), Internet filtering in<br />
libraries, and worked to push legislation to ban online gambling. They have<br />
also joined forces with the liberal social engineers (Group #2) in pushing<br />
for strong “net neutrality” regulations, fearing that Internet Service<br />
Providers (ISPs) will somehow discriminate against Christians online. This<br />
group argues that, because the Internet is a public space, some rules and<br />
laws are necessary to govern anti-social behavior. They do not believe that<br />
technology can solve all social problems—on the contrary, they believe that<br />
the Internet is generally furthering the decline of culture. Yet, in some<br />
instances they embrace the Internet <strong>as</strong> a tool, <strong>as</strong> evidenced by former<br />
Secretary of Education William Bennett’s K-12 Internet-b<strong>as</strong>ed home<br />
schooling project. In general, moral conservatives don’t want individuals<br />
empowered to engage in antisocial behavior, nor do they want corporations<br />
to facilitate such behavior. Examples are groups like the Christian Coalition<br />
and Focus on the Family, and around the world with countries like<br />
Indonesia, Thailand, Saudi Arabia and other religiously conservative nations<br />
that seek to limit activity on the Internet.<br />
6. Old Economy Regulators: This group believes that there is nothing<br />
inherently unique about the Internet and that it should be regulated in the<br />
same way that government regulates everything else, including p<strong>as</strong>t<br />
technologies. There is a certain sense of urgency among certain elected<br />
officials, government bureaucrats, and “public interest” advocates who<br />
believe that cyberspace is in a state of near-anarchy—a haven for criminals,<br />
con artists, and rapacious corporations. Exemplars of this group include,<br />
law enforcement officials seeking to limit use of encryption and other<br />
innovative technologies, veterans of the telecom regulatory wars that<br />
preceded the breakup of Ma Bell, legal analysts working for social<br />
engineering think tanks, <strong>as</strong> well <strong>as</strong> government officials seeking to impose<br />
restrictive regulatory frameworks on the broadband Internet. As far <strong>as</strong> old<br />
economy regulators are concerned, the 1934 Communications Act (or<br />
perhaps its 1996 update) answered all the questions that will ever arise
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 331<br />
regarding the Internet. Moreover, European, Chinese and other old<br />
economy regulators overse<strong>as</strong> fear that, absent more regulation, their nations<br />
will be byp<strong>as</strong>sed by the American Internet leviathan.<br />
7. Tech Companies & Trade Associations: This group encomp<strong>as</strong>ses a<br />
range of organizations from the politically savvy hardware, software and<br />
communications giants to Internet start-ups. These businesses, from old<br />
stalwarts like IBM, AT&T, and Hewlett Packard to “teenagers” like Cisco<br />
Systems and Microsoft, and “youngsters” like Google and Facebook,<br />
understand that trade, tax, regulatory, and other public policy issues<br />
incre<strong>as</strong>ingly affect their bottom line and competitive position. While the<br />
players in this group (and in Bricks and Mortars) don’t have the same level<br />
of ideological cohesion <strong>as</strong> the above groups, they share a certain set of<br />
interests which justifies their grouping. They realize that getting one’s way<br />
in politics takes more than being right: It requires playing the game and<br />
making one’s c<strong>as</strong>e persu<strong>as</strong>ively. From time to time, some tech businesses<br />
may take the Internet exceptionalist position that the Internet should be left<br />
free from government intervention. Generally, they do so only to avoid<br />
regulation that might put them at a competitive disadvantage. On the<br />
whole, tech companies tend to believe that regulation can be both<br />
advantageous and detrimental; they do not fight against all regulations and<br />
are in favor of the right ones for them (and incre<strong>as</strong>ingly support the<br />
“wrong” ones for their competitors). 4 To some extent, they also advocate<br />
policies that are good for the technology industry or the economy <strong>as</strong> a<br />
whole. While communication companies, being in a traditionally regulated<br />
industry, have long recognized the importance of government, most IT<br />
companies have ignored government and policy issues, being too busy<br />
creating the technologies that drive the digital world. But <strong>as</strong> these<br />
companies have matured and become aware, often through painful<br />
experience, of how issues in W<strong>as</strong>hington can affect their bottom line, many<br />
have evolved into political sophisticates. And while individual tech<br />
companies can have different views on different issues, these differences<br />
are largely rooted in business model interests, rather than ideological views<br />
about the market or government.<br />
8. Bricks-and-Mortars: This group includes the companies, professional<br />
groups, and unions that gain their livelihood from old-economy, face-toface<br />
business transactions. These include both producers (such <strong>as</strong><br />
automobile manufacturers, record companies, and airlines) and distributors<br />
and middlemen (such <strong>as</strong> retailers, car dealers, wine wholesalers, pharmacies,<br />
4 For a discussion of how technology companies view public policy see ACT’s Understanding the<br />
IT Lobby: An Insider’s Guide, 2008,<br />
http://actonline.org/publications/2008/08/05/understanding-the-it-lobby-aninsiders-guide/.
332 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
optometrists, real estate agents, or unions representing workers in these<br />
industries). Many of them fear, often correctly, that the Internet is making<br />
them obsolete, while others have worked to transform their business<br />
models to take advantage of e-commerce. In recent years, there h<strong>as</strong> been a<br />
widening rift between the bricks-and-mortar producers and the distributors<br />
and middlemen (and the unions that represent their workers). Producers<br />
have begun to realize that they can use the Internet to go directly to their<br />
consumers, byp<strong>as</strong>sing (or at le<strong>as</strong>t minimizing) the role of bricks-and-mortar<br />
middlemen. The middlemen and unions, working actively to keep this from<br />
happening or at le<strong>as</strong>t to forestall the day of reckoning, are not shy about<br />
enlisting the aid of government to “level the playing field.” Certainly, the<br />
long running battle over taxing Internet sales represented a fight between<br />
bricks-and-mortars and tech companies. Likewise, the grocery store<br />
workers’ union in California h<strong>as</strong> recently worked to p<strong>as</strong>s legislation making<br />
it more difficult for stores to use self-service checkout systems. 5<br />
The Dividing Lines<br />
The above groups’ attitudes about Internet policy can be placed along two axes:<br />
Individual Empowerment vs. Societal Benefit<br />
This line separates groups on the b<strong>as</strong>is of beliefs about the Internet’s overriding<br />
purpose. In some ways this is a variant on the cl<strong>as</strong>sic tension between liberty<br />
and equality. However, it goes beyond this to represent the tension between<br />
individualism and communitarianism, with the former being a focus on<br />
individual rights, and the latter invoking community benefits like economic<br />
growth, security, and improved quality of life.<br />
Those in the individual empowerment category believe that IT’s chief function<br />
is to liberate individuals from control by, or dependence on, big organizations.<br />
For them the Internet is a v<strong>as</strong>t, open global communications medium designed<br />
principally to enable individuals to freely communicate and access information.<br />
When debating any issue, they examine it principally through the lens of how it<br />
affects individuals, not society <strong>as</strong> a whole. Thus, the issue of net neutrality is<br />
seen in terms of its effect on individual freedom to act in any way desired on<br />
broadband networks. Such groups want to put the little guy on the same playing<br />
field <strong>as</strong> the big boys, whether this means supporting small ISPs, small media<br />
outlets, or individual open source coders.<br />
Those belonging in the societal benefit camp believe IT and the Internet’s main<br />
job is to incre<strong>as</strong>e economic productivity, promote government responsiveness<br />
5 Robert D. Atkinson, Innovation and Its Army of Opponents, BUSINESSWEEK, Sept. 23, 2010,<br />
http://search.businessweek.com/Search?searchTerm=innovation+and+its+army+o<br />
f+opponents&resultsPerPage=20.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 333<br />
and efficiency, and enable the development new and better services for<br />
consumers <strong>as</strong> a societal whole. They tend to examine individual IT policy issues<br />
through the lens of how they affect the communitarian interest and are willing<br />
to accept tradeoffs to individual liberty or freedom if they boost overall<br />
economic or societal well-being. For example, they see the actions of ISPs to<br />
manage their broadband networks <strong>as</strong> being necessary to help the majority of the<br />
users, even if it means that a few “bandwidth hogs” have to wait a minute or<br />
two longer to download their pirated copy of Lord of the Rings. They also believe<br />
that both government and corporations can serve <strong>as</strong> proxies for community<br />
interests, and that what’s good for, say, Cisco, AT&T, Microsoft or Google or<br />
the federal government can be good for America <strong>as</strong> whole. Some groups fall in<br />
between the two extremes and argue that tradeoffs between particular<br />
individual’s benefit (or harm) and community interests are inevitable.<br />
Internet exceptionalists and social engineers generally believe the Internet is all<br />
about individual empowerment. The former resent its commercialization and<br />
view empowerment <strong>as</strong> inevitable. The latter, <strong>as</strong> stated earlier, believe the<br />
Internet should mainly be an educational and social networking tool and fear its<br />
empowering capabilities will be taken away by powerful multinational<br />
corporations and statist governments that will reshape the Internet to serve<br />
their own narrow purposes (profit in the former, control in the latter). Both see<br />
hackers and pirates <strong>as</strong> lone champions standing tall against greedy corporate and<br />
inept government leviathans.<br />
Bricks-and-mortars and old economy regulators see IT in instrumental terms <strong>as</strong><br />
designed for commerce and, by extension, for the community benefit. They just<br />
don’t like how the Internet h<strong>as</strong> evolved, whether it’s competition from Dot-<br />
Coms or the spread of strong encryption that frustrates government<br />
surveillance, censorship, and other control. Tech companies also see IT in more<br />
instrumental terms, arguing that its rules should facilitate robust commerce.<br />
Moral conservatives don’t want individuals empowered, since this will just<br />
enable even more antisocial behavior, and they also don’t want corporations to<br />
facilitate such behavior.<br />
Moderates and free marketers occupy the middle ground. They believe that the<br />
digitization of the economy holds great promise for boosting productivity and<br />
improving society. At the same time, they see the Internet <strong>as</strong> creating<br />
communities, boosting education, and giving people more control over their<br />
lives. Free marketers don’t believe that individual interests should necessarily<br />
trump corporate interests—they see corporations <strong>as</strong> persons under the law.
334 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
Laissez-Faire vs. Government Regulation<br />
The groups divide along this line over the degree to which the government<br />
should impose formal rules on IT and the Internet.<br />
Internet exceptionalists, and to a lesser degree free marketers, believe the<br />
Internet should be governed by its users. These groups lie on the laissez-faire side<br />
of the dividing line. They consider the Internet unique and capable of creating<br />
spontaneous order, a model for how the rest of society should be organized.<br />
Free marketers believe the Internet is what allows Co<strong>as</strong>e’s vision of a society<br />
with low transaction costs and ubiquitous markets to become a reality. 6<br />
At the other extreme are groups on the government regulation side of the line,<br />
who see the Internet <strong>as</strong> a new “Wild West” calling for a man with a badge to<br />
protect vulnerable citizens against intrusive governments and profit-hungry<br />
corporations. Moral conservatives, social engineers, and old economy regulators<br />
tend to hold this view, arguing for an array of government actions to limit what<br />
companies can do. So do bricks-and-mortars, although less <strong>as</strong> a matter of<br />
principle than <strong>as</strong> a way of clinging to their ever-weakening economic position.<br />
Moderates and tech companies occupy the middle ground. They believe the<br />
Internet is unique and generally requires a light regulatory touch if IT<br />
innovation is to thrive. But in some key are<strong>as</strong> such <strong>as</strong> cybersecurity and<br />
copyright protection, they believe that the Internet needs stronger rules,<br />
especially to enable law enforcement to go after bad actors. In still other are<strong>as</strong>,<br />
such <strong>as</strong> the privacy of non-sensitive data and net neutrality, they believe that<br />
self-regulating government/business partnerships are the best way to protect<br />
consumers while giving companies needed flexibility.<br />
ITIF w<strong>as</strong> formed to advance a set of pragmatic solutions to the growing<br />
number of technology-related policy problems. We believe the growth of the<br />
digital economy and society depends on a synthesis of these views: the correct<br />
position will tend to lie at the intersection of the two axes. The dichotomy<br />
between individual empowerment and institutional efficiency is not a zero-sum<br />
game. Individuals benefit both socially and economically when governments<br />
and corporations work more efficiently and effectively, and institutions benefit<br />
when individuals are informed and able to make choices. A light touch on<br />
regulation is important to maintain the flexibility required to operate in this<br />
high-speed economy, but government action is also necessary to give businesses<br />
and consumers confidence that the Internet is not a den of thieves or a market<br />
tilted against fair competition, and to help speed digital transformation (e.g., the<br />
ubiquitous use of IT throughout the economy and society).<br />
6 Economist Ronald Co<strong>as</strong>e postulated that high transaction costs engendered large<br />
organizations. See, e.g., RONALD COASE, THE FIRM, THE MARKET AND THE LAW (1988).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 335<br />
Ongoing Policy Debates<br />
Of course, the above typology is imperfect—with many individuals and<br />
organizations falling into more than one group or no group at all. But <strong>as</strong> one<br />
looks at the central political fights about the future of information technology,<br />
the influence of these competing factions is clear. As c<strong>as</strong>e studies, we consider<br />
the recent debates over four key issues: privacy, taxation, copyright protection,<br />
and net neutrality.<br />
Privacy<br />
While the recent flaps over Facebook and Google Street View are the most<br />
visible examples, the collection and use of personal information about Internet<br />
users by corporations and government is the source of many heated and<br />
emotional debates. Old economy regulators and social engineers want to impose<br />
sweeping regulations that would give individuals control over “their” personal<br />
data. And while they tolerate, grudgingly, advertising <strong>as</strong> the one true business<br />
model for Internet content and services (they oppose ISPs allowing content or<br />
application companies to voluntarily pay for prioritized service) they want to<br />
limit the effectiveness of online advertising, and the revenue it can raise,<br />
because of privacy fears.<br />
Many tech companies want complete freedom to collect personal data, provided<br />
they comply with privacy policies they write themselves. And while some tech<br />
companies have supported moderate “notice and choice” legislation, most<br />
companies remain wary of any federal regulation of privacy, even <strong>as</strong> they<br />
recognize the need for federal laws to preempt incre<strong>as</strong>ingly antsy state<br />
legislators from p<strong>as</strong>sing a patchwork of different Internet privacy bills.<br />
Internet exceptionalists expect technology to solve the problem. As far <strong>as</strong><br />
they’re concerned, users should take responsibility for their own privacy and<br />
apply the tools available to protect their personal data.<br />
Free marketers reject the need for privacy legislation, <strong>as</strong>serting that the harms<br />
from regulation would far outweigh the benefits, and that government<br />
regulation is likely to be an imposition on individual liberty and choice,<br />
including b<strong>as</strong>ic rights of free speech. While moderates worry that overly-strict<br />
privacy laws would stifle innovation and incre<strong>as</strong>e costs for consumers, they also<br />
believe that, absent any rules, users will not develop the trust needed for the<br />
digital economy and society to flourish.<br />
The recent furor over Facebook is a perfect example of how these issues play<br />
out. This social network company announced two new features in 2010: instant<br />
personalization, which allows users to share data from their Facebook profile<br />
with partner websites, and social plug-ins for third party websites, which allow
336 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
users to more e<strong>as</strong>ily share web pages they like with their social network outside<br />
of Facebook. 7<br />
Social engineers howled in protest, demanding restrictive government<br />
regulations to bar such practices. Some, like Danah Boyd, a fellow at Harvard’s<br />
Berkman Center for Internet and Society, went so far <strong>as</strong> to claim that Facebook<br />
functioned <strong>as</strong> a public utility and should be regulated like one. 8<br />
Facebook w<strong>as</strong> slow to react, initially focusing more on highlighting the benefits<br />
of its innovative new tools. However, it quickly responded more appropriately,<br />
rolling out a much more user-friendly and transparent system of user privacy<br />
controls.<br />
ITIF and other moderates <strong>as</strong> well <strong>as</strong> free marketers argue that government<br />
control over the privacy policies of social networks is not necessary to protect<br />
consumers and moreover, would be harmful to future innovation. In the heated<br />
political environment of the privacy debate, government intervention would<br />
probably become regulatory overkill. At the same time, moderates argue that<br />
legitimate privacy concerns about personally identifiable data and sensitive data<br />
(financial or medical information, for example) need to be addressed through<br />
comprehensive industry-wide codes of self-regulation, enforceable by<br />
government action (e.g., FTC action against companies that do not live up to<br />
their own privacy policies for unfair and deceptive trade practices).<br />
When it comes to the collection and use of data by government, the coalitions<br />
reconfigure. Here the Internet exceptionalists, social engineers, and free<br />
marketers make common cause in their crusade against “Big Brother.” It largely<br />
does not matter whether the goal is to crack down on deadbeat dads, catch red<br />
light runners, or prevent terrorist attacks: If it involves the government<br />
collecting more information or using existing information for new purposes,<br />
these groups will generally oppose it. In protesting against the growing practice<br />
of cities installing red light camer<strong>as</strong>, former Republican House majority leader<br />
Dick Armey railed: “This is a full-scale surveillance system. Do we really want a<br />
society where one cannot walk down the street without Big Brother tracking our<br />
every move?” 9<br />
7 For more see: Daniel C<strong>as</strong>tro, Information Technology and Innovation Foundation, The Right<br />
to Privacy is Not a Right to Facebook, April 2010), http://itif.org/publications/facebooknot-right;<br />
and Daniel C<strong>as</strong>tro, Information Technology and Innovation Foundation, Facebook<br />
is Not the Enemy, 2010, http://itif.org/publications/facebook-not-enemy.<br />
8 Danah Boyd, Facebook is a utility; utilities get regulated, APOPHENIA, May 15, 2010,<br />
http://www.zephoria.org/thoughts/archives/2010/05/15/facebook-is-a-utilityutilities-get-regulated.html.<br />
9 Thom<strong>as</strong> C. Greene, Cops Using High-Tech Surveillance in Florida, THE REGISTER, July 2, 2001,<br />
http://www.theregister.co.uk/2001/07/02/cops_using_hightech_surveillance/.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 337<br />
High-tech companies have engaged in the debate over government use of and<br />
access to data b<strong>as</strong>ed in large part on their business interests. Technology<br />
companies with direct business interests in providing government technologies<br />
to collect information (e.g., smart card and biometrics companies) have been<br />
strong supporters of particular initiatives. Other technology companies,<br />
worrying that government access to data can restrict commerce or reduce<br />
consumer trust in the Internet (e.g., in cloud computing applications where<br />
consumer data is remotely stored) have called for limitations or procedural<br />
safeguards on government access to data.<br />
Whether a middle position in the debate can be found remains an ongoing<br />
question. Moderates support the adoption of new technologies by government,<br />
if it is clearly demonstrated that they fulfill an important public mission and if<br />
potential privacy problems are effectively addressed, especially by designing<br />
privacy protections into systems. At the same time, they support putting into<br />
place adequate rules and protections governing the access to that data by<br />
government.<br />
Internet Sales Taxes<br />
Tax policy is controversial in any setting, but perhaps particularly so with regard<br />
to the Internet. The collection of state and local sales taxes for Internet<br />
transactions is so controversial that 15 years after it w<strong>as</strong> first raised, the issue<br />
continues to be debated. Old economy regulators want sales taxes to be<br />
collected on Internet purch<strong>as</strong>es and want high taxes on telecommunications<br />
services to maintain their revenue. The state of Colorado h<strong>as</strong> gone so far <strong>as</strong> to<br />
require Internet retailers to share the names and purch<strong>as</strong>e information of<br />
Colorado residents with the state government (so the state can collect a “use”<br />
tax from Internet shoppers). Bricks and mortar companies want sales taxes<br />
imposed to maintain their competitive position against pure-play Internet<br />
retailers. Some social engineers favor not only sales tax collection, but also<br />
special taxes on broadband use to subsidize access for low-income and rural<br />
households.<br />
By contr<strong>as</strong>t, the tech companies involved in selling over the Internet do not<br />
want the burden of collecting taxes over thousands of jurisdictions, and they do<br />
not want to lose their price advantage. Likewise, they do not want broadband or<br />
telephone service unfairly taxed at higher rates. Others—like many free<br />
marketers and Internet exceptionalists—oppose Internet sales taxes on<br />
principle. They believe “the fewer taxes the better,” especially when it comes to<br />
promoting the new digital economy.<br />
Internet exceptionalists, tech companies, and free marketers will likely continue<br />
to oppose giving states the right to tax Internet sales to their residents from<br />
companies outside their borders. State governments will press hard for the right,<br />
citing their large budget shortfalls. And pragmatists will likely favor state sales
338 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
taxes, particularly if they are tied to a quid pro quo deal forcing states to rescind<br />
laws and regulations that discriminate against e-commerce sellers, and if taxation<br />
is administered in ways that minimize administrative burden. For now, however,<br />
the debate continues, with states legally unable to collect sales taxes and most<br />
states imposing high, discriminatory taxes on telecommunications services.<br />
Copyright Protection<br />
As virtually all media have become digital, protecting copyrights h<strong>as</strong> become a<br />
nightmare. The controversy over the file sharing system Napster almost a<br />
decade ago w<strong>as</strong> just the beginning. The ubiquity of file-sharing technologies,<br />
coupled with computers that can rip digital files from CDs or DVDs, and highspeed<br />
broadband networks that can quickly transfer large files, h<strong>as</strong> meant that<br />
“digital piracy” h<strong>as</strong> grown like wildfire. Internet exceptionalists argue that the<br />
Internet Age marks the end of intellectual property rights because enforcing<br />
copyright protections on digital media is too difficult (hence their mantra<br />
“information wants to be free.”) These advocates claim that non-commercial<br />
file “sharing” of copyrighted media is a form of fair use, which they <strong>as</strong>sert is<br />
legal under copyright law. For example, the Electronic Freedom Forum’s “Let<br />
the Music Play” campaign protests the music and film industries’ prosecution of<br />
file copiers. In their ideal world, some rich dot-com entrepreneur would<br />
establish a separate country on a desert island, linked to the rest of the world by<br />
high speed fiber-optic cable and hosting a m<strong>as</strong>sive computer with a cornucopia<br />
of pirated digital content, all beyond the reach of national copyright laws.<br />
Many social engineers side with the Internet exceptionalists, though for very<br />
different re<strong>as</strong>ons. They fear that technology will let copyright holders exact such<br />
strict control on content that traditional notions of fair use will become<br />
obsolete. And they fear that digital rights management (DRM) technologies will<br />
become so stringent that activities consumers have long enjoyed (like the ability<br />
to play music files on more than one device) will be prohibited. Both argue<br />
strongly against any efforts to better control digital copyright theft that may<br />
impinge on individual liberty or individual rights like free speech (e.g., permitting<br />
ISPs to filter for illegal content or crafting international treaties like ACTA to<br />
strengthen and harmonize anti-piracy efforts). And both would love to see the<br />
<strong>Digital</strong> Millennium Copyright Act (DMCA) enter the dust bin of IT policy<br />
history, particularly the academics and engineers who feel the DMCA restricts<br />
their ability to hack DRM technology in the name of research. 10<br />
Because of their emph<strong>as</strong>is on property rights, most free marketers tend to<br />
strongly support efforts to limit digital copyright theft. But with their focus on<br />
freedom, a few come all the way around to the left, arguing that because liberty<br />
10 Michele Boldrin, Against Intellectual Monopoly, Nov. 10, 2008,<br />
http://www.cato.org/event.php?eventid=5362.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 339<br />
trumps property, the grant of intellectual property rights by government<br />
amounts to the provision of a state-sanctioned monopoly. 11 In their view,<br />
individuals should be free to use digital content in ways they want and content<br />
owners—not others such <strong>as</strong> digital intermediaries—should be responsible for<br />
policing the use of their content.<br />
Moderates also support efforts to limit digital copyright theft, believing that<br />
such theft is wrong, and that a robust digital ecosystem requires economic<br />
incentives to produce often expensive digital content. At the same time,<br />
however, they are not absolutists, and in particular seek to balance the costs and<br />
benefits of copyright defense, especially through fair use.<br />
The bricks and mortar companies—including the Recording Industry<br />
Association of America—initially worked to block the development of new<br />
technologies that facilitate playing <strong>downloaded</strong> and possibly pirated music. But<br />
more than a decade later the content industries are not so much fighting against<br />
such technologies <strong>as</strong> they are working to develop and use technologies that can<br />
counter copyright theft, and going after organizations that enable widespread<br />
digital content theft (e.g., the Swedish Pirate Bay). 12 And even <strong>as</strong> they have<br />
struggled to cope with music and movie piracy, content producers have largely<br />
come to terms with the realities of the digital era: They have begun providing<br />
legal, affordable, and consumer-friendly means for consumers to buy or view<br />
copyright-protected digital content, with Apple’s iTunes music store and Hulu<br />
being the most prominent examples.<br />
Although generally sympathetic to the content providers’ copyright concerns,<br />
many high-tech companies (e.g., ISPs, search engines, social networks) fear that<br />
the federal government will require them to adjust their businesses to become<br />
copyright enforcers, either by having to take action against their customers or<br />
by building in expensive content protection technologies. Once again, the<br />
question is whether a compromise can be found, ensuring that content holders<br />
have the legal protections and economic incentives they need to continue<br />
producing copyrighted materials without imposing overly-large burdens on<br />
technology companies, and by extension their customers.<br />
11 See, e.g., Cato Institute, Against Intellectual Monopoly, 2008,<br />
http://www.cato.org/event.php?eventid=5362.<br />
12 Eric Pfanner, Music Industry Counts the Cost of Piracy, THE NEW YORK TIMES, Jan. 21, 2010,<br />
http://www.nytimes.com/2010/01/22/business/global/22music.html?ref=recordin<br />
g_industry_<strong>as</strong>sociation_of_america; Eamonn Forde, From Peer to Eternity: Will Jumping the<br />
Legal Divide Solve Anything?, THE MUSIC NETWORK, Dec. 6, 2010,<br />
http://www.themusicnetwork.com/music-features/industry/2010/12/06/frompeer-to-eternity-will-jumping-the-legal-divide-solve-anything/.
340 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
Net Neutrality<br />
What h<strong>as</strong> become a highly contentious issue, net neutrality, refers to the idea<br />
that the individual networks collectively forming the Internet be controlled by<br />
users rather than by their owners and operators. While network operators are in<br />
a unique position to manage their resources, proponents of net neutrality<br />
believe they cannot be trusted to utilize their knowledge for the good of the<br />
Internet user community.<br />
Social engineers are the most p<strong>as</strong>sionate about net neutrality, but they make<br />
common cause with the veterans of the old economy regulator group and<br />
Internet exceptionalists. Indeed, social engineer Tim Wu coined the stillmystifying<br />
term “net neutrality.” 13 These groups fear that the Internet’s unique<br />
nature is under threat by the forces of incumbent telecommunications and cable<br />
companies providing broadband service. If “Big Broadband” gets its way,<br />
neutralists fear the Internet will go the way of cable TV, the “v<strong>as</strong>t w<strong>as</strong>teland” 14<br />
where elitist programming such <strong>as</strong> The Wire competes with advertisingsupported,<br />
populist programming such <strong>as</strong> American Idol.<br />
Free marketers see net neutrality <strong>as</strong> one more attack by big government<br />
regulators on the Internet, the l<strong>as</strong>t b<strong>as</strong>tion of freedom from regulation. They<br />
argue that market forces and consumer choice will always discipline any anticonsumer<br />
violations of net neutrality, while antitrust or tort law will serve <strong>as</strong> a<br />
handy tool to remedy any anti-business violations.<br />
Tech companies are split on the issue, largely around which side of the network<br />
they are on. Those tech companies providing network services (e.g., ISPs and<br />
major equipment makers) are generally against strong regulations in support of<br />
network neutrality (at le<strong>as</strong>t with regard to the network itself) while companies<br />
whose business model depends on using the network to gain access to<br />
customers (e.g., content & service providers like Google) are either neutral or in<br />
favor of a stronger regulatory regime (at le<strong>as</strong>t with regard to the infr<strong>as</strong>tructure<br />
layers, <strong>as</strong> opposed to other parts of the Internet “stack,” such <strong>as</strong> applications.)<br />
13 Tim Wu, Network Neutrality, Broadband Discrimination, 2 JOURNAL OF TELECOMMUNICATIONS<br />
AND HIGH TECHNOLOGY LAW 141 (2003),<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863.<br />
14 On May 9, 1961, in a speech to the National Association of Broadc<strong>as</strong>ters, newly-appointed<br />
FCC chairman Newton N. Minow referred to television <strong>as</strong> a “v<strong>as</strong>t w<strong>as</strong>teland.” Newton N.<br />
Minow, “Television and the Public Interest,” address to the National Association of<br />
Broadc<strong>as</strong>ters, W<strong>as</strong>hington, D.C., May 9, 1961.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 341<br />
However, these differences have begun to blur somewhat, <strong>as</strong> evidenced by the<br />
October 2009 joint statement on net neutrality issued by Google and Verizon. 15<br />
Moderates generally see the Internet <strong>as</strong> a work-in-progress. Moderates believe it<br />
is good that network equipment producers are improving the Internet and<br />
recognize operators <strong>as</strong> possessing the highly specialized knowledge needed to<br />
provide equitable access to the Internet’s pool of resources. But moderates<br />
realize that competition doesn’t operate <strong>as</strong> efficiently in some network markets<br />
<strong>as</strong> it does in the markets for general-purpose consumer goods and services. In<br />
other words, some network markets are under-competitive (because network<br />
effects create market power), so markets alone aren’t sufficient to guarantee an<br />
open Internet for everyone. 16 The role of government in Internet regulation is<br />
to ensure that all consumers enjoy the fruits of investment and innovation, but<br />
only in ways that don’t limit continued investment and innovation.<br />
As these and other issues continue to be fought in legislatures and communities<br />
around the country, government officials should seek solutions that balance the<br />
needs of individuals with those of society, and that offer the security of codified<br />
laws when necessary and the flexibility of informal rules when appropriate. As<br />
the technology policy debates go on and the various factions push for the<br />
solutions that fit their ideologies and interests, the policies that promote the<br />
growth and vitality of the digital economy will not be found at the extremes, but<br />
instead in the vital center.<br />
The Future of <strong>Digital</strong> Politics<br />
Some might argue that these issues are transitory and will recede in importance<br />
<strong>as</strong> the digital economy matures. But there is good re<strong>as</strong>on to believe otherwise:<br />
The debates that pit online consumers against resistant middlemen are likely to<br />
continue <strong>as</strong> new forms of online distribution evolve. The emergence of much<br />
f<strong>as</strong>ter and ubiquitous wired and wireless broadband networks will mean more<br />
Americans using these networks and more business models developing to take<br />
advantage of them. Data generated by emerging new technologies such <strong>as</strong><br />
wireless location systems, digital signature systems, intelligent transportation<br />
systems, the smart electric grid, health IT, and radio frequency identification<br />
(RFID) devices—some used by government, others by the private sector—will<br />
drive new privacy concerns among social engineers and their fellow travelers. In<br />
some ways, the digital revolution h<strong>as</strong> been so successful that many previously<br />
15 Lowell McAdam, CEO Verizon Wireless & Eric Schmidt, CEO Google, Finding Common<br />
Ground on an Open Internet, Verizon PolicyBlog, Oct. 21, 2009, http://policyblog.<br />
verizon.com/BlogPost/675/FindingCommonGroundonanOpenInternet.<strong>as</strong>px.<br />
16 Richard Bennett, Information Technology and Innovation Foundation, ITIF Comments on<br />
FCC Broadband Recl<strong>as</strong>sifying, August 10, 2010, http://www.itif.org/publications/itifcomments-fcc-broadband-recl<strong>as</strong>sifying.
342 CHAPTER 5: WHO WILL GOVERN THE NET IN 2020?<br />
analog political issues have become digital issues; on the other hand, the<br />
political issues of the future remain unformed, precisely because the<br />
technologies are changing so quickly.<br />
The public policy issues surrounding the IT revolution are no longer sideshows<br />
or mere theoretical discussions for a handful of technologically savvy people,<br />
nor are they the royal road to a utopia of untold wealth and perfect freedom.<br />
The battle lines have been drawn, and the issues are both serious and complex.<br />
<strong>Digital</strong> politics, if not the great issue of our age, will be central to the life of our<br />
nation in the decade ahead—and well beyond.
PART II<br />
ISSUES & APPLICATIONS<br />
343
CHAPTER 6<br />
SHOULD ONLINE INTERMEDIARIES<br />
BE REQUIRED TO POLICE MORE?<br />
Trusting (and Verifying) Online Intermediaries’ Policing 347<br />
Frank P<strong>as</strong>quale<br />
Online Liability for Payment Systems 365<br />
Mark MacCarthy<br />
345<br />
Fuzzy Boundaries: The Potential Impact<br />
of Vague Secondary Liability Doctrines<br />
on Technology Innovation 393<br />
Paul Szynol
346 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 347<br />
Trusting (and Verifying)<br />
Online Intermediaries’ Policing<br />
By Frank P<strong>as</strong>quale *<br />
Introduction<br />
Internet Service Providers (ISPs) and search engines have mapped the Web,<br />
accelerated e-commerce, and empowered new communities. They can also<br />
enable intellectual property infringement, har<strong>as</strong>sment, stealth marketing, and<br />
frightening levels of surveillance. As a result, individuals are rapidly losing the<br />
ability to control their own image on the web, or even to know what data others<br />
are presented with regarding them. When Web users attempt to find<br />
information or entertainment, they have little <strong>as</strong>surance that a carrier or search<br />
engine is not subtly bi<strong>as</strong>ing the presentation of results in accordance with its<br />
own commercial interests. 1<br />
None of these problems is readily susceptible to swift legal intervention.<br />
Instead, intermediaries themselves have begun policing their own virtual<br />
premises. eBay makes it e<strong>as</strong>y for intellectual property owners to report<br />
infringing merchandise. A carrier like Comc<strong>as</strong>t h<strong>as</strong> the technical power to slow<br />
or block traffic to and from a site like BitTorrent, which is often accused of<br />
infringement. 2 Google’s StopBadware program tries to alert searchers about<br />
malware-ridden websites, 3 and YouTube employs an indeterminate number of<br />
people to police copyright infringement, illegal obscenity, and even many<br />
grotesque or humiliating videos. 4 Reputable social networks do the same for<br />
their own content.<br />
* Professor of Law, Seton Hall Law School; Visiting Fellow, Princeton Center for Information<br />
Technology Policy.<br />
1 Benjamin Edelman, Hard-Coding Bi<strong>as</strong> in Google “Algorithmic” Search Results, Nov. 15, 2010,<br />
available at http://www.benedelman.org/hardcoding/ (“I present categories of searches<br />
for which available evidence indicates Google h<strong>as</strong> “hard-coded” its own links to appear at<br />
the top of algorithmic search results, and I offer a methodology for detecting certain kinds<br />
of tampering by comparing Google results for similar searches. I compare Google’s hardcoded<br />
results with Google’s public statements and promises, including a dozen denials but<br />
at le<strong>as</strong>t one admission.”).<br />
2 See, e.g., Comc<strong>as</strong>t Corp. v. FCC, No. 08-1291, 2010 U.S. App. LEXIS 7039 (D.C. Cir. April<br />
6, 2010).<br />
3 For more information, visit http://stopbadware.org/.<br />
4 YouTube, YouTube Community Guidelines,<br />
http://www.youtube.com/t/community_guidelines (“YouTube staff review flagged
348 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Yet all is not well in the land of online self-regulation. However competently<br />
they police their sites, nagging questions will remain about their fairness and<br />
objectivity in doing so. Is Comc<strong>as</strong>t blocking BitTorrent to stop infringement,<br />
or to decre<strong>as</strong>e access to content that competes with its own for viewers? How<br />
much digital due process does Google need to give a site it accuses of harboring<br />
malware? If Facebook eliminates a video of war carnage, is that a token of<br />
respect for the wounded or one more reflexive effort of a major company to<br />
ingratiate itself with a W<strong>as</strong>hington establishment currently committed to<br />
indefinite military engagement in the Middle E<strong>as</strong>t?<br />
Questions like these will persist, and erode the legitimacy of intermediary selfpolicing,<br />
<strong>as</strong> long <strong>as</strong> key operations of leading companies are shrouded in<br />
secrecy. Administrators must develop an institutional competence for<br />
continually monitoring rapidly-changing business practices. A trusted advisory<br />
council charged with <strong>as</strong>sisting the Federal Trade Commission (FTC) and<br />
Federal Communications Commission (FCC) could help courts and agencies<br />
adjudicate controversies concerning intermediary practices. An Internet<br />
Intermediary Regulatory Council (IIRC) would spur the development of what<br />
Christopher Kelty calls a “recursive public”—one that is “vitally concerned with<br />
the material and practical maintenance and modification of the technical, legal,<br />
practical, and conceptual means of its own existence <strong>as</strong> a public.” 5 Questioning<br />
the power of a dominant intermediary is not just a preoccupation of the<br />
anxious. Rather, monitoring is a prerequisite for <strong>as</strong>suring a level playing field<br />
online.<br />
Understanding Intermediaries’ Power<br />
Internet intermediaries govern online life. 6 ISPs and search engines are<br />
particularly central to the web’s ecology. Users rely on search services to map<br />
the web for them and use ISPs to connect to one another. Economic<br />
sociologist David Stark h<strong>as</strong> observed that “search is the watchword of the<br />
information age.” 7 ISPs are often called “carriers” to reflect the parallel<br />
videos 24 hours a day, seven days a week to determine whether they violate our Community<br />
Guidelines. When they do, we remove them.”)<br />
5 CHRISTOPHER M. KELTY, TWO BITS: THE CULTURAL SIGNIFICANCE OF FREE SOFTWARE 3<br />
(Duke Univ. Press 2007).<br />
6 For a definition of intermediary, see Thom<strong>as</strong> F. Cotter, Some Observations on the Law and<br />
Economics of Intermediaries, 2006 MICH. ST. L. REV. 67, 68–71 (“[A]n ‘intermediary’ can be any<br />
entity that enables the communication of information from one party to another. On the<br />
b<strong>as</strong>is of this definition, any provider of communications services (including telephone<br />
companies, cable companies, and Internet service providers) qualify <strong>as</strong> intermediaries.”).<br />
7 DAVID STARK, THE SENSE OF DISSONANCE: ACCOUNTS OF WORTH IN ECONOMIC LIFE 1<br />
(Princeton Univ. Press 2009) (“Among the many new information technologies that are<br />
reshaping work and daily life, perhaps none are more empowering than the new technologies<br />
of search.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 349<br />
between their own services in the new economy and transportation<br />
infr<strong>as</strong>tructure. Online intermediaries organize and control access to an<br />
extraordinary variety of digitized content. Content providers aim to be at the<br />
top of Google Search or Google News results. 8 Services like iTunes, Hulu, and<br />
YouTube offer audio and video content. Social networks are extending their<br />
reach into each of these are<strong>as</strong>. Cable-b<strong>as</strong>ed ISPs like Comc<strong>as</strong>t have their own<br />
relationships with content providers. 9<br />
When an Internet connection is dropped, or a search engine fails to produce a<br />
result the searcher knows exists somewhere on the web, such failures are<br />
obvious. However, most web experiences do not unfold in such a binary, p<strong>as</strong>s–<br />
fail manner. An ISP or search engine can slow down the speed or reduce the<br />
ranking of a website in ways that are very hard for users to detect. Moreover,<br />
there are many points of control, or layers, of the Web. 10 Even when users’<br />
experience with one layer causes suspicion, it can blame others for the problem.<br />
The new power of intermediaries over reputation and visibility implicates<br />
several traditional concerns of the American legal system. 11 Unfortunately,<br />
Internet intermediaries are presently bound only by weak and inadequate<br />
enforcement of consumer protection and false advertising statutes, which were<br />
designed for very different digital infr<strong>as</strong>tructures.<br />
8 See Deborah Fallows & Lee Rainie, Pew Internet & Am. Life Project, Data Memo: The<br />
Popularity and Importance of Search Engines 2 (Aug. 2004),<br />
http://www.pewinternet.org/pdfs/PIP_Data_Memo_Searchengines.pdf (“The<br />
average visitor scrolled through 1.8 result pages during a typical search.”); Leslie Marable,<br />
False Oracles: Consumer Reaction to Learning the Truth About How Search Engines Work: Results of an<br />
Ethnographic Study, CONSUMER WEBWATCH, June 30, 2003, at 5, available at<br />
http://www.consumerwebwatch.org/pdfs/false-oracles.pdf (“The majority of<br />
participants never clicked beyond the first page of search results. They trusted search<br />
engines to present only the best or most accurate, unbi<strong>as</strong>ed results on the first page.”).<br />
9 ROBERT W. MCCHESNEY, RICH MEDIA, POOR DEMOCRACY: COMMUNICATION POLITICS IN<br />
DUBIOUS TIMES 123 (2000) (describing how convergence of digital technology “eliminates<br />
the traditional distinctions between media and communications sectors”).<br />
10 JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET—AND HOW TO STOP IT 67 (2008)<br />
(describing a physical layer, the “actual wires or airwaves over which data will flow;” an<br />
application layer, “representing the t<strong>as</strong>ks people might want to perform on the network;” a<br />
content layer, “containing actual information exchanged among the network’s users;” and a<br />
social layer, “where new behaviors and interactions among people are enabled by the<br />
technologies underneath”).<br />
11 Yochai Benkler, Communications Infr<strong>as</strong>tructure Regulation and the Distribution of Control over Content,<br />
22 TELECOMM. POL’Y 183, 185–86 (1998) (describing the power of intermediaries over<br />
information flow: “technology, institutional framework, and organizational adaptation …<br />
determine … who can produce information, and who may or must consume, what type of<br />
information, under what conditions, and to what effect”); Cotter, supra note 6, at 69–71<br />
(discussing some of the functions of technological intermediaries, including their control of<br />
information flow from suppliers to consumers).
350 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
In the space of a brief essay, I cannot survey the entire range of intermediary<br />
policing practices. But it is worthwhile to drill down a bit into the tough<br />
questions raised by one intermediary—the dominant search engine, Google—<strong>as</strong><br />
it decides what is and is not an acceptable practice for search engine optimizers<br />
who want their clients’ sites to appear higher in the rankings for given queries.<br />
Search engineers tend to divide the search engine optimization (SEO) business<br />
into “good guys” and “bad guys,” often calling the former “white hat SEO” and<br />
the latter “black hat SEO.” 12 Some degree of transparency regarding the search<br />
engine’s algorithm is required to permit white hat SEO. These rules are<br />
generally agreed upon <strong>as</strong> practices that “make the web better;” i.e., have fresh<br />
content, don’t sell links, don’t “stuff metatags” with extraneous information just<br />
to get attention. However, if there were complete transparency, “black hat”<br />
SEOs could unfairly elevate the visibility of their clients’ sites—and even if this<br />
were only done temporarily, the resulting churn and chaos could severely reduce<br />
the utility of search results. Moreover, a search engine’s competitors could use<br />
the trade secrets to enhance its own services.<br />
This secrecy h<strong>as</strong> led to a growing gray zone of Internet practices with uncertain<br />
effect on sites’ rankings. Consider some of the distinctions below, b<strong>as</strong>ed on<br />
search engine optimization literature:<br />
White Hat (acceptable) 13 Gray Area (unclear how these<br />
are treated) 14<br />
Asking blogs you like to<br />
link to you, or engaging in<br />
reciprocal linking between<br />
your site and other sites in a<br />
legitimate dialogue. 15<br />
Paying a blogger or site to link<br />
to your blog in order to boost<br />
search results and not just to<br />
incre<strong>as</strong>e traffic.<br />
Black Hat (unacceptable; can<br />
lead to down-ranking in Google<br />
results or even the “Google<br />
Death Penalty” of De-Indexing)<br />
Creating a “link farm” of spam<br />
blogs (splogs) to link to you, or<br />
linking between multiple sites<br />
you created (known <strong>as</strong> link<br />
farms) to boost search results. 16<br />
12 Elizabeth van Couvering, Is Relevance Relevant?,<br />
http://jcmc.indiana.edu/vol12/issue3/vancouvering.html (search engineers’<br />
“animosity towards the … guerilla fighters of spamming and hacking, is more direct” than<br />
their hostility toward direct business competitors); Aaron Wall, Google Thinks YOU Are a<br />
Black Hat SEO. Should You Trust Them?, SEOBOOK, Apr. 17, 2008,<br />
http://www.seobook.com/to-google-you-are-a-spammer (claiming that Google<br />
discriminates against self-identified SEOs).<br />
13 Phil Craven, ‘Ethical’ Search Engine Optimization Exposed!, WebWorkshop,<br />
http://www.webworkshop.net/ethical-search-engine-optimization.html (l<strong>as</strong>t visited<br />
Jun. 8, 2009).<br />
14 Grey Hat SEO, http://greyhatseo.com/ (l<strong>as</strong>t visited Jun. 5, 2006) (claiming a Grey Hat<br />
SEO is someone who uses black hat techniques in an ethical way.)<br />
15 Link Schemes, GOOGLE WEBMASTER CENTRAL,<br />
http://www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer=66356 (“The
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 351<br />
White Hat (acceptable) 13 Gray Area (unclear how these<br />
are treated) 14<br />
Running human-conducted<br />
tests of search inquiries<br />
with permission from the<br />
search engine.<br />
Creating non-intentional<br />
duplicate content (through<br />
printer-friendly versions,<br />
pages aimed at mobile<br />
devices, etc.) 18<br />
Generating a coherent site<br />
with original and<br />
informative material aimed<br />
at the user<br />
Doing a few queries to do<br />
elementary reverse<br />
engineering. (This may not be<br />
permitted under the Terms of<br />
Service).<br />
Intentionally creating<br />
permitted duplicate content to<br />
boost search results<br />
Creating content or additional<br />
pages that walk the line<br />
between useful information<br />
and “doorway pages”<br />
Black Hat (unacceptable; can<br />
lead to down-ranking in Google<br />
results or even the “Google<br />
Death Penalty” of De-Indexing)<br />
Using computer programs to<br />
send automated search queries<br />
to gauge the page rank<br />
generated from various search<br />
terms (Terms of Service<br />
prohibit this) 17<br />
Intentionally creating<br />
unnecessary duplicate content<br />
on many pages and domains to<br />
boost results<br />
Creating “doorway pages” that<br />
are geared towards popular<br />
keywords but that redirect to a<br />
largely unrelated main site. 19<br />
best way to get other sites to create relevant links to yours is to create unique, relevant<br />
content that can quickly gain popularity in the Internet community. The more useful content<br />
you have, the greater the chances someone else will find that content valuable to their<br />
readers and link to it.”).<br />
16 Duncan Riley, Google Declares Jihad On Blog Link Farms, TECHCRUNCH, Oct. 24, 2007,<br />
http://www.techcrunch.com/2007/10/24/google-declares-jihad-on-blog-linkfarms/.<br />
17 Automated Queries, GOOGLE WEBMASTER CENTRAL,<br />
http://www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer=66357<br />
(“Google’s Terms of Service do not allow the sending of automated queries of any sort to<br />
our system without express permission in advance from Google.”); Google Terms of<br />
Service: Use of the Services by you, http://www.google.com/accounts/TOS (l<strong>as</strong>t visited<br />
Jun. 4, 2009) (“You agree not to access (or attempt to access) any of the Services by any<br />
means other than through the interface that is provided by Google, unless you have been<br />
specifically allowed to do so in a separate agreement with Google.”).<br />
18 Duplicate Content, GOOGLE WEBMASTER CENTRAL,<br />
http://www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer=66359<br />
(“Examples of non-malicious duplicate content could include: Discussion forums that can<br />
generate both regular and stripped-down pages targeted at mobile devices, Store items<br />
shown or linked via multiple distinct URLs, Printer-only versions of web pages”).<br />
19 Google Blogoscoped, German BMW Banned From Google, Feb. 4, 2006,<br />
http://blogoscoped.com/archive/2006-02-04-n60.html; Matt Cutts, Ramping up on<br />
International Webspam, MATT CUTTS: GADGETS, GOOGLE, AND SEO, Feb. 4, 2006,<br />
http://www.mattcutts.com/blog/ramping-up-on-international-webspam/ (Google<br />
employee confirming BMW’s ban).
352 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
White Hat (acceptable) 13 Gray Area (unclear how these<br />
are treated) 14<br />
Targeting an appreciative<br />
audience 20<br />
Influencing search engine<br />
by making pages e<strong>as</strong>ier to<br />
scan by automated bots 23<br />
Putting random references to<br />
salacious or celebrity topics on<br />
a blog primarily devoted to<br />
discussing current affairs 21<br />
Creating “hidden pages” when<br />
there may be a logical re<strong>as</strong>on<br />
to show one page to search<br />
engine bots and another page<br />
to users who type in the page’s<br />
URL<br />
Black Hat (unacceptable; can<br />
lead to down-ranking in Google<br />
results or even the “Google<br />
Death Penalty” of De-Indexing)<br />
Distracting an involuntary<br />
audience with completely<br />
misleading indexed content<br />
(akin to “initial interest<br />
confusion” in Internet<br />
trademark law) 22<br />
Using “hidden pages” to show a<br />
misleading page to search<br />
engine bots, and another page<br />
to users who type in the page’s<br />
URL<br />
As these practices show, search engines are referees in the millions of contests<br />
for attention that take place on the web each day. There are hundreds of<br />
entities that want to be the top result in response to a query like “sneakers,”<br />
“restaurant in New York City,” or “best employer to work for.” Any academic<br />
who writes on an obscure subject wants to be the “go-to” authority when it is<br />
Googled—and for consultants, a top or tenth-ranked result could be the<br />
20 Webm<strong>as</strong>ter Guidelines: Design and content guidelines, GOOGLE WEBMASTER CENTRAL,<br />
http://www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer=35769 (l<strong>as</strong>t<br />
visited Jun. 4, 2009) (“Think about the words users would type to find your pages, and make<br />
sure that your site actually includes those words within it.”).<br />
21 Daniel Solove, Thanks, Jennifer Aniston (or the Manifold Ways to Do the Same Search),<br />
CONCURRING OPINIONS,<br />
http://www.concurringopinions.com/archives/2006/01/thanks_jennifer.html (“One<br />
of my more popular posts is one entitled Jennifer Aniston Nude Photos and the Anti-<br />
Paparazzi Act. It seems to be getting a lot of readers interested in learning about the<br />
workings of the Anti-Paparazzi Act and the law of information privacy. It sure is surprising<br />
that so many readers are eager to understand this rather technical statute. Anyway, for the<br />
small part that Jennifer Aniston plays in this, we thank her for the traffic.”); Dan Filler, Coffee<br />
Or Nude Celebrity Photos: A Tale Of Two Evergreen Posts, THE FACULTY LOUNGE,<br />
http://www.thefacultylounge.org/2008/04/coffee-or-nude.html (“significant amounts<br />
of traffic arrived in the form of web surfers seeking out pictures of Jennifer Aniston”).<br />
22 J<strong>as</strong>on Preston, Google punishes Squidoo for having too much Spam, BLOG BUSINESS SUMMIT, Jul. 11,<br />
2007, http://blogbusinesssummit.com/2007/07/google-punishes-squidoo-forhaving-too-much-spam.htm.<br />
23 Webm<strong>as</strong>ter Guidelines: Design and Content Guidelines, GOOGLE WEBMASTER CENTRAL,<br />
http://www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer=35769 (l<strong>as</strong>t<br />
visited Jun. 4, 2009) (“Create a useful, information-rich site, and write pages that clearly and<br />
accurately describe your content.”); Id. (“Try to use text instead of images to display<br />
important names, content, or links. The Google crawler doesn’t recognize text contained in<br />
images.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 353<br />
difference between lucrative gigs and obscurity. The top and right hand sides of<br />
many search engine pages are open for paid placement; but even there the<br />
highest bidder may not get a prime spot because a good search engine strives to<br />
keep even these sections very relevant to searchers. 24 The organic results are<br />
determined by search engines’ proprietary algorithms, and preliminary evidence<br />
indicates that searchers (and particularly educated searchers) concentrate<br />
attention there. Businesses can grow reliant on good Google rankings <strong>as</strong> a way<br />
of attracting and keeping customers.<br />
For example, John Battelle tells the story of the owner of 2bigfeet.com (a seller<br />
of large-sized men’s shoes), whose site w<strong>as</strong> knocked off the first page of<br />
Google’s results for terms like “big shoes” by a sudden algorithm shift in<br />
November 2003, right before the Christm<strong>as</strong> shopping se<strong>as</strong>on. The owner<br />
attempted to contact Google several times, but said he “never got a response.”<br />
Google claimed the owner may have hired a search engine optimizer who ran<br />
afoul of its rules—but it would not say precisely what those rules were. 25 Like<br />
the IRS’s unwillingness to disclose all of its “audit flags,” the company did not<br />
24 Steven Levy, Secret of Googlenomics: Data-Fueled Recipe Brews Profitability, WIRED, May 2, 2009,<br />
http://www.wired.com/culture/culturereviews/magazine/17-<br />
06/nep_googlenomics (in Google’s AdWords program,”The bids themselves are only a<br />
part of what ultimately determines the auction winners. The other major determinant is<br />
something called the quality score. This metric strives to ensure that the ads Google shows<br />
on its results page are true, high-caliber matches for what users are querying. If they aren’t,<br />
the whole system suffers and Google makes less money.”); see also Google, What is the Quality<br />
Score and How is it Calculated,<br />
http://adwords.google.com/support/aw/bin/answer.py?hl=en&answer=10215 (l<strong>as</strong>t<br />
visited Sept. 1, 2009) (“The AdWords system works best for everybody—advertisers, users,<br />
publishers, and Google too—when the ads we display match our users’ needs <strong>as</strong> closely <strong>as</strong><br />
possible.”).<br />
25 JOHN BATTELLE, THE SEARCH: HOW GOOGLE AND ITS RIVALS REWROTE THE RULES OF<br />
BUSINESS AND TRANSFORMED OUR CULTURE (Portfolio Trade 2005). See also Joe Nocera,<br />
Stuck in Google’s Doghouse, N.Y. TIMES, Sept. 13, 2008,<br />
http://www.nytimes.com/2008/09/13/technology/13nocera.html (“In the summer of<br />
2006 … Google pulled the rug out from under [web business owner Dan Savage, who had<br />
come to rely on its referrals to his page, Sourcetool]… . When Mr. Savage <strong>as</strong>ked Google<br />
executives what the problem w<strong>as</strong>, he w<strong>as</strong> told that Sourcetool’s “landing page quality” w<strong>as</strong><br />
low. Google had recently changed the algorithm for choosing advertisements for prominent<br />
positions on Google search pages, and Mr. Savage’s site had been identified <strong>as</strong> one that<br />
didn’t meet the algorithm’s new standards… . Although the company never told Mr. Savage<br />
what, precisely, w<strong>as</strong> wrong with his landing page quality, it offered some suggestions for<br />
improvement, including running fewer AdSense ads and manually typing in the addresses<br />
and phone numbers of the 600,000 companies in his directory, even though their Web sites<br />
were just a click away. At a cost of several hundred thousand dollars, he made some of the<br />
changes Google suggested. No improvement.”). Savage filed suit against Google on an<br />
antitrust theory, which w<strong>as</strong> dismissed in March 2010. See TradeComet, LLC v. Google, Inc.,<br />
2010 U.S. Dist. LEXIS 20154 (S.D. N.Y. March 5, 2010),<br />
http://www.courthousenews.com/2010/03/08/Google%20opinion.pdf.
354 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
want to permit manipulators to gain too great an understanding of how it<br />
detected their tactics.<br />
So far, claims like 2bigfeet.com’s have not been fully examined in the judicial<br />
system, largely because Google h<strong>as</strong> successfully deflected them by claiming that<br />
its search results embody opinions protected by the First Amendment. Several<br />
articles have questioned whether blanket First Amendment protection covers all<br />
search engine actions, and that conclusion h<strong>as</strong> not yet been embraced on the<br />
appellate level in the United States. 26 The FTC’s guidance to search engines,<br />
promoting the clear separation of organic and paid results, suggests that search<br />
engines’ First Amendment shield is not insurmountable. 27 While a creative or<br />
opportunistic litigant could conceivably advance a First Amendment right to<br />
promote products or positions without indicating that the promotion h<strong>as</strong> been<br />
paid for, such a challenge h<strong>as</strong> not yet eliminated false advertising law, and even<br />
political speakers have been required to reveal their funding sources. 28<br />
Qualified Transparency for Carrier<br />
& Search Engine Practices<br />
Both search engines’ ranking practices and carriers’ network management<br />
should be transparent to some entity capable of detecting bi<strong>as</strong>ed policing by<br />
these intermediaries. 29 There are some institutional precedents for the kind of<br />
monitoring that would be necessary to accomplish these goals. For example,<br />
the French Commission Nationale De L’Informatique et des Libertes (CNIL)<br />
h<strong>as</strong> several prerogatives designed to protect the privacy and reputation of<br />
26 Frank P<strong>as</strong>quale, Rankings, Reductionism, and Responsibility, CLEVELAND ST. L. REV. (2006);<br />
Frank P<strong>as</strong>quale & Oren Bracha, Federal Search Commission, 93 CORNELL L. REV. 1149 (2008);<br />
Jennifer A. Chandler, A Right to Reach an Audience: An Approach to Intermediary Bi<strong>as</strong> on the<br />
Internet, 35 HOFSTRA L. REV. 1095, 1109 (2007).<br />
27 See Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 26 (discussing the implications of<br />
Ellen Goodman’s work on “stealth marketing” for search engines, and how the Hippsley<br />
Letter of 2002 inadequately addressed such concerns in the industry.).<br />
28 In early c<strong>as</strong>es alleging an array of unfair competition and business torts claims against search<br />
engines, the First Amendment h<strong>as</strong> proven a formidable shield against liability. Search<br />
engines characterize their results <strong>as</strong> opinion, and lower courts have been reluctant to penalize<br />
them for these forms of expression. In other work, I have described why this First<br />
Amendment barrier to accountability should not be insurmountable. Search engines take<br />
advantage of a web of governmental immunities that they would be loath to surrender.<br />
FAIR v. Rumsfeld, 547 U.S. 47 (2006) and cognate c<strong>as</strong>es stand for the proposition that such<br />
immunities can be conditioned on agreement to certain conditions on an entity’s speech.<br />
Whatever the federal government’s will, it is within its power to regulate ranking and rating<br />
entities in some way when they are so deeply dependent on governmental action. Frank<br />
P<strong>as</strong>quale, Asterisk Revisited, 3 J. BUS. & TECH. LAW 61 (2008).<br />
29 I mean partial in two senses of the word—unduly self-interested, or only partly solving<br />
problems they claim to be solving.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 355<br />
French citizens, and to enforce standards of fair data practices. 30 CNIL<br />
“ensure[s] that citizens are in a position to exercise their rights through<br />
information” by requiring data controllers to “ensure data security and<br />
confidentiality,” to “accept on-site inspections by the CNIL,” and to “reply to<br />
any request for information.” 31 CNIL also grants individual persons the right to<br />
obtain information about the digital dossiers kept on them and the use of this<br />
information. For example, CNIL explains that French law provides that:<br />
Every person may, on simple request addressed to the<br />
organisation in question, have free access to all the information<br />
concerning him in clear language.<br />
Every person may directly require from an organisation<br />
holding information about him that the data be corrected (if<br />
they are wrong), completed or clarified (if they are incomplete<br />
or equivocal), or er<strong>as</strong>ed (if this information could not legally be<br />
collected).<br />
30 Law No. 78-17 of January 6, 1978, J.C.P. 1978, III, No. 44692. English translation of law <strong>as</strong><br />
amended by law of August 6, 2004, and by Law of May 12, 2009,<br />
http://www.cnil.fr/fileadmin/documents/en/Act78-17VA.pdf; French language text<br />
modified through Law No. 2009-526 of May 12, 2009, J.O., May 13, 2009,<br />
http://www.cnil.fr/la-cnil/qui-sommes-nous/; French language consolidated version <strong>as</strong><br />
of May 14, 2009,<br />
http://www.legifrance.gouv.fr/affichTexte.do?cidTexte=JORFTEXT000000886460<br />
&f<strong>as</strong>tPos=1&f<strong>as</strong>tReqId=826368234&categorieLien=cid&oldAction=rechTexte.<br />
Commission Nationale de l’Informatique et des Libertés (CNIL), founded by Law No. 78-17<br />
of January 6, 1978, supra, is an independent administrative French authority protecting<br />
privacy and personal data held by government agencies and private entities. Specifically,<br />
CNIL’s general mission consists of ensuring that the development of information<br />
technology remains at the service of citizens and does not breach human identity, human<br />
rights, privacy, or personal or public liberties.<br />
31 CNIL, Rights and Obligations, http://www.cnil.fr/english/the-cnil/rights-andobligations/<br />
(l<strong>as</strong>t visited Mar. 12, 2010). Specifically, Chapter 6, Article 44, of the CNILcreating<br />
Act provides:<br />
The members of the “Commission nationale de l’informatique et des<br />
libertés” <strong>as</strong> well <strong>as</strong> those officers of the Commission’s operational services<br />
accredited in accordance with the conditions defined by the l<strong>as</strong>t paragraph of<br />
Article 19 (accreditation by the commission), have access, from 6 a.m to 9<br />
p.m, for the exercise of their functions, to the places, premises, surroundings,<br />
equipment or buildings used for the processing of personal data for<br />
professional purposes, with the exception of the parts of the places,<br />
premises, surroundings, equipment or buildings used for private purposes.<br />
Law No. 78-17 of January 6, 1978, J.C.P. 1978, III, No. 44692, ch. 6, art. 44, at 30,<br />
http://www.cnil.fr/fileadmin/documents/en/Act78-17VA.pdf.
356 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Every person may oppose that information about him is used<br />
for advertising purposes or for commercial purposes. 32<br />
While the United States does not have the same tradition of protecting privacy<br />
prevalent in Europe, 33 CNIL’s aims and commitments could prove worthwhile<br />
models for U.S. agencies.<br />
U.S. policymakers may also continue to experiment with public–private<br />
partnerships to monitor problematic behavior at search engines and carriers.<br />
For instance, the National Advertising Division (NAD) of the Council of Better<br />
Business Bureaus is a “voluntary, self-regulating body” that fields complaints<br />
about allegedly untruthful advertising. 34 The v<strong>as</strong>t majority of companies<br />
investigated by NAD comply with its recommendations, but can also resist its<br />
authority and resolve the dispute before the FTC. 35 Rather than overwhelming<br />
the agency with adjudications, the NAD process provides an initial forum for<br />
advertisers and their critics to contest the validity of statements. 36 NAD is part<br />
of a larger <strong>as</strong>sociation called the National Advertising Review Council (NARC),<br />
which promulgates procedures for NAD, the Children’s Advertising Review<br />
Unit (CARU), and the National Advertising Review Board (NARB). 37<br />
32 CNIL, Rights and Obligations, supra note 31.<br />
33 James Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty, 113 YALE L.J. 1151,<br />
1155 (2004) (comparing U.S. and European privacy law).<br />
34 Seth Stevenson, How New Is New? How Improved Is Improved? The People Who Keep Advertisers<br />
Honest, SLATE, July 13, 2009, http://www.slate.com/id/2221968.<br />
35 Id. (“When an ad is brought to their attention, the NAD’s lawyers review the specific claims<br />
at issue. The rule is that the advertiser must have substantiated any claims before the ad w<strong>as</strong><br />
put on the air, so the NAD will first <strong>as</strong>k for any substantiating materials the advertiser can<br />
provide. If the NAD lawyers determine that the claims aren’t valid, they’ll recommend that<br />
the ad be altered. The compliance rate on this is more than 95 percent. But if the advertiser<br />
refuses to modify the ad (this is a voluntary, self-regulating body, not a court of law), the<br />
NAD will refer the matter to the Federal Trade Commission. One such FTC referral<br />
resulted in an $83 million judgment against a weight-loss company.”).<br />
36 Id.<br />
37 NATIONAL ADVERTISING REVIEW COUNCIL, THE ADVERTISING INDUSTRY’S PROCESS OF<br />
VOLUNTARY SELF-REGULATION: POLICIES AND PROCEDURES § 2.1(a) (July 27, 2009) (“The<br />
National Advertising Division of the Council of Better Business Bureaus (hereinafter NAD),<br />
and the Children’s Advertising Review Unit (CARU), shall be responsible for receiving or<br />
initiating, evaluating, investigating, analyzing (in conjunction with outside experts, if<br />
warranted, and upon notice to the parties), and holding negotiations with an advertiser, and<br />
resolving complaints or questions from any source involving the truth or accuracy of<br />
national advertising.”). Though billed <strong>as</strong> “self-regulation,” it is difficult to see how the policy<br />
would have teeth were it not self-regulation in the shadow of an FTC empowered by the<br />
Lanham Act to aggressively police false advertising. The FTC h<strong>as</strong> several mechanisms by<br />
which to regulate unfair business practices in commerce. See, e.g., 15 U.S.C. § 45(b) (2006)
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 357<br />
Instead of an “Innovation Environment Protection Agency (iEPA)” (the agency<br />
Lawrence Lessig proposed to supplant the FCC), I would recommend the<br />
formation of an Internet Intermediary Regulatory Council (IIRC), which would<br />
<strong>as</strong>sist both the FCC and FTC in carrying out their present missions. 38 Like the<br />
NARC, the IIRC would follow up on complaints made by competitors, the<br />
public, or when it determines that a practice deserves investigation. If the selfregulatory<br />
council failed to reconcile conflicting claims, it could refer complaints<br />
to the FTC (in the c<strong>as</strong>e of search engines, which implicate the FTC’s extant<br />
expertise in both privacy and advertising) or the FCC (in the c<strong>as</strong>e of carriers).<br />
In either context, an IIRC would need not only lawyers, but also engineers and<br />
programmers who could fully understand the technology affecting data, ranking,<br />
and traffic management practices.<br />
An IIRC would research and issue reports on suspect practices by Internet<br />
intermediaries, while respecting the intellectual property of the companies it<br />
investigated. An IIRC could generate official and even public understanding of<br />
intermediary practices, while keeping crucial proprietary information under the<br />
control of the companies it monitors. An IIRC could develop a detailed<br />
description of safeguards for trade secrets, which would prevent anyone outside<br />
its offices from accessing the information. 39 Another option would be to allow<br />
IIRC agents to inspect such information without actually obtaining it. An IIRC<br />
could create “reading rooms” for use by its experts, just <strong>as</strong> some courts allow<br />
restrictive protective orders to govern discovery in disputes involving trade<br />
secrets. The experts would review the information in a group setting (possibly<br />
during a period of days) to determine whether a given intermediary had engaged<br />
in practices that could constitute a violation of privacy or consumer protection<br />
laws. Such review would not require any outside access to sensitive<br />
information.<br />
I prefer not to specify at this time whether an IIRC would be a private or public<br />
entity. Either approach would have distinct costs and benefits explored (in<br />
part) by a well-developed literature on the role of private entities in Internet<br />
(giving the commission the authority to register an official complaint against an entity<br />
engaged in unfair business methods).<br />
38 It could include a search engine division, an ISP division focusing on carriers, and eventually<br />
divisions related to social networks or auction sites if their practices begin to raise<br />
commensurate concerns.<br />
39 This is the way that the NAD proceeds. It provides specific procedures under which the<br />
participants can request that certain sensitive information be protected. See NAT’L<br />
ADVERTISING REVIEW COUNCIL, THE ADVERTISING INDUSTRY’S PROCESS OF VOLUNTARY<br />
SELF-REGULATION § 2.4(d)–(e), at 4–5 (2009),<br />
http://www.nadreview.org/07_Procedures.pdf (discussing procedure for confidential<br />
submission of trade secrets).
358 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
governance. 40 Regardless of whether monitoring is done by a governmental<br />
entity (like CNIL) or an NGO (like NARC), we must begin developing the<br />
institutional capacity to permit a more rapid understanding of intermediary<br />
actions than traditional litigation permits. 41<br />
It is not merely markets and antitrust enforcement that are insufficient to<br />
constrain problematic intermediary behavior—the common law is also likely to<br />
fall short. It is hard to imagine any but the wealthiest and most sophisticated<br />
plaintiffs’ attorneys attempting to understand the tweaks to the Google<br />
algorithm that might have unfairly diminished their clients’ sites’ salience. Trade<br />
secrets have been deployed in the context of other litigation to frustrate<br />
investigations of black box algorithms. 42 Examination of Google’s algorithms<br />
subject to very restrictive protective orders would amount to a similar barrier to<br />
accountability. Given its recent string of litigation victories, it is hard to imagine<br />
rational litigants continuing to take on that risk. Moreover, it makes little sense<br />
for a court to start from scratch in understanding the complex practices of<br />
intermediaries when an entity like the IIRC could develop l<strong>as</strong>ting expertise in<br />
interpreting their actions.<br />
A status quo of unmonitored intermediary operations is a veritable “ring of<br />
Gyges,” 43 tempting them to push the envelope with policing practices which<br />
40 See, e.g., Philip J. Weiser, Internet Governance, Standard Setting, and Self-Regulation, 28 N. KY. L.<br />
REV. 822, 822 (2001) (examining “in particular the nature and limits of a key private<br />
regulator of the Internet: standard-setting organizations and their institution of open,<br />
interoperable standards”).<br />
41 Google h<strong>as</strong> already recognized the need for some kind of due process in response to<br />
complaints about its labeling of certain websites <strong>as</strong> “harmful” (due to the presence of viruses<br />
or other security threats at the sites) via the StopBadware program. See ZITTRAIN, FUTURE<br />
OF THE INTERNET, supra note 10, at 171 (“Requests for review—which included ple<strong>as</strong> for<br />
help in understanding the problem to begin with—inundated StopBadware researchers, who<br />
found themselves overwhelmed in a matter of days by appeals from thousands of Web sites<br />
listed. Until StopBadware could check each site and verify it had been cleaned of bad code,<br />
the warning page stayed up.”). Google’s cooperation with the Harvard Berkman Center for<br />
Internet Research to run the StopBadware program could prefigure future intermediary<br />
cooperation with NGOs to provide “rough justice” to those disadvantaged by certain<br />
intermediary practices.<br />
42 See Jessica Ring Amunson & Sam Hirsch, The C<strong>as</strong>e of the Disappearing Votes: Lessons from the<br />
Jennings v. Buchanan Congressional Election Contest, 17 WM. & MARY BILL RTS. J. 397, 397–98<br />
(2008) (“[T]he litigation ultimately w<strong>as</strong> utterly inconclusive <strong>as</strong> to the re<strong>as</strong>on for the 18,000<br />
electronic undervotes because discovery targeting the defective voting system w<strong>as</strong> thwarted<br />
when the voting machines’ manufacturer successfully invoked the trade-secret privilege to<br />
block any investigation of the machines or their software by the litigants.”).<br />
43 “The Ring of Gyges is a mythical magical artifact mentioned by the philosopher Plato in<br />
book 2 of his Republic (2.359a–2.360d). It granted its owner the power to become invisible<br />
at will. Through the story of the ring, Republic discusses whether a typical person would be<br />
moral if he did not have to fear the consequences of his actions.” Wikipedia, Ring of Gyges,<br />
http://en.wikipedia.org/wiki/Ring_of_Gyges (l<strong>as</strong>t accessed Dec. 1, 2010).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 359<br />
cannot be scrutinized or challenged. Distortions of the public sphere are also<br />
likely. While a commercially-influenced “f<strong>as</strong>t-tracking” or “up-ranking” of<br />
some content p<strong>as</strong>t others might raise suspicions among its direct (but dispersed)<br />
victims, the real issues it raises are far broader. If an online ecology of<br />
information that purports to be b<strong>as</strong>ed on one mode of ordering is actually b<strong>as</strong>ed<br />
on another, it sets an unfair playing field whose bi<strong>as</strong>es are largely undetectable<br />
by lay observers. Stealth marketing generates serious negative externalities that<br />
menace personal autonomy and cultural authenticity. Moreover, the degree of<br />
expertise necessary to recognize these externalities in the new online<br />
environment is likely to be possessed by only the most committed observers.<br />
This potent combination of expertise and externalities is a cl<strong>as</strong>sic rationale for<br />
regulation. As Danny Weitzner’s proposal for “extreme factfinding” (in the<br />
context of the Google–DoubleClick merger review) recognized, only a<br />
dedicated group of engineers, social scientists, attorneys, and computer<br />
scientists are likely to be adept enough at understanding search engine decisions<br />
<strong>as</strong> a whole to understand particular complaints about them. 44 Someone needs<br />
to be able to examine the finer details of the publicly undisclosed operation of<br />
culturally significant automated ranking systems—that is, to watch those who<br />
watch and influence us. 45<br />
44 See generally, Danny Weitzner, What to Do About Google and Doubleclick? Hold Google to It’s Word<br />
With Some Extreme Factfinding About Privacy Practices, GOOGLE OPEN INTERNET POLICY BLOG,<br />
Oct. 8, 2007, http://dig.csail.mit.edu/breadcrumbs/node/203:<br />
In the 1990s, the FTC under Christine Varney’s leadership pushed operators<br />
of commercial websites to post policies stating how they handle personal<br />
information. That w<strong>as</strong> an innovative idea at the time, but the power of<br />
personal information processing h<strong>as</strong> swamped the ability of a static statement<br />
to capture the privacy impact of sophisticated services, and the level of<br />
generality at which these policies tend to be written often obscure the real<br />
privacy impact of the practices described. It’s time for regulators to take the<br />
next step and <strong>as</strong>sure that both individuals and policy makers have<br />
information they need.<br />
Weitzner proposes that “[r]egulators should appoint an independent panel of technical, legal<br />
and business experts to help them review, on an ongoing b<strong>as</strong>is the privacy practices of<br />
Google.” Id. The panel would be “made up of those with technical, legal and business<br />
expertise from around the world.” Id. It would hold “public hearings at which Google<br />
technical experts are available to answer questions about operational details of personal data<br />
handling.” Id. There would be “staff support for the panel from participating regulatory<br />
agencies,” “real-time publication of questions and answers,” and “[a]n annual report<br />
summarizing what the panel h<strong>as</strong> learned.” Id.<br />
45 In the meantime, Google h<strong>as</strong> been developing a tool that would help consumers detect if<br />
their Internet service provider w<strong>as</strong> “running afoul of Net neutrality principles.” Stephanie<br />
Condon, Google-Backed Tool Detects Net Filtering, Blocking, CNET NEWS, Jan. 28, 2009,<br />
http://news.cnet.com/8301-13578_3-10152117-38.html (“[The tool, M-Lab,] is running<br />
three diagnostic tools for consumers: one to determine whether BitTorrent is being blocked<br />
or throttled, one to diagnose problems that affect l<strong>as</strong>t-mile broadband networks, and one to
360 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Why Dominant Search Engines &<br />
Carriers Deserve More Scrutiny than<br />
Dominant Auction Sites & Social Networks<br />
Those skeptical of the administrative state may find this proposal to “watch the<br />
watchers” problematic. They think of intermediaries <strong>as</strong> primarily market actors,<br />
to be disciplined by market constraints. However, the development of<br />
dominant Web 2.0 intermediaries w<strong>as</strong> itself a product of particular legal choices<br />
about the extent of intellectual property rights and the responsibilities of<br />
intermediaries made in legislative and judicial decisions in the 1990s. As<br />
intermediaries gained power, various entities tried to bring them to heel—<br />
including content providers, search engine optimizers, trademark owners, and<br />
consumer advocates. In traditional information law, claims under trademark,<br />
defamation, and copyright law might have posed serious worries for<br />
intermediaries. However, revisions of communications and intellectual property<br />
law in the late 1990s provided safe harbors that can trump legal claims sounding<br />
in each of these other are<strong>as</strong>. 46 Some b<strong>as</strong>ic reporting responsibilities are a small<br />
price to pay for continuing enjoyment of such immunities.<br />
An argument for treating internet intermediaries more like regulated entities<br />
owes much to the trail-blazing work of legal realists. Among these, Robert<br />
Hale’s work on utilities remains especially inspirational. 47 Hale developed many<br />
of the theoretical foundations of the New Deal, focusing on the ways in which<br />
the common law became inadequate <strong>as</strong> large business entities began ordering<br />
diagnose problems limiting speeds.”). It remains to be seen whether Google itself would<br />
submit to a similar inspection to determine whether it w<strong>as</strong> engaging in stealth marketing or<br />
other problematic practices.<br />
46 17 U.S.C. § 512(d) (2000) (<strong>Digital</strong> Millennium Copyright Act of 1998 safe harbor); 47 U.S.C.<br />
§ 230(c)(1) (2000) (Communications Decency Act of 1997 safe harbor for intermediaries).<br />
For critical commentary on the latter, see Michael L. Rustad & Thom<strong>as</strong> H. Koenig, Rebooting<br />
Cybertort Law, 80 WASH. L. REV. 335, 371 (2005) (“An activist judiciary, however, h<strong>as</strong> radically<br />
expanded § 230 by conferring immunity on distributors. Section 230(c)(1) h<strong>as</strong> been<br />
interpreted to preclude all tort lawsuits against ISPs, websites, and search engines. Courts<br />
have … haphazardly lump[ed] together web hosts, websites, search engines, and content<br />
creators into this amorphous category.”).<br />
47 Ilana Waxman, Note, Hale’s Legacy: Why Private Property is Not a Synonym for Liberty, 57<br />
HASTINGS L.J. 1009, 1019 (“Hale’s most fundamental insight w<strong>as</strong> that the coercive power<br />
exerted by private property owners is itself a creature of state power… . By protecting the<br />
owner’s property right … ‘the government’s function of protecting property serves to<br />
delegate power to the owners’ over non-owners, so that ‘when the owners are in a position<br />
to require non-owners to accept conditions <strong>as</strong> the price of obtaining permission to use the<br />
property in question, it is the state that is enforcing compliance, by threatening to forbid the<br />
use of the property unless the owner’s terms are met.’ … .[A]ll property essentially<br />
constitutes a delegation of state power to the property owner…). For a powerful application<br />
of these ide<strong>as</strong> to Internet law, see Julie Cohen, Lochner in Cyberspace: The New Economic<br />
Orthodoxy of ‘Rights Management,’ 97 MICH. L. REV. 462 (1998).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 361<br />
incre<strong>as</strong>ing proportions of the national economy. 48 Hale’s crucial insight w<strong>as</strong><br />
that many of the leading businesses of his day were not extraordinary<br />
innovators that “deserved” all the profits they made; rather, their success w<strong>as</strong><br />
dependent on a network of laws and regulation that could e<strong>as</strong>ily shift favor<br />
from one corporate player to another. 49 Hale focused his theoretical work on<br />
the utilities of his time, expounding an economic and philosophical justification<br />
for imposing public service obligations on them. Regulatory bodies like state<br />
utility commissions and the FCC all learned from his work, which showed the<br />
inadequacy of private law for handling disputes over infr<strong>as</strong>tructural utilities.<br />
Market advocates may worry that monitoring of search engines and carriers will<br />
lead to more extensive surveillance of the affairs of other intermediaries, like<br />
social networks and auction platforms. They may feel that competition is<br />
working in each of those are<strong>as</strong>, and should be the foundation of all intermediary<br />
policy. However, competition is only one of many tools we can use to<br />
encourage responsible and useful intermediaries. We should rely on<br />
competition-promotion via markets and antitrust only to the extent that (a) the<br />
intermediary in question is an economic (<strong>as</strong> opposed to cultural or political)<br />
force; (b) the “voice” of the intermediary’s user community is strong; 50 and (c)<br />
competition is likely to be genuine and not contrived. These criteria help us<br />
map older debates about platforms onto newer entities.<br />
For search engines and carriers, each of these factors strongly militates in favor<br />
of regulatory intervention. Broadband competition h<strong>as</strong> failed to materialize<br />
beyond duopoly service for most Americans. There are several re<strong>as</strong>ons to<br />
suspect that Google’s dominance of the general purpose search market will<br />
continue to grow. 51 Just <strong>as</strong> p<strong>as</strong>t policymakers recognized the need for common<br />
48 Duncan Kennedy, The Stakes of Law, or, Hale and Foucault, 15 LEGAL STUDIES FORUM<br />
(4) (1991).<br />
49 BARBARA FRIED, THE PROGRESSIVE ASSAULT ON LAISSEZ FAIRE: ROBERT HALE AND THE<br />
FIRST LAW AND ECONOMICS MOVEMENT (1998), available at<br />
http://www.hup.harvard.edu/catalog/FRIPRA.html.<br />
50 Competition is designed to provide users an “exit” option; regulation is designed to give<br />
them more of a “voice” in its governance. Hirschman ALBERT O. HIRSCHMAN, Exit and<br />
Voice: An Expanding Sphere of Influence, in RIVAL VIEWS OF MARKET SOCIETY AND OTHER<br />
RECENT ESSAYS 78–80 (1986) (describing “exit” and “voice” <strong>as</strong> two cl<strong>as</strong>sic options of<br />
reform or protest). To the extent exit is unavailable, voice (influence) within the relevant<br />
intermediary becomes less necessary; to the extent voice is available, exit becomes less<br />
necessary.<br />
51 Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 26, at 1179. Section III of the article,<br />
“Why Can’t Non-Regulatory Alternatives Solve the Problem?,” addresses the many factors<br />
impeding competition in the search market. Present dominance entrenches future<br />
dominance <strong>as</strong> the leading search engine’s expertise on user habits grows to the extent that no<br />
competitor can match its understanding of how to target ads well. Id. Since that article w<strong>as</strong><br />
published, Harvard Business School Professor Ben Edelman h<strong>as</strong> investigated another selfreinforcing<br />
<strong>as</strong>pect of Google’s market power: the non-portability of AdSense data, which
362 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
carrier obligations for concentrated communications industries, present ones<br />
will need to recognize carriers’ and search engines’ status <strong>as</strong> incre<strong>as</strong>ingly<br />
essential facilities for researchers, advertisers, and media outlets. 52<br />
The parallel is apt because, to use the three dimensions discussed above, carriers<br />
and dominant general-purpose search engines a) are just <strong>as</strong> important to culture<br />
and politics <strong>as</strong> they are to economic life, b) conceal key <strong>as</strong>pects of their<br />
operations, and are essentially credence goods, vitiating user community<br />
influence, and c) do not presently face many strong competitors, and are<br />
unlikely to do so in the immediate future. The first point—regarding cultural<br />
power—should lead scholars away from merely considering economies of scale<br />
and scope and network effects in evaluating search engines. We need to<br />
consider all dimensions of network power—the full range of cultural, political,<br />
and social obstacles to competition that a dominant standard can generate. 53<br />
Moreover, policymakers must acknowledge that competition itself can drive<br />
practices with many negative externalities. The bottom line here is that someone<br />
needs to be able to “look under the hood” of culturally significant automated<br />
ranking systems.<br />
What about auction platforms, another important online intermediary? 54 Here,<br />
a purely economic, antitrust-driven approach to possible problems is more<br />
appropriate. To use the criteria mentioned above: (a) a site like eBay is a very<br />
important online marketplace, but h<strong>as</strong> little cultural or political impact and (b)<br />
the user community at eBay understands its reputation rankings very well, and<br />
h<strong>as</strong> shown remarkable capacities for cohesion and self-organization to protest<br />
makes it difficult for Google customers to apply what they have learned about their Internet<br />
customers to ad campaigns designed for other search engines. Ben Edelman, PPC Platform<br />
Competition and Google’s ‘May Not Copy’ Restriction, June 27, 2008,<br />
http://www.benedelman.org/news/062708-1.html. As Edelman shows, Google h<strong>as</strong><br />
tried to make the data it gathers for companies “sticky,” inextricable from its own<br />
proprietary data structures.<br />
52 TIM WU, THE MASTER SWITCH (Knopf, 2010) (promoting “separations principle” in the<br />
digital landscape.).<br />
53 DAVID GREWAL, NETWORK POWER: THE SOCIAL DYNAMICS OF GLOBALIZATION 45 (Yale<br />
Univ. Press 2008) (“[T]he network power of English isn’t the result of any intrinsic features<br />
of English (for example, ‘it’s e<strong>as</strong>y to learn’): it’s purely a result of the number of other people<br />
and other networks you can use it to reach… . The idea of network power … explains how<br />
the convergence on a set of common global standards is driven by the accretion of<br />
individual choices that are free and forced at the same time.”).<br />
54 David S. Evans, Antitrust Issues Raised by the Emerging Global Internet Economy, 102 NW. U. L.<br />
REV. COLLOQUY 285, 291 (2008) (“European Community law and decisional practice …<br />
impose special obligations and significant scrutiny on firms that have market shares <strong>as</strong> low <strong>as</strong><br />
40 percent.”). Evans compiles data demonstrating that some leading auction platforms<br />
(such <strong>as</strong> eBay) are well above this market share in Europe and the U.S. Id. (citing comScore,<br />
MyMetrix qSearch 2.0 Key Me<strong>as</strong>ures Report, Dec. 2007,<br />
http://www.comscore.com/method/method.<strong>as</strong>p).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 363<br />
(and occ<strong>as</strong>ionally overturn) policies it dislikes. These factors overwhelm the<br />
possibility that (c) competition in the general auction market (<strong>as</strong> opposed to<br />
niche auctions) may be unlikely to develop. If real competitors fail to<br />
materialize due to illicit monopolization, antitrust judgments against Microsoft<br />
(and parallel requirements of some forms of “operating system neutrality”) can<br />
guide future litigants seeking online auction platform neutrality. While eBay’s<br />
user community successfully pressured Disney to end its 2000 specialpreference<br />
deal with eBay, in the future antitrust judgments or settlements<br />
might require the full disclosure of (and perhaps put conditions on) such deals. 55<br />
In social networks, another area where tipping can quickly lead to one or a few<br />
players’ dominance, 56 the situation is more mixed. While Rebecca Mackinnon<br />
and danah boyd have compared Facebook to a utility, the famously marketoriented<br />
Economist magazine h<strong>as</strong> compared it to a country, possibly in need of<br />
a constitution and formal input from users. Social networks are closer to search<br />
engines than auction sites with respect to factor a: they are becoming crucial<br />
hubs of social interactions, cultural distribution and promotion, and political<br />
organizing. 57<br />
On the other hand, social networks provide a some leverage to their members<br />
to police bad behavior, opening up “voice” options, with respect to factor b, far<br />
more potent than those available to the scattered searchers of Google. A group<br />
named “Facebook: Stop Invading My Privacy” became very popular within<br />
Facebook itself, catalyzing opposition to some proposed features of its Beacon<br />
program in 2008. 58 Facebook’s privacy snafus in early 2009 led the company to<br />
organize formal user community input on future alterations to the company’s<br />
terms of service. On the final factor, competitive dynamics, it appears that<br />
competition is more likely to develop in the social network space than in the<br />
broadband, search engine, or auction platform industries. There is a more<br />
55 In 2000, eBay granted special perks to Disney on a platform within its auction site. After<br />
protest from “the eBay community,” the perks ce<strong>as</strong>ed. eBay CEO Meg Whitman said of the<br />
special Disney deal: “We’ve concluded that eBay h<strong>as</strong> to be a level playing field. That is a<br />
core part of our DNA, and it h<strong>as</strong> to be going forward.” ADAM COHEN, THE PERFECT<br />
STORE: INSIDE EBAY 292 (Back Bay Books 2006).<br />
56 In early 2008, 98% of Brazilian social networkers used Google’s Orkut; 97% of South<br />
Korean social networkers used CyWorld, and 83% of American social networkers used<br />
MySpace or Facebook. Evans, supra note 54 at 292.<br />
57 James Grimmelmann, Saving Facebook, 94 IOWA L. REV. 1137, 52-59 (2009),<br />
http://works.bepress.com/james_grimmelmann/20/.<br />
58 William McGeveran, Disclosure, Endorsement, and Identity in Social Marketing, 2009 ILL. L. REV.<br />
1105, 1120 (2009),<br />
http://www.law.uiuc.edu/lrev/publications/2000s/2009/2009_4/McGeveran.pdf.
364 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
diverse playing field here than in the carrier or search space, with more than<br />
4,000 social networks in the United States. 59<br />
Any policy analysis of dominant intermediaries should recognize the sensitive<br />
cultural and political issues raised by them. The cultural, communal, and<br />
competitive dynamics surrounding dominant search engines and carriers defy<br />
e<strong>as</strong>y or stereotyped responses. Qualified transparency will <strong>as</strong>sist policymakers<br />
and courts that seek to address the cultural, reputational, and political impact of<br />
dominant intermediaries.<br />
Conclusion<br />
As David Brin predicted in The Transparent Society, further disclosure from<br />
corporate entities needs to accompany the scrutiny we all incre<strong>as</strong>ingly suffer <strong>as</strong><br />
individuals. 60 While the FTC and the FCC have articulated principles for<br />
protecting privacy, they have not engaged in the monitoring necessary to<br />
enforce these guidelines. This essay promotes institutions designed to develop<br />
better agency understanding of privacy-eroding practices. Whether public or<br />
private, such institutions would respect legitimate needs for business<br />
confidentiality while promoting individuals’ capacity to understand how their<br />
reputations are shaped by dominant intermediaries.<br />
59 Evans, supra note 54, at 290.<br />
60 DAVID BRIN, THE TRANSPARENT SOCIETY: WILL TECHNOLOGY FORCE US TO CHOOSE<br />
BETWEEN PRIVACY AND FREEDOM? (B<strong>as</strong>ic Books 1999).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 365<br />
Online Liability for<br />
Payment Systems<br />
By Mark MacCarthy �<br />
Introduction<br />
U.S. policy toward the liability of Internet intermediaries for online harms w<strong>as</strong><br />
set in the late 1990s. It consisted of two parts. The first part w<strong>as</strong> Section 230 of<br />
the 1996 Telecommunications Act, providing a safe harbor from indirect<br />
liability for online service providers. 1 This safe harbor is an exception from a<br />
range of normal liabilities that would apply to traditional providers of media<br />
content such <strong>as</strong> broadc<strong>as</strong>ters and newspapers. It does not apply to all<br />
intermediaries or platform providers, but to what might be called “pure”<br />
Internet intermediaries. That is, it covers intermediaries to the extent they are<br />
providing services that are somehow intrinsic to the Internet. Under its terms,<br />
except for requirements of contract law, criminal law, and intellectual property<br />
law, online entities are not responsible for the content of the material that is<br />
found on their systems <strong>as</strong> long <strong>as</strong> it h<strong>as</strong> been provided by another information<br />
content provider.<br />
The second part of the U.S. policy toward Internet intermediary liability w<strong>as</strong> set<br />
out in 1998 with the <strong>Digital</strong> Millennium Copyright Act. 2 DMCA allows a<br />
complete exemption from copyright liability for entities involved in pure<br />
transmission activities. It also creates a notice-and-takedown regime for web<br />
hosts and other online service providers. It also allows recipients of these<br />
notices to challenge them. Upon receipt of a response, the online service<br />
providers are required to reinstate the allegedly infringing material unless the<br />
rights holder h<strong>as</strong> filed a legal infringement action. Online service providers are<br />
exempt from liability for good faith removal of material following a notice. It<br />
also provides for penalties if a rights holder files a notification that knowingly<br />
� Mark MacCarthy is Adjunct Professor in the Communications Culture and Technology<br />
Program at Georgetown University. Formerly, he w<strong>as</strong> Senior Vice President for Public<br />
Policy at Visa Inc. Substantial portions of this essay were originally published <strong>as</strong> Mark<br />
MacCarthy, What Payment Intermediaries Are Doing About Online Liability and Why It Matters, 25<br />
BERKELEY TECHNOLOGY LAW JOURNAL 1039 (2010).<br />
1 47 U.S.C. § 230(c)(1) (2006) (“No provider or user of an interactive computer service shall<br />
be treated <strong>as</strong> the publisher or speaker of any information provided by another information<br />
content provider.”). The interpretation of this provision is quite broad. See, e.g., Zeran v. Am.<br />
Online, Inc., 129 F.3d 327, 330-31 (4th Cir. 1997) (finding that plaintiff’s tort claims of<br />
defamation were preempted by § 230). The immunity does not extend to criminal law,<br />
contract law, or intellectual property law. 47 U.S.C. § 230(e)(1)-(4) (2006).<br />
2 17 U.S.C. § 512 avaiable at http://www4.law.cornell.edu/uscode/17/512.html.
366 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
misrepresents that the material is infringing. DMCA also requires online service<br />
providers to have in place some procedures to respond to “repeat infringers,”<br />
including termination of accounts in appropriate circumstances. 3<br />
Many commentators think that DMCA represents a balanced compromise. 4<br />
However, controversy persists. Content providers have successfully lobbied for<br />
laws imposing more robust responsibilities for stopping copyright infringement<br />
on Internet service providers (ISPs). France and the United Kingdom, for<br />
example, have adopted “graduated response” mechanisms. 5 These liability<br />
regimes require ISPs to forward copyright infringement notices to alleged<br />
infringers, and to disconnect alleged repeat infringers.<br />
On the other hand, defenders of civil liberties and the First Amendment think<br />
DMCA notice-and-takedown requirements are too strong, arguing that a large<br />
proportion of the complaints filed under the law are improper, 6 and that they<br />
contains an inherent imbalance toward takedown, even when First Amendment<br />
values are implicated. 7<br />
In another essay in this collection, Brian Holland strongly defends Section 230<br />
<strong>as</strong> a modified version of Internet exceptionalism, 8 and <strong>as</strong> providing the b<strong>as</strong>is for<br />
the development of innovation on the Internet. 9 However, it, too, h<strong>as</strong> been<br />
3 17 U.S.C. § 512(i) conditions the eligibility of the safe harbor. It applies only if the service<br />
provider “h<strong>as</strong> adopted and re<strong>as</strong>onably implemented, and informs subscribers and account<br />
holders of the service provider’s system or network of, a policy that provides for the<br />
termination in appropriate circumstances of subscribers and account holders of the service<br />
provider’s system or network who are repeat infringers.” Intermediaries such <strong>as</strong> Google,<br />
YouTube and AT&T appear to have established termination policies for copyright<br />
infringement.<br />
4 See, for example, “United States and Canada Overview,” in RONALD DEIBERT, JOHN<br />
PALFREY, RAFAL ROHOZINSKI, AND JONATHAN ZITTRAIN, ACCESS CONTROLLED 378 (<br />
2010) and JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET AND HOW TO STOP IT 119<br />
(2008).<br />
5 Eric Pfanner, U.K. Approves Crackdown on Internet Pirates, NEW YORK TIMES, April 8, 2010 at<br />
http://www.nytimes.com/2010/04/09/technology/09piracy.html?scp=1&sq=digital<br />
%20economy%20bill%20uk&st=cse. Eric Pfanner, France Approves Wide Crackdown on Net<br />
Piracy, NEW YORK TIMES, Oct. 22, 2009,<br />
http://www.nytimes.com/2009/10/23/technology/23net.html?_r=1.<br />
6 Jennifer M. Urban & Laura Quilter, Efficient Process or ‘Chilling Effects’? Take-down Notices Under<br />
Section 512 of the <strong>Digital</strong>. Millennium Copyright Act, 22 SANTA CLARA HIGH TECH LJ 621 (2006).<br />
7 Wendy Seltzer, Free Speech Unmoored in Copyright’s Safe Harbor: Chilling Effects of the DMCA on<br />
the First Amendment, BERKMAN CENTER RESEARCH PUBLICATION NO. 2010-3, p. 16, March<br />
2010, available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1577785.<br />
8 See chapter 3, see also H. Brian Holland, In Defense of Online Intermediary Immunity: Facilitating<br />
Communities of Modified Exceptionalism, 56 U. KAN. L. REV. 369, 397 (2007).<br />
9 Remarks by Lawrence Strickling, Assistant Secretary of Commerce for Communications and<br />
Information, to Internet Society’s INET Series: Internet 2020: The <strong>Next</strong> Billion Users, April 29,
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 367<br />
controversial. Some argue that it allows ISPs to avoid making socially-desirable<br />
investments necessary to provide security on their networks. 10 Others think it<br />
allows hosting sites to escape their responsibility for defamation and other<br />
harms caused by people who use their sites to spread false and damaging<br />
information. 11<br />
Commentary on the controversies involved in these two pillars of the U.S.<br />
policy toward online liability is growing. 12 Work on whether to revise the<br />
consensus position on intermediary liability is underway. 13<br />
This essay attempts to contribute to this debate by looking at what payment<br />
systems have been doing about online liability. This will provide an illuminating<br />
perspective on the debate for a very straightforward re<strong>as</strong>on: Payment systems<br />
have operated outside this framework for online liability. They are not covered<br />
by Section 230 and they are not subject to the notice-and-takedown provisions<br />
of the DMCA. How have they handled issues relating to the use of payment<br />
systems for illegal activity online? This essay explores this question through an<br />
examination of two c<strong>as</strong>es in which they have been called upon to take steps to<br />
control illegal activity involving their payment systems: Internet gambling and<br />
copyright infringement.<br />
Some have argued that payment systems should have legal responsibility for<br />
keeping their systems free of illegal online activity. 14 Payment systems can keep<br />
track of those who use their system online – both merchants and cardholders<br />
have contracts with financial. The online transactions using payment systems<br />
can be tracked electronically by type. Governments and aggrieved parties might<br />
not be able to find wrong-doers who use payment systems for illegal online<br />
activity, but the payment system providers can. They are the “le<strong>as</strong>t-cost”<br />
2010, available at<br />
http://www.ntia.doc.gov/presentations/2010/InternetSociety_04292010.html.<br />
10 Doug Lichtman & Eric Posner, Holding Internet Service Providers Accountable, 14 U. CHI. SUP.<br />
CT. ECON. REV. 221 (2006).<br />
11 JOHN PALFREY AND URS GASSER, BORN DIGITAL 106 (2008), and DANIEL SOLOVE, THE<br />
FUTURE OF REPUTATION 125-160 (2007).<br />
12 See, Adam Thierer, Dialogue: The Future of Online Obscenity and Social Networks, ARS TECHNICA,<br />
March 5, 2009, http://arstechnica.com/tech-policy/news/2009/03/a-friendlyexchange-about-the-future-of-online-liability.ars.<br />
13 See, for instance, Organization for Economic Cooperation and Development, The Economic<br />
and Social Role of Internet Intermediaries, April 2010,<br />
http://www.oecd.org/dataoecd/49/4/44949023.pdf.<br />
14 Ronald J. Mann & Seth R. Belzley, The Promise of Internet Intermediary Liability, 47 WM. & MARY<br />
L. REV. 239, 249-50 (2005), available at<br />
http://scholarship.law.wm.edu/cgi/viewcontent.cgi?article=1225&context=wmlr.
368 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
avoider of the damage done by this illegal online activity and so should bear the<br />
burden of controlling it.<br />
This perspective seems wrong to me. Still, payment system practices toward<br />
illegal online activity on their systems suggest several lessons. First, regardless of<br />
the precise legal liabilities, intermediaries have a general responsibility to keep<br />
their systems free of illegal transactions and they are taking steps to satisfy that<br />
obligation. Second, the decision to impose legal responsibilities on<br />
intermediaries should not be b<strong>as</strong>ed on the le<strong>as</strong>t cost avoider principle.<br />
Assessments of intermediary liability must take into account market failures, <strong>as</strong><br />
well <strong>as</strong> an analysis of costs, benefits and equities. Third, if intermediaries are<br />
shouldered with responsibilities to control illegal online activity, these<br />
responsibilities needed to be clearly spelled out. Fourth, if governments are<br />
going to use intermediaries to enforce local laws, they must harmonize these<br />
local laws.<br />
Part II of this essay outlines a framework for the analysis of intermediary<br />
liability. This framework calls for a thorough analysis, including an <strong>as</strong>sessment<br />
of market failure and an analysis of the costs, benefits, and equities involved in<br />
imposing intermediary liability. Part III applies this framework to the policies<br />
and practices of payment intermediaries in the are<strong>as</strong> of Internet gambling and<br />
online copyright infringement. Part IV draws some conclusions from these<br />
experiences.<br />
Indirect Intermediary<br />
Liability Regimes<br />
Most legal regimes hold parties liable for their own misconduct. In contr<strong>as</strong>t, an<br />
indirect liability regime holds a person responsible for the wrongs committed by<br />
another. There are usually several parties involved in an indirect liability regime:<br />
the bad actor, the wronged party and a third party. The bad actor is the person<br />
directly involved in causing the harm to the wronged party. A third party,<br />
neither the bad actor nor the wronged party, is <strong>as</strong>signed responsibility in an<br />
attempt to prevent the harmful conduct of the bad actor or to compensate the<br />
wronged party for the harm. In the c<strong>as</strong>e of copyright infringement, for example,<br />
the bad actor would be the infringer, the wronged party would be the record<br />
company that owned the music copyrights, and the third party would be an ISP<br />
or a payment system that facilitates the infringement. Indirect liability can be<br />
imposed through a variety of legal mechanisms. 15<br />
15 See, e.g., Dougl<strong>as</strong> Lichtman, Holding Internet Service Providers Accountable, 27 REG. 54, 59 (2004)<br />
(proposing that ISP liability for cyber security issues could be established in a regime of<br />
“negligence or strict liability, whether it is best implemented by statute or via gradual<br />
common law development”); Mann & Belzley, supra note 14, at 269-72 (suggesting three<br />
possible regimes: traditional tort regime, a takedown requirement, and a hot list).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 369<br />
A Framework for Analysis<br />
Indirect liability holds a party responsible for wrongs committed by another<br />
person. Why should there be any such rule? Why not simply hold the bad actor<br />
responsible? The economic analysis of indirect liability attempts to answer this<br />
question using some standard economic tools and concepts. 16 A standard<br />
economic framework considers issues of market failure, costs and benefits, and<br />
equity to <strong>as</strong>sess the need for an indirect liability regime in specific c<strong>as</strong>es. 17<br />
Market Failure Analysis<br />
Before imposing an indirect liability regime, economic analysis <strong>as</strong>ks whether<br />
there is really any market failure. If there is no market failure, there is no need<br />
for an indirect liability rule. In particular, there need not be an indirect liability<br />
rule when the law or the wronged party can effectively reach the bad actor<br />
directly 18 and transaction costs are not significant.<br />
Even if the wronged party cannot e<strong>as</strong>ily reach a bad actor that a third party can<br />
reach, it is still not necessary to impose liability on the third party. When the<br />
wronged party and the intermediary can e<strong>as</strong>ily negotiate an arrangement,<br />
efficiency will guide the third party to undertake enforcement efforts on behalf<br />
of the wronged party. This is a key <strong>as</strong>pect of a market failure analysis. Unless<br />
transaction costs interfere with contracting, affected parties can allocate liability<br />
efficiently through contractual design. 19<br />
16 See generally Lichtman & Posner, supra note 10 (summarizing this perspective); Dougl<strong>as</strong><br />
Lichtman & William Landes, Indirect Liability for Copyright Infringement: An Economic Perspective,<br />
16 HARV. J.L. & TECH. 395, 396-99 (2003).<br />
17 See Lichtman & Posner, supra note 10, at 228-33.<br />
18 The effective reach condition is evaluated prior to an <strong>as</strong>sessment of the ability of a third<br />
party to effectively control the bad activity. See id. at 230-31. If the law or the wronged party<br />
can e<strong>as</strong>ily reach the bad actor, then why even consider whether to impose a duty on a third<br />
party? Of course, the bad actors are never totally out of reach of the law or wronged parties.<br />
With some finite expenditure of resources, perhaps very large, the direct bad actors could be<br />
brought to justice or harms could be prevented. The real economic question is whether<br />
those costs are larger than the costs of <strong>as</strong>signing that enforcement role to a third party. And<br />
this means that the effective reach condition collapses into the control factor discussed,<br />
infra. Landes and Lichtman put the comparative point accurately, applied to the specific c<strong>as</strong>e<br />
of contributory copyright liability: “Holding all else equal, contributory liability is more<br />
attractive … the greater the extent to which indirect liability reduces the costs of copyright<br />
enforcement <strong>as</strong>—compared to a system that allows only direct liability.” Lichtman &<br />
Landes, supra note 16, at 398.<br />
19 Lichtman & Posner, supra note 10, at 235. Lichtman and Posner also focus on what the<br />
parties might do: “The right thought experiment is to imagine that all the relevant entities<br />
and all the victims and all the bad actors can efficiently contract one to another and then to<br />
<strong>as</strong>k how the parties would in that situation allocate responsibility for detecting and deterring<br />
bad acts.” Id. at 257. But there is no need to conduct this thought experiment in the abstract.<br />
Free, equal, and rational parties can bargain to allocate responsibility and so we can answer
370 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Cost Benefit Analysis<br />
Some arguments for indirect liability follow a le<strong>as</strong>t cost analysis. A “le<strong>as</strong>t cost”<br />
perspective puts the burden of enforcing the law on the party that can stop the<br />
illegal transactions at the lowest cost. Focusing on costs is desirable in order to<br />
create an efficient enforcement regime. In a “le<strong>as</strong>t cost” framework, the cost to<br />
the intermediary itself and to the direct customers of the intermediary must be<br />
taken into account. If ISPs or payment systems have to incur costs to monitor<br />
their system for illegal content, those costs will be p<strong>as</strong>sed down to their direct<br />
customers. With the price incre<strong>as</strong>e, some customers stop using the service or<br />
reduce their usage of it. If the service provided is a network service, then the<br />
external network effects on other users of the service from an overall reduction<br />
in use also have to be counted. 20According to the le<strong>as</strong>t cost idea, when these<br />
costs are less than the cost of enforcement activity by the wronged party or by<br />
enforcement officials, then liability rests with the intermediary.<br />
This le<strong>as</strong>t-cost analysis is limited. It ignores the size of the harms that can be<br />
avoided by intermediary action. The mistake is to think that if efforts by third<br />
parties provide more enforcement than efforts by the wronged parties then it<br />
must be worthwhile for the third parties to take these enforcement steps.<br />
Similarly, it is sometimes thought that if third parties can more e<strong>as</strong>ily reach bad<br />
actors than the wronged parties, then they should be required to do so. But this<br />
is wrong. It is almost always possible to spend more on enforcement and obtain<br />
some return. From an economic point of view, the question is whether that<br />
extra spending provides commensurate reductions in damages. Therefore, the<br />
le<strong>as</strong>t cost rule is not the right decision rule, even in a strictly economic analysis.<br />
Instead, a full cost-benefit analysis is more appropriate. 21<br />
the question of what the parties would do in this thought experiment by looking at what they<br />
actually do. The relevant inquiry is whether the bargaining situation is free of significant<br />
transaction costs or other obstacles to reaching an agreement.<br />
20 If there are fewer Internet subscribers then the service is less valuable to e-commerce<br />
merchants <strong>as</strong> well since there are fewer potential customers. See Matthew Schruers, The<br />
History and Economics of ISP Liability for Third Party Content, 88 VA. L. REV. 205, 250-52 (2002);<br />
see also Lichtman & Posner, supra note 10, at 241-43 (seeming to minimize the importance of<br />
these external, network effects in <strong>as</strong>sessing liability regimes: “Immunizing ISPs from liability<br />
is not the correct mechanism for encouraging them to provide positive externalities.” Id. at<br />
243). However, the loss of the ISP-generated external benefits is a potential cost of <strong>as</strong>signing<br />
liability that h<strong>as</strong> to be taken into account when <strong>as</strong>sessing whether to <strong>as</strong>sign liability. Mann<br />
and Belzley’s article gets the overall point right, noting: “To the extent the regulation affects<br />
conduct with positive social value, <strong>as</strong> is likely in at le<strong>as</strong>t some of the contexts this essay<br />
discusses, the direct and indirect effects on that conduct must be counted <strong>as</strong> costs of any<br />
regulatory initiative.” Mann & Belzley, supra note 14, at 274.<br />
21 The le<strong>as</strong>t-cost analysis seems to function like a cost effectiveness analysis, where a given<br />
level of enforcement is <strong>as</strong>sumed and the question is how that goal can be reached at the<br />
lowest cost. See Mann & Belzley, supra note 14, at 250 (adopting that perspective <strong>as</strong> “a<br />
mature scheme of regulation that limits the social costs of illegal Internet conduct in the
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 371<br />
There is a difference between the costs and benefits to private parties involved<br />
and the costs and benefits to society. The costs and benefits of third party<br />
enforcement efforts fall on different parties. A wronged party benefits from<br />
third party enforcement efforts and the third party pays the costs. The wronged<br />
party h<strong>as</strong> a natural incentive to have the third party do <strong>as</strong> much <strong>as</strong> possible in<br />
the way of enforcement—even p<strong>as</strong>t the point where there is a corresponding<br />
reduction in damages—because the wronged party appropriates the damage<br />
reduction but pays no costs. From an economic efficiency point of view,<br />
enforcement efforts that do not yield a commensurate reduction in damages are<br />
w<strong>as</strong>ted. Private benefits may not be worth it from a social point of view when<br />
balanced against the costs to other parties.<br />
Equity Analysis<br />
The cost benefit framework just described lacks a normative dimension. It does<br />
not take into account questions of fairness, rights, and justice. And it does not<br />
consider who deserves the benefit of protection from harm or who is at fault or<br />
blameworthy for failing to take preventive me<strong>as</strong>ures.<br />
The view that an economic efficiency standard, by itself, is sufficient to create<br />
indirect liability is too strong. The focus on parties who had no part in creating<br />
the problem and who are not responsible for the illegal activity puts a burden<br />
on people who are innocent of any wrong-doing. Burdening innocent people<br />
seems unfair, and arguments that justify this approach on grounds that it is<br />
good for society <strong>as</strong> a whole violate widely accepted moral principles and are<br />
unlikely to withstand public scrutiny. 22<br />
We should require a person to right the wrongs committed by others only if we<br />
think that person is somehow responsible for those wrongs. Determining who<br />
is responsible for righting wrongs committed by others is controversial in both<br />
moral and political philosophy. 23 Libertarians generally maintain that people<br />
need to fix only the problems that they themselves directly created. 24 Without<br />
most cost-effective manner”). But a full cost-benefit analysis gives up the <strong>as</strong>sumption of a<br />
fixed benefit goal and takes the value of benefits into account <strong>as</strong> well.<br />
22 See, e.g., JONATHAN WOLFF, AN INTRODUCTION TO POLITICAL PHILOSOPHY 57 (1996)<br />
(stating that “utilitarianism will permit enormous injustice in the pursuit of the general<br />
happiness”). A more sophisticated indirect or rule utilitarian approach can attempt to meet<br />
this difficulty, but that approach is subject to difficulties of its own. See generally JOHN RAWLS,<br />
A THEORY OF JUSTICE (1971) (critiquing utilitarianism). The underlying intuition behind this<br />
alternative account of social justice is that “[e]ach person possesses an inviolability founded<br />
on justice that even the welfare of society <strong>as</strong> a whole cannot override.” Id. at 3<br />
23 See infra notes 24-27 and accompanying text.<br />
24 See Jim Harper, Against ISP Liability, 28 REG. 30, 30-31 (2005) (arguing that ISPs should be<br />
liable for harms to third parties only if they have a duty to these parties and that “efficiency”<br />
considerations do not override the lack of such a duty founded on justice). Libertarians
372 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
this limitation, it is hard not to slide into a doctrine that requires all actors to<br />
stop misconduct whenever they can. 25 Others think that one h<strong>as</strong> a duty to<br />
correct injustices to the extent that one participates in an institutional<br />
framework which produces injustice. 26 Still others believe in general positive<br />
duties to eliminate harms even when one h<strong>as</strong> no direct role in causing them. 27<br />
Ultimately, the analysis of indirect liability cannot avoid considerations of<br />
fairness, rights, and justice. The key factors in this <strong>as</strong>sessment will be those that<br />
have been used traditionally: directness of the involvement by third parties in<br />
activities that lead to harm to another person, an <strong>as</strong>sessment of the degree of<br />
harm involved, the knowledge that third parties have or should have about the<br />
specific harm involved, what their intentions are, whether they are consciously<br />
acting in furtherance of a crime or other illegal act, and other similar<br />
considerations. 28 These complicated normative and empirical questions cannot<br />
be avoided by a single principle that purports to look at costs and benefits<br />
alone. 29<br />
generally reject the idea that we have positive duties to ameliorate harms we did not cause.<br />
E.g., id.<br />
25 Mann & Belzley, supra note 14, at 272 (noting that the principle that liability should be<br />
<strong>as</strong>signed regardless of blameworthiness “e<strong>as</strong>ily could shade into judicial doctrines that would<br />
obligate all actors to stop all misconduct whenever possible” and thinking that this<br />
“unbounded principle” is “unduly disruptive”). But it is hard to see how their proposal to<br />
implement indirect liability through regulation whenever it would be less expensive than<br />
leaving liability with the wronged party would be less disruptive.<br />
26 See, e.g., THOMAS W. POGGE, WORLD POVERTY AND HUMAN RIGHTS 172 (2002) (arguing<br />
that those involved in an institutional order that authorizes and upholds slavery have a duty<br />
to protect slaves or to promote institutional reform, even if they do not own slaves<br />
themselves).<br />
27 See, e.g., David Luban, Just War and Human Rights, in INTERNATIONAL ETHICS 195, 209<br />
(Charles R. Beitz et al. eds., 1985) (stating that “all humans in a position to effect” a human<br />
right have an obligation to do so).<br />
28 Mann and Belzley criticize the “myopic focus on the idea that the inherent p<strong>as</strong>sivity of<br />
Internet intermediaries makes it normatively inappropriate to impose responsibility on them<br />
for the conduct of primary malfe<strong>as</strong>ors.” Mann & Belzley, supra note 14, at 261-62. But<br />
p<strong>as</strong>sivity is relevant to the knowledge and control factors needed to <strong>as</strong>sess liability from an<br />
equity point of view. Lichtman and Landes seem to criticize the focus of current law on<br />
“knowledge, control, the extent of any non-infringing uses, and other factors” because they<br />
are not “particularly clear <strong>as</strong> to why those issues are central.” Lichtman & Landes, supra note<br />
16, at 405. But these factors are crucial because they relate to the way in which the equity<br />
issues can be resolved.<br />
29 These equity considerations can interact with the cost analysis. Consider the following:<br />
suppose transaction costs make it impossible for the wronged parties to negotiate<br />
enforcement deals with a third party–they are too numerous or lack the resources to<br />
compensate the third party. Suppose further it is possible that the cost savings involved in<br />
<strong>as</strong>signing liability to a third party are substantial. And finally stipulate that the third party’s<br />
involvement in the harm is so remote that <strong>as</strong>signing blame is a mistake. We might in that
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 373<br />
An economic framework, broadly construed and supplemented with suitable<br />
considerations of equity, can be a useful way to <strong>as</strong>sess the need for indirect<br />
liability for intermediaries in specific c<strong>as</strong>es. The elements of the framework are<br />
<strong>as</strong> follows:<br />
� Market Failure Analysis: Are there substantial transaction costs? Can<br />
enforcement be achieved without an indirect liability rule? Can private<br />
parties work out enforcement arrangements among themselves? Can<br />
third parties effectively work with law enforcement without an indirect<br />
liability mandate?<br />
� Cost-Benefit Analysis: Does the burden on the wronged party or on<br />
law enforcement to take enforcement steps exceed the burden on the<br />
third parties? Are the costs of enforcement efforts re<strong>as</strong>onable in light<br />
of the reduction in harm? Are there longer-term or dynamic<br />
considerations to take into account?<br />
� Equity Analysis: Do third parties exercise such close control over the<br />
harm that they should be held responsible for its mitigation or<br />
elimination? Are they blameworthy for not taking steps against it? Is<br />
the harm particularly egregious?<br />
Applying the Framework<br />
to Payment Intermediaries<br />
Payment intermediaries have developed and refined policies and practices to<br />
deal with illegal Internet transactions in their payment networks. Two general<br />
conclusions can be drawn from an analysis of these policies and practices.<br />
The first is that payment intermediary action h<strong>as</strong> been effective. As the<br />
following discussions demonstrate, Internet gambling websites have been<br />
denied access to the U.S. market, and their current and projected revenues are<br />
in decline. As a result of the payment system action in the Allofmp3.com<br />
copyright infringement c<strong>as</strong>e, Allofmp3.com w<strong>as</strong> confined to a domestic market<br />
and experienced a dramatic reduction in the volume of activity at its website.<br />
The second conclusion is that the widespread <strong>as</strong>sumption that payment system<br />
action in this area is simple and almost cost-free deserves more careful<br />
consideration. 30 The discussion of payment intermediaries’ activities to control<br />
circumstance nevertheless <strong>as</strong>sign liability to the third party. The gains to the rest of us are<br />
just too great. However, should we not compensate the third party for taking the<br />
enforcement steps he is required to take? Assigning indirect liability when there is not this<br />
level of control or fault to justify blameworthiness might be so efficient under a cost analysis<br />
that it is worth considering, but in that c<strong>as</strong>e the use of compensation mechanisms should<br />
also be considered.<br />
30 See, e.g., Perfect 10, 494 F.3d at 824 (Kozinski, J., dissenting).
374 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
illegal activity on their systems reveals substantial costs that should give policy<br />
makers pause before moving ahead with the imposition of an indirect liability<br />
scheme for payment providers. These include:<br />
� The cost to maintain and enforce an Internet gambling coding and<br />
blocking scheme that is entirely manual and cannot be automated;<br />
� The cost from over-blocking legal transactions;<br />
� The cost to screen and check the business activity of merchants<br />
participating in the payment systems;<br />
� The cost to monitor the use of payment systems for specific illegal<br />
activity, where the payment systems are in no better position than<br />
anyone else to conduct this monitoring activity;<br />
� The cost to <strong>as</strong>sess complaints of illegality, where the intermediary h<strong>as</strong><br />
no special expertise and is often less familiar with the legal and factual<br />
issues than the wronged party and the allegedly bad actor;<br />
� The cost to defend against legal challenges to enforcement actions,<br />
where the challenge typically comes in an off-shore jurisdiction; and<br />
� Longer-term costs to the United States from taking unilateral action in<br />
this area, including the encouragement of copycat regimes in other<br />
are<strong>as</strong> of law and in other jurisdictions.<br />
The re<strong>as</strong>onableness of these costs in light of the benefits achieved h<strong>as</strong> not yet<br />
been seriously studied. Instead, it seems to be <strong>as</strong>sumed that small compliance<br />
costs are justified by large enforcement benefits. Although precision in the<br />
estimates of costs and benefits is unlikely in this area, a more disciplined<br />
qualitative analysis is required.<br />
Internet Gambling Legislation<br />
The development of the Internet <strong>as</strong> a commercial medium presented a<br />
challenge to local gambling laws. With access to the Internet, individuals could<br />
reach gambling services from their homes, without the need to travel to a<br />
gambling merchant’s physical operation. The Internet provided a way for<br />
gambling merchants who were legal in their own jurisdictions to provide service<br />
to customers in different jurisdictions where gambling w<strong>as</strong> not allowed.<br />
The United States Congress began its consideration of how to react to illegal<br />
Internet gambling in the late 1990s. 31 One early proposal w<strong>as</strong> to put an<br />
31 See General Accounting Office, Internet Gambling, An Overview of the Issues, Dec. 2002,<br />
http://www.gao.gov/new.items/d0389.pdf. Many state laws made Internet gambling<br />
illegal and Federal law also appeared to outlaw at le<strong>as</strong>t some forms of it in interstate<br />
commerce. But the legal situation w<strong>as</strong> ambiguous with respect to some forms of Internet<br />
gambling. The Interstate Wire Act of 1961 applied to Internet gambling and appeared to<br />
prohibit the use of the Internet for the “placing of bets or wages on any sporting event or<br />
contest.” See The Interstate Wire Act (18 U.S.C. § 1084) at
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 375<br />
enforcement burden on ISPs. It would have required ISPs to terminate<br />
domestic Internet gambling merchants and to block foreign Internet gambling<br />
merchants upon request of law enforcement. 32 This initial effort failed to p<strong>as</strong>s,<br />
in part because of concerns about the effectiveness and appropriateness of<br />
putting an enforcement burden on ISPs. 33<br />
In 2006, Congress p<strong>as</strong>sed the Unlawful Internet Gambling Enforcement Act<br />
(UIGEA), which imposed a system of indirect liability on financial institutions<br />
for the purpose of preventing illegal Internet gambling transactions. 34 Prior to<br />
the p<strong>as</strong>sage of UIGEA, payment card networks devised a coding and blocking<br />
system in order to manage the risks of Internet gambling. 35 Each merchant in<br />
the payment system is normally required to identify its major line of business<br />
and to include a four digit “merchant category code” in each authorization<br />
message. 36 For gambling, this merchant category code w<strong>as</strong> 7995. 37 In addition,<br />
merchants were required to use an electronic commerce indicator when an<br />
Internet transaction w<strong>as</strong> involved. 38 Together, these two pieces of information<br />
http://www.law.cornell.edu/uscode/18/usc_sec_18_00001084----000-.html. The U.S.<br />
Fifth Circuit Court of Appeals ruled in 2002 that the Wire Act applied only to sports betting<br />
and not to other types of online gambling. See In re M<strong>as</strong>terCard, 313 F.3d 257 (5th Cir.<br />
2002). The status of horseracing w<strong>as</strong> similarly unclear. The Interstate Horse Racing Act<br />
appeared to allow the electronic transmission of interstate bets. It w<strong>as</strong> amended in<br />
December 2000 to explicitly include wagers through the telephone or other electronic media.<br />
See the Interstate Horse Racing Act (15 U.S.C. §§ 3001-3007) at<br />
http://www.law.cornell.edu/uscode/15/usc_sup_01_15_10_57.html. These statutes<br />
appeared to allow the Internet to be used for both non-sports gambling and for gambling on<br />
horse races. The U.S. Department of Justice, however, thought, and still thinks, that existing<br />
statutes bar all forms of Internet gambling. See Letter from William E. Moschella, Assistant<br />
Attorney General to Rep. John Conyers Jr., July 14, 2003 at<br />
http://www.igamingnews.com/articles/files/DOJ_letter-031714.pdf (“The<br />
Department of Justice believes that current federal law, including 18 U.S.C. §§ 1084, 1952,<br />
and 1955, prohibits all types of gambling over the Internet.”).<br />
32 H.R. 3125 at http://thom<strong>as</strong>.loc.gov/cgi-bin/query/D?c106:2:./temp/~c106mktqmw<br />
33 See the floor debate on H.R. 3125, CR H6057-6068, July 17, 2000,<br />
http://thom<strong>as</strong>.loc.gov/cgi-bin/query/R?r106:FLD001:H56058.<br />
34 Unlawful Internet Gambling Enforcement Act of 2006, Pub. L. No. 109–347, 120 Stat. 1884<br />
(codified at 31 U.S.C. §§ 5361–5367 (2006)).<br />
35 Financial Aspects of Internet Gaming: Good Gamble or Bad Bet?: Hearing Before the<br />
Subcomm. on Oversight and Investigations of the H. Comm. on Financial Servs., 107th<br />
Cong. 25-27, 34-35 (2001) [hereinafter Financial Aspects of Internet Gaming Hearing]<br />
(statement and testimony of Mark MacCarthy, Senior Vice President, Public Policy, Visa,<br />
U.S.A., Inc.) (describing this system of coding and blocking Internet gambling transactions);<br />
U.S. GEN. ACCOUNTING OFFICE, supra note 31, at 20-25.<br />
36 U.S. GEN. ACCOUNTING OFFICE, supra note 31, at 22.<br />
37 VISA MERCHANT CATEGORY CLASSIFICATION (MCC) CODES DIRECTORY, available at<br />
http://www.da.usda.gov/procurement/card/card_x/mcc.pdf.<br />
38 U.S. GEN. ACCOUNTING OFFICE, supra note 31, at 22.
376 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
in the authorization message allowed payment networks or issuing banks to<br />
identify transactions involving Internet gambling merchants. 39<br />
Given this system, it w<strong>as</strong> entirely fe<strong>as</strong>ible for the issuing bank or the payment<br />
network to block Internet gambling transactions. The system could<br />
accommodate conflicting laws in different jurisdictions in the following way: If<br />
it w<strong>as</strong> illegal in one country, such <strong>as</strong> the United States, for cardholders to engage<br />
in Internet gambling, then the issuing banks b<strong>as</strong>ed in that country could decline<br />
authorization requests for all properly coded Internet gambling transactions.<br />
This would effectively block these transactions. However, the banks in other<br />
countries who permit Internet gambling, such <strong>as</strong> the United Kingdom, could<br />
allow the use of their cards for Internet gambling by not declining properly<br />
coded Internet gambling transactions.<br />
The system w<strong>as</strong> limited in detecting nuances in illegal versus legal Internet<br />
gambling. If a jurisdiction recognized some Internet gambling transactions <strong>as</strong><br />
legal and others <strong>as</strong> illegal, the system would not detect it. 40 The merchant<br />
category code described a type of business, not the legal status of the<br />
transaction involved. 41 If a particular jurisdiction allowed c<strong>as</strong>ino gambling, but<br />
not sports betting, both transactions would nevertheless be labeled 7995. And if<br />
the system w<strong>as</strong> set up to block these coded transactions, then both transactions,<br />
legal and illegal, would be blocked. 42<br />
Another weakness in the system w<strong>as</strong> enforcement. If an Internet gambling<br />
merchant realized that his transactions would be blocked in a large jurisdiction<br />
such <strong>as</strong> the United States, then he would have every incentive to hide. 43 Instead<br />
of describing itself <strong>as</strong> a gambling merchant, it would just code itself <strong>as</strong> a T-shirt<br />
sales site or some other legal merchant. Without the proper merchant category<br />
code, the system w<strong>as</strong> blind and could not effectively block the merchant’s<br />
transactions. 44<br />
The payment networks addressed this enforcement issue with a special program<br />
to verify that Internet gambling merchants coded their transactions correctly. 45<br />
39 Id.<br />
40 U.S. GEN. ACCOUNTING OFFICE, supra note 31, at 22.<br />
41 See VISA MERCHANT CATEGORY CLASSIFICATION (MCC) CODES DIRECTORY, supra note 37<br />
(listing all the MCC codes by “merchant type”).<br />
42 U.S. GEN. ACCOUNTING OFFICE, supra note 31, at 22.<br />
43 Id. at 26.<br />
44 Id.<br />
45 Id. at 31-32. The fines for incorrectly identifying authorization requests for online gambling<br />
transactions are set out at page 557 of the Visa International Operating Regulations. VISA,<br />
VISA INTERNATIONAL OPERATING REGULATIONS (April 2010),
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 377<br />
Payment network personnel would test transactions at popular Internet<br />
gambling sites. They would enter a transaction at the web site and track the<br />
transaction through the payment system. They would be able to tell whether the<br />
transaction w<strong>as</strong> coded properly or not after they identified the transaction in the<br />
system. If the transaction w<strong>as</strong> not properly coded, the network would contact<br />
the bank that worked with the merchant and tell the bank that its merchant w<strong>as</strong><br />
out of compliance with the coding rule. The payment network would <strong>as</strong>k the<br />
bank to take steps to bring the merchant into compliance. Finally, the network<br />
would retest the site for proper coding. 46<br />
The UIGEA required payment systems to have policies and procedures<br />
re<strong>as</strong>onably designed to stop illegal Internet gambling transactions. 47 The statute<br />
creates a safe harbor for payment systems that adopt a coding and blocking<br />
scheme. 48 The Federal Reserve Board and the Department of the Tre<strong>as</strong>ury<br />
implemented this safe harbor with a non-exclusive description of one way in<br />
which a payment system can demonstrate that its policies and practices are<br />
re<strong>as</strong>onably designed to stop illegal Internet gambling transactions. 49 This nonexclusive<br />
description tracked the existing industry practices.<br />
http://usa.visa.com/download/merchants/visa-international-operating-regulationsmain.pdf#557.<br />
In addition, Visa requires online gambling merchants to post certain notices:<br />
“a Website for an Online Gambling Merchant must contain … [t]he statement ʻ Internet<br />
Gambling may be illegal in the jurisdiction in which you are located; if so, you are not<br />
authorized to use your payment card to complete this transaction.’” Id. at 594.<br />
46 U.S GEN. ACCOUNTING OFFICE, supra note 31, at 32.<br />
47 Unlawful Internet Gambling Enforcement Act of 2006, Pub. L. No. 109–347, 120 Stat. 1884<br />
(codified at 31 U.S.C. §§ 5361–5367 (2006)).<br />
48 12 C.F.R. § 233.6(d)(1)(ii) (2009).<br />
49 The code’s relevant section reads:<br />
(ii) Implementation of a code system, such <strong>as</strong> transaction codes and<br />
merchant/business category codes, that are required to accompany the<br />
authorization request for a transaction, including—<br />
(A) The operational functionality to enable the card system operator or the<br />
card issuer to re<strong>as</strong>onably identify and deny authorization for a transaction<br />
that the coding procedure indicates may be a restricted transaction; and<br />
(B) Procedures for ongoing monitoring or testing by the card system<br />
operator to detect potential restricted transactions, including—<br />
(1) Conducting testing to <strong>as</strong>certain whether transaction authorization<br />
requests are coded correctly; and<br />
(2) Monitoring and analyzing payment patterns to detect suspicious payment<br />
volumes from a merchant customer … .<br />
Id.
378 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Implementation Challenges with<br />
the Internet Gambling Act<br />
UIGEA defines illegal Internet gambling <strong>as</strong> whatever is illegal under current<br />
U.S. state and Federal law. It therefore continues the uncertainty regarding the<br />
illegality of some Internet gambling activities. 50 Financial intermediaries have<br />
the discretion to block or not block these transactions b<strong>as</strong>ed upon their own<br />
judgment and the strength of the legal arguments presented to them. UIGEA<br />
also provides them with protection from liability if they over-block Internet<br />
gambling sites that turn out to be legal. The current law thereby allows<br />
substantial over-blocking and puts substantial discretion in the hands of the<br />
payment companies.<br />
Impact of UIGEA<br />
A large percentage of non-U.S. companies that derived extensive revenues from<br />
their operations in the United States left the market after the p<strong>as</strong>sage of<br />
UIGEA. All European companies that had been active in the U.S. market left it<br />
after the p<strong>as</strong>sage of UIGEA. 51 By December 2008, all the publicly-trade online<br />
gambling firms had left the U.S. market, even though most of the private firms<br />
remained. 52<br />
Three major European online gambling merchants lost $3 billion in 2006 from<br />
this withdrawal from the U.S market. 53 Me<strong>as</strong>ured traffic at particular sites<br />
declined <strong>as</strong> well. In September 2006, Party Poker, for example, which derived<br />
much of its traffic from the United States, had an average of about 12,000 active<br />
players. By November 2006, that number had dropped to about 4,000. 54<br />
50 These uncertainties affect several types of gambling, including horse racing, state lotteries,<br />
Indian gaming, and games of skill.<br />
51 European Commission Directorate-General for Trade, Examination Procedure Concerning an<br />
Obstacle to Trade, Within the Meaning of Council Regulation (EC) No 3286/94, Consisting of Me<strong>as</strong>ures<br />
Adopted by the United States of America Affecting Trade in Remote Gambling Services Complaint, Report<br />
to the Trade Barriers Regulation Committee (Commission Staff Working Paper) 59, June 10, 2009,<br />
available at http://trade.ec.europa.eu/doclib/docs/2009/june/tradoc_143405.pdf<br />
[hereinafter EC Gambling Report].<br />
52 C<strong>as</strong>ino City, Online Gambling in the United States Jurisdiction, 2009,<br />
http://online.c<strong>as</strong>inocity.com/jurisdictions/united-states/.<br />
53 See EC Gambling Report, supra note 51, at 79 (“the direct losses in revenue due to the loss of<br />
the US market for just these three companies were above $3 billion in 2006.”).<br />
54 See WhichPoker.com, UIEGA Effects,<br />
http://www.whichpoker.com/stats/UIGEAEffects (l<strong>as</strong>t accessed Oct. 18, 2010).<br />
WhichPoker attributes the departure of the biggest publicly-traded online poker sites from<br />
the US market to stock market rules.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 379<br />
Shortly after UIGEA w<strong>as</strong> signed into law in October 2006 analysts estimated<br />
that the value of British Internet gambling stocks declined by $7.6 billion. 55 In<br />
the 9 months between January 1, 2006 and November 1, 2006, just after the<br />
p<strong>as</strong>sage of UIGEA, three major European online gambling firms lost an<br />
estimated 75% of their value, totaling approximately 8.3 billion euros. 56<br />
An estimate by the European Commission of the likely evolution of the U.S.<br />
market in the absence of the specific restrictions imposed in 2006, b<strong>as</strong>ed on an<br />
<strong>as</strong>sumption of a 3% yearly growth, show U.S. Internet gambling accounting for<br />
about $5.8 billion per year in gross revenue in 2006, and reaching almost $14.5<br />
billion in 2012. Following the p<strong>as</strong>sage of UIGEA, the annual figure declined to<br />
about $4.0 billion in 2006, and by 2012 w<strong>as</strong> estimated to be at only $4.6<br />
billion. 57 UIGEA reduced thus the size of the U.S. market well below what it<br />
would otherwise have been.<br />
Internet Gambling Assessment<br />
On equity grounds, it seems that the payment system connection to Internet<br />
gambling is too p<strong>as</strong>sive to justify imposing legal responsibility for blocking<br />
illegal Internet gambling. Payment intermediaries are not to blame when others<br />
use their system for Internet gambling because these intermediaries have no<br />
specific connection to the activity other than operating a general purpose<br />
payment system. They do not reap extra profits through special arrangements<br />
with the Internet gambling merchants. Internet gambling transactions are no<br />
different from any other payment card transaction. On pure equity grounds<br />
alone, then, there is no re<strong>as</strong>on to single out these transactions and impose<br />
special legal responsibilities.<br />
A market analysis indicates that there are still some fe<strong>as</strong>ible enforcement<br />
arrangements that were not established prior to the p<strong>as</strong>sage of the UIGEA.<br />
Although intermediaries may not be responsible for their customers’ gambling,<br />
many of them are concerned about the social ills connected with the activity and<br />
want to reduce its prevalence. 58 U.S. financial intermediaries had already refused<br />
to sign up domestic Internet merchants because these merchants were not<br />
55 Eric Pfanner and Heather Timmons, U.K. Seeks Global Rules for Online Gambling,<br />
INTERNATIONAL HERALD TRIBUNE, Nov. 2, 2006, at 14, available at<br />
http://www.nytimes.com/iht/2006/11/02/technology/IHT-02gamble.html. The<br />
b<strong>as</strong>is for this decline in share value w<strong>as</strong> the withdrawal of these firms from the lucrative US<br />
market and the perception that they would not be able to recover the revenue lost from non-<br />
U.S. customers.<br />
56 EC Gambling Report, supra note 51, at 83.<br />
57 Id. at 19.<br />
58 See Financial Aspects of Internet Gaming Hearing, supra note 39, at 25-26 (statement of Mark<br />
MacCarthy, Senior Vice President, Public Policy, Visa U.S.A., Inc.).
380 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
authorized to act legally in the United States. 59 Some state attorneys general<br />
requested the intermediaries to block offshore gambling activities, and many<br />
cooperated. 60 These agreements did not extend to all financial institutions and<br />
did not cover all states, but they could have been extended without imposing a<br />
legislative requirement.<br />
A cost-benefit analysis of the UIGEA starts with an estimate of its effect on the<br />
amount of illegal Internet gambling activity. As we have seen, the legislation did<br />
not eliminate Internet gambling in the United States, but it did reduce it<br />
substantially below what it would otherwise have been.<br />
The costs <strong>as</strong>sociated with the payment systems’ compliance with the legislation<br />
include the costs of maintaining and enforcing an Internet gambling coding and<br />
blocking scheme, which is entirely manual and cannot be automated, <strong>as</strong> noted<br />
above.<br />
Another cost is the over-blocking problem created by the way in which<br />
payment intermediaries comply with UIGEA. Perfectly legal transactions will<br />
likely be blocked because payment intermediaries cannot distinguish them from<br />
illegal transactions. This example illustrates that intermediaries are usually better<br />
than others at monitoring their own systems for business activity of a certain<br />
type, but not at detecting the illegality of activity on their systems. 61 The point<br />
arises in Internet gambling because the codes used by financial institutions<br />
reflect the business activity of gambling, not its status <strong>as</strong> legal or illegal. As a<br />
result, the payment systems’ policies and procedures, which were adopted to<br />
comply with the Act and which have been accepted by the implementing<br />
regulations, over-block and prevent perfectly legal activity from taking place. 62<br />
59 Id. at 26; U.S. GEN. ACCOUNTING OFFICE, supra note 31, at 20.<br />
60 See, e.g., JACK GOLDSMITH & TIM WU, WHO CONTROLS THE INTERNET?:<br />
ILLUSIONS OF A BORDERLESS WORLD 82 (2006) (discussing Spitzer’s efforts “to<br />
convince every major American credit card provider and online payment system to stop<br />
honoring web gambling transactions.”).<br />
61 See Mann & Belzley, supra note 14, at 278 (“Surely eBay is more adept at searching and<br />
monitoring its marketplace than Tiffany & Co., while eBay probably is not <strong>as</strong> effective <strong>as</strong><br />
Tiffany & Co. in distinguishing bona fide Tiffany products from counterfeits.”); see also<br />
Schruers, supra note 20, at 252 (“[T]he ISP is not the le<strong>as</strong>t-cost avoider when it comes to<br />
discovering [illegal] content; it is only well suited for cost avoidance after it is apprized of the<br />
problem.”). Schruers adds that in this c<strong>as</strong>e, the wronged party may be better suited to the<br />
t<strong>as</strong>k of locating the offending content. Id. at 252.<br />
62 Mann & Belzley, supra note 14, at 294. Mann and Belzley have a useful discussion of this<br />
over-blocking issue:<br />
[A] risk always exists that imposing additional burdens on intermediaries will<br />
chill the provision of valuable goods and services. That will be especially<br />
problematic in c<strong>as</strong>es where considerable risk of chilling legal conduct that is
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 381<br />
Alternatives to UIGEA<br />
In light of this difficulty, there might be more effective ways of <strong>as</strong>signing<br />
liability. The new law creates unnecessary confusion by failing to define the<br />
term “unlawful Internet gambling.” Congressman Barney Frank h<strong>as</strong> introduced<br />
legislation to license and regulate Internet gambling merchants. 63 The lack of<br />
clarity about which merchants are legal would be resolved through a licensing<br />
process. At best, the system would rely on a list of approved gambling entities<br />
that the payment networks could check before approving gambling transactions<br />
from particular Internet merchants. 64<br />
The new licensing regime proposed in Congressman Frank’s legislation would<br />
be an improvement over the existing system in the short term. But over time,<br />
the only way payment systems can operate is through a reduction in the<br />
diversity of the laws they must accommodate. The U.S. government must either<br />
find other ways to enforce its laws abroad or begin harmonizing its laws with<br />
those of other countries. One solution is an international agreement that would<br />
recognize licensing arrangements in different countries <strong>as</strong> long <strong>as</strong> they satisfied<br />
certain agreed-upon minimum standards.<br />
adjacent to the targeted conduct exists. As discussed below, that might tend<br />
to make the use of intermediaries less plausible in file-sharing contexts where<br />
determining whether any particular act of file-sharing is illegal is difficult, and<br />
much more plausible in the gambling context where in many c<strong>as</strong>es<br />
substantially all traffic to a particular site likely involves illegal conduct.<br />
Requiring intermediaries to make those kind [sic] of subjective decisions<br />
imposes costs not only on the intermediaries (that must make those<br />
decisions), but also on the underlying actors whose conduct might be filtered<br />
incorrectly.<br />
Id. at 274. The Internet gambling c<strong>as</strong>e illustrates that determining when a website is engaged<br />
in illegal gambling is not a simple t<strong>as</strong>k. It is fraught with the kind of “subjective decisions”<br />
that Mann and Belzley are properly concerned about. Payment systems faced with this<br />
difficulty do not to make these subjective decisions, instead blocking all gambling activity,<br />
including legal gambling transactions.<br />
63 Internet Gambling Regulation, Consumer Protection, and Enforcement Act, H.R. 2267,<br />
111th Cong. (2009).<br />
64 See text of H.R. 2267 and discussion at<br />
http://financialservices.house.gov/press/PRArticle.<strong>as</strong>px?NewsID=495. The House<br />
Financial Services Committee approved the me<strong>as</strong>ure on July 29, 2010. See Sewell Chan,<br />
Congress Rethinks its Ban on Internet Gambling, NEW YORK TIMES, July 29, 2010 available at<br />
http://www.nytimes.com/2010/07/29/us/politics/29gamble.html. The revised<br />
legislation contains a ban on the use of credit cards for any Internet gambling, even the<br />
newly-legalized merchants, but debit cards can be used at the licensed sites. The text of the<br />
revised legislation is available at<br />
http://financialservices.house.gov/Media/file/markups/7_28_2010/Amendments--<br />
HR%202267/Frank12.pdf
382 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Online Copyright Infringement<br />
The ideal copyright enforcement mechanism would be for content owners to<br />
sue direct infringers. But often, direct infringers are too ubiquitous, too small,<br />
and too difficult to find. The result is well-developed notions of secondary<br />
liability for copyright infringement that involve intermediaries—<strong>as</strong> Paul Szynol<br />
dicusses in another essay in this collection. These doctrines of secondary liability<br />
have evolved substantially over the p<strong>as</strong>t decades.<br />
Legal Context for Intermediary<br />
Liability in Copyright Infringement<br />
Court c<strong>as</strong>es and federal statute define some indirect responsibilities of<br />
intermediaries regarding copyright. The 1984 Supreme Court decision in Sony<br />
Corp. of America v. Universal City Studios, Inc. 65 established a standard for <strong>as</strong>sessing<br />
third party liability. Providers of a technology that can be used for infringing<br />
activities are not liable when there are “substantial non-infringing uses” of the<br />
technology. 66 The <strong>Digital</strong> Millennium Copyright Act of 1998 enabled copyright<br />
owners to enforce their existing rights in the Internet context by enlisting the<br />
help of Internet intermediaries. 67 The key mechanism for gaining the<br />
cooperation of intermediaries is a safe harbor from secondary liability. ISPs are<br />
given an exemption from secondary liability so long <strong>as</strong> they act <strong>as</strong> a pure<br />
conduit, providing only transitory communications and system caching. 68 Web<br />
hosts and search engines also receive a safe harbor, provided they comply with a<br />
specific notice-and-takedown procedure. 69 Upon receiving notification of<br />
claimed infringement, the provider must expeditiously take down or block<br />
access to the material. 70<br />
Successful litigation against peer-to-peer networks in the digital music area also<br />
incre<strong>as</strong>ed the ability of copyright owners to use third parties to combat<br />
copyright infringement where the third party is affirmatively involved in<br />
fostering the infringement. In an early file-sharing c<strong>as</strong>e, the Ninth Circuit found<br />
that the peer-to-peer service Napster w<strong>as</strong> liable for secondary infringement<br />
b<strong>as</strong>ed on its control and facilitation of its users’ infringement of music<br />
copyrights; 71 The company subsequently went out of business in its original<br />
65 464 U.S. 417 (1984).<br />
66 Id. at 442.<br />
67 17 U.S.C. § 512 (2006).<br />
68 17 U.S.C. § 512(a).<br />
69 17 U.S.C. § 512(b).<br />
70 Id.<br />
71 A&M Records, Inc. v. Napster, Inc., 239 F.3d 1004 (9th Cir. 2001).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 383<br />
form. 72 More recently, the Supreme Court found that another peer-to-peer<br />
service, Grokster, violated federal copyright law when it took “affirmative steps<br />
taken to foster infringement … by third parties,” such <strong>as</strong> advertising an<br />
infringing use or instructing how to engage in an infringing use. 73<br />
Against this background arose a question regarding payment systems: Are they<br />
liable for secondary infringement when they are used for direct infringement? In<br />
Perfect 10 v. Visa International Service Ass’n, 74 a subscription-b<strong>as</strong>ed adult content<br />
website alleged that numerous websites b<strong>as</strong>ed in several countries had stolen its<br />
proprietary images, altered them, and illegally offered them for sale online. 75 In<br />
response to complaints, Visa did not deny payment services to the allegedly<br />
infringing sites, and Perfect 10 brought a contributory and vicarious<br />
infringement action against Visa. The Ninth Circuit affirmed the district court’s<br />
rejection of liability for Visa. 76<br />
In Perfect 10, the Ninth Circuit dismissed the charge of contributory<br />
infringement by focusing on whether the credit card companies “materially<br />
contributed” to the infringement. 77 The court said the credit card companies did<br />
not materially contribute to the infringement because they had no “direct<br />
connection” to the infringement. 78 To have direct connection to the<br />
infringement they would have had to reproduce, display, or distribute the<br />
allegedly infringing works, which they did not do. 79 Payment services might<br />
make it more profitable to infringe, but they are too far removed in the causal<br />
72 Benny Evangelista, Napster Runs Out of Lives – Judge Rules Against Sale, S.F. CHRONICLE, Sept.<br />
4, 2002, at B1.<br />
73 MGM Studios, Inc. v. Grokster, Ltd., 545 U.S. 913, 919 (2005).<br />
74 494 F.3d 788 (9th Cir. 2007); see Jonathan Band, The Perfect 10 Trilogy, 5 COMPUTER L. REV.<br />
INT’L 142 (2007) (discussing Perfect 10 v. Visa International Service Ass’n and its relationship to<br />
similar secondary liability c<strong>as</strong>es). Band summarizes the Visa c<strong>as</strong>e:<br />
Here the Ninth Circuit rejected what would have represented a significant<br />
expansion of secondary liability to actors far removed from the infringing<br />
activity. However, unlike the other c<strong>as</strong>es, this c<strong>as</strong>e provoked a strong dissent<br />
by respected jurist Alex Kozinski. This dissent suggests that the outer edges<br />
of secondary liability remain to be defined.<br />
Id. at 14. Judge Kozinski’s dissent is indeed stinging, but it also underestimates the burden<br />
that secondary liability would place on intermediaries. Id.<br />
75 Perfect 10, 494 F.3d at 793.<br />
76 Id.<br />
77 Id. at 796.<br />
78 Id.<br />
79 Id.
384 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
chain that leads to the actual infringing acts for them to be described <strong>as</strong> making<br />
a material contribution. 80<br />
The court made a similar point about vicarious liability, finding that the card<br />
companies had no practical ability or right to prevent the infringing activity. 81<br />
While credit card services can exert financial pressure on the infringing<br />
websites, they cannot stop the actual reproduction or distribution of the<br />
infringing images. 82<br />
In his dissent, Judge Kozinski rejected both arguments. 83 According to Judge<br />
Kozinski, the card companies were directly connected to the infringement<br />
because they provided payment services. 84 Without these payment services there<br />
would be no infringement. 85 The card companies had the contractual right to<br />
terminate illegal activity on their systems, <strong>as</strong> well <strong>as</strong> the practical ability to exert<br />
financial pressure to stop or limit the infringing activity. 86<br />
This dissent apparently played a role in a more recent c<strong>as</strong>e in which a district<br />
court found payment processors liable for trademark infringement for failing to<br />
take down allegedly infringing content. A key element in this c<strong>as</strong>e w<strong>as</strong> the<br />
knowledge imputed to the payment processor of infringing activity that should<br />
have been apparent from an analysis of chargeback claims. 87<br />
Payment System Complaint Program<br />
Even though payment intermediaries may not be required to take steps against<br />
online copyright infringement, they have chosen to do so. 88 Payment systems<br />
cannot monitor their networks for copyright law violations. They do not have<br />
the factual b<strong>as</strong>is to conclude that a particular sale of a product is a violation of<br />
80 Id. at 797.<br />
81 Id. at 803.<br />
82 Id. at 804.<br />
83 Id. at 810-11 (Kozinski, J., dissenting).<br />
84 Id. at 811-12.<br />
85 Id.<br />
86 Id. at 816-17.<br />
87 Gucci v. Frontline No. 9 Civ.6925 (HB) U.S. District Court for the Southern District, June<br />
23, 2010<br />
88 See generally International Piracy: The Challenges of Protecting Intellectual Property in the<br />
21st Century: Hearing Before the Subcomm. on Courts, the Internet, and Intellectual<br />
Property of the H. Comm. on the Judiciary, 110th Cong. 73-82 (2007) [hereinafter<br />
International Piracy Hearing] (statement of Mark MacCarthy, Senior Vice President for<br />
Global Public Policy, Visa Inc.) (providing this account of payment intermediaries and<br />
intellectual property).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 385<br />
someone’s copyright. 89 Many music downloads are perfectly legal transactions,<br />
but some are not. Distinguishing the two is often a complex factual and legal<br />
question which payment intermediaries do not have the expertise or ability to<br />
resolve.<br />
The payment systems have no way of knowing whether a transaction involves<br />
copyright infringement without a complaint. The payment networks have thus<br />
developed policies and procedures to handle these complaints. 90<br />
The complaint process starts when a business entity approaches a payment<br />
system with clear, documented evidence of illegal activity and adequately<br />
identifies the infringing Internet merchant. 91 The business entity must provide<br />
substantiation that the activity is illegal and documentation that payment cards<br />
are actually being used for this illegal activity. 92<br />
The next step is to <strong>as</strong>sess legality, which can be complex in cross-border<br />
situations. 93 After wrestling with these issues, the payment networks developed<br />
a policy for cross-border transactions: If a transaction would be illegal in either<br />
the jurisdiction of the merchant or the jurisdiction of the cardholder, the<br />
transactions should not be in the payment system. 94 In c<strong>as</strong>es like copyright<br />
infringement, this means that merchants are responsible for making sure that<br />
the transactions they submit to the payment system are legal in both their<br />
operating jurisdiction and the jurisdiction in which their customer is located.<br />
This <strong>as</strong>sessment of legality requires the payment network to determine whether<br />
the type of transaction would be illegal in either jurisdiction. 95 Since the facts<br />
and law involved are often complex, the payment networks are willing to take<br />
on only the clearest c<strong>as</strong>es of copyright violation. Once they determine illegality,<br />
the payment providers do what they re<strong>as</strong>onably can to <strong>as</strong>sist the complaining<br />
party. Since payment networks do not work directly with merchants, they<br />
typically try to locate the bank that h<strong>as</strong> the merchant account and provide the<br />
complaint to the bank involved, which usually resolves the issue. 96 In most<br />
c<strong>as</strong>es, either the bank does not want the business and terminates the merchant<br />
89 Id. at 76.<br />
90 Id. at 77.<br />
91 This Section describes the process at Visa, but other payment networks use a similar process.<br />
See id. at 85.<br />
92 Id.<br />
93 Id. at 77-78.<br />
94 Id. at 78.<br />
95 Id.<br />
96 Id.
386 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
or takes other action to bring the merchant into compliance. 97 If the bank does<br />
not take action, the payment networks can take further enforcement action<br />
against the bank. 98<br />
Allofmp3.com<br />
In some instances, the merchant resists the enforcement efforts of payment<br />
systems, insists on the legality of the underlying activity, and goes to a local<br />
court to vindicate its perceived rights under local law. This is what occurred in<br />
the Allofmp3.com c<strong>as</strong>e.<br />
In 2005, Visa received a documented complaint from International Federation<br />
of the Phonographic Industry (IFPI), which represents copyright owners b<strong>as</strong>ed<br />
in more than seventy countries. 99 The complaint alleged that Allofmp3.com, a<br />
website located in Russia, w<strong>as</strong> infringing on the copyrights of IFPI’s members<br />
by allowing unauthorized downloads of music. 100 Visa <strong>as</strong>sessed the legal<br />
situation, in part by obtaining a review by outside counsel, and concluded that<br />
the transactions were illegal under local Russian law. 101 They were also illegal<br />
under the laws of the v<strong>as</strong>t majority of the merchant’s customers who were<br />
located primarily in the United Kingdom and the United States. 102 In October<br />
2005, the Italian authorities shut down a localized version of Allofmp3.com,<br />
allofmp3.it, and began a criminal investigation of the Italian site. 103 In addition,<br />
97 Id.<br />
98 Id. Payment systems have a voluntary program such <strong>as</strong> this in place for counterfeiting<br />
complaints <strong>as</strong> well. This program includes include having a process in place to respond to<br />
complaints of the use of a payment brand for sales of counterfeit goods. Trademark owners<br />
would provide information such <strong>as</strong> a description of the allegedly counterfeit transaction and<br />
evidence that the payment system brand w<strong>as</strong> involved, and the payment system would look<br />
into the allegation and take action in according with a publicly stated policy, which could<br />
include suspension of the merchant involved. Trademark owners would agree to indemnify<br />
payment systems for steps taken and for legal risk. This system is described by INTA in<br />
“Addressing the Sale of Counterfeits on the Internet,” September 2009 available <strong>as</strong><br />
attachment 3 in the INTA Submission On The Request For Public Comment Regarding The<br />
Joint Strategic Plan For IP Enforcement, for the Office of the Intellectual Property<br />
Enforcement Coordinator (IPEC) through the Office of Management and Budget , March<br />
24, 2010 available at<br />
http://www.whitehouse.gov/omb/IPEC/frn_comments/InternationalTrademarkA<br />
ssociation.pdf<br />
99 Id.<br />
100 Id. (discussing IFPI’s role); Nate Anderson, Music Industry Encouraged Visa to Pull the Plug on<br />
AllofMP3.com, ARSTECHNICA, Oct. 19, 2006,<br />
http://arstechnica.com/business/news/2006/10/8029.ars.<br />
101 International Piracy Hearing, supra note 88, at 79 (statement of Mark MacCarthy, Senior Vice<br />
President for Global Public Policy, Visa Inc.).<br />
102 Id.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 387<br />
the United States Trade Representative intervened with the Russian government<br />
to urge them to shut down Allofmp3.com. 104<br />
At the beginning of September 2006, after appropriate notice, the Russian bank<br />
working with Allofmp3.com stopped processing Visa transactions for<br />
Allofmp3.com. 105 At the end of September 2006, the bank also stopped<br />
processing transactions from an affiliated site called allTunes. 106 After these Visa<br />
transactions ended, further confirmation of the site’s illegality w<strong>as</strong> forthcoming;<br />
a Danish court ordered the Internet provider Tele2 to block its subscribers’<br />
access to allofmp3.com, thereby making it harder for potential customers in<br />
Denmark to access the site. 107 M<strong>as</strong>terCard also cut off payment services to<br />
allofmp3.com. 108 By May of 2007, the site’s popularity had plummeted. 109<br />
The company w<strong>as</strong> all but out of business, but the legal process w<strong>as</strong> just starting.<br />
The owner of allTunes sued the bank that had stopped processing its Visa<br />
transactions in a Russian court. 110 Visa w<strong>as</strong> a party to that litigation on the side<br />
of the bank. 111 In June 2007, the owner won a judgment that the bank had<br />
violated its contract with the merchant, and the judgment required the bank to<br />
continue to provide processing services. 112 In response to the bank’s claim that<br />
the merchant w<strong>as</strong> acting illegally, the court determined that there were no<br />
rulings in Russia establishing that allTunes w<strong>as</strong> making illegal use of exclusive<br />
rights belonging to rights holders. 113<br />
103 Press Rele<strong>as</strong>e, IFPI, Allofmp3.com: Setting the Record Straight, June 2, 2006,<br />
http://www.ifpi.org/content/section_news/20060601.html.<br />
104 See International Piracy Hearing, supra note 88, at 26 (testimony of Victoria A. Espinel, Assistant<br />
U.S. Rep. for Intellectual Property and Innovation, Office of the U.S. Trade Rep.) (“We will<br />
continue to press Russia to shut down and prosecute the operators of illegal Web sites<br />
operating in Russia, including the successors to the infamous AllOfMP3.com.”).<br />
105 Id. at 79 (statement of Mark MacCarthy, Senior Vice President for Global Public Policy, Visa<br />
Inc.).<br />
106 Id.<br />
107 Press Rele<strong>as</strong>e, IFPI, New Court Setback for Allofmp3.com, Oct. 26, 2006,<br />
http://www.ifpi.org/content/section_news/20061026.html.<br />
108 BBC News, MP3 site’s voucher system closes, May 21, 2007,<br />
http://news.bbc.co.uk/2/hi/entertainment/6677265.stm.<br />
109 IFPI reported in May 2007 that Allofmp3 “rated outside the top 2000 websites.” Press<br />
Rele<strong>as</strong>e, IFPI, Police Dawn Raid Stops Allofmp3.com Pirate Vouchers Scheme, May 21, 2007,<br />
http://www.ifpi.org/content/section_news/20070521.html.<br />
110 Arbitration Court of Moscow 2007, A40-70411/06-67-500.<br />
111 Id. at 1.<br />
112 Id. at 5.<br />
113 Id. The court stated:
388 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
In August 2007, another Russian court issued a ruling in a different c<strong>as</strong>e,<br />
relating to criminal copyright infringement initiated by IFPI against the owner<br />
of Allofmp3.com. 114 This ruling stated that there had not been sufficient<br />
confirmation of any illegal activity by the site’s owner. 115 Even though the<br />
copyright owners had not given permission to distribute their recorded material,<br />
a Russian collective rights society (the Russian Multimedia and Internet Society,<br />
or ROMS by its initials in Russian) w<strong>as</strong> deemed to be operating legitimately<br />
under Russian law. 116 The court implied that Allofmp3.com and similar sites<br />
would be in compliance with Russian law to the extent that they paid for rights<br />
from this Russian collective rights society. 117<br />
These court c<strong>as</strong>es created a challenge for Visa because the payment system had<br />
responded to a documented complaint of copyright infringement. 118 Despite an<br />
outside review that seemed to establish illegality in the local jurisdiction, a local<br />
court ordered a local bank to continue to provide payment services. 119 Yet these<br />
transactions would still be illegal in virtually every other country in the world.<br />
To preserve its cross-border policy, Visa decided to allow the local bank to<br />
provide only domestic service to the site involved in the court c<strong>as</strong>e. 120<br />
Transactions from customers in other countries would not be allowed. 121<br />
According to Article 49 of the Russian Federation Law “On Copyright and<br />
Allied Rights,” it is only the Court that can execute actions in connection<br />
with illegal use of copyrights and allied rights, if there is a lawsuit filed by<br />
exclusive right holders, which the Defendants, VISA and IFPI are not, while<br />
in this c<strong>as</strong>e there are no court rulings with the force of res judicata establishing<br />
the Plaintiff’s illegal use of exclusive rights belonging to some right holders.<br />
Id. The Defendant w<strong>as</strong> Rosbank, the Russian financial institution licensed by Visa to<br />
authorize merchants in Russia to accept Visa. Id.<br />
114 Cheremushkinsky [District Court of Moscow], 2007, No. 1-151-07.<br />
115 Id. at 4.<br />
116 Id. at 5.<br />
117 Id.; see also International Piracy Hearing, supra note 88, at 99 (testimony of Victoria A. Espinel,<br />
Assistant U.S. Rep. for Intellectual Property and Innovation, Office of the U.S. Trade Rep.)<br />
(“My understanding of the c<strong>as</strong>e is that Media Services, the company that operated allTunes,<br />
w<strong>as</strong> able to successfully argue in Russian court that it w<strong>as</strong> not acting illegally because it w<strong>as</strong><br />
paying royalties to collecting societies, collecting societies that were not authorized by the<br />
rights holders.”).<br />
118 Id. at 80 (statement of Mark MacCarthy, Senior Vice President for Global Public Policy, Visa<br />
Inc.).<br />
119 Id. at 80-81.<br />
120 Id.<br />
121 Id. at 81.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 389<br />
Assessment of Payment System Actions<br />
on Online Copyright Infringement<br />
Are payment systems doing enough on their own to respond to online<br />
copyright infringement? Does there need to be a system of legal liability for<br />
them to control online copyright infringement using their payment systems?<br />
First, Perfect 10 properly rejected indirect liability for payment intermediaries. 122<br />
The involvement of payment networks in copyright violations is attenuated and<br />
entirely p<strong>as</strong>sive. On control grounds, there is simply no way to draw a line<br />
between payment network involvement in allegedly infringing transactions and<br />
involvement in a wide range of other potentially illegal activities. If they are<br />
liable in this c<strong>as</strong>e, why wouldn’t they be liable for all c<strong>as</strong>es of illegal activity on<br />
their payment systems? Unintentionally, Judge Kozinski’s dissent brought out<br />
this implication. 123<br />
But the actual experience of payment intermediaries reveals that things are<br />
never <strong>as</strong> simple <strong>as</strong> removing infringing material. At best, there is a welldocumented<br />
<strong>as</strong>sertion of infringement under the laws of a particular<br />
jurisdiction. Judge Kozinski appears to favor a notice-and-takedown approach,<br />
so that payment intermediaries are not responsible for illegal conduct of which<br />
they are unaware. 124 But <strong>as</strong> Visa found in Allofmp3.com, payment card services<br />
and their <strong>as</strong>sociated financial service partners can be liable for wrongful<br />
termination of services in those jurisdictions if they react to an allegation of<br />
infringement by “kick[ing] the pirates off their payment networks.” 125<br />
122 Perfect 10, Inc. v. Visa Int’l Serv. Ass’n, 494 F.3d 788, 798 (9th Cir. 2007). For analysis, see<br />
Band, supra note 74.<br />
123 See id. at 824 (Kozinski, J., dissenting) (“Credit cards already have the tools to police the<br />
activities of their merchants, which is why we don’t see credit card sales of illegal drugs or<br />
child pornography.”). Of course, card companies use different tools in the c<strong>as</strong>e of illegal<br />
drugs and child pornography, namely, proactive monitoring, but it is hard to see on<br />
Kozinski’s analysis why card companies shouldn’t use whatever tools they can to stop illegal<br />
activity in all c<strong>as</strong>es. See id. (“Plaintiff is not <strong>as</strong>king for a huge change in the way credit cards<br />
do business; they <strong>as</strong>k only that defendants abide by their own rules and stop doing business<br />
with crooks. Granting plaintiff the relief it seeks would not … be the end of Capitalism <strong>as</strong><br />
we know it.”). But it might be the end of payment systems <strong>as</strong> we know them if indirect<br />
liability for them means an obligation to stop doing business with everyone who might be<br />
involved with illegality anywhere. Kozinski attempts to limit his analysis to those c<strong>as</strong>es where<br />
there are special arrangements between bad actors and the payment system, id. at 819-20, but<br />
nothing in his analysis turns on these special arrangements. These special arrangements turn<br />
out to be risk-b<strong>as</strong>ed pricing for adult content websites. Would he really have voted with the<br />
majority if the price that adult content merchants face for accepting cards w<strong>as</strong> the same <strong>as</strong><br />
the price set for less risky merchants?<br />
124 Perfect 10, 494 F.3d at 824 (Kozinski, J., dissenting).<br />
125 Id. at 817.
390 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
Second, there is no market failure in this situation that would justify imposing<br />
intermediary liability on payment systems. There are available arrangements<br />
between payment intermediaries and copyright owners that can reduce the<br />
amount of copyright infringement on the Internet. These arrangements are<br />
informal, but expanding. They rely on complaints by copyright owners,<br />
followed by investigation and action by intermediaries. They seem to strike a<br />
cost-b<strong>as</strong>ed balance by putting the burden of discovering infringement on the<br />
copyright owner and triggering action by the third party only after notification.<br />
The arrangements may involve compensating payment intermediaries for<br />
performing enforcement services, but if this enables copyright owners to reduce<br />
the harm of copyright infringement, they might very well pay. If there are extra<br />
efforts, above and beyond standard practices, that a particular copyright owner<br />
would like payment intermediaries to make, those efforts should be open to<br />
negotiation. There do not seem to be any transaction costs that would prevent<br />
the parties from negotiating adjustments to these arrangements over time. And<br />
there appears to be no market failure that would justify not relying on private<br />
sector enforcement arrangements.<br />
Third, given the legal risks involved, copyright owners should be willing to<br />
indemnify payment intermediaries for damages resulting from enforcement<br />
actions against alleged infringers. Allofmp3.com indicates that these legal risks are<br />
not hypothetical. If the copyright owner believes in the legal soundness of his<br />
c<strong>as</strong>e, he should be prepared to <strong>as</strong>sume the risk. It might be one way to <strong>as</strong>sure<br />
that only strong complaints are brought to the attention of the payment<br />
intermediary. An additional mechanism might be to require the presence of a<br />
court or governmental agency that holds that the activity involved is infringing.<br />
A statute could potentially help provide legal immunity to payment<br />
intermediaries when they take good-faith action against alleged infringers. But<br />
U.S. law cannot provide immunity in other jurisdictions, which is where the aid<br />
of global payment intermediaries is needed.<br />
Fourth, this c<strong>as</strong>e illustrates the need for greater clarity in the legal environment<br />
in which intermediaries operate. Intermediaries cannot be in the position of<br />
creating new global law through their own interpretation of current statutes.<br />
Again Allofmp3.com suggests the need for even greater harmonization of local<br />
laws that intermediaries are expected to enforce.<br />
In sum, the experience of payment intermediaries indicates that some efforts on<br />
their part to respond to legitimate complaints would be justified. It is not<br />
appropriate to do nothing in response to allegations of copyright infringement.<br />
The current complaint procedure and c<strong>as</strong>e-by-c<strong>as</strong>e response is re<strong>as</strong>onable. It<br />
could be improved through further discussions among the parties, further<br />
recourse to court judgments of infringement, and harmonization of current<br />
international standards.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 391<br />
Conclusion<br />
The question remains: Should the government place an enforcement burden on<br />
payment intermediaries? The standard le<strong>as</strong>t cost analysis suggests that the<br />
advantages of government intervention sometimes appear to be substantial, but<br />
nothing in the analysis suggests that Internet intermediaries are always the best<br />
vehicle for government control. The costs, benefits and equities involved in<br />
specific c<strong>as</strong>es have not been adequately <strong>as</strong>sessed. Intermediaries are often in a<br />
position to voluntarily police their own communities and have taken steps to do<br />
this without explicit government requirements. The equities set out in current<br />
law establish a regime that works tolerably well. Even when government<br />
requirements are explicit, <strong>as</strong> in the c<strong>as</strong>e of Internet gambling, they are often<br />
crafted to fit the architecture and structure of the intermediaries themselves.<br />
While some adjustments would improve these legal regimes, nothing suggests<br />
that more liability imposed unilaterally by local governments would be an<br />
improvement.<br />
Greater government coordination on the rules that intermediaries must follow<br />
on the Internet would be an improvement. To avoid legal liability and to<br />
comply with local laws, payment intermediaries are moving toward accepting<br />
the laws of all jurisdictions. They also have wide discretion on what activities to<br />
allow on their systems. But this situation is problematic. Intermediaries are not<br />
the best- situated to decide which rules to follow. Also, no laws are selfinterpreting.<br />
They often apply to particular situations in obscure and heavily<br />
fact-dependent ways. Intermediaries’ flexibility in adjudication leaves room for<br />
private, strategic, and unaccountable decisions that affect the shape and<br />
direction of online activity. Coordinated government rules are best for an<br />
additional re<strong>as</strong>on: The intermediary role does not scale well in a world of<br />
multiple, overlapping, and conflicting rules. If governments are going to use<br />
intermediaries to regulate the Internet, they need to coordinate their own laws<br />
to make that role possible.
392 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 393<br />
Fuzzy Boundaries:<br />
The Potential Impact of Vague<br />
Secondary Liability Doctrines<br />
on Technology Innovation<br />
By Paul Szynol *<br />
L<strong>as</strong>t year, Ninth Circuit Judge Alex Kozinski, with Josh Goldfoot from the<br />
Department of Justice’s Criminal Division, published an article in the Columbia<br />
Journal of Law and the Arts entitled “A Declaration of the Dependence of<br />
Cyberspace.” Its title is a play on the title of John Perry Barlow’s 1996 “A<br />
Declaration of the Independence of Cyberspace”; its content, <strong>as</strong> the title<br />
suggests, is something of an attack on Barlow’s philosophies, and, more<br />
generally, on the idea that the Internet is a unique entity that requires custom<br />
legal treatment. The authors make several key claims about the law’s<br />
relationship to the internet, but the central argument focuses on secondary<br />
liability—the copyright doctrine that makes makers of multi-use technologies<br />
legally liable for other people’s infringing uses of their technology.<br />
Broadly stated, the rationale at the heart of the secondary liability doctrine is<br />
this: An entity that knowingly helps to facilitate the commission of an illegal act<br />
(such <strong>as</strong> copyright infringement, for example) should be penalized for its<br />
contribution to the illegal activity. 1 If a technology company induces its<br />
customers to use its product for infringing purposes, for instance, both the<br />
users and the company should be liable for such infringement—the users for<br />
direct infringement and the company for contributory infringement, which is a<br />
species of secondary liability.<br />
The doctrine is appealing <strong>as</strong> a practical solution to widespread infringement<br />
because it targets the entities that enable illegal behavior—e.g., the Napsters and<br />
Groksters of the world—and thus eradicates the distribution mechanism that<br />
enables infringement in the first place. Judge Kozinski and Mr. Goldfoot (I’ll<br />
generally refer to them <strong>as</strong> “the authors” from here on), like the movie and<br />
music industries, certainly believe that the doctrine of secondary liability should<br />
be readily used <strong>as</strong> a handy and effective tool for weeding out copyright<br />
* Paul Szynol graduated from Columbia University, where he studied history and philosophy,<br />
and Yale University, where he studied intellectual property law.<br />
1 The specific theories of secondary liability have more nuanced elements, such <strong>as</strong> the<br />
requirements of materiality for contributory infringement and direct financial benefit for<br />
vicarious infringement. Since these elements are not critical to the essay’s main thesis, I’ve<br />
avoided spelling them out in detail.
394 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
infringement. According to the authors, people “who provide powerful tools<br />
that can be used for good or evil have some responsibility to make sure that<br />
those tools are used responsibly.” Put more bluntly, however, if you outlaw the<br />
tool, you needn’t ch<strong>as</strong>e after the users, so in practice it’s less a question of ethics<br />
and more a question of convenience and efficiency.<br />
One of the principal problems with this approach, however, is the fact that the<br />
boundaries of secondary liability are not precisely set, and, short of extreme<br />
c<strong>as</strong>es, it is not at all clear under what circumstances a product manufacturer will<br />
be liable for secondary infringement. Such wholesale endorsement for<br />
secondary liability doctrines should therefore give us some pause. For example,<br />
at what point does a software company that develops a peer-to-peer application<br />
utilized by end users to exchange copyrighted materials begin to “contribute” to<br />
the infringement and become secondarily liable? Does the company contribute<br />
simply by writing software that is merely capable of infringing uses? 2 Or does<br />
the company contribute only if the software’s primary use is, by design,<br />
infringing? Or, further yet, does the company contribute only if a substantial<br />
portion of the end-users utilize the technology for infringing purposes? If so,<br />
how much of the user b<strong>as</strong>e must engage in infringing activity for it to be a<br />
substantial portion? 3 Or, <strong>as</strong> yet another option, does the company “contribute”<br />
only if it promotes infringing uses of its software? And, if that’s the c<strong>as</strong>e, how<br />
much promotion is too much promotion? For example, is the advertising<br />
slogan “Rip. Mix. Burn.” too much of an inducement to make infringing<br />
copies of music? 4<br />
These are fundamental, starting-point questions about the secondary liability<br />
doctrine, and one would expect that c<strong>as</strong>e law or legislation provides a clear<br />
answer to each. Yet the law is ambiguous (and the authors are altogether silent)<br />
on these points. Outside of extreme c<strong>as</strong>es, no one knows with certainty—<br />
including lawyers, judges, company officers, engineers and academics—when<br />
secondary liability might attach to a product that facilitates the transmission of<br />
copyrighted materials. The legal system’s failure to provide clear guidelines is<br />
2 An argument that the Supreme Court famously rejected in its 1984 “Betamax” decision.<br />
Sony Corp. of America v. Universal City Studios, Inc., 464 U.S. 417 (1984).<br />
3 See, for example, the Napster litigation. According to the District Court’s opinion, 87% of<br />
the content on Napster w<strong>as</strong> copyrighted, and “virtually all Napster users” transferred<br />
copyrighted content. A & M Records, Inc. v. Napster, Inc., 114 F. Supp. 2d 896 (N.D.Ca.<br />
2000). A decade later, a critical question remains essentially unanswered: How much lower<br />
would those percentages have to be for a manufacturer to be safe from secondary liability?<br />
4 The standard introduced in Grokster is “clear expression”, which is not much of a lodestar<br />
for someone seeking to gauge risk with any degree of precision. Metro-Goldwyn-Mayer Studios<br />
Inc. v. Grokster, Ltd., 545 U.S. 913, 914 (2005). One could persu<strong>as</strong>ively argue that Apple’s<br />
very large, very prominent and very ubiquitous “Rip. Mix. Burn.” billboards amounted to<br />
“clear expression.”
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 395<br />
the equivalent of posting a sign on a freeway that says “obey the speed limit”<br />
without giving an actual speed.<br />
The effect is potentially detrimental to the entire technology sector. A clear rule<br />
is a predictable rule, and a predictable rule is one on which innovators can rely<br />
when developing a product. Without clear guidance from the legal system, tech<br />
companies are forced to engage in a “fingers crossed” product design process,<br />
and, subsequently, face a market that can be an explosive landmine of<br />
infringement liability. The potential economic damage to a company found<br />
guilty of secondary liability can be substantial, to say the le<strong>as</strong>t. Since statutory<br />
damages for copyright infringement range from $750 to $150,000 per<br />
infringement, a maker of a multi-use technology may confront liabilities on a<br />
scale that can threaten the viability of even the wealthiest corporations. The risk<br />
is further exacerbated by the recent trend of unpredictable and often very<br />
bloated damage awards granted to copyright plaintiffs. Such risk can dissuade<br />
even the most resolute investors from marketing their invention—and it can<br />
literally bankrupt the braver among them. The loss of a robust distribution tool<br />
harms the content sector, too, since a powerful method for distributing content<br />
to end users will not be brought to market.<br />
Judge Kozinski and Mr. Goldfoot are not concerned with the chilling effect that<br />
the legal system’s ambiguity can have on technology innovation. In fact, they<br />
reject the proposition, and confidently point to the pharmaceutical and auto<br />
industries <strong>as</strong> counter-examples: Both industries have to comply with legal<br />
regulation yet manufacturers in both industries nevertheless innovate.<br />
It’s not a very persu<strong>as</strong>ive comparison. First, the auto industry is hardly a hotbed<br />
of innovation. We might really like power windows and power steering, but, <strong>as</strong><br />
advancements over prior art, these innovations are an order of magnitude<br />
smaller than the innovation we’ve seen on the internet. Second, the players in<br />
the auto and pharmaceutical industries are frequently different from the players<br />
in the technology sector. It is rare, after all, if not unheard of, that a single<br />
person invents valuable medicine—the medical R&D process takes place in the<br />
laboratories of some of the wealthiest companies under the sun. In addition,<br />
medical innovation is subject to review and approval by government regulatory<br />
agencies, so by the time a medicine reaches the market, it h<strong>as</strong> already been<br />
approved by the government. Innovation in information technology, in<br />
contr<strong>as</strong>t, is often the result of the proverbial garage inventor who rele<strong>as</strong>es the<br />
technology entirely on its own. Think of eBay, Napster, Apple, Google and<br />
Microsoft, each of which had a modest start in someone’s home or garage at the<br />
hands of one or two people (and many subsequently acquired similarly<br />
independent garage innovations). The distinction between a multinational<br />
company and a garage inventor is critical. First, there is no government<br />
imprimatur for multi-use technologies. Second, in contr<strong>as</strong>t to wealthy<br />
companies that can afford sophisticated legal teams, garage inventors typically
396 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?<br />
lack the economic resources necessary to pay for a comprehensive legal review<br />
of product design prior to the product’s rele<strong>as</strong>e. That inability incre<strong>as</strong>es the<br />
likelihood that the garage inventor will—unwittingly—design its product in a<br />
way that leads to legal liability, or the likelihood that, after rele<strong>as</strong>ing the product<br />
and receiving angry threats of litigation, the garage inventor will have to<br />
backtrack and redesign the product in order to avoid liability. These are very<br />
expensive me<strong>as</strong>ures. If the inventor can afford them, the inventor will have<br />
spent money that it would have saved had the law simply been clearer in the<br />
first place; if the inventor cannot afford them, the outcome is even worse: the<br />
start-up will simply fold, thus w<strong>as</strong>ting its investment costs, while consumers will<br />
miss out on the product altogether.<br />
That outcome is bad enough, but it’s the third re<strong>as</strong>on for the comparison’s<br />
inadequacy that should give all of us some pause: Because the legal landscape<br />
around copyright secondary liability is so unclear, even if the would-be inventor<br />
did have the resources to hire outside counsel, lack of clarity in the law means<br />
that, unless the product clearly crosses a line, lawyers—no matter how high<br />
their hourly rates—won’t be able to confidently provide the inventor with a<br />
legal imprimatur. In other words, no matter how much a company tries, lack of<br />
clear standards means that its lawyers might “get it wrong,” and the company<br />
may face infringement liability if it rele<strong>as</strong>es the product, or incur the costs of<br />
post-rele<strong>as</strong>e redesign, or both. That is a very expensive proposition, and its<br />
corollary is clear: Faced with potential liability exposure and potential redesign<br />
costs, each of which could figure in the millions or even billions of dollars, 5<br />
some would-be inventors and investors will, <strong>as</strong> rational economic actors, forego<br />
the whole enterprise—not because they analyzed the risk and found it<br />
potentially too costly, but because the law’s ambiguity meant they simply couldn’t<br />
properly analyze the risk in the first place. Notably, the foregoing outcome will<br />
apply to garage inventors and big companies alike. The garage inventor whose<br />
coffers won’t be able to withstand the potential cost will retreat to the sound of<br />
a distant death knell; the big company will retreat because it knows that its deep<br />
pockets makes it an attractive target for a lawsuit and therefore may well decide<br />
that the potential litigation and licensing costs, even if not fatal, just aren’t worth<br />
it. Again, consumers will miss out on a new product.<br />
An ambiguous secondary liability doctrine also disadvantages American<br />
products in a global market: U.S. companies will have to worry about drowning<br />
in the unpredictable and poorly charted quicksand of secondary liability, while<br />
their international competitors will have clear legal rules to guide them. The<br />
5 It’s worth emph<strong>as</strong>izing that the billion dollar figure is not hyperbole—just <strong>as</strong>k SAP, which<br />
recently lost its legal dispute with Oracle and w<strong>as</strong> ordered to pay $1.3 billion in damages.<br />
See Sam Diaz, Jury: SAP Owes Oracle $1.3 Billion for Copyright Infringement, ZDNET, Nov. 23,<br />
2010. The facts of that c<strong>as</strong>e are quite different from the examples given here, of course, but<br />
the award is a very conspicuous reminder that such <strong>as</strong>tronomical damage awards are a<br />
startling reality of present day copyright litigation.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 397<br />
domestic market suffers <strong>as</strong> well: By creating barriers to entry (high and<br />
unreliable due diligence costs <strong>as</strong> well <strong>as</strong> post-rele<strong>as</strong>e redesign costs), the<br />
ambiguity favors entrenched entities over newcomers. Advocating secondary<br />
liability without removing the ambiguity also contradicts the authors’ claim that<br />
the same set of laws should apply to offline and online worlds: The fuzzy<br />
secondary liability doctrine which they so strongly espouse in connection with<br />
technology wouldn’t fly in the physical world. For example, should a car<br />
company be held liable for drivers who speed? After all, it would be e<strong>as</strong>y<br />
enough to add a “speed limit compliance chip.” Yet auto manufacturers are not<br />
forced to pay any portion of a speeding driver’s ticket. Offline, in other words,<br />
bad actors—the users of technology—are punished for their own transgressions.<br />
Online, however, the law ch<strong>as</strong>es the manufacturers—and applies ad-hoc,<br />
ambiguous standards to their products. It would seem that the authors want<br />
Internet-specific laws after all.<br />
None of this sounds like wise intellectual property policy. The legal system h<strong>as</strong><br />
a constitutional imperative to incentivize inventors, after all, and it achieves this<br />
objective in part by providing both content producers 6 and innovators with a<br />
stable and predictable legal climate, such <strong>as</strong> the “bright line” rule devised by the<br />
Supreme Court in its 1984 Sony ruling. 7 In its current state, the law threatens to<br />
punish rather than reward those who have the courage to rele<strong>as</strong>e an innovative<br />
technology if that technology may be misused by its adopters and if that<br />
technology h<strong>as</strong> yet to be contemplated and cleared by the judiciary or<br />
legislature. That is not an environment that encourages innovation. If the<br />
intent of the judiciary and the Department of Justice is indeed to mightily wield<br />
the secondary liability sword across the technology sector, the doctrine must be<br />
clearly defined, so that the rules of engagement are clearly stated and U.S.<br />
innovators can design their products with confidence—not in fear.<br />
6 In Community for Creative Non-Violence v. Reid, the Supreme Court acknowledged “Congress’<br />
paramount goal in revising the 1976 Act of enhancing predictability and certainty of<br />
copyright ownership.” 490 U.S. 730, 749 (1989).<br />
7 Sony v. Universal City Studios, supra note 2.
398 CHAPTER 6: SHOULD ONLINE INTERMEDIARIES BE REQUIRED TO POLICE MORE?
CHAPTER 7<br />
IS SEARCH NOW AN<br />
“ESSENTIAL FACILITY?”<br />
399<br />
Dominant Search Engines:<br />
An Essential Cultural & Political Facility 401<br />
Frank P<strong>as</strong>quale<br />
The Problem of Search Engines <strong>as</strong> Essential Facilities:<br />
An Economic & Legal Assessment 419<br />
Geoffrey A. Manne<br />
Some Skepticism About Search Neutrality 435<br />
James Grimmelmann<br />
Search Engine Bi<strong>as</strong> & the Demise<br />
of Search Engine Utopianism 461<br />
Eric Goldman
400 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 401<br />
Dominant Search Engines:<br />
An Essential Cultural<br />
& Political Facility<br />
By Frank P<strong>as</strong>quale *<br />
Many worry about search engines’ growing power. How are worldviews being<br />
bi<strong>as</strong>ed by them? Do search engines have an interest in getting certain<br />
information prioritized or occluded? 1 Dominant search engines (“DSEs”) 2 are a<br />
key hub of Internet traffic. They provide an ever-expanding array of services.<br />
Google, for instance, just announced its intention to go into travel shopping. As<br />
they am<strong>as</strong>s information about their users, calls for regulation have focused on<br />
the threats to privacy they generate. Some of these efforts have been successful;<br />
others look more doubtful. One thing is certain: They are only the beginning of<br />
a struggle over the rights and responsibilities of key intermediaries. Some hope<br />
that competition law—and particularly the doctrine of “essential facilities”—will<br />
lead policymakers to scrutinize search engines actions.<br />
When American lawyers talk about “essential facilities,” they are referring to<br />
antitrust doctrine that h<strong>as</strong> tried, at various points, to make certain “bottlenecks”<br />
in the economy provide access on fair and nondiscriminatory terms to all<br />
comers. As robust American competition law fades into a secluded corner of<br />
legal history, 3 “essential facilities” doctrine still remains, for some scholars, a ray<br />
of hope for intermediary responsibility. 4 Oren Bracha and I helped fuel this<br />
* Professor of Law, Seton Hall Law School; Visiting Fellow, Princeton Center for Information<br />
Technology Policy.<br />
1 ALEX HALAVAIS, SEARCH ENGINE SOCIETY 85 (Polity 2008) (“In the process of ranking<br />
results, search engines effectively create winners and losers on the web <strong>as</strong> a whole. Now that<br />
search engines are moving into other realms, this often opaque technology of ranking<br />
becomes kingmaker in new venues.”); Chi-Chu Tschang, The Squeeze at China’s Baidu,<br />
BUSINESSWEEK, Dec. 31, 2008, at<br />
www.businessweek.com/magazine/content/09_02/b4115021710265.htm (“Salespeople<br />
working for Baidu drop sites from results to bully companies into buying sponsored links [a<br />
form of paid advertising], say some who have been approached.”).<br />
2 We can provisionally define a dominant search engine (“DSE”) <strong>as</strong> one with more than 40<br />
percent market share. Google clearly satisfies this criterion in the United States and Europe.<br />
See David S. Evans, Antitrust Issues Raised by the Emerging Global Internet Economy, 102 NW. U. L.<br />
REV. COLLOQUY 285 (2008) (reporting market shares for leading internet intermediaries).<br />
3 BARRY LYNN, CORNERED: THE NEW MONOPOLY CAPITALISM AND THE ECONOMICS OF<br />
DESTRUCTION (John Wiley & Sons, Inc. 2010) (describing the declining impact of American<br />
antitrust law).<br />
4 Brett Frischmann & Spencer Weber Waller, Revitalizing Essential Facilities, 75 ANTITRUST L.J. 1,<br />
2 (2008) (“infr<strong>as</strong>tructure subject to substantial access and nondiscrimination norms [h<strong>as</strong>] …<br />
been heavily regulated.”).
402 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
hope in our 2008 article Federal Search Commission, which compared dominant<br />
search engines to railroads and common carriers in the hope that they would be<br />
recognized <strong>as</strong> infr<strong>as</strong>tructural foundations of the information economy. 5 But I<br />
now see that Federal Search Commission, like many other parts of the search engine<br />
accountability literature, tried too hard to shoehorn a wide variety of social<br />
concerns about search engines into the economic language of antitrust policy. 6<br />
It is now time for scholars and activists to move beyond the crabbed vocabulary<br />
of competition law to develop a richer normative critique of search engine<br />
dominance.<br />
This will not be an e<strong>as</strong>y sell in cyberlaw, which tends to uncritically promote<br />
competition and innovation <strong>as</strong> the highest aims of Internet policy. If a<br />
dominant search engine is abusing its position, market-oriented scholars say,<br />
market forces will usually solve the problem, and antitrust law can step in when<br />
they fail to do so. Even those who favor net neutrality rules for carriers are<br />
wary of applying them to other intermediaries, like search engines. All tend to<br />
<strong>as</strong>sume that the more “innovation” happens on the Internet, the more choices<br />
users will have and the more efficient the market will become. Yet these<br />
scholars have not paid enough attention to the kind of innovation that is best<br />
for society, and whether the uncoordinated preferences of millions of web users<br />
for low-cost convenience are likely to address the cultural and political concerns<br />
that dominant search engines raise.<br />
In this article, I hope to demonstrate two points. First, antitrust law terms (like<br />
“essential facility”) cannot hope to capture the complexity of concerns raised by<br />
an information landscape where one company serves <strong>as</strong> the predominant map<br />
of the web, and simultaneously attempts to exploit that dominance by endlessly<br />
expanding into adjoining fields. Second, I hope to point the way toward a new<br />
concept of “essential cultural and political facility,” which can help policymakers<br />
realize the situations where a bottleneck h<strong>as</strong> become important enough that<br />
special scrutiny is warranted. This scrutiny may not always lead to regulation—<br />
which the First Amendment renders a dicey enterprise in any corner of the<br />
information economy. However, it could lead us to recognize the importance<br />
of publicly funded alternatives to the concentrated conduits and contentproviders<br />
colonizing the web.<br />
5 Oren Bracha & Frank P<strong>as</strong>quale, Federal Search Commission: Fairness, Access, and Accountability in<br />
the Law of Search, 93 CORNELL L. REV. 1193 (2008).<br />
6 RICHARD POSNER, ANTITRUST LAW, at ix (2d ed. 2001) (“Almost everyone professionally<br />
involved in antitrust today—whether <strong>as</strong> litigator, prosecutor, judge, academic, or informed<br />
observer—not only agrees that the only goal of the antitrust laws should be to promote<br />
economic welfare, but also agrees on the essential tenets of economic theory that should be<br />
used to determine the consistency of specific business practices with that goal.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 403<br />
The Limits of Antitrust <strong>as</strong> Search Policy<br />
Antitrust c<strong>as</strong>es tend to consume a great deal of time, in part because economic<br />
conduct is subject to many different interpretations. 7 One person’s<br />
anticompetitive conduct is another’s effective business strategy. The same<br />
unending (and indeterminate) arguments threaten to stall discourse on search<br />
policy. For example, the Federal Trade Commission’s (FTC) review of the<br />
Google–DoubleClick merger focused almost entirely on the economic effects<br />
of the proposed combination, rather than the threats to privacy it posed. 8<br />
Search engines are among the most innovative services in the global economy.<br />
They provide extraordinary efficiencies for advertisers and consumers by<br />
targeting messages to viewers who are most likely to want to receive them. In<br />
order to attract more users, search engines use revenues from advertising to<br />
organize and index a great deal of content on the Internet. Like the major<br />
broadc<strong>as</strong>t networks, search engines are now beginning to displace. They provide<br />
opportunities to view content (organic search results) in order to sell advertising<br />
(paid search results). 9 Search engines have provoked antitrust scrutiny because<br />
proposed deals between major search engines (and between search engines and<br />
content providers) suggest undue coordination of competitors in an already<br />
concentrated industry. 10<br />
7 See Jonathan Zittrain, The Un-Microsoft Un-Remedy: Law Can Prevent the Problem that It Can’t<br />
Patch Later, 31 CONN. L. REV. 1361, 1361–62 (1999) (“The main concern in finding a remedy<br />
for [‘bad monopolist behaviors’] may be time: The technology environment moves at a<br />
lightning pace, and by the time a federal c<strong>as</strong>e h<strong>as</strong> been made out of a problem, the problem<br />
is proven, a remedy f<strong>as</strong>hioned, and appeals exhausted, the damage may already be<br />
irreversible.”).<br />
8 News Rele<strong>as</strong>e, FTC, Federal Trade Commission Closes Google/DoubleClick Investigation<br />
(Dec. 20, 2007), available at www.ftc.gov/opa/2007/12/googledc.shtm (“The Commissioners<br />
... wrote that ‘<strong>as</strong> the sole purpose of federal antitrust review of mergers and acquisitions is<br />
to identify and remedy transactions that harm competition,’ the FTC lacks the legal authority<br />
to block the transaction on grounds, or require conditions to this transaction, that do not<br />
relate to antitrust. Adding, however, that it takes consumer privacy issues very seriously, the<br />
Commission cross-referenced its rele<strong>as</strong>e of a set of proposed behavioral marketing<br />
principles that were also announced today.”).<br />
9 According to the Google corporate home page, “[W]e distinguish ads from search results or<br />
other content on a page by labeling them <strong>as</strong> ‘sponsored links’ or ‘Ads by Google.’ We don’t<br />
sell ad placement in our search results, nor do we allow people to pay for a higher ranking<br />
there.” Google, Inc., Corporate Information: Company Overview,<br />
www.google.com/corporate/ (l<strong>as</strong>t visited Mar. 12, 2010).<br />
10 For example, the deal reached between Microsoft and Yahoo! that would have Microsoft’s<br />
Bing search engine deliver results for searches on Yahoo! h<strong>as</strong> provoked antitrust concerns<br />
both domestically and internationally. See Christopher S. Rugaber, Microsoft–Yahoo Deal to Face<br />
Tough Antitrust Probe, ABCNEWS, July 29, 2009,<br />
http://seattletimes.nwsource.com/html/localnews/2009563654_apusmicrosoftyaho<br />
oantitrust.html.
404 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Those opposed to regulation often claim that antitrust law offers a more<br />
targeted and efficient response to abuses. As Justice Breyer explained in his<br />
cl<strong>as</strong>sic work Regulation and Its Reform:<br />
[T]he antitrust laws differ from cl<strong>as</strong>sical regulation both in their<br />
aims and in their methods … . [T]hey act negatively, through a<br />
few highly general provisions prohibiting certain forms of private<br />
conduct. They do not affirmatively order firms to behave in<br />
specified ways; for the most part, they tell private firms what<br />
not to do … . Only rarely do the antitrust enforcement<br />
agencies create the detailed web of affirmative legal obligations<br />
that characterizes cl<strong>as</strong>sical regulation. 11<br />
Given the lack of search engine regulation in the U.S., actual and threatened<br />
antitrust investigations have been a primary government influence on Google’s<br />
business practices <strong>as</strong> its dominance in search grows. Many believe that the<br />
Department of Justice’s (DOJ) suspicion of the company’s proposed joint<br />
venture with Yahoo! in the search advertising field effectively scuttled the deal<br />
by late 2008. 12 However, antitrust enforcement appears less promising in other<br />
<strong>as</strong>pects of search. 13 This section discusses the limits of antitrust in addressing<br />
the cultural and political dilemm<strong>as</strong> raised by Google’s proposed Book Search<br />
deal with publishers, 14 and its dominance of online advertising.<br />
11 STEPHEN BREYER, REGULATION AND ITS REFORM 156–57 (Harvard Univ. Press 1982). But<br />
see A. Dougl<strong>as</strong> Melamed, Antitrust: The New Regulation, 10 ANTITRUST 13, 13 (1995)<br />
(describing “two paradigms,” the law enforcement model and the regulatory model, and the<br />
shift of antitrust law from the former to the latter).<br />
12 Nichol<strong>as</strong> Thompson & Fred Vogelstein, The Plot to Kill Google, WIRED, Jan. 19, 2009, at 88,<br />
available at www.wired.com/techbiz/it/magazine/17-02/ff_killgoogle (noting that antitrust<br />
scrutiny culminated in a hearing in which the DOJ threatened to bring an antitrust c<strong>as</strong>e<br />
against Google and that one prominent DOJ attorney expressed the view that Google<br />
already is a monopoly).<br />
13 Daniel Rubinfeld, Foundations of Antitrust Law and Economics, in HOW THE CHICAGO SCHOOL<br />
OVERSHOT THE MARK: THE EFFECT OF CONSERVATIVE ECONOMIC ANALYSIS ON U.S.<br />
ANTITRUST 51, 57 (Robert Pitofsky ed., 2008) (describing how “conservative economics h<strong>as</strong><br />
fostered a tendency to downplay enforcement in dynamic technological industries in which<br />
innovation issues play a significant role”).<br />
14 Despite the DOJ’s intervention to affect the terms of the proposed settlement, many leading<br />
antitrust experts have argued that the settlement would not violate the antitrust laws. See, e.g.,<br />
Einer Elhauge, Why the Google Books Settlement Is Pro-Competitive 58 (Harvard Law Sch., Law &<br />
Econ. Discussion Paper No. 646, Harvard Law Sch., Pub. Law & Theory Research Paper<br />
No. 09-45, 2009), available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1459028 (“The settlement does<br />
not raise rival barriers to offering [many] books, but to the contrary lowers them. The<br />
output expansion is particularly dramatic for out-of-print books, for which there is currently<br />
no new output at all.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 405<br />
Privacy concerns are nearly impossible to address within the economic models<br />
of contemporary competition law. Antitrust scrutiny did little to address the<br />
privacy concerns raised when Google proposed to merge with the web<br />
advertising firm DoubleClick. 15 The proposed deal provoked a complaint from<br />
the Electronic Privacy Information Center (EPIC). EPIC claimed that Google’s<br />
modus operandi amounts to a “deceptive trade practice”:<br />
Upon arriving at the Google homepage, a Google user is not<br />
informed of Google’s data collection practices until he or she<br />
clicks through four links. Most users will not reach this page<br />
… . Google collects user search terms in connection with his<br />
or her IP address without adequate notice to the user.<br />
Therefore, Google’s representations concerning its data<br />
retention practices were, and are, deceptive practices. 16<br />
One key question raised by the proposed merger w<strong>as</strong> whether privacy and<br />
consumer protection concerns like these can be addressed by traditional<br />
antitrust analysis. 17 Privacy law expert Peter Swire argued that they can, because<br />
“privacy harms reduce consumer welfare … [and] lead to a reduction in the<br />
quality of a good or service.” 18 Swire believed that consumers would be worse<br />
off after the merger because of the unparalleled digital dossiers the combined<br />
entity could generate:<br />
Google often h<strong>as</strong> “deep” information about an individual’s<br />
actions, such <strong>as</strong> detailed information about search terms.<br />
Currently, DoubleClick sets one or more cookies on an<br />
individual’s computers, and receives detailed information about<br />
which sites the person visits while surfing. DoubleClick h<strong>as</strong><br />
15 Dawn Kawamoto & Anne Broache, FTC Allows Google–DoubleClick Merger to Proceed, CNET<br />
NEWS, Dec. 20, 2007, http://news.cnet.com/FTC-allows-Google-DoubleClickmerger-to-proceed/2100-1024_3-6223631.html<br />
(describing U.S. authorities’ blessing of the<br />
proposed deal).<br />
16 See Complaint and Request for Injunction, Request for Investigation and for Other Relief, In<br />
re Google Inc. and DoubleClick, Inc., No. 071-0170 (FTC Apr. 20, 2007), available at<br />
http://epic.org/privacy/ftc/google/epic_complaint.pdf at 9 [hereinafter Google, Inc.<br />
and DoubleClick Complaint].<br />
17 See Siva Vaidhyanathan, The Googlization of Everything, Google and DoubleClick: A<br />
Bigger Antitrust Problem than I Had Imagined,<br />
www.googlizationofeverything.com/2007/10/google_and_doubleclick_a_bigge.php (Oct.<br />
21, 2007, 16:05 EST).<br />
18 Peter Swire, Protecting Consumers: Privacy Matters in Antitrust Analysis, CTR. FOR AM. PROGRESS,<br />
Oct. 19, 2007, www.americanprogress.org/issues/2007/10/privacy.html (italics omitted).
406 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
“broad” information about an individual’s actions, with its<br />
leading ability to pinpoint where a person surfs. 19<br />
Initial points of contention include (a) the definition of the products at issue,<br />
and (b) how to weigh the costs and benefits of a merger. The combined<br />
company would have different segments of “customers” in a two-sided<br />
market: 20 (1) searchers trying to find sites, and (2) ad buyers trying to reach<br />
searchers. Swire contends that many people care about privacy, and “[i]t would<br />
be illogical to count the harms to consumers from higher prices while excluding<br />
the harms from privacy inv<strong>as</strong>ions—both sorts of harms reduce consumer<br />
surplus and consumer welfare in the relevant market.” 21<br />
However, the web searcher category not only consists of consumers who care<br />
about privacy, but also includes many people who do not highly value it or who<br />
actively seek to expose their information in order to receive more targeted<br />
solicitations. According to Eric Goldman’s work on personalized search, some<br />
may even consider the gathering of data about them to be a service. 22 The more<br />
information is gathered about them, the better intermediaries are able to serve<br />
them relevant ads. Many economic models of web publication <strong>as</strong>sume that<br />
users “pay” for content by viewing ads; 23 they may effectively pay less if the<br />
advertisements they view bear some relation to things they want to buy. So<br />
while Swire models advertising and data collection <strong>as</strong> a cost to be endured,<br />
19 Id. According to Swire, “[i]f the merger is approved, then individuals using the market<br />
leader in search may face a search product that h<strong>as</strong> both ‘deep’ and ‘broad’ collection of<br />
information. For the many millions of individuals with high privacy preferences, this may be<br />
a significant reduction in the quality of the search product—search previously w<strong>as</strong><br />
conducted without the combined deep and broad tracking, and now the combination will<br />
exist.” Id.<br />
20 For a definition of two-sided market, see Nichol<strong>as</strong> Economides & Joacim Tåg, Net Neutrality<br />
on the Internet: A Two-Sided Market Analysis 1 (NET Inst., Working Paper No. 07-45, N.Y. Univ.<br />
Law and Econ., Research Paper No. 07-40, 2007),<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1019121 (“[P]latforms sell<br />
Internet access services to consumers and may set fees to content and applications providers<br />
‘on the other side’ of the Internet.”). In the search engine context, consumers “pay” by<br />
attending to ads, and ad-purch<strong>as</strong>ers pay Google for the chance to get ad viewers’ attention.<br />
21 Swire, supra note 18.<br />
22 Eric Goldman, A Co<strong>as</strong>ean Analysis of Marketing, 2006 WIS. L. REV. 1151, 1162–64 (“Three<br />
components determine an individual consumer’s utility from a marketing exposure: (1) the<br />
consumer’s substantive interest in the marketing, (2) the consumer’s nonsubstantive reactions<br />
to the marketing exposure, and (3) the attention consumed by evaluating and sorting the<br />
marketing. … [A] consumer may derive utility from the rote act of being contacted by<br />
marketers or exposed to the marketing, regardless of the marketing content.”).<br />
23 David S. Evans, The Economics of the Online Advertising Industry, 7 REV. NETWORK ECON. 359,<br />
359 (2008), available at www.bepress.com/rne/vol7/iss3/2 (describing how many of the top<br />
websites have adopted the “free-tv” model where the publisher generates traffic by not<br />
charging for readers but then sell that traffic to advertisers).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 407<br />
Google and DoubleClick argue that the resulting personalized ads serve<br />
customers. Their arguments prevailed, and Google officially acquired<br />
DoubleClick in 2008. 24<br />
Antitrust law is ill prepared to handle a “market” where some percentage of<br />
consumers consider loss of privacy a gain and others consider it a loss.<br />
Economic re<strong>as</strong>oning in general falters in the face of externalities, but usually we<br />
can all agree that, say, pollution is a harm (or negative externality) and flowers<br />
are a boon (or positive externality). Privacy preferences are much more<br />
idiosyncratic.<br />
Critics of the merger do have a response to this problem of diverse<br />
preferences—they can shift from characterizing lost privacy <strong>as</strong> a cost of web<br />
searching to describing it <strong>as</strong> a reduction in the quality of the services offered by<br />
the merging entities. 25 Dougl<strong>as</strong> Kysar’s work on the product–process<br />
distinction is encouraging here. Kysar h<strong>as</strong> claimed that consumers should have<br />
a right to make choices of products b<strong>as</strong>ed on how the products are made, not<br />
just how well they work. 26 Kysar argues “in favor of acknowledging and<br />
accommodating [consumer] process preferences within policy analysis, given<br />
the potential significance that such preferences may serve in the future <strong>as</strong><br />
outlets for public-minded behavior.” 27 Nevertheless, the valuation problems<br />
here are daunting. How are we to determine how much consumers are willing<br />
to pay to avoid privacy-eroding companies? 28<br />
Perhaps, <strong>as</strong> Lisa Heinzerling and Frank Ackerman suggest in their book Priceless,<br />
we should stop even trying to pretend that these decisions can be made on<br />
24 See Press Rele<strong>as</strong>e, Google Inc., Google Closes Acquisition of DoubleClick (Mar. 11, 2008),<br />
available at www.google.com/intl/en/press/pressrel/20080311_doubleclick.html.<br />
25 Both Supreme Court precedent and DOJ guidelines support this approach. See Nat’l Soc’y<br />
of Prof ’l Eng’rs v. United States, 435 U.S. 679, 695 (1978) (“The <strong>as</strong>sumption that<br />
competition is the best method of allocating resources in a free market recognizes that all<br />
elements of a bargain—quality, service, safety, and durability—and not just the immediate<br />
cost, are favorably affected by the free opportunity to select among alternative offers.”); U.S.<br />
DEP’T OF JUSTICE, HORIZONTAL MERGER GUIDELINES § 4, at 30–32 (1997) (efficient market<br />
behavior is indicated by lower prices, new products, and “improved quality”).<br />
26 Dougl<strong>as</strong> A. Kysar, Preferences for Processes: The Process/Product Distinction and the Regulation of<br />
Consumer Choice, 118 HARV. L. REV. 526, 529 (2004) (“[C]onsumer preferences may be heavily<br />
influenced by information regarding the manner in which goods are produced.”).<br />
27 Id. at 534.<br />
28 Christopher Yoo h<strong>as</strong> demanded this kind of accounting in the context of net neutrality. See<br />
Christopher Yoo, Beyond Network Neutrality, 19 HARV. J.L. & TECH. 1, 54 (2005) (“There is<br />
nothing incoherent about imposing regulation to promote values other than economic<br />
welfare. … [but] such a theory must provide a b<strong>as</strong>is for quantifying the noneconomic<br />
benefits and for determining when those benefits justify the economic costs.”).
408 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
anything approaching a purely economic b<strong>as</strong>is. 29 Engaging in a cost–benefit<br />
analysis diminishes privacy’s status <strong>as</strong> a right. Though many scholars have<br />
compellingly argued for broader foundations for competition law, the<br />
mainstream of contemporary antitrust policy in the United States cannot<br />
accommodate such concerns. Antitrust’s summum bonum is the maximization of<br />
“consumer welfare,” and this me<strong>as</strong>ure of efficiency is notoriously narrow. 30 For<br />
example, the DOJ w<strong>as</strong> hard pressed to adequately factor in a b<strong>as</strong>ic democratic<br />
commitment to diverse communicative channels during many media mergers. 31<br />
Given antitrust doctrine’s pronounced tendency to suppress or elide the cultural<br />
and political consequences of concentrated corporate power, the Bureau of<br />
Competition and the Bureau of Economics within the FTC are ill-equipped to<br />
respond to the most compelling issues raised by search engines. 32 The Google–<br />
Doubleclick merger proceedings ultimately ended with an overwhelming win<br />
for Google at the FTC. 33 This outcome w<strong>as</strong> all but inevitable given the<br />
foundations of contemporary antitrust doctrine, 34 and is the logical outgrowth<br />
of overreliance on legal economic theory that uncritically privileges market<br />
29 Frank Ackerman & Lisa Heinzerling, Priceless: On Knowing the Price of Everything and<br />
the Value of Nothing 8–9 (The New Press 2004).<br />
30 See Maurice E. Stucke, Better Competition Advocacy, 82 ST. JOHN’S L. REV. 951, 1001 (2008)<br />
(observing the primacy of allocative efficiency in antitrust analysis). Stucke notes that<br />
“[b]ehind allocative efficiency’s façade of positivism lie [many] moral questions … .” Id. See<br />
also Julie E. Cohen, Network Stories, 70 LAW & CONTEMP. PROBS. 91, 92 (2007) (“What makes<br />
the network good can only be defined by generating richly detailed ethnographies of the<br />
experiences the network enables and the activities it supports, and articulating a normative<br />
theory to explain what is good, and worth preserving, about those experiences and<br />
activities.”).<br />
31 See C. Edwin Baker, Media Concentration: Giving Up on Democracy, 54 FLA. L. REV. 839, 857<br />
(2002) (“[T]he dominant antitrust focus on power over pricing can be distinguished from power<br />
over the content available for consumer choice. In the currently dominant paradigm, a merger that<br />
dramatically reduced the number of independent suppliers of a particular category of<br />
content—say, news or local news or Black activist news—creates no antitrust problem if, <strong>as</strong><br />
likely, it does not lead to power to raise prices.”).<br />
32 See STATEMENT OF THE FEDERAL TRADE COMMISSION CONCERNING<br />
GOOGLE/DOUBLECLICK, FTC File No. 071-0170 (FTC Dec. 20, 2007), available at<br />
http://www.ftc.gov/os/c<strong>as</strong>elist/0710170/071220statement.pdf [hereinafter STATEMENT<br />
OF FTC CONCERNING GOOGLE/DOUBLECLICK] (“Although [privacy concerns] may present<br />
important policy questions for the Nation, the sole purpose of federal antitrust review of<br />
mergers and acquisitions is to identify and remedy transactions that harm competition.”).<br />
33 Id.<br />
34 Maurice Stucke describes and critiques this bi<strong>as</strong> in some detail. See Stucke, supra note 30, at<br />
1031 (describing a “mishm<strong>as</strong>h of neocl<strong>as</strong>sical economic theory, vignettes of zero-sum<br />
competition, and normative weighing of the anticompetitive ethereal—deadweight welfare<br />
loss—against the conjectures of procompetitive efficiencies” at the core of too much<br />
antitrust law and theory). Among his many important contributions to the literature, Stucke<br />
makes it clear that competition policy includes far more goals and tactics than antitrust<br />
enforcement alone. Id. at 987–1008.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 409<br />
outcomes. 35 As long <strong>as</strong> contemporary doctrine holds that antitrust is singularly<br />
focused on the “consumer welfare” a proposed transaction will generate, 36<br />
antitrust policymakers will be unable to address the cultural and political<br />
consequences of consolidation in the search industry.<br />
Antitrust challenges to the proposed settlement of a copyright lawsuit by<br />
authors and publishers against Google’s Book Search program are likely to be<br />
similarly constrained. 37 As in the Google-Doubleclick merger, the privacy<br />
implications of Google’s proposed deal with publishers are profound. 38 Anyone<br />
who cares about public input into the future of access to knowledge should<br />
approach the potential deal here warily, even if the prospect of constructing a<br />
digital Library of Alexandria tempts scholars. 39 As Harvard librarian Robert<br />
Darnton h<strong>as</strong> argued, only a naive optimist could ignore the perils of having one<br />
profit-driven company effectively entrusted with a comprehensive collection of<br />
the world’s books. 40<br />
When publishers challenged Google’s book scanning in 2007, many hoped that<br />
public interest groups could leverage copyright challenges to Google’s book<br />
35 Reza Dibadj, Beyond Facile Assumptions and Radical Assertions: A C<strong>as</strong>e for “Critical Legal<br />
Economics,” 2003 UTAH L. REV. 1155, 1161 (“[T]hree of the most b<strong>as</strong>ic <strong>as</strong>sumptions to the<br />
popular [law & economics] enterprise—that people are rational, that ability to pay<br />
determines value, and that the common law is efficient—while couched in the metaphors of<br />
science, remain unsubstantiated.”). But see JAMES R. HACKNEY, JR., UNDER COVER OF<br />
SCIENCE: AMERICAN LEGAL–ECONOMIC THEORY AND THE QUEST FOR OBJECTIVITY 164–66<br />
(Duke Univ. Press 2007) (describing the “notable movement to broaden the scope of legal–<br />
economic theory under the rubric of socioeconomics”).<br />
36 See Leegin Creative Leather Prods., Inc. v. PSKS, Inc., 551 U.S. 877, 906 (2007)<br />
(acknowledging the economic foundations of U.S. antitrust law).<br />
37 Motoko Rich, Google and Authors Win Extension for Book Settlement, N.Y. TIMES, Nov. 9, 2009, at<br />
B3, available at<br />
www.nytimes.com/2009/11/10/technology/companies/10gbooks.html?_r=1. The<br />
DOJ expressed dissatisfaction with the parties’ most recent proposed settlement, <strong>as</strong> well. See<br />
Cecilia Kang, Judge Puts Off Ruling on Google’s Proposed <strong>Digital</strong> Book Settlement, WASH. POST, Feb.<br />
19, 2010, available at www.w<strong>as</strong>hingtonpost.com/wpdyn/content/article/2010/02/18/AR2010021800944.html?hpid=moreheadlines.<br />
38 Electronic Frontier Foundation, Google Book Search Settlement and Reader Privacy,<br />
available at www.eff.org/issues/privacy/google-book-search-settlement (l<strong>as</strong>t visited<br />
July 11, 2010). As author Michael Chabon argues, “if there is no privacy of thought —<br />
which includes implicitly the right to read what one wants, without the approval, consent or<br />
knowledge of others — then there is no privacy, period.” Id.<br />
39 See, e.g., Diane Leenheer Zimmerman, Can Our Culture Be Saved? The Future of <strong>Digital</strong><br />
Archiving, 91 MINN. L. REV. 989, 990–91 (2007) (looking at the Google Book Search project<br />
<strong>as</strong> a means of saving culture and “explor[ing] whether saving culture and saving copyright<br />
can be made compatible goals”).<br />
40 Robert Darnton, The Library in the New Age, 55 N.Y. REV. BOOKS, June 12, 2008, at 39,<br />
available at www.nybooks.com/articles/21514.
410 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
search program to promote the public interest. Courts could condition a pro-<br />
Google fair use finding on universal access to the contents of the resulting<br />
datab<strong>as</strong>e. Landmark c<strong>as</strong>es like Sony v. Universal 41 set a precedent for taking such<br />
broad public interests into account in the course of copyright litigation. 42 Those<br />
who opt out of the settlement may be able to fight for such concessions, but for<br />
now the battle centers on challenges to the settlement itself.<br />
Both James Grimmelmann and Pamela Samuelson have suggested several<br />
principles and recommendations to guide judicial deliberations on the proposed<br />
settlement. 43 Grimmelmann’s work h<strong>as</strong> focused primarily on antitrust issues, 44<br />
while Samuelson h<strong>as</strong> concentrated on the concerns of academic authors. 45<br />
Grimmelmann h<strong>as</strong> succinctly summarized the settlement’s potential threats to<br />
innovation and competition in the market for book indices, and books<br />
themselves:<br />
The antitrust danger here is that the settlement puts Google in<br />
a highly privileged position for book search and book sales. …<br />
The authors and publishers settled voluntarily with Google, but<br />
there’s no guarantee they’ll offer similar terms, or any terms at<br />
all, to anyone else. … [They] could unilaterally decide only to<br />
talk to Google. 46<br />
41 Sony Corp. of Am. v. Universal City Studios, Inc., 464 U.S. 417, 442 (1984).<br />
42 Frank P<strong>as</strong>quale, Breaking the Vicious Circularity: Sony’s Contribution to the Fair Use<br />
Doctrine, 55 CASE W. RES. L. REV. 777, 790 (2005).<br />
43 See Pamela Samuelson, Google Book Search and the Future of Books in Cyberspace, 94 MINN. L.<br />
REV. (forthcoming, 2010), available at http://digitalscholarship.org/digitalkoans/2010/01/13/google-book-search-and-the-future-ofbooks-in-cyberspace/<br />
(discussing the “six categories of serious reservations that have<br />
emerged about the settlement … reflected in the hundreds of objections and numerous<br />
amicus curiae briefs filed with the court responsible for determining whether to approve the<br />
settlement.”).<br />
44 See generally James Grimmelmann, How to Fix the Google Book Search Settlement, 12 J. INTERNET<br />
L., Apr. 2009, at 1 (arguing that the Google Book Search antitrust c<strong>as</strong>e settlement should be<br />
approved with additional me<strong>as</strong>ures designed to promote competition and protect<br />
consumers) [hereinafter Grimmelmann, Google Book Search Settlement].<br />
45 Letter from Pamela Samuelson, Richard M. Sherman Distinguished Professor of Law,<br />
University of California, Berkeley School of Law, to Hon. Denny Chin, Judge, S.D.N.Y.<br />
(Sept. 3, 2009), available at www.scribd.com/doc/19409346/Academic-Author-Letter-090309<br />
(urging the judge to condition “approval of the Settlement Agreement on modification of<br />
various terms identified herein so that the Agreement will be fairer and more adequate<br />
toward academic authors.”).<br />
46 James Grimmelmann, In Google We Antitrust, TPMCAFÉ BOOK CLUB, Jan. 15, 2009,<br />
http://tpmcafe.talkingpointsmemo.com/2009/01/15/in_google_we_antitrust.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 411<br />
Grimmelmann proposes several methods of <strong>as</strong>suring that the publishers will<br />
deal with other book search services. 47 Grimmelmann suggests an “[a]ntitrust<br />
consent decree” and “[n]ondiscrimination among copyright owners” <strong>as</strong><br />
potential responses to the issues raised by the settlement. 48 Most of his<br />
proposal reflects a policy consensus that presumes competition is the ideal<br />
solution to abuses of power online. 49<br />
Yet there are many re<strong>as</strong>ons why competition is unlikely to arise in book search<br />
services, even if the settlement is altered in order to promote it. 50 Licensing<br />
costs are likely to be a substantial barrier to entry. A key to competition in the<br />
search market is having a comprehensive datab<strong>as</strong>e of searchable materials; the<br />
more these materials need to be licensed, the less likely it is that a second comer<br />
can set up its own book archive. As scholars have demonstrated, deals like<br />
Google’s proposed settlement help entrench copyright holders’ claims for<br />
licensing revenue. 51 Moreover, innovation in search is heavily dependent on<br />
having an installed b<strong>as</strong>e of users that effectively “train” the search engine to be<br />
responsive. 52 The more search queries an engine gets, the better able it is to<br />
sharpen and perfect its algorithm. 53 Each additional user tends to decre<strong>as</strong>e the<br />
cost of a better quality service for all subsequent users by contributing activity<br />
that helps the search engine differentiate between high and low quality<br />
organizational strategies. 54 Thus, incumbents with large numbers of users enjoy<br />
47 Id.<br />
48 Grimmelmann, Google Book Search Settlement, supra note 44, at 15.<br />
49 Grimmelmann does also propose some revised terms that would not be primarily designed<br />
to incentivize the development of new alternatives to Google Book Search; for example, he<br />
proposes “[l]ibrary and reader representation at the [Book Rights R]egistry” that would<br />
administer many <strong>as</strong>pects of the settlement. Id.<br />
50 See Bracha & P<strong>as</strong>quale, supra note 5, at 1152 (“Though the market choices of users and<br />
technological developments constrain search engine abuse to some extent, they are unlikely<br />
to vindicate [certain social] values … .”); Frank P<strong>as</strong>quale, Seven Re<strong>as</strong>ons to Doubt Competition in<br />
the General Search Engine Market, MADISONIAN, Mar. 18, 2009,<br />
http://madisonian.net/2009/03/18/seven-re<strong>as</strong>ons-to-doubt-competition-in-thegeneral-search-engine-market.<br />
51 See James Gibson, Risk Aversion and Rights Accretion in Intellectual Property Law, 116 YALE L.J.<br />
882, 884 (2007) (describing how the decision <strong>as</strong> to whether to fight for fair use or license a<br />
copyrighted work can be difficult “because the penalties for infringement typically include<br />
supracompensatory damages and injunctive relief ”).<br />
52 James Pitkow et al., Personalized Search, 45 COMMS. ACM, Sept. 2002, at 50 (discussing<br />
methods of personalizing search systems).<br />
53 For example, if 100 people search for “alternatives to Microsoft Word software” on a search<br />
engine on a given day and all pick the third-ranked result, the search algorithm may adjust<br />
itself and put the third-ranked result <strong>as</strong> the first result the next day. The most-used search<br />
engine will have more data to tweak its algorithms than its less-used rivals.<br />
54 Oren Bracha & Frank P<strong>as</strong>quale, Federal Search Commission: Fairness, Access, and Accountability in<br />
the Law of Search, 93 CORNELL L. REV. 1141, 1181 (2008); David A. Vise & Mark Malseed,
412 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
substantial advantages over smaller entrants. Restrictive terms of service also<br />
deter competitors who <strong>as</strong>pire to reverse engineer and develop better versions of<br />
such services. 55 In general purpose search, users cannot reproduce, copy, or<br />
resell any Google service for any re<strong>as</strong>on, even if the behavior is manual and nondisruptive.<br />
56 Another section proscribes “creat[ing] a derivative work of … the<br />
Software.” 57 Advertisers face other restrictions, <strong>as</strong> Google’s AdWords<br />
Application Programming Interface (API) Terms & Conditions “impede<br />
advertisers’ efforts to efficiently copy their ad campaigns to other providers.” 58<br />
All of these factors militate against robust competition in the comprehensive<br />
book search field.<br />
Quantum leaps in technology capable of overcoming these brute disadvantages<br />
are unlikely, particularly because search is <strong>as</strong> much about personalized service <strong>as</strong><br />
it is about technical principles of information organization and retrieval. 59<br />
Current advantage in search is likely to be self-reinforcing, especially given that<br />
so many more people are using the services now than when Google overtook<br />
other search engines in the early 2000s. 60<br />
What does an online world featuring an entrenched Google Book Search <strong>as</strong><br />
gatekeeper look like? Initially, it will prove a v<strong>as</strong>t improvement on the status<br />
The Google Story 215 (2005) (noting that the most-used search engine will have more data<br />
to tweak its algorithms than its less-used rivals. ). (<br />
55 Though the precise terms of service of Google Book Search have not been finalized,<br />
Google’s more general terms of service are not promising. Google’s terms of service<br />
prohibit any action that “interferes with or disrupts” Google’s services, networks, or<br />
computers. Google Inc., Terms of Service § 5.4 (Apr. 16, 2007),<br />
www.google.com/accounts/TOS. Repeated queries to the service necessary to gather data<br />
on its operations may well violate these terms.<br />
56 Id. § 5.5.<br />
57 Id. § 10.2. Section 5.3 would proscribe both the automatic data collection and the use of a<br />
nonapproved “interface” for accessing Google’s datab<strong>as</strong>e, regardless of the exact means. Id.<br />
§ 5.3.<br />
58 Ben Edelman, PPC Platform Competition and Google’s ‘May Not Copy’ Restriction, June 27, 2008,<br />
http://www.benedelman.org/news/062708-1.html (arguing that “Google’s restrictions<br />
on export and copying of advertisers’ campaigns … hinder competition in Internet<br />
advertising”). Though the hearing at which Professor Edelman w<strong>as</strong> to testify w<strong>as</strong> cancelled,<br />
he h<strong>as</strong> documented these problems in some detail at his website, www.benedelman.org.<br />
59 John Battelle, THE SEARCH: HOW GOOGLE AND ITS RIVALS REWROTE THE RULES OF<br />
BUSINESS AND TRANSFORMED OUR CULTURE 8 (2005).at 8 (describing how personalized<br />
search enhances the value of search engines to both users and advertisers). Due to trade<br />
secrecy, it is impossible for policymakers to discover how much of the intermediary’s success<br />
is due to its employees’ inventive genius, and how much is due to the collective contributions<br />
of millions of users to the training of the intermediary’s computers.<br />
60 See Randall Stross, Planet Google: One Company’s Audacious Plan to Organize Everything<br />
We Know 98 (Free Press 2008) (describing success of YouTube, a subsidiary of Google).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 413<br />
quo of bulky, hard-to-acquire, physical copies of books. But when we consider<br />
the ways in which knowledge can be rationed for profit, or structured to<br />
promote political ends, some worries arise. Google plans to monetize the book<br />
search corpus, and one predictable way of incre<strong>as</strong>ing its value is to make parts<br />
of it unavailable to those unwilling to pay high licensing fees. If the settlement<br />
allowed Google to charge such fees in an unconstrained manner, unmoored<br />
from the underlying costs of operating the project, the company would<br />
essentially be exploiting a public e<strong>as</strong>ement (to copy books) for unlimited private<br />
gain. 61 The Open Content Alliance h<strong>as</strong> questioned the restrictive terms of the<br />
contracts that Google strikes when it agrees to scan and create a digital datab<strong>as</strong>e<br />
of a library’s books. 62 Those restrictive terms foreshadow potential future<br />
restrictions on book search services. The proposed deal raises fundamental<br />
questions about the proper scope of private initiative in organizing and<br />
rationing access to knowledge.<br />
Well-funded libraries may pay a premium to gain access to all sources; lesser<br />
institutions may be granted inferior access. If permitted to become prevalent,<br />
such tiered access to information could rigidify and reinforce existing<br />
inequalities in access to knowledge. 63 Information tiering inequitably<br />
disadvantages many groups, promoting the leveraging of wealth into status,<br />
educational, or other occupational advantage. Information is not only<br />
intrinsically valuable, but also can be a positional good, useful for scoring<br />
advantages over others. 64<br />
61 Writers’ Reps and Richard A. Epstein Objection filed with the Southern District of New<br />
York in re Google Book Search, available at<br />
http://www.writersreps.com/feature.<strong>as</strong>px?FeatureID=172 (arguing that the Google<br />
Book Search Settlement “would accomplish[] orphan legislation—but just for Google. … If<br />
[Google] is to be handed exclusive possession after stealing the scans to begin with, then it<br />
should be required to share those scans.”).<br />
62 See Open Content Alliance, Let’s Not Settle for This Settlement,<br />
www.opencontentalliance.org/2008/11/05/lets-not-settle-for-this-settlement (l<strong>as</strong>t visited<br />
Mar. 12, 2010) (“At its heart, the settlement agreement grants Google an effective monopoly<br />
on an entirely new commercial model for accessing books. It re-conceives reading <strong>as</strong> a<br />
billable event. This reading event is therefore controllable and trackable. It also forces<br />
libraries into financing a vending service that requires they perpetually buy back what they<br />
have already paid for over many years of careful collection.”).<br />
63 Frank P<strong>as</strong>quale, Technology, Competition, and Values, 8 MINN. J. L. SCI. & TECH. 607, 608 (2007)<br />
(explaining how “much technology is used not just simply to improve its user’s life, but also<br />
to help its user gain advantage over others”). For example, “[t]est-preparation technologies<br />
… creat[e] inequalities; students able to afford test-preparation courses, such <strong>as</strong> those<br />
offered by Kaplan, have a definite advantage over those who do not have access to such<br />
courses.” Id. at 615 (internal citation omitted).<br />
64 Harry Brighouse & Adam Swift, Equality, Priority, and Positional Goods, 116 ETHICS 471, 472<br />
(2006) (“[Positional goods] are goods with the property that one’s relative place in the<br />
distribution of the good affects one’s absolute position with respect to its value. The very<br />
fact that one is worse off than others with respect to a positional good means that one is
414 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Admittedly, Google Book Search h<strong>as</strong> so far proven a great resource for<br />
scholars. It h<strong>as</strong> made “book learning accessible on a new, worldwide scale,<br />
despite the great digital divide that separates the poor from the<br />
computerized.” 65 Current access to knowledge is stratified in many troubling<br />
ways; the works of John Willinsky 66 and Peter Suber 67 identify many troubling<br />
current forms of tiering that pale before the present impact of Google Book<br />
Search. 68 Given the aggressive pricing strategies of many publishers and<br />
content owners, Google Book Search is a vital alternative for scholars.<br />
Nevertheless, there is no guarantee in the current version of the settlement that<br />
Google Book Search will preserve its public-regarding features. 69 It may well<br />
end up like the powerful “group purch<strong>as</strong>ing organizations” in the American<br />
health care system that started promisingly, but have evolved to exploit their<br />
intermediary role in troubling ways. 70 Google is more than just one among<br />
many online service providers jostling for a competitive edge on the web. It is<br />
likely to be the key private entity capable of competing or cooperating with<br />
academic publishers and other content providers. Dedicated monitoring and<br />
regulation of the settlement terms now could help ensure that book digitization<br />
worse off, in some respect, than one would be if that good were distributed equally. So<br />
while it might indeed be perverse to advocate leveling down all things considered, leveling<br />
down with respect to positional goods benefits absolutely, in some respect, those who would<br />
otherwise have less than others.<br />
65 Darnton, supra note 40, at 76.<br />
66 JOHN WILLINSKY, THE ACCESS PRINCIPLE: THE CASE FOR OPEN ACCESS TO RESEARCH AND<br />
SCHOLARSHIP 5 (The MIT Press 2005) (describing extreme “digital divide” between those<br />
most connected to information resources and those cut off from them).<br />
67 See generally Peter Suber, Open Access News, www.earlham.edu/~peters/fos/fosblog.html.<br />
Suber is a leader of the open access movement, which aims to “[p]ut[] peer-reviewed<br />
scientific and scholarly literature on the internet[,] [m]ak[e] it available free of charge and<br />
free of most copyright and licensing restrictions[,] [and] remov[e] the barriers to serious<br />
research.” Id.<br />
68 See id. (chronicling on a daily b<strong>as</strong>is news and controversies related to open access to scholarly<br />
materials on the Internet).<br />
69 Siva Vaidhyanathan, Baidu.com Accused of Rigging Search, The Googlization of Everything,<br />
Global Google, Jan. 2009,<br />
http://www.googlizationofeverything.com/2009/01/baiducom_accused_of_rigging<br />
_se.php (Feb. 19, 2009, 14:20 EST) (“‘Public failure’ [is a] phenomenon in which a private<br />
firm steps into a vacuum created by incompetent or gutted public institutions. A firm does<br />
this not for immediate rent seeking or even revenue generation. It does so to enhance<br />
presence, reputation, or to build a platform on which to generate revenue later or elsewhere.<br />
It’s the opposite of ‘market failure.’ And it explains a lot of what Google does.”).<br />
70 For background on group purch<strong>as</strong>ing organizations, see S. PRAKASH SETHI, GROUP<br />
PURCHASING ORGANIZATIONS: AN UNDISCLOSED SCANDAL IN THE U.S. HEALTH CARE<br />
INDUSTRY 122 (Palgrave MacMillan 2009) (“The benefits of combined purch<strong>as</strong>es would be<br />
greatly reduced in conditions where the middlemen … control the entire process through<br />
restrictive arrangements with suppliers and customers.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 415<br />
protects privacy, diverse stakeholder interests, and fair pricing of access to<br />
knowledge. Alliances between Google Book Search and publishers deserve<br />
public scrutiny because they permit private parties to take on what have often<br />
been public functions of determining access to and pricing of information.<br />
Where “regulatory copyright” 71 h<strong>as</strong> answered such questions with compulsory<br />
licenses, 72 the new alliances <strong>as</strong>pire to put into place a regime of crosssubsidization<br />
resistant to public scrutiny or input. 73 Given the vital public<br />
interests at stake in the development of this information infr<strong>as</strong>tructure,<br />
monitoring is vital. 74 Extant law provides little <strong>as</strong>surance that it will actually<br />
occur.<br />
A Public Alternative?<br />
In other work, I have proposed a number of regulations that would permit<br />
either government or public accountability groups to monitor search engines to<br />
detect abuses of their dominant position. To conclude this piece, I would like<br />
to raise one other alternative: a publicly funded search engine.<br />
To the extent that search engines resist monitoring and accountability,<br />
governments should consider establishing public alternatives to them. Here,<br />
lessons from recent debates over health insurance may be instructive. There<br />
are structural parallels between the intermediary role of private health insurers<br />
(which stand <strong>as</strong> a gatekeeper between patients and providers of health products<br />
and services) and that of search engines (which stand between searchers and<br />
71 See Joseph P. Liu, Regulatory Copyright, 83 N.C. L. REV. 87, 91 (2004) (describing the growth<br />
and scope of compulsory licensing statutes that provide for compensation for copyright<br />
holders while denying them the right to veto particular uses of their work).<br />
72 Marybeth Peters, the U.S. Register of Copyrights, h<strong>as</strong> objected to the proposed Google<br />
Books Settlement on the grounds that it would violate traditional norms of separation of<br />
powers in copyright policy. See Hearing on Competition and Commerce in <strong>Digital</strong> Books: The<br />
Proposed Google Book Settlement Before the House Comm. on the Judiciary, 111th Cong. (2009)<br />
(statement of Marybeth Peters, Register of Copyrights), available at<br />
http://judiciary.house.gov/hearings/pdf/Peters090910.pdf, at 2 (“In the view of the<br />
Copyright Office, the settlement proposed by the parties would encroach on responsibility<br />
for copyright policy that traditionally h<strong>as</strong> been the domain of Congress. … We are greatly<br />
concerned by the parties’ end run around legislative process and prerogatives, and we submit<br />
that this Committee should be equally concerned.”).<br />
73 Google considers its pricing and ranking decisions a closely held trade secret—an <strong>as</strong>sertion<br />
that would seem very strange if it came from a public library. See Pamela Samuelson, Google<br />
Books Is Not a Library, THE HUFFINGTON POST, Oct. 13, 2009,<br />
www.huffingtonpost.com/pamela-samuelson/google-books-is-not-alib_b_317518.html<br />
(“Libraries everywhere are terrified that Google will engage in pricegouging<br />
when setting prices for institutional subscriptions to [Google Book Search]<br />
contents.”).<br />
74 Frank P<strong>as</strong>quale, Beyond Competition and Innovation: The Need for Qualified Transparency in Internet<br />
Intermediaries, 104 Nw. U. L. REV. 105 (2010) (offering proposals for monitoring internet<br />
intermediaries).
416 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
providers of information). The 1965 decision to establish Medicare <strong>as</strong> a public<br />
option for an elderly population ill-served by private providers and insurers may<br />
prove a model for an information economy plagued by persistent digital divides.<br />
As the United States debated health reform from 2009 to 2010, there w<strong>as</strong> a<br />
tension between regulation-focused approaches (which would require revelation<br />
and alteration of private insurers’ unfair practices) and a public option that<br />
would compete with existing insurers. Democrats ultimately gave up on<br />
pushing the public option, but the debate exposed the many positive <strong>as</strong>pects a<br />
state-sponsored alternative can provide in certain markets. A public option<br />
could play a role in search parallel to the role that Medicare plays in the health<br />
system: guaranteeing some b<strong>as</strong>eline of transparency in pricing and evaluation. 75<br />
The recent Google Book Search settlement negotiations have led Siva<br />
Vaidhyanathan to characterize Google’s archive project <strong>as</strong> evidence of a “public<br />
failure.” 76 Where<strong>as</strong> government intervention is often necessary in c<strong>as</strong>es of<br />
“market failure,” Vaidhyanathan argues that the reverse can occur: market<br />
actors can step into a vacuum where government should have been. In the c<strong>as</strong>e<br />
of digitized books, the problem is presented starkly: Why h<strong>as</strong> the Library of<br />
Congress failed to require digital deposit of books, instead of merely accepting<br />
paper copies? We can debate when such a requirement became plausible;<br />
however, had the government required such deposit <strong>as</strong> soon <strong>as</strong> it became<br />
fe<strong>as</strong>ible, the problematic possibility of a Google monopoly here would be much<br />
less troubling. If digital deposit ever is adopted, the government could license<br />
its corpus to alternative search services. There is no good re<strong>as</strong>on why the<br />
company that is best capable of reproducing books (and settling lawsuits b<strong>as</strong>ed<br />
on that reproduction) should have a monopoly on search technologies used to<br />
organize and distribute them.<br />
More ambitiously, an NGO or qu<strong>as</strong>i-administrative NGO could undertake to<br />
index and archive the web, licensing opportunities to search and organize it to<br />
various entities that promise to maintain open standards for ranking and rating<br />
websites and other Internet presences. 77 Wikipedia, Sl<strong>as</strong>hdot, and eBay all<br />
75 For more on the role of public options like Medicare in the modern medical sector, see Frank<br />
P<strong>as</strong>quale, Making the C<strong>as</strong>e for the Public Plan, Part II: Public Option <strong>as</strong> Private Benchmark, July 15,<br />
2009, available at http://balkin.blogspot.com/2009/06/making-c<strong>as</strong>e-for-public-planpart-ii.html.<br />
76 Vaidhyanathan, supra note 69. (“ ‘Public failure’ [is a] phenomenon in which a private firm<br />
steps into a vacuum created by incompetent or gutted public institutions. A firm does this<br />
not for immediate rent seeking or even revenue generation. It does so to enhance presence,<br />
reputation, or to build a platform on which to generate revenue later or elsewhere. It’s the<br />
opposite of ‘market failure.’ And it explains a lot of what Google does.”).<br />
77 For a cultural c<strong>as</strong>e for government intervention here, see Mário J. Silva, The C<strong>as</strong>e for a<br />
Portuguese Web Search Engine,<br />
http://www.google.com/url?sa=t&source=web&cd=1&ved=0CBUQFjAA&url=htt<br />
p%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.106.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 417<br />
suggest methods of evaluating relevance and authority that could be employed<br />
by public, open search engines. If such a search engine became at le<strong>as</strong>t<br />
somewhat popular (or popular within a given niche), it could provide an<br />
important alternative source of information and metadata on ranking processes.<br />
The need for a public option in search becomes even more apparent when we<br />
consider the w<strong>as</strong>te and inefficiency caused by opaque intermediaries in other<br />
fields. Like private health insurers, Google is a middleman, standing between<br />
consumers and producers of knowledge. In programs like Book Search, it will<br />
effectively collaborate with copyright owners to determine what access people<br />
get, how much they have to pay, and on what terms. In the health field,<br />
providers and private insurers are both very concentrated in the U.S., and<br />
consumers (i.e., the businesses and individuals who buy insurance plans) are not.<br />
Insurers and providers also jealously guard the secrecy of many pricing<br />
decisions. 78 That is one key re<strong>as</strong>on why the U.S. spends so much more on<br />
health care than other industrialized nations, without getting consistently better<br />
results, access, or quality.<br />
Health care reformers often split into two camps: those who believe that<br />
regulation of middlemen like insurers can bring about fair results, and those<br />
who believe that only a public option can serve <strong>as</strong> a benchmark for judging the<br />
behavior of private insurers. The Patient Protection and Affordable Care Act<br />
(PPACA) of 2010 decisively opted for the regulatory option, and the early stages<br />
of its implementation have been rocky. The constitutional challenges to search<br />
engine regulation would likely prove more serious than the many lawsuits now<br />
attacking PPACA. Therefore, even if the public option in health care is off the<br />
table now, it should inspire future proposals in information policy, where<br />
regulation of intermediaries may be even more difficult than it h<strong>as</strong> proven to be<br />
in health care. If search engines consistently block or frustrate me<strong>as</strong>ures to<br />
incre<strong>as</strong>e their accountability, public alternatives could prove to be an<br />
indispensable foundation of a fair, just, and open information environment.<br />
5334%26rep%3Drep1%26type%3Dpdf&ei=IWZYTJbaCoKC8gapvY2xCw&usg=AFQ<br />
jCNHdTPpTBUuNHZhTOZtGaRiVKP6C4g&sig2=9aoaKLXiXOOUuYHMewopV<br />
Q (describing the value of a Portuguese-oriented search engine); JEAN NOEL JENNENY,<br />
GOOGLE AND THE MYTH OF UNIVERSAL KNOWLEDGE: A VIEW FROM EUROPE (Univ. of<br />
Chicago Press 2007). Where<strong>as</strong> these authors believe that English-language bi<strong>as</strong> is a<br />
particularly problematic <strong>as</strong>pect of Google’s hegemony in the field, I argue that the possibility<br />
of many kinds of hidden bi<strong>as</strong> counsel in favor of at le<strong>as</strong>t one robust, publicly funded<br />
alternative.<br />
78 See, e.g., Uwe Reinhart, The Pricing of U.S. Hospital Services: Chaos Behind a Veil of Secrecy, at<br />
http://healthaff.highwire.org/cgi/content/abstract/25/1/57; Annemarie Bridy, Trade<br />
Secret Prices and High-Tech Devices: How Medical Device Manufacturers are Seeking to Sustain Profits by<br />
Propertizing Prices, 17 TEX. INTELL. PROP. L.J. 187 (2009) (discussing “recent claims by the<br />
medical device manufacturer Guidant that the actual prices its hospital customers pay for<br />
implantable devices, including cardiac pacemakers and defibrillators, are protectable <strong>as</strong> trade<br />
secrets under the Uniform Trade Secrets Act.”).
418 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 419<br />
The Problem of Search Engines<br />
<strong>as</strong> Essential Facilities: An<br />
Economic & Legal Assessment<br />
By Geoffrey A. Manne *<br />
What is wrong with calls for search neutrality, especially those rooted in the<br />
notion of Internet search (or, more accurately, Google, the policy scolds’ bête<br />
noir of the day) <strong>as</strong> an “essential facility,” and necessitating governmentmandated<br />
access? As others have noted, the b<strong>as</strong>ic concept of neutrality in<br />
search is, at root, farcical. 1 The idea that a search engine, which offers its users<br />
edited access to the most relevant websites b<strong>as</strong>ed on the search engine’s<br />
<strong>as</strong>sessment of the user’s intent, 2 should do so “neutrally” implies that the search<br />
engine’s efforts to ensure relevance should be cabined by an almost-limitless<br />
range of ancillary concerns. 3<br />
Nevertheless, proponents of this view have begun to adduce incre<strong>as</strong>ingly detailladen<br />
and complex arguments in favor of their positions, and the European<br />
Commission h<strong>as</strong> even opened a formal investigation into Google’s practices,<br />
b<strong>as</strong>ed largely on various claims that it h<strong>as</strong> systematically denied access to its top<br />
search results (in some c<strong>as</strong>es paid results, in others organic results) by<br />
competing services, 4 especially vertical search engines. 5 To my knowledge, no<br />
* Executive Director, International Center for Law & Economics and Lecturer in Law, Lewis<br />
& Clark Law School. www.laweconcenter.org;<br />
www.lclark.edu/law/faculty/geoffrey_manne.<br />
1 See, e.g., Danny Sullivan, The Incredible Stupidity of Investigating Google for Acting Like a Search<br />
Engine, SEARCH ENGINE LAND, http://searchengineland.com/the-incredible-stupidityof-investigating-google-for-acting-like-a-search-engine-57268<br />
(“A search engine’s job is<br />
to point you to destination sites that have the information you are seeking, not to send you<br />
to other search engines. Getting upset that Google doesn’t point to other search engines is<br />
like getting upset that the New York Times doesn’t simply have headlines followed by a<br />
single paragraph of text that says ‘read about this story in the Wall Street Journal.’”).<br />
2 A remarkable feat, given that this intent must be inferred from simple, context-less search<br />
terms.<br />
3 Perfectly demonstrated by Frank P<strong>as</strong>quale’s call, elsewhere in this volume, for identifying<br />
search engines <strong>as</strong> “essential cultural and political facilities,” thereby mandating incorporation<br />
into their structure whatever “cultural” and “political” preferences any sufficiently-influential<br />
politician (or law professors) happens to deem appropriate.<br />
4 Competing services include, for example, MapQuest (www.mapquest.com) (competing<br />
with Google Maps), Veoh (www.veoh.com) (competing with You Tube) and Bing Shopping<br />
(www.bing.com/shopping) (competing with Google Products).<br />
5 Vertical search engines are search engines that focus on a particular category of products, or<br />
on a particular type of search. Examples include Kayak (www.kayak.com) (travel search),
420 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
one h<strong>as</strong> yet claimed that Google should offer up links to competing general<br />
search engines <strong>as</strong> a remedy for its perceived market foreclosure, but Microsoft’s<br />
experience with the “Browser Choice Screen” it h<strong>as</strong> now agreed to offer <strong>as</strong> a<br />
consequence of the European Commission’s successful competition c<strong>as</strong>e<br />
against the company is not encouraging. 6 These more superficially sophisticated<br />
claims are rooted in the notion of Internet search <strong>as</strong> an “essential facility”—a<br />
bottleneck limiting effective competition.<br />
These claims, <strong>as</strong> well <strong>as</strong> the more fundamental harm-to-competitor claims, are<br />
difficult to sustain on any economically-re<strong>as</strong>onable grounds. To understand this<br />
requires some b<strong>as</strong>ic understanding of the economics of essential facilities, of<br />
Internet search, and of the relevant product markets in which Internet search<br />
operates.<br />
The B<strong>as</strong>ic Law & Economics<br />
of Essential Facilities<br />
There are two ways to deal with a problematic bottleneck: Remove the<br />
bottleneck or regulate access to it. The latter is the more common course<br />
adopted in the U.S. and elsewhere. Complex, Byzantine and often counterproductive<br />
regulatory apparatuses are required to set and monitor the terms of<br />
access. Among other things, this paves the way for either intensely-problematic<br />
judicial oversight of court-imposed remedies or else the creation of sectorspecific<br />
regulatory agencies subject to capture, political influence, bureaucratic<br />
inefficiency, and inefficient longevity. The Interstate Commerce Commission<br />
(and its successor agencies within the Department of Transportation) and the<br />
Federal Communications Commission (and its implementation beginning in<br />
1996 of the monstrous Telecommunications Act) in the U.S. are paradigmatic<br />
examples of these costly effects, and it is certainly questionable whether the<br />
dise<strong>as</strong>e is worse than the cure. 7<br />
Obviously, an essential facility must be essential. Efforts over the years to<br />
shoehorn various markets into this category have sometimes strained credulity,<br />
<strong>as</strong> it h<strong>as</strong> variously been claimed that Aspen, Colorado ski hills, 8 local voice mail<br />
SourceTool (www.sourcetool.com) (business input sourcing), and Foundem<br />
(www.foundem.com) (retail product search and price comparison).<br />
6 See European Commission, Web browser choice for European consumers,<br />
http://ec.europa.eu/competition/consumers/web_browsers_choice_en.html (l<strong>as</strong>t<br />
accessed Dec. 8, 2010).<br />
7 Oren Bracha and Frank P<strong>as</strong>quale’s call for a “Federal Search Commission” modeled on the<br />
Federal Trade Commission is in fact an embrace of the need for a bureaucratic apparatus to<br />
regulate the forced access called for by search neutrality proponents. See Oren Bracha and<br />
Frank P<strong>as</strong>quale, Federal Search Commission: Fairness, Access, and Accountability in the Law of Search,<br />
93 CORNELL L. REV. 1193 (2008).<br />
8 Aspen Highlands Skiing Corp. v. Aspen Skiing Co., 738 F.2d 1509 (10th Cir. 1984), aff ’d on<br />
other grounds, 472 U.S. 585 (1985).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 421<br />
services, 9 soft drinks 10 and direct freight flights between New York and San<br />
Juan 11 (among many other things) were essential facilities necessitating<br />
mandated access under the antitrust laws. 12 In these and many other c<strong>as</strong>es,<br />
myriad alternatives to the allegedly-monopolized market exist and it is arguable<br />
that there w<strong>as</strong> nothing whatsoever “essential” about these markets.<br />
In antitrust literature and jurisprudence, a plaintiff would need to prove the<br />
following to prevail in a monopolization c<strong>as</strong>e rooted in the essential facilities<br />
doctrine:<br />
1. Control of the essential facility by a monopolist;<br />
2. A competitor’s inability practically or re<strong>as</strong>onably to duplicate the<br />
essential facility;<br />
3. The denial of the use of the facility to a competitor; and<br />
4. The fe<strong>as</strong>ibility of providing the facility to competitors. 13<br />
Arguably, since the Supreme Court’s 2004 Trinko decision, 14 a plaintiff would<br />
also need to demonstrate the absence of federal regulation governing access.<br />
The Trinko decision significantly circumscribed the area subject to essential<br />
facilities arguments, limiting such claims to instances where, <strong>as</strong> in the Aspen<br />
Skiing c<strong>as</strong>e, a competitor refuses to deal on re<strong>as</strong>onable terms with another<br />
competitor with whom it h<strong>as</strong>, in fact, dealt in the p<strong>as</strong>t. 15<br />
A key problem with many essential facilities c<strong>as</strong>es is the non-essentiality of the<br />
relevant facility. While there can be no doubt that to particular competitors,<br />
particularly those constrained to only one avenue of access to consumers by<br />
geography or natural monopoly, a facility may indeed seem essential, the<br />
touchstone of U.S. antitrust law h<strong>as</strong> long been consumer, not competitor,<br />
welfare. So while, indeed, Aspen Highlands may have had difficulty competing<br />
with the Aspen Ski Company for consumers who had already chosen to ski in<br />
Aspen, consumers nonetheless had unfettered access to a wide range of<br />
alternative ski (and other vacation) destinations, such that the likelihood of the<br />
9 CTC Communications Corp. v. Bell Atlantic Corp., 77 F. Supp. 2d 124 (D. Me. 1999).<br />
10 Sun Dun v. Coca-Cola Co., 740 F. Supp. 381 (D. Md. 1990).<br />
11 Century Air Freight, Inc. v. American Airlines, Inc., 597 F. Supp. 564 (S.D.N.Y. 1984).<br />
12 For a more complete list of essential facilities (and attempted essential facilities) c<strong>as</strong>es, <strong>as</strong><br />
well <strong>as</strong> an important treatment of the essential facilities doctrine in US antitrust law, see<br />
Abbott B. Lipsky, Jr. & J. Gregory Sidak, Essential Facilities, 51 STAN. L. REV. 1187 (1999).<br />
13 MCI Comm’ns Corp. v. American Tel. & Tel. Co., 708 F.2d 1081 (7th Cir. 1982).<br />
14 Verizon Comm’ns. v. Law Offices of Curtis V. Trinko LLP, 540 U.S. 398 (2004).<br />
15 See, e.g., PHILLIP E. AREEDA & HERBERT HOVENKAMP, ANTITRUST LAW (2004 supp.) at 199.
422 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
monopolization of Aspen’s ski hills affecting overall consumer welfare w<strong>as</strong><br />
essentially non-existent. 16 In such a circumstance, should it matter if a particular<br />
competitor is harmed? Is that a function of antitrust-relevant conduct on the<br />
part of another firm, or an unfortunate set of business decisions on the part of<br />
the first firm?<br />
As Phillip Areeda and Herbert Hovenkamp have famously said of the essential<br />
facilities doctrine, “[it] is both harmful and unnecessary and should be<br />
abandoned.” 17 As another antitrust expert h<strong>as</strong> described it:<br />
At bottom, a plaintiff making an essential facilities argument is saying<br />
that the defendant h<strong>as</strong> a valuable facility that it would be difficult to<br />
reproduce, and suggesting that is a re<strong>as</strong>on for a court to intervene and<br />
impose a sharing duty. But at le<strong>as</strong>t in the v<strong>as</strong>t majority of the c<strong>as</strong>es, the<br />
fact that the defendant h<strong>as</strong> a highly valued facility is a re<strong>as</strong>on to reject<br />
sharing, not to require it, since forced sharing “may lessen the incentive<br />
for the monopolist, the rival, or both to invest in those economically<br />
beneficial facilities.” 18<br />
This perennial problem—antitrust laws being used to protect competitors rather<br />
than consumers—lies at the heart of claims surrounding Internet search <strong>as</strong> an<br />
essential facility.<br />
There is much tied up in the argument, and proponents have often been careful<br />
to at le<strong>as</strong>t go through the motions of drawing the rhetorical line back to<br />
consumers. In its fullest expression, it is claimed that harm to competitors now<br />
will mean the absence of competitors later and thus an unfettered monopoly<br />
with the intent and power to harm consumers. 19 It is also often argued that<br />
consumers (in this c<strong>as</strong>e Internet users searching for certain websites or the<br />
products they sell) are intrinsically harmed by the unavailability of access to the<br />
information contained in sites that are denied access to the search engine’s<br />
“essential facility.” 20<br />
16 The courts, however, did not agree.<br />
17 3A AREEDA & HOVENKAMP, ANTITRUST LAW 771c, at 173 (2002).<br />
18 R. Hewitt Pate, Refusals to Deal and Essential Facilities, Testimony Submitted to DOJ/FTC<br />
Hearings on Single Firm Conduct, Jul. 18, 2006, available at<br />
http://www.justice.gov/atr/public/hearings/single_firm/docs/218649.htm (quoting<br />
Trinko, 540 U.S.. at 408).<br />
19 See, e.g., European Commission Launches Antitrust Investigation of Google, SEARCH<br />
NEUTRALITY.ORG, Nov. 30, 2010, http://www.searchneutrality.org (“Google is exploiting<br />
its dominance of search in ways that stifle innovation, suppress competition, and erode<br />
consumer choice.”). Meanwhile, complainants have gone to Europe where a showing of<br />
consumer harm is not necessary to prevail under its competition laws.<br />
20 As Oren Bracha and Frank P<strong>as</strong>quale put it, “Search engines, in other words, often function<br />
not <strong>as</strong> mere satisfiers of predetermined preferences, but <strong>as</strong> shapers of preferences,” Federal<br />
Search Commission, 93 CORNELL L. REV. at 1185. Bracha and P<strong>as</strong>quale also claim that “Market<br />
participants need information about products and services to make informed economic
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 423<br />
The b<strong>as</strong>ic essential facilities c<strong>as</strong>e against Google is that it controls a bottleneck<br />
for the Internet—it is the access point for most consumers, and search results<br />
on Google determine which websites are successful and which end up in<br />
oblivion. 21 More particularly, it is argued that Google h<strong>as</strong> used its control over<br />
this bottleneck to deny access by competitors to Google’s users. To understand<br />
this requires a brief discussion of the economics relevant to Internet Search and<br />
its relevant market.<br />
The B<strong>as</strong>ic Economics of Internet Search<br />
Implicit in claims that Google controls access to an essential facility is that<br />
access by some relevant set of consumers (or competitors) to relevant content is<br />
accessible only (or virtually only) through Google. It is necessary, then, to<br />
<strong>as</strong>sess whether Google’s search results pages are, in fact, without significant<br />
competition for the economic activity at their heart. Of course the economic<br />
activity at their heart is advertising. 22<br />
It is hard to conceive of Internet search—let alone Google’s website—<strong>as</strong> the<br />
only means of reducing search costs for potential consumers (Internet<br />
searchers) and prospective sellers. Leaving <strong>as</strong>ide the incredible range of<br />
alternative sources to the Internet for commerce, 23 off the top of my head, I can<br />
imagine Google’s competitor websites finding access to users by 1) advertising<br />
in print publications and TV; 2) using social networking sites to promote their<br />
sites, 3) being linked to by other websites including sites specializing in rating<br />
websites, online magazines, review sites, and the like; 4) implementing affiliate<br />
programs or other creative marketing schemes; 5) purch<strong>as</strong>ing paid advertising,<br />
both in Google’s own paid search results, <strong>as</strong> well <strong>as</strong> on other, heavily-trafficked<br />
websites; and 6) securing access via Google’s general search competitors like<br />
Yahoo! and Bing. Competitors denied access to the top few search results at<br />
decisions. … [A]attaining visibility and access to users is critical to competition and<br />
cooperation online. Centralized control or manipulation by search engines may stifle<br />
innovation by firms relegated to obscurity.” Id. at 1173-74.<br />
21 Id. at 1173 (“Concentrated control over the flow of information, coupled with the ability to<br />
manipulate this flow, may reduce economic efficiency by stifling competition.”).<br />
22 See KEN AULETTA, GOOGLED: THE END OF THE WORLD AS WE KNOW IT 16 (2009) (quoting<br />
Google CEO Eric Schmidt <strong>as</strong> saying, “We are in the advertising business”).<br />
23 There is a tendency for Web sites to view their Internet enterprises <strong>as</strong> different than their<br />
offline counterparts’, but, at root, most Internet sites (other than branded ones attached<br />
directly to offline stores) are founded by entrepreneurs who made a simple business decision<br />
to ply their trade online rather than off. That this decision may have foreclosed e<strong>as</strong>y access<br />
to certain offline customers, or put the entrepreneur in a position where access to customers<br />
could be frustrated by certain competitive disadvantages specific to the Internet, does not<br />
convert these competitive disadvantages into special problems deserving of antitrust<br />
treatment. To do so would be to inappropriately and inefficiently insulate the online/offline<br />
business decision from the healthy effects of Schumpeter’s “perennial gale of creative<br />
destruction.” JOSEPH SCHUMPETER, THE PROCESS OF CREATIVE DESTRUCTION (1942).
424 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Google’s site are still able to advertise their existence and attract users through a<br />
wide range of other advertising outlets—extremely wide, in fact: According to<br />
one estimate Google w<strong>as</strong> responsible in 2007 for only about 7.5% of the<br />
world’s advertising. 24<br />
For Google to profit from its business—whether <strong>as</strong> a monopolist or not—it<br />
must deliver up to its advertisers a set of users. Interestingly, users of Google’s<br />
general search engine are mostly uninterested in the paid results. They click<br />
through the unpaid or “organic” search results by a wide margin ahead of paid<br />
results. 25 There is thus an <strong>as</strong>ymmetry. On one side of its platform are<br />
advertisers who care about the quantity and quality (the likelihood that users<br />
who see an ad will click through to advertisers’ sites and purch<strong>as</strong>e something<br />
while there) of the users on the other side. Meanwhile, users care very little<br />
about the quantity of advertisers and care only somewhat about the quality of<br />
advertisers (preferring greater relevance to lesser, but frequently ignoring paid<br />
results anyway). Nevertheless, the core of this enterprise is search result<br />
relevance. Greater relevance improves the quality of searchers from the<br />
advertisers’ point of view, ensuring that advertisers’ paid results are clicked on<br />
by the users most likely to find the advertiser’s site of interest and to purch<strong>as</strong>e<br />
something there.<br />
But there are problems inherent in the ambiguity of search terms and the ability<br />
to “game the system” that prevent even the most sophisticated algorithms from<br />
offering up perfect relevance. First, search terms are often context-less, and a<br />
user searching for “jaguar” may be searching for information on the car<br />
company, the operating system, the big cat, or something else. 26 Along a<br />
different dimension, a user searching for “Nikon camera” might be looking to<br />
buy a Nikon camera or might be looking for a picture of a Nikon camera to post<br />
on his blog. Obviously advertisers care very much which of these users clicks<br />
on their paid result. At the same time, many undesirable websites (spam sites<br />
and the like) can and do take advantage of predictable search results to occupy<br />
desirable search result real estate to the detriment of the search engine, its users<br />
and its advertisers. Efforts to keep these sites out of the top results and to<br />
ensure maximum relevance from ambiguous search terms require a host of<br />
algorithm tweaks and even human interventions. That these may (intentionally<br />
or inadvertently) harm some websites’ rank in certain search results is consistent<br />
with a well-functioning search platform.<br />
24 See Erick Schonfeld, Estimates Put Internet Advertising at $21 Billion in U.S., $45 Billion Globally,<br />
TECHCRUNCH, Feb. 26, 2008, http://techcrunch.com/2008/02/26/estimates-putinternet-advertising-at-21-billion-in-us-45-billion-globally/.<br />
25 See, e.g., Neil Walker, Google Organic Click Through Rate (CTR), UK SEO CONSULTANT, May 11,<br />
2010, http://www.seomad.com/SEOBlog/google-organic-click-through-ratectr.html.<br />
26 See Bill Slawski, A Look at Google Midpage Query Refinements, SEO BY THE SEA, Apr. 20, 2006,<br />
http://www.seobythesea.com/?p=174.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 425<br />
Google offers its organic search results and its other services <strong>as</strong> a solution to the<br />
two-sided platform problem mentioned above: In order to attract paying<br />
advertisers, Google also h<strong>as</strong> to attract (and match up) the advertisers’ target<br />
audience. Google offers everything it does to its users in an effort to attract<br />
these users and to glean information from them that facilitates its all-important<br />
matching (relevance) function. In the process, Google generates revenue from<br />
advertisers eager to “sell” to this audience. For a host of re<strong>as</strong>ons, Google (like<br />
all search engines) does not charge searchers to access its various services, but it<br />
does charge advertisers. Just because search is an ancillary business to Google’s<br />
true advertising business does not necessarily mean it is not a relevant market<br />
for purposes of antitrust analysis; nevertheless it is essential to avoid the pitfall<br />
of examining one side of a two-sided market in isolation. As David Evans<br />
notes, “[t]he analysis of either side of a two-sided platform in isolation yields a<br />
distorted picture of the business.” 27 Two-sided market definition is complex,<br />
and little understood—especially by non-experts throwing around various<br />
alleged markets in which companies like Google are said to be “dominant.”<br />
There is actually substantial re<strong>as</strong>on to doubt the propriety of a narrow market<br />
definition limited to online search advertising. 28 Even where there are different<br />
purposes for different types of advertising—e.g. brand recognition for display<br />
ads and efforts to sell for search ads and other outlets like coupons—this is<br />
merely a difference in degree. Both are fundamentally forms of reducing the<br />
costs of a user’s search for a product, <strong>as</strong> we have understood since George<br />
Stigler’s seminal work on the subject in 1968, 29 and the relevant question is<br />
whether the difference is significant enough to render decisions in one market<br />
essentially unaffected by decisions or prices in the other.<br />
There is evidence that advertisers view online and offline advertising <strong>as</strong><br />
substitutes, and this applies not only to traditional advertisers but also Internet<br />
companies. Thus, in 2009, Pepsi decided not to advertise during the 2010 Super<br />
Bowl, in order to focus instead on a particular type of online campaign. “This<br />
year for the first time in 23 years, Pepsi will not have ads in the Super Bowl<br />
telec<strong>as</strong>t …. Instead it is redirecting the millions it h<strong>as</strong> spent annually to the<br />
27 David S. Evans, Two-Sided Market Definition, ABA SECTION OF ANTITRUST LAW, MARKET<br />
DEFINITION IN ANTITRUST: THEORY AND CASE STUDIES (forthcoming), available at<br />
http://ssrn.com/abstract=1396751 at p. 9.<br />
28 Readers interested in a fuller treatment of the market definition question surrounding<br />
Google are directed toward Geoffrey A. Manne & Joshua D. Wright, Google and the Limits of<br />
Antitrust: The C<strong>as</strong>e Against the C<strong>as</strong>e Against Google, 34 HARV. J. L. & PUB. POL’Y 1 (2011)<br />
(forthcoming), from which much of the discussion of Google’s markets and economics in<br />
this essay is drawn.<br />
29 GEORGE JOSEPH STIGLER, THE ORGANIZATION OF INDUSTRY 201 (Univ. of Chi. Press 1983)<br />
(1968).
426 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Internet.” 30 And even Google itself advertises offline. 31 Another study suggests<br />
that there is indeed a trade-off between online and more traditional types of<br />
advertising: Avid Goldfarb and Catherine Tucker have demonstrated that<br />
display advertising pricing is sensitive to the availability of offline alternatives. 32<br />
And of course companies have limited advertising budgets, distributed across a<br />
broad range of media and promotional efforts. As one commentator notes: “By<br />
2011 web advertising in the United States w<strong>as</strong> expected to climb to sixty billion<br />
dollars, or 13 percent of all ad dollars. This meant more dollars siphoned from<br />
traditional media, with the largest slice probably going to Google.” 33<br />
Advertising revenue on the Internet is driven initially by the size of the<br />
audience, with a significant multiplier for the likelihood that those consumers<br />
will purch<strong>as</strong>e the advertisers’ products 34 (b<strong>as</strong>ed on a viewer’s propensity to<br />
“click through” to the advertiser’s site). Google’s competition in selling ads thus<br />
comes, in varying degrees, not only from other search sites, but also from any<br />
other site that offers a service, product, or experience that consumers might<br />
otherwise find in Google’s “organic” search results, for which Google is not<br />
paid. For Google’s competitors, this means seeking forced access to its users.<br />
But access to eyeballs can be had from a large range of access points around the<br />
Web.<br />
Social media sites like Twitter and Facebook are therefore significant access<br />
points, occupying, <strong>as</strong> they do, a considerable amount of Internet “eyeball” time.<br />
The Pepsi deviation of advertising revenue from the Super Bowl to the Internet<br />
is not likely to have inured much to Google’s benefit <strong>as</strong> the strategy w<strong>as</strong> a<br />
“social media play,” building on the expressed brand loyalties and peer<br />
communications that propel social media. 35 In a world of scarce advertising<br />
dollars and effective marketing via social media sites, Google and all other<br />
advertisers, online and off, must compete with the growing threat to their<br />
revenue from these still-novel marketing outlets. “If Facebook’s community of<br />
30 Larry D. Woodard, Pepsi’s Big Gamble: Ditching Super Bowl for Social Media, ABC NEWS, Dec. 23,<br />
2009, http://abcnews.go.com/print?id=9402514.<br />
31 See Danny Sullivan, Google Pushes Chrome Browser Via Newspaper Ads, SEARCH ENGINE LAND,<br />
Nov. 21, 2010, http://searchengineland.com/google-pushes-chrome-browser-vianewspaper-ads-56600.<br />
32 Avi Goldfarb & Catherine Tucker, Search Engine Advertising: Pricing Ads to Context 96 (NET<br />
Institute Working Paper No. 07-23, 2007) available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1021451&rec=1&srcabs=10084<br />
33 KEN AULETTA, GOOGLED: THE END OF THE WORLD AS WE KNOW IT 16 (2009).<br />
34 David S. Evans, The Economics of the Online Advertising Industry, 7 REV. OF NETWORK ECON.<br />
359, 359-60 (2008).<br />
35 See Larry D. Woodard, Pepsi’s Big Gamble: Ditching Super Bowl for Social Media, ABC NEWS, Dec.<br />
23, 2009, http://abcnews.go.com/print?id=9402514.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 427<br />
users got more of their information through [the Facebook] network, their<br />
Internet search engine and navigator might become Facebook, not Google.” 36<br />
The upshot: To the extent that inclusion in Google search results is about<br />
“Stiglerian” search-cost reduction for websites (and it can hardly be anything<br />
else), the range of alternate facilities for this function is nearly limitless.<br />
Finally, Google competes not only with other general search engines (and<br />
possibly all other forms of advertising) but also with so-called vertical search<br />
engines. These are search engines and e-commerce websites with search<br />
functionality that specializes in specific content: Amazon in books, music, and<br />
other consumer goods; Kayak in travel services; eBay in consumer auctions;<br />
WebMD in medical information and products; SourceTool in business-tobusiness<br />
supplies; Yelp in local businesses, and many others. To the extent that<br />
Internet users byp<strong>as</strong>s Google and begin their searches at one of these<br />
specialized sites (<strong>as</strong> is incre<strong>as</strong>ingly the c<strong>as</strong>e), the value to these heavily-trafficked<br />
websites from access to Google’s users decre<strong>as</strong>es. 37<br />
Competition from vertical search engines is important because ad click-through<br />
rates likely are higher when consumers are actively searching for something to<br />
buy—just <strong>as</strong> search advertising targets consumers who express some interest in<br />
a particular search term, the effect is magnified if the searcher can be identified<br />
<strong>as</strong> an immediate consumer. Thus online retailers like CDnow that can establish<br />
their own brands and their own navigation channels 38 have a significant<br />
advantage in drawing searchers—and advertisers—away from Google: The fact<br />
that a consumer is performing a search on a retail site itself conveys important<br />
and valuable information to advertisers that is not otherwise available from<br />
most undifferentiated Google searches—it certainly incre<strong>as</strong>es the chance that<br />
the searcher is searching to buy a CD rather than learn something about the<br />
singer. Because this “ready-to-buy” traffic is the most valuable, there is a<br />
possibility of two separate search markets, with most high-value traffic<br />
byp<strong>as</strong>sing general-purpose search engines for product search sites like eBay and<br />
Amazon.com, and with Google and other general-purpose search engines<br />
serving primarily non-targeted, lower-value traffic. The implication is that, while<br />
even relatively small-scale competition may present a potentially significant<br />
threat to Google’s search business, this threat does not depend on links to these<br />
sites from Google’s search results. And thus these competitors have a strong,<br />
36 KEN AULETTA, GOOGLED: THE END OF THE WORLD AS WE KNOW IT 172–73 (2009).<br />
37 For example, in the thirty days ending on February 23, 2010, less than ten percent of visits<br />
to eBay.com originated from a search engine. See ALEXA, eBay.com Site Info,<br />
http://www.alexa.com/siteinfo/ebay.com.<br />
38 See Donna L. Hoffman & Thom<strong>as</strong> P. Novak, How to Acquire Customers on the Web, HARV. BUS.<br />
REV., May–June 2000, at 3, 5, 7. (CDnow w<strong>as</strong> acquired by Amazon.com in 2001.)
428 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
independent incentive to develop marketing programs outside of Google’s<br />
search pages—and there is good re<strong>as</strong>on not to deputize Google in the process.<br />
Is Google an Essential Facility?<br />
Recall that the b<strong>as</strong>ic claim is that Google’s competitors are foreclosed from<br />
access to Google’s desirable (essential) marketing platform and thereby suffer<br />
significant harm. Of course from the outset, this h<strong>as</strong> it backwards (and this is a<br />
core problem with the essential facilities doctrine <strong>as</strong> a whole).<br />
If there is a problem, it should be the problem of limited access by Google’s users<br />
to Google’s competitors. Sometimes the absence of access by competitors to<br />
consumers is the same thing <strong>as</strong> the absence of access by consumers to<br />
competitors, but it depends on how well the market h<strong>as</strong> been defined. In the<br />
most fundamental sense Google h<strong>as</strong> precisely zero control over access by<br />
consumers (meaning users who use Google to search the Internet) to<br />
competitors: Anyone with access to a browser can access any site on the<br />
Internet simply by typing its URL into the browser. Perhaps understanding<br />
this, proponents of the “Internet search is an essential facility” claim argue that<br />
mere access is insufficient, and that consumers are essentially ignorant about the<br />
valuable content on the web except by search engines, which are subject to the<br />
search engine’s editorial control over that access. To the typical Google user,<br />
according to this view, Google’s competitors are effectively non-existent unless<br />
they appear in the top few search results.<br />
Now we are dangerously close to the sort of arbitrary market definition exercise,<br />
devoid of the discipline imposed by economics, that identifies an<br />
anticompetitive problem by narrowing the market until every company is a<br />
monopolist over some small group of consumers. Indeed, one can always<br />
define a market by focusing on idiosyncratic preferences or product variations.<br />
Justice Fort<strong>as</strong> decried this type of analysis in his dissent in Grinnell (regarding<br />
home security systems), and it merits quoting at length:<br />
The trial court’s definition of the “product” market even more<br />
dramatically demonstrates that its action h<strong>as</strong> been<br />
Procrustean—that it h<strong>as</strong> tailored the market to the dimensions<br />
of the defendants. It recognizes that a person seeking<br />
protective services h<strong>as</strong> many alternative sources. It lists<br />
“watchmen, watchdogs, automatic proprietary systems<br />
confined to one site, (often, but not always,) alarm systems<br />
connected with some local police or fire station, often<br />
unaccredited CSPS [central station protective services], and<br />
often accredited CSPS.” The court finds that even in the same<br />
city a single customer seeking protection for several premises<br />
may “exercise its option” differently for different locations. It<br />
may choose accredited CSPS for one of its locations and a<br />
different type of service for another.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 429<br />
But the court isolates from all of these alternatives only those<br />
services in which defendants engage. It eliminates all of the<br />
alternative sources despite its conscientious enumeration of<br />
them. Its definition of the “relevant market” is not merely<br />
confined to “central station” protective services, but to those<br />
central station protective services which are “accredited” by<br />
insurance companies.<br />
There is no pretense that these furnish peculiar services for<br />
which there is no alternative in the market place, on either a<br />
price or a functional b<strong>as</strong>is. The court relies solely upon its finding that<br />
the services offered by accredited central stations are of better quality, and<br />
upon its conclusion that the insurance companies tend to give<br />
“noticeably larger” discounts to policyholders who use<br />
accredited central station protective services. This Court now<br />
approves this strange red-haired, bearded, one-eyed man-witha-limp<br />
cl<strong>as</strong>sification. 39<br />
In Internet search <strong>as</strong> well, complainants imply a market b<strong>as</strong>ed on the fact that<br />
Google offers “better quality” access to a larger set of Internet users than the<br />
myriad existing alternatives. But claiming essentiality b<strong>as</strong>ed on a competitor’s<br />
relative high quality is deeply problematic.<br />
This point is of great importance in <strong>as</strong>sessing the economics of the essential<br />
facilities doctrine generally and its application to Internet search in particular. It<br />
is clear, even under a fairly expansive reading of the essential facilities doctrine,<br />
that even a monopolist h<strong>as</strong> no duty to subsidize the efforts of a less-effective<br />
rival. 40 Arguably the Aspen Skiing c<strong>as</strong>e should have been tossed out on this<br />
b<strong>as</strong>is. As a practical matter, the Aspen Ski Company, by entering into a joint<br />
marketing agreement with its smaller rival, Aspen Highlands, allowed Highlands<br />
to take advantage of its markedly larger productivity (both in developing ski<br />
terrain and amenities, <strong>as</strong> well <strong>as</strong> marketing Aspen <strong>as</strong> a ski destination). Its<br />
subsequent decision to drop Highlands from its marketing program for failing<br />
to offer sufficient return on its investment should have been unobjectionable. 41<br />
Similarly, the explicit claim in c<strong>as</strong>es brought against Google by its allegedlyforeclosed<br />
rivals is that these (relatively miniscule) sites should have access to<br />
Google’s effective and inexpensive marketing tool. But it is by no means clear<br />
that Google does or should have this duty to promote its rivals (without<br />
compensation to Google, <strong>as</strong> it happens). This is particularly true when, <strong>as</strong><br />
discussed above, other modes of access exist for competitors’ activities, even if<br />
39 U.S. v. Grinnell Corp., 384 U.S. 563, 590-91 (1966) (emph<strong>as</strong>is added).<br />
40 See Olympia Equip. Le<strong>as</strong>ing Co. v. Western Union Tel. Co., 797 F.2d 370 (7 th Cir. 1986).<br />
41 See KEITH N. HYLTON, ANTITRUST LAW: ECONOMIC THEORY AND COMMON LAW<br />
EVOLUTION 205-06 (2003).
430 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
these modes of access are of lower quality or higher cost. Particularly where, <strong>as</strong><br />
here, the alleged bottleneck arises not out of a combination with another firm<br />
or firms but out of unilateral conduct (success in the marketplace), the claim<br />
that a superior access point among many (inferior) access points should be pried<br />
open for the benefit of its competitors is specious.<br />
It is worth noting that an alleged Google competitor, SourceTool, in the<br />
TradeComet complaint, 42 h<strong>as</strong> made a version of this argument, alleging that<br />
Google once engaged in profitable commerce with SourceTool (by selling<br />
SourceTool ads next to Google search results) and then penalized SourceTool<br />
to its (Google’s) economic detriment. 43 The shape of this argument is a<br />
transparent effort to remain under what is left of the essential facilities doctrine<br />
following Trinko. But notice that even if it is true that Google intentionally<br />
ended a profitable arrangement with SourceTool (which is by no means clear),<br />
the claim still doesn’t p<strong>as</strong>s muster. It is almost impossible that Google could be<br />
receiving less revenue from whatever site h<strong>as</strong> replaced SourceTool in the paid<br />
search result spots SourceTool once paid for. As a result, even if Google were<br />
foregoing a previously-profitable relationship with SourceTool, it is not, in fact,<br />
suffering any economic harm because another advertiser h<strong>as</strong> stepped into<br />
SourceTool’s shoes.<br />
Of course the argument that Google’s competitors are effectively absent<br />
without (guaranteed?) access to Google’s top few search results proves too<br />
much. There is a scarcity of “top few search results,” and any effective search<br />
engine must have the ability to ensure that those results are the most relevant<br />
possible, <strong>as</strong> well <strong>as</strong> that they do not violate various quality, safety, moral or<br />
other standards that the search engine chooses to promote. “Forcing [owners<br />
of essential facilities] to share access may not enhance consumer welfare.” 44<br />
Pure “neutrality” is neither possible nor desirable, and the exclusion of certain<br />
websites from these coveted positions should be deemed utterly unpersu<strong>as</strong>ive in<br />
making out even a prima facie monopolization c<strong>as</strong>e against a search engine.<br />
And it is not even the c<strong>as</strong>e that SourceTool, Foundem, 45 and other competing<br />
websites are absent from Google; it is, however, sometimes the c<strong>as</strong>e that these<br />
42 Complaint , TradeComet.Com LLC v. Google, Inc., 693 F. Supp. 2d 370 (S.D.N.Y. 2009) (No.<br />
09Civ.1400(SHS)).<br />
43 Id. 8.<br />
44 KEITH N. HYLTON, ANTITRUST LAW: ECONOMIC THEORY AND COMMON LAW EVOLUTION<br />
208 (2003).<br />
45 Foundem is a “vertical search and price comparison” site in the UK. See<br />
www.foundem.co.uk. The company is at the heart of the “search neutrality” debate in<br />
Internet search. It h<strong>as</strong> created a website to advocate its views on the neutrality issue at<br />
www.searchneutrality.org, and its claims are at the heart of the European Commission’s<br />
investigation of Google. See Foundem’s discussion of the EU action and its relationship to<br />
Foundem’s claims in European Commission Launches Antitrust Investigation of Google, SEARCH<br />
NEUTRALITY.ORG, Nov. 30, 2010, http://www.searchneutrality.org.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 431<br />
sites do not show up in the top few organic search results (and, often at the<br />
same time, Google’s own competing product search results do). But if access to<br />
the top few search results is required to ensure the requisite access sought by<br />
Google’s competitors, the relevant market h<strong>as</strong> been narrowed considerably,<br />
creating a standard that can’t possibly be met, no matter how “neutral” a search<br />
engine’s results.<br />
Meanwhile, if Foundem were to disappear from the face of the Earth, who,<br />
other than its investors and employees (and perhaps their landlords), would be<br />
harmed? The implicit claim (if an antitrust c<strong>as</strong>e is to be made) is that websites<br />
like Foundem apply a constraint on Google’s ability to extract monopoly rents<br />
(presumably from advertisers). But this is a curious claim to make while<br />
simultaneously arguing that Google itself is made “better” (<strong>as</strong> in, searchers are<br />
indeed looking for Foundem in searches from which the site may be excluded)<br />
by the inclusion of Foundem in its search results (thus, presumably, incre<strong>as</strong>ing<br />
Google’s attractiveness to its users and thus its advertisers), while also claiming<br />
that Foundem would ce<strong>as</strong>e to exist without access to the top few Google search<br />
results.<br />
Google does not sell retail goods, and does not profit directly from its own<br />
product search offerings (which compete with Foundem), instead receiving<br />
benefit by incre<strong>as</strong>ing its customer b<strong>as</strong>e and the efficacy (presumably) of paid<br />
advertisements on its search pages that include a link to its own price<br />
comparison results. It is a remarkably tenuous claim to make that Google<br />
profits more by degrading its search results than by improving them. If the<br />
contrary claim is really true—if, that is, Google harms itself or its advertisers by<br />
intentionally penalizing competing sites like Foundem—then that argument and<br />
any evidence for it is absent from the current debate. And, of course, if Google<br />
is, <strong>as</strong> it claims, actually improving its product by applying qualitative decisions to<br />
demote sites like Foundem and others that, Google claims, merely re-publish<br />
information from elsewhere on the web with precious little original content,<br />
then Google’s efforts should be seen <strong>as</strong> a feature and not a bug.<br />
Moreover, the extension of the essential facilities logic to competition between<br />
Google and competitors like Foundem, MapQuest or Kayak is extremely<br />
problematic. To the extent that Google and Foundem, for example, are<br />
competitors, they are competitors not in the advertising space but rather in the<br />
“information dissemination and retail distribution channel” space. I’m not sure<br />
what else to call it. Foundem earns revenue by directing customers to retail<br />
sites to purch<strong>as</strong>e goods. In this sense, Foundem acts like a shopping mall.<br />
Google does the same, only instead of receiving a cut from the sale, <strong>as</strong><br />
Foundem does, Google sells advertisements. Thus, when Foundem complains<br />
about access to Google’s site, it is a competing channel of distribution,<br />
complaining that it needs access to its competitor’s distribution channel in order<br />
to compete.<br />
It’s a weird sort of complaint. It isn’t the same <strong>as</strong> the cl<strong>as</strong>sic essential facilities<br />
sort of complaint where, to simplify, the owner of a vertically-integrated railroad
432 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
and rail transport company prevents access by other transport companies to its<br />
railroad line. Instead this would be like railroad company A arguing that<br />
railroad B must give A access to B’s tracks so A can sell access to those tracks<br />
to other rail transport companies.<br />
But even this doesn’t completely capture the audacity of the complaint, because<br />
for the analogy to hold, railroad A would actually be <strong>as</strong>king the court to force<br />
railroad B to put up a sign at the head of its tracks allowing railroad A to offer<br />
to trains already on B’s railroad the opportunity to jump off B’s railroad and<br />
start over again on A’s railroad that follows another route—but without<br />
knowing for sure if the route is better or worse until you jump onto A’s tracks.<br />
Something like that. Again, it’s weird.<br />
And note, of course, the problem that “at the head of the tracks” (<strong>as</strong> in<br />
something like “the first, second or third organic result”) is a problematic<br />
requirement <strong>as</strong> only three sites at any given time can occupy those spots—but<br />
there may be many more than three firms complaining of Google’s conduct<br />
and/or affected by the vagaries of its product design decisions. Or to keep with<br />
the shopping mall analogy, it’s like the owner of any of a number of small, new<br />
shopping malls requiring the owner of a large, established shopping mall to<br />
permit each of the new mall’s owners to set up a bus line to ferry shoppers to<br />
the new mall <strong>as</strong> they enter the established mall. Even where the established<br />
mall h<strong>as</strong> a geographic, reputational and resource advantage, no one would argue<br />
that this access w<strong>as</strong> essential to efficient commerce, and the cost to the<br />
successful incumbent would be manifestly too high.<br />
As discussed above, sites like Foundem do indeed have access to Google’s end<br />
users via any number of keywords on Google’s site. Type “UK price<br />
comparison site” into Google and a number of Google competitors come up<br />
including Foundem (and Google’s own price comparison site is seemingly<br />
absent). The claim thus becomes one that is either inappropriately aggregated<br />
(“for all search terms on average that may direct users to Foundem, Foundem is<br />
effectively denied access to the top search results”) or else overly narrow (“we<br />
prefer customers to find us by typing ‘Nikon camera’ into Google, not by typing<br />
‘price comparison Nikon camera’ into Google”). In any c<strong>as</strong>e, access is in fact<br />
available for these competitors, and “the indispensable requirement for<br />
invoking the [essential facilities] doctrine is the unavailability of access to the<br />
‘essential facilities’; where access exists, the doctrine serves no purpose.” 46<br />
Meanwhile, it is difficult to see how relevance (and thus efficiency) could be<br />
well-served by a neutrality principle that required a tool that reduces search costs<br />
to inherently incre<strong>as</strong>e those costs by directing searchers to a duplicate search on<br />
another site. If one is searching for a specific product and hoping to find price<br />
comparisons on Google, why on earth would that person be hoping to find not<br />
Google’s own efforts at price comparison, built right into its search engine, but<br />
46 Trinko, 540 U.S. at 411.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 433<br />
instead a link to another site that requires another several steps before finding<br />
the information?<br />
Seen this way, Google’s decision to promote its own price comparison results is<br />
a simple product pricing and design decision, protected by good sense and the<br />
Trinko decision (at le<strong>as</strong>t in the U.S.). Unlike the majority of its vertical search<br />
competitors and by design, Google makes no direct revenue from users clicking<br />
through to purch<strong>as</strong>e anything from its shopping search results, and this allows it<br />
to offer a different (and, to many consumers, a significantly better) set of<br />
results. The page h<strong>as</strong> paid search results only in small boxes at the top and<br />
bottom, the information is all algorithmically generated, and retailers do not pay<br />
to have their information on the page. For this product design—by definition<br />
of great value to users (in effect lowering the price to them of their product<br />
search)—to merit Google’s investment, it is necessary that its own, morerelevant<br />
and less-expensive results receive priority. If this is generating<br />
something of value for Google it is doing so only in the most salutary f<strong>as</strong>hion:<br />
by offering additional resources for users to improve their “search experience”<br />
and thus induce them to use Google’s search engine. To require “neutrality” in<br />
this setting is to impair the site’s ability to design and price its own product.<br />
Even the Aspen Skiing decision didn’t go that far, requiring access to a joint<br />
marketing arrangement but not obligating Aspen Ski Company to alter its prices<br />
for skiers seeking to access only its own slopes.<br />
And the same analysis holds for <strong>as</strong>sessments of Google’s other offerings (maps<br />
and videos, for example) that compete with other sites. Look for the nearest<br />
McDonalds in Google and a Google Map is bound to top the list (but not be<br />
the exclusive result, of course). But why should it be any other way? In effect,<br />
what Google does is give you the Web’s content in <strong>as</strong> accessible and appropriate<br />
a form <strong>as</strong> it can—design decisions that, Google must believe, incre<strong>as</strong>e quality<br />
and reduce effective price for its users. By offering not only a link to<br />
McDonalds’ web site, <strong>as</strong> well <strong>as</strong> various other links, but also a map showing the<br />
locations of the nearest restaurants, Google is offering up results in different<br />
forms, hoping that one is what the user is looking for. There is no economic<br />
justification for requiring a search engine in this setting to offer another site’s<br />
rather than its own simply because there happen to be other sites that do,<br />
indeed, offer such content (and would like cheaper access to consumers).<br />
Conclusion<br />
Search neutrality and forced access to Google’s results pages is b<strong>as</strong>ed on the<br />
proposition that—Google’s users’ interests be damned—if Google is the e<strong>as</strong>iest<br />
way competitors can get to potential users, Google must provide that access.<br />
The essential facilities doctrine, dealt a near-death blow by the Supreme Court<br />
in Trinko, h<strong>as</strong> long been on the ropes. It should remain moribund here. On<br />
the one hand Google does not preclude, nor does it have the power to preclude,<br />
users from accessing competitors’ sites; all users need do is type<br />
“foundem.com” into their web browser—which works even if it’s Google’s
434 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
own Chrome browser! To the extent that Google can and does limit<br />
competitors’ access to its search results page, it is not controlling access to an<br />
“essential facility” in any sense other than Wal-Mart controls access to its own<br />
stores. “Google search results generated by its proprietary algorithm and found<br />
on its own web pages” do not constitute a market to which access should be<br />
forcibly granted by the courts or legislature.<br />
The set of claims that are adduced under the rubric of “search neutrality” or the<br />
“essential facilities doctrine” against Internet search engines in general and, <strong>as</strong> a<br />
practical matter, Google in particular, are deeply problematic. They risk<br />
encouraging courts and other decision makers to find antitrust violations where<br />
none actually exist, threatening to chill innovation and efficiency-enhancing<br />
conduct. In part for this re<strong>as</strong>on, the essential facilities doctrine h<strong>as</strong> been<br />
relegated by most antitrust experts to the dustbin of history. As Joshua Wright<br />
and I conclude elsewhere:<br />
Indeed, it is our view that in light of the antitrust claims arising<br />
out of innovative contractual and pricing conduct, and the<br />
apparent lack of any concrete evidence of anticompetitive<br />
effects or harm to competition, an enforcement action against<br />
Google on these grounds creates substantial risk for a “false<br />
positive” which would chill innovation and competition<br />
currently providing immense benefits to consumers. 47<br />
47 Geoffrey A. Manne & Joshua D. Wright, Google and the Limits of Antitrust: The C<strong>as</strong>e Against the<br />
C<strong>as</strong>e Against Google, 34 HARV. J. L. & PUB. POL’Y at 62.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 435<br />
Some Skepticism<br />
About Search Neutrality<br />
By James Grimmelmann *<br />
The perfect search engine would be like the mind of God. 1<br />
The God that holds you over the pit of hell, much <strong>as</strong> one holds a spider, or<br />
some loathsome insect, over the fire, abhors you, and is dreadfully<br />
provoked; his wrath towards you burns like fire; he looks upon you <strong>as</strong><br />
worthy of nothing else, but to be c<strong>as</strong>t into the fire … 2<br />
If God did not exist, it would be necessary to invent him. 3<br />
Search engines are attention lenses; they bring the online world into focus.<br />
They can redirect, reveal, magnify, and distort. They have immense power to<br />
help and to hide. We use them, to some extent, always at our own peril. And<br />
out of the many ways that search engines can cause harm, the thorniest<br />
problems of all stem from their ranking decisions. 4<br />
What makes ranking so problematic? Consider an example. The U.K.<br />
technology company Foundem offers “vertical search” 5—it helps users<br />
compare prices for electronics, books, and other goods. That makes it a Google<br />
competitor. 6 But in June 2006, Google applied a “penalty” to Foundem’s<br />
* Associate Professor of Law, New York Law School. I would like to thank Aislinn Black and<br />
Frank P<strong>as</strong>quale for their comments. This essay is available for reuse under the Creative<br />
Commons Attribution 3.0 United States license,<br />
http://creativecommons.org/`licenses/by/3.0/us/.<br />
1 Charles Ferguson, What’s <strong>Next</strong> for Google, TECH. REV., Jan. 1, 2005, at 38, available at<br />
http://www.technologyreview.com/web/14065/ (quoting Sergey Brin, co-founder of<br />
Google).<br />
2 Jonathan Edwards, Sinners in the Hands of an Angry God (sermon delivered July 8, 1741 in<br />
Enfield, Connecticut), available in 22 WORKS OF JONATHAN EDWARDS 411 (Harry S. Stout &<br />
Nathan O. Hatch eds., Yale University Press 2003).<br />
3 Voltaire, Epître à l’auteur du livre des Trois imposteurs [Letter to the Author of The Three<br />
Impostors] (1768), available at http://www.whitman.edu/VSA/trois.imposteurs.html.<br />
4 See James Grimmelmann, The Structure of Search Engine Law, 93 IOWA L. REV. 1, 17–44 (2007)<br />
(identifying nine distinct types of harm search engines can cause to users, information<br />
providers, and third parties).<br />
5 See generally JOHN BATTELLE, THE SEARCH: HOW GOOGLE AND ITS RIVALS REWROTE THE<br />
RULES OF BUSINESS AND TRANSFORMED OUR CULTURE 274–76 (2005) (discussing “domainspecific<br />
search”).<br />
6 See Google Product Search Beta, GOOGLE, http://www.google.com/prdhp.
436 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
website, causing all of its pages to drop dramatically in Google’s rankings. 7 It<br />
took more than three years for Google to remove the penalty and restore<br />
Foundem to the first few pages of results for searches like “compare prices<br />
shoei xr-1000.” 8 Foundem’s traffic, and hence its business, dropped off<br />
dramatically <strong>as</strong> a result. The experience led Foundem’s co-founder, Adam Raff,<br />
to become an outspoken advocate: creating the site searchneutrality.org, 9 filing<br />
comments with the Federal Communications Commission (FCC), 10 and taking<br />
his story to the op-ed pages of The New York Times, 11 calling for legal protection<br />
for the Foundems of the world.<br />
Of course, the government doesn’t get involved every time a business is harmed<br />
by a bad ranking—or Consumer Reports would be out of business. 12 Instead,<br />
search-engine critics b<strong>as</strong>e their c<strong>as</strong>e for regulation on the immense power of<br />
search engines, which can “break the business of a Web site that is pushed<br />
down the rankings.” 13 They have the power to shape what millions of users,<br />
carrying out billions of searches a day, see. 14 At that scale, search engines are<br />
the new m<strong>as</strong>s media 15—or perhaps the new meta media—capable of shaping<br />
public discourse itself. And while power itself may not be an evil, abuse of<br />
power is.<br />
Search-engine critics thus aim to keep search engines—although in the U.S. and<br />
much of the English-speaking world, it might be more accurate to say simply<br />
“Google” 16—from abusing their dominant position. The hard part comes in<br />
defining “abuse.” After a decade of various attempts, critics have hit on the<br />
7 See Foundem’s Google Story, SEARCHNEUTRALITY.ORG (Aug. 18, 2009),<br />
http://www.searchneutrality.org/foundem-google-story.<br />
8 The Shoei XR-1000 is a motorcycle helmet—according to Foundem, it’s £149.99 plus £10<br />
delivery from Helmet City.<br />
9 About, SEARCH NEUTRALITY.ORG, Oct. 9, 2009, http://www.searchneutrality.org/about.<br />
10 Reply Comments of Foundem, In the Matter of Preserving the Open Internet Broadband<br />
Industry Practices, GN Docket No. 09-191 (F..C.C).<br />
11 Adam Raff, Search, But You May Not Find, N.Y. TIMES, Dec. 27, 2009, at A27.<br />
12 Cf. Bose Corp. v. Consumers Union, 466 U.S. 485 (1984) (holding Consumer Reports not<br />
subject to product disparagement liability for negative review of Bose speaker).<br />
13 The Google Algorithm, N.Y. TIMES, July 14, 2010, at A30.<br />
14 See GRANT ESKELSEN ET AL, THE DIGITAL ECONOMY FACT BOOK 12–13 (10th ed. 2009),<br />
http://pff.org/issues-pubs/books/factbook_10th_Ed.pdf.<br />
15 See generally KEN AULETTA, GOOGLED: THE END OF THE WORLD AS WE KNOW IT (2009)<br />
(trying to understand Google by adopting the perspective of the media industry). Cf. Aaron<br />
Swartz, Googling for Sociopaths, RAW THOUGHT (Dec. 14, 2009),<br />
http://www.aaronsw.com/weblog/googled (describing Googled <strong>as</strong> “a history of [Google]<br />
<strong>as</strong> told by the incumbent sociopaths”).<br />
16 See ESKELSEN ET AL., FACT BOOK, supra note 14.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 437<br />
idea of “neutrality” <strong>as</strong> a governing principle. The idea is explicitly modeled on<br />
network neutrality, which would “forbid operators of broadband networks to<br />
discriminate against third-party applications, content or portals.” 17 Like<br />
broadband Internet service providers (ISPs), search engines “accumulate great<br />
power over the structure of online life.” 18 Thus, perhaps search engines should<br />
similarly be required not to discriminate among websites.<br />
For some academics, this idea is a thought experiment: a way to explore the<br />
implications of network neutrality ide<strong>as</strong>. 19 For others, it is a real proposal: a<br />
preliminary agenda for action. 20 Lawyers for ISPs fighting back against network<br />
neutrality have seized on it, either <strong>as</strong> a reductio ad absurdum or a way to kneecap<br />
their bitter rival Google. 21 Even the New York Times h<strong>as</strong> gotten into the game,<br />
running an editorial calling for scrutiny of Google’s “editorial policy.” 22 Since<br />
New York Times editorials, <strong>as</strong> a rule, reflect no independent thought but only a<br />
kind of prevailing conventional wisdom, it is clear that search neutrality h<strong>as</strong><br />
truly arrived on the policy scene.<br />
Notwithstanding its sudden popularity, the c<strong>as</strong>e for search neutrality is a<br />
muddle. There is a fundamental misfit between its avowed policy goal of<br />
protecting users and most of the tests it proposes to protect them. Scratch<br />
beneath the surface of search neutrality and you will find that it would protect<br />
17 Barbara van Schewick, Towards an Economic Framework for Network Neutrality Regulation, 5 J.<br />
TELECOMM. & HIGH TECH. L. 329, 333 (2007), available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=812991.<br />
18 Frank P<strong>as</strong>quale, Internet Nondiscrimination Principles: Commercial Ethics for Carries and Search<br />
Engines, 2008 U. CHI. LEGAL. FORUM 263, 298, available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1134159 [hereinafter P<strong>as</strong>quale,<br />
Internet Nondiscrimination Principles].<br />
19 See Mark R. Patterson, Non-Network Barriers to Network Neutrality, 78 FORDHAM L. REV. 2843<br />
(2010); Andrew Odlyzko, Network Neutrality, Search Neutrality, and the Never-ending Conflict<br />
Between Efficiency and Fairness in Markets, 8 REV. NETWORK ECON. 40 (2009).<br />
20 See DAWN C. NUNZIATO, VIRTUAL FREEDOM: NET NEUTRALITY AND FREE SPEECH IN THE<br />
INTERNET AGE (2009) [hereinafter NUNZIATO, VIRTUAL FREEDOM]; P<strong>as</strong>quale, Internet<br />
Nondiscrimination Principles, supra note 18; Oren Bracha & Frank P<strong>as</strong>quale, Federal Search<br />
Commission: Access, Fairness, and Accountability in the Law of Speech, 93 CORNELL L. REV. 1149<br />
(2008) [hereinafter Bracha & P<strong>as</strong>quale, Federal Search Commission]; Jennifer A. Chandler, A<br />
Right to Reach an Audience: An Approach to Intermediary Bi<strong>as</strong> on the Internet, 35 HOFSTRA L. REV.<br />
1095 (2007).<br />
21 Letter from Robert W. Quinn, Jr., Senior Vice President, AT&T, to Sharon Gillett, Chief,<br />
Wireline Competition Bureau Federal Communications Commission (Sept. 25, 2009),<br />
available at<br />
http://graphics8.nytimes.com/packages/pdf/technology/20090925_ATT-<br />
Letter.pdf.<br />
22 The Google Algorithm, supra note 13. But see Danny Sullivan, The New York Times Algorithm &<br />
Why It Needs Government Regulation, SEARCH ENGINE LAND (July 15, 2010) (parodying New<br />
York Times editorial on Google).
438 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
not search users, but websites. In the search space, however, websites are <strong>as</strong><br />
often users’ enemies <strong>as</strong> not; the whole point of search is to help users avoid the<br />
sites they don’t want to see.<br />
In short, search neutrality’s ends and means don’t match. To explain why, I will<br />
deconstruct eight proposed search-neutrality principles:<br />
1. Equality: Search engines shouldn’t differentiate at all among websites.<br />
2. Objectivity: There are correct search results and incorrect ones, so search<br />
engines should return only the correct ones.<br />
3. Bi<strong>as</strong>: Search engines should not distort the information landscape.<br />
4. Traffic: Websites that depend on a flow of visitors shouldn’t be cut off<br />
by search engines.<br />
5. Relevance: Search engines should maximize users’ satisfaction with<br />
search results.<br />
6. Self-interest: Search engines shouldn’t trade on their own account.<br />
7. Transparency: Search engines should disclose the algorithms they use to<br />
rank web pages.<br />
8. Manipulation: Search engines should rank sites only according to general<br />
rules, rather than promoting and demoting sites on an individual b<strong>as</strong>is.<br />
As we shall see, all eight of these principles are unusable <strong>as</strong> b<strong>as</strong>es for sound<br />
search regulation.<br />
I would like to be clear up front about the limits of my argument. Just because<br />
search neutrality is incoherent, it doesn’t follow that search engines deserve a<br />
free p<strong>as</strong>s under antitrust, intellectual property, privacy, or other well-established<br />
bodies of law. 23 Nor is search-specific legal oversight out of the question.<br />
Search engines are capable of doing d<strong>as</strong>tardly things: According to<br />
BusinessWeek, the Chinese search engine Baidu explicitly shakes down<br />
websites, demoting them in its rankings unless they buy ads. 24 It’s e<strong>as</strong>y to tell<br />
horror stories about what search engines might do that are just plausible enough<br />
to be genuinely scary. 25 My argument is just that search neutrality, <strong>as</strong> currently<br />
proposed, is unlikely to be workable and quite likely to make things worse. It<br />
fails at its own goals, on its own definition of the problem.<br />
23 This essay is not the place for a full discussion of these issues (although we will meet<br />
antitrust and consumer protection law in p<strong>as</strong>sing). Grimmelmann, The Structure of Search<br />
Engine Law, supra note 4, provides a more detailed map.<br />
24 Chi-Chu Tschang, The Squeeze at China’s Baidu, BUSINESSWEEK, Dec. 31, 2008,<br />
http://www.businessweek.com/magazine/content/09_02/b4115021710265.htm<br />
(alleging that Baidu directly retaliates against sites that refuse to buy sponsored links by<br />
demoting them in its organic rankings).<br />
25 See, e.g., Cory Doctorow, Scroogled, http://craphound.com/scroogled.html; Tom Slee, Mr.<br />
Google’s Guidebook, WHIMSLEY (Mar. 7, 2008),<br />
http://whimsley.typepad.com/whimsley/2008/03/mr-googles-guid.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 439<br />
Theory<br />
Before delving into the specifics of search-neutrality proposals, it will help to<br />
understand the principles said to justify them. There are two broad types of<br />
arguments made to support search neutrality, one each focusing on users and<br />
on websites. A search engine that misuses its ranking power might be seen<br />
either <strong>as</strong> misleading users about what’s available online, or <strong>as</strong> blocking websites<br />
from reaching users. 26 Consider the arguments in turn.<br />
Users: Search helps people find the things they want and need. Good search<br />
results are better for them. And since search is both subjective and personal,<br />
users themselves are the ones who should define what makes search results<br />
good. The usual term for this goal is “relevance”: relevant results are the ones<br />
that users themselves are most satisfied with. 27 All else being equal, good search<br />
policy should try to maximize relevance.<br />
A libertarian might say that this goal is trivial. 28 Users are free to pick and<br />
choose among search engines and other informational tools. 29 They will<br />
naturally flock to the search engine that offers them the most relevant results;<br />
the market will provide just <strong>as</strong> much relevance <strong>as</strong> it is efficient to provide. 30<br />
There is no need for regulation; relevance, being demanded by users, will be<br />
26 Other arguments for search neutrality reduce to these two. Bracha and P<strong>as</strong>quale, for<br />
example, are concerned about democracy. They want “an open and relatively equal chance<br />
to all members of society for participation in the cultural sphere.” Bracha & P<strong>as</strong>quale, Federal<br />
Search Commission, supra note 20, at 1183–84. Search engines provide that chance if<br />
individuals can both find (<strong>as</strong> users) and be found (<strong>as</strong> websites) when they participate in<br />
politics and culture. Similarly, Bracha and P<strong>as</strong>quale’s economic efficiency argument turns on<br />
users’ ability to find market information, id. at 1173–75. and their fairness concern speaks to<br />
websites’ losses of “audience or business,” id. at 1175–76. Whatever interest society h<strong>as</strong> in<br />
search neutrality arises from users’ and websites’ interests in it—so we are justified in<br />
focusing our attention on users and websites.<br />
27 See BATTELLE, THE SEARCH, supra note 5, at 19–25.<br />
28 For a clear statement of a libertarian perspective on search neutrality, see Mike M<strong>as</strong>nick’s<br />
posts at Techdirt on the subject, collected at<br />
http:/www.techdirt.com/blog.php?tag=search+neutrality. Eric Goldman’s Search<br />
Engine Bi<strong>as</strong> and the Demise of Search Engine Utopianism, 8 YALE J. L. & TECH. 188 (2006), makes a<br />
general c<strong>as</strong>e against the regulation of relevance on similar grounds.<br />
29 In Google’s words, “Competition is just one click away.” Adam Kovacevich, Google’s<br />
Approach to Competition, GOOGLE POLICY BLOG (May 8, 2009),<br />
http://googlepublicpolicy.blogspot.com/2009/05/googles-approach-tocompetition.html.<br />
30 See Eric Goldman, A Co<strong>as</strong>ean Analysis of Marketing, 2006 WIS. L. REV. 1151.
440 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
supplied by search engines. And this is exactly what search engines themselves<br />
say: relevance is their principal, or only, goal. 31<br />
The response to this point of view—most carefully argued by Frank<br />
P<strong>as</strong>quale 32—is best described <strong>as</strong> “liberal.” It focuses on maximizing the<br />
effective autonomy of search users, but questions whether market forces<br />
actually enable users to demand optimal relevance. For one thing, it questions<br />
whether users can actually detect deviations from relevance. 33 The user who<br />
turns to a search engine, by definition, doesn’t yet know what she’s looking for<br />
or where it is. Her own knowledge, therefore, doesn’t provide a fully reliable<br />
check on what the search engine shows her. The information she would need<br />
to know that the search engine is hiding something from her may be precisely<br />
the information it’s hiding from her—a relevant site that she didn’t know<br />
existed. 34<br />
Perhaps just <strong>as</strong> importantly, structural features of the search market can make it<br />
hard for users to discipline search engines by switching. Search-neutrality<br />
advocates have argued that search exhibits substantial barriers to entry. 35 The<br />
web is so big, and search algorithms so complex and refined, that there are<br />
substantial fixed costs to competing at all. 36 Moreover, the rise of personalized<br />
search both creates switching costs for individual users 37 and also makes it<br />
harder for them to share information about their experiences with multiple<br />
search engines. 38<br />
Websites: The c<strong>as</strong>e for protecting websites reaches back into free speech theory.<br />
Jerome Barron’s 1967 article, Access to the Press—A New First Amendment Right, 39<br />
31 See, e.g., Technology Overview, GOOGLE, www.google.com/corporate/tech.html; How Web<br />
Documents Are Ranked, YAHOO!,<br />
http://help.yahoo.com/l/us/yahoo/search/indexing/ranking-01.html; Ask Search<br />
Technology, ASK, http://sp.<strong>as</strong>k.com/en/docs/about/<strong>as</strong>k_technology.shtml.<br />
32 See P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18; Bracha & P<strong>as</strong>quale, Federal<br />
Search Commission, supra note 20; Frank P<strong>as</strong>quale, Asterisk Revisited: Debating a Right of Reply on<br />
Search Results, 3 J. BUS. & TECH. L. 61 (2008); Frank P<strong>as</strong>quale, Rankings, Reductionism, and<br />
Responsibility, 54 CLEV. ST. L. REV. 115 (2006).<br />
33 See Chandler, Right to Reach an Audience, supra note 20, at 1116; Patterson, Non-Network<br />
Barriers, supra note 19, at 2860-62.<br />
34 See Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1183–84.<br />
35 See id. at 1181–82.<br />
36 See id. at 1181.<br />
37 See P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 265.<br />
38 See Frank P<strong>as</strong>quale, Could Personalized Search Ruin Your Life?, CONCURRING OPINIONS (Feb. 7,<br />
2008),<br />
http://www.concurringopinions.com/archives/2008/02/personalized_se.html.<br />
39 80 HARV. L. REV. 1641 (1967).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 441<br />
argued that freedom of speech is an empty right in a m<strong>as</strong>s-media society unless<br />
one also h<strong>as</strong> access to the m<strong>as</strong>s media themselves. He thus argued that<br />
newspapers should be required to open their letters to the editor and their<br />
advertising to all points of view. 40 Although his proposed right of access is<br />
b<strong>as</strong>ically a dead letter <strong>as</strong> far <strong>as</strong> First Amendment doctrine goes, 41 it captured the<br />
imaginations of media-law scholars and media advocates. 42<br />
Scholars have begun to adapt Barron’s ide<strong>as</strong> to online intermediaries, including<br />
search engines. Dawn Nunziato’s book Virtual Freedom draws extensively on<br />
Barron to argue that Congress may need to “authorize the regulation of<br />
dominant search engines to require that they provide meaningful access to<br />
content.” 43 Jennifer Chandler applies Barron’s ide<strong>as</strong> to propose a “right to<br />
reach an audience” 44 that would give website owners various protections against<br />
exclusion 45 and demotion by search engines. 46 Similarly, Frank P<strong>as</strong>quale<br />
suggests bringing “universal service” over into the search space, 47 perhaps<br />
through a government-provided search engine. 48<br />
The Barronian argument for access, however, needs to be qualified. The freespeech<br />
interest in access to search engine ranking placement is really audiences’<br />
free speech interest; the real harm is that search users have been deprived of<br />
access to the speech of websites, not that websites have been deprived of access<br />
to users. Put another way, websites’ access interest is derivative of users’<br />
interests. In the Supreme Court’s words, “The First Amendment protects the<br />
right of every citizen to ‘reach the minds of willing listeners.’” 49 Or, in Jerome<br />
40 Id. at 1667.<br />
41 See Miami Herald Pub’g Co. v. Tornillo, 418 U.S. 241 (1974) (striking down Florida law<br />
requiring newspapers to provide equal space for political responses).<br />
42 See, e.g., Reclaiming the First Amendment: Constitutional Theories of Media Reform, 35 HOFSTRA L.<br />
REV. 917–1582 (symposium issue collecting papers from conference honoring the 40 th<br />
anniversary of publication of Access to the Press).<br />
43 NUNZIATO, VIRTUAL FREEDOM, supra note 20, at 150.<br />
44 Chandler, Right to Reach an Audience, supra note 20, at 1103–17 (search engines), 1124-30<br />
(proposed right).<br />
45 Exclusion from a search index may sound like a bright-line category of abuse, but note that<br />
a demotion from, say, #1 to #58,610 will have the same effect. No one ever clicks through<br />
5861 pages of results. Thus, in practice, any rule against exclusion would also need to come<br />
with a—more problematic—rule against substantial demotions.<br />
46 Id. at 1117–18.<br />
47 P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 289–92. His example, which<br />
focuses on Google’s scans of books for its Book Search project, is interesting, but is<br />
“universal access” only in a loose, metaphorical sense.<br />
48 See Frank P<strong>as</strong>quale, Dominant Search Engines: An Essential Cultural and Political Facility, infra 258.<br />
49 Hefron v. Int’l Soc. for Krishna Consciousness, Inc., 452 U.S. 640, 655 (1981) (quoting<br />
Kovacs v. Cooper, 336 U.S. 77, 87 (1949)) (emph<strong>as</strong>is added).
442 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Barron’s, “[T]he point of ultimate interest is not the words of the speakers but<br />
the minds of the hearers.” 50 With these purposes in mind, let us turn to actual<br />
search-neutrality proposals.<br />
Equality<br />
Scott Cleland observes that Google’s “algorithm reportedly h<strong>as</strong> over 1,000<br />
variables/discrimination bi<strong>as</strong>es which decide which content gets surfaced.” 51<br />
He concludes that “Google is not neutral” and thus should be subject to any<br />
FCC network-neutrality regulation. 52 On this view, a search engine does<br />
something wrong if it treats websites differently, “surfac[ing]” some, rather than<br />
others. This is a theory of neutrality <strong>as</strong> equality, it comes from the networkneutrality<br />
debates, and it is nonsensical <strong>as</strong> applied to search.<br />
Equality h<strong>as</strong> a long pedigree in telecommunications. For years, common-carrier<br />
regulations required the AT&T system to offer its services on equal terms to<br />
anyone who wanted a phone. 53 This kind of equality is at the heart of proposed<br />
network neutrality regulations: treating all packets identically once they arrive at<br />
an ISP’s router, regardless of source or contents. 54 Whether or not equality in<br />
packet routing is a good idea <strong>as</strong> a technical matter, the rule itself is simple<br />
enough and relatively clear. One can, without difficulty, identify Comc<strong>as</strong>t’s<br />
forging of packets to terminate BitTorrent connections <strong>as</strong> a violation of the<br />
principle. 55 As long <strong>as</strong> an ISP isn’t overloaded to the point of losing too many<br />
packets, equality does what it’s supposed to: ensures that every website enjoys<br />
access to the ISP’s network and customers.<br />
Try to apply this form of equality to search and the results are absurd. Of<br />
course Google differentiates among sites—that’s why we use it. Systematically<br />
favoring certain types of content over others isn’t a defect for a search engine—<br />
it’s the point. 56 If I search for “Machu Picchu pictures,” I want to see llam<strong>as</strong> in a<br />
50 Barron, Access to the Press, supra note 39, at 1653.<br />
51 Scott Cleland, Why Google Is Not Neutral, PRECURSOR BLOG (Nov. 4, 2009),<br />
http://precursorblog.com/content/why-google-is-not-neutral.<br />
52 Id.<br />
53 See generally JONATHAN E. NEUCHTERLEIN & PHILIP J. WEISER, DIGITAL CROSSROADS 45–68<br />
(2005).<br />
54 For an accessible introduction to the technical issues, see Edward W. Felten, The Nuts and<br />
Bolts of Network Neutrality (2006), http://itpolicy.princeton.edu/pub/neutrality.pdf.<br />
55 See In re Formal Compl. of Free Press & Public Knowledge Against Comc<strong>as</strong>t Corp. for Secretly Degrading<br />
Peer-to-Peer Applications, WC Docket No. 07-52, Order, 23 F.C.C. Rcd. 13,028, 13,029–32<br />
(discussing blocking), 13,050–58 (finding that blocking violated federal policy) (2008), vacated,<br />
Comc<strong>as</strong>t v. FCC, 600 F.3d 642 (D.C. Cir. 2010).<br />
56 See Karl Bode, Google Might Stop Violating “Search Neutrality”If Anybody Knew What That<br />
Actually Meant, TECHDIRT (May 7, 2010),
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 443<br />
ruined city on a cloud-forest mountaintop, not horny housewives who whiten<br />
your teeth while you wait for them to refinance your mortgage. Search<br />
inevitably requires some form of editorial control. 57 A search engine cannot<br />
possibly treat all websites equally, not without turning into the phone book. But<br />
for that matter, even the phone book is not neutral in the sense of giving fully<br />
equal access to all comers, <strong>as</strong> the proliferation of AAA Locksmiths and Aabco<br />
Plumbers attests. Differentiating among websites, without something more, is<br />
not wrongful.<br />
Objectivity<br />
If search engines must make distinctions, perhaps we should insist that they<br />
make correct distinctions. Foundem, for example, argues that the Google<br />
penalty w<strong>as</strong> unfair by pointing to positive write-ups of Foundem from “the<br />
UK’s leading technology television programme” and “the UK’s leading<br />
consumer body,” and to its high search ranks on Yahoo! and Bing. 58 The<br />
unvoiced <strong>as</strong>sumption here is that search queries can have objectively right and<br />
wrong answers. A search on “James Grimmelmann blog” should come back<br />
with my weblog at http://laboratorium.net; anything else is a wrong answer.<br />
But this view of what search is and does is wrong. A search for “apple” could<br />
be looking for information about Fiji apples, Apple computers, or Fiona Apple.<br />
“bbs” could refer to airgun pellets, bulletin-board systems, or bed-andbreakf<strong>as</strong>ts.<br />
Different people will have different intentions in mind; even the<br />
same person will have different intentions at different times. Sergey Brin’s<br />
theological comparison of perfect search to the “mind of God” 59 shows us why<br />
perfect search is impossible. Not even Google is—or ever could be—<br />
omniscient. The search query itself is necessarily an incomplete b<strong>as</strong>is on which<br />
to guess at possible results. 60<br />
The objective view of search, then, fails for two related re<strong>as</strong>ons. First, search<br />
users are profoundly diverse. They have highly personal, highly contextual<br />
goals. One size cannot fit all. And second, a search engine’s job always<br />
involves guesswork. 61 Some guesses are better than others, but the search<br />
http://www.techdirt.com/articles/20100504/1324279300.shtml (“[T]he entire purpose<br />
of search is to discriminate and point the user toward more pertinent results.”).<br />
57 See Goldman, Search Engine Bi<strong>as</strong>, supra note 28, at 115–18.<br />
58 Foundem’s Google Story, supra note 7.<br />
59 Supra note 1.<br />
60 See generally ALEX HALAVAIS, THE SEARCH-ENGINE SOCIETY 32–55 (2009) (discussing<br />
difficulties of <strong>as</strong>certaining meaning in search process).<br />
61 See Eric Goldman, Deregulating Relevancy in Internet Trademark Law, 54 EMORY L.J. 507, 521–28<br />
(2005).
444 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
engine will always have to guess. “James Grimmelmann blog” shouldn’t take<br />
users to Toyota’s corporate page—but perhaps they were interested in my<br />
guest-blogging at Concurring Opinions, or in blogs about me, or they have me<br />
mixed up with Eric Goldman and were actually looking for his blog. Time<br />
Warner Cable’s complaint that “significant components of [Google’s] Ad Rank<br />
scheme are subjective” 62 is beside the point. Search itself is subjective. 63<br />
Few scholars go so far <strong>as</strong> to advocate explicit re-ranking to correct search<br />
results. 64 But even those who acknowledge that search is subjective sometimes<br />
write <strong>as</strong> though it were not. Frank P<strong>as</strong>quale gives a hypothetical in which<br />
“YouTube’s results always appear <strong>as</strong> the first thirty [Google] results in response<br />
to certain video queries for which [a rival video site] h<strong>as</strong> demonstrably more<br />
relevant content.” 65 One might <strong>as</strong>k, “demonstrably more relevant” by what<br />
standard? Often the answer will be contentious.<br />
In Foundem’s c<strong>as</strong>e, what difference should it make that Yahoo! and others liked<br />
Foundem? So? That’s their opinion. Google had a different one. Who is to<br />
say that Yahoo! w<strong>as</strong> right and Google w<strong>as</strong> wrong? 66 One could equally well<br />
argue that Google’s low ranking w<strong>as</strong> correct and Yahoo!’s high ranking w<strong>as</strong> the<br />
mistake. “compare prices shoei xr-1000” is not the sort of question that admits<br />
62 Comments of Time Warner Cable Inc. 77, In the Matter of Preserving the Open Internet<br />
Broadband Industry Practices, GN Docket No. 09-191 (F.C.C. comments filed Jan. 14,<br />
2010).<br />
63 See Goldman, Search Engine Bi<strong>as</strong>, supra note 28, at 112–13. This point should not be<br />
confused with a considered opinion on the question of how the First Amendment applies to<br />
search-ranking decisions. Search engines make editorial judgments about relevance, but they<br />
also present information that can only be described <strong>as</strong> factual (such <strong>as</strong> maps and addresses),<br />
extol their objectivity in marketing statements, and are perceived by users <strong>as</strong> having an aura<br />
of reliability. It is possible to make false statements even when speaking subjectively—for<br />
example, I would be lying to you if I said that I enjoy eating scallops. The fact that search<br />
engines’ judgments are expressed algorithmically, including in ways not contemplated by<br />
their programmers, complicates the analysis even further. The definitive First Amendment<br />
analysis of search-engine speech h<strong>as</strong> yet to be written. Academic contributions to that<br />
conversation include Goldman, Search Engine Bi<strong>as</strong>, supra note 28, at 112–15; Bracha &<br />
P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1188–1201; P<strong>as</strong>quale, Asterisk Revisited,<br />
supra note 32, at 68–85; NUNZIATO, VIRTUAL FREEDOM, supra note 20, p<strong>as</strong>sim (and<br />
particularly pages 149–51); Chandler, Right to Reach an Audience, supra note 20, at 1124–29;<br />
James Grimmelmann, The Google Dilemma, 53 N.Y.L.S. L. REV. 939, 946 (2009);<br />
Grimmelmann, The Structure of Search Engine Law, supra note 4, at 58–60. Some leading c<strong>as</strong>es<br />
are listed in note 85, infra.<br />
64 But see Sandeep Pandey et al., Shuffling a Stacked Deck: The C<strong>as</strong>e for Partially Randomized Search<br />
Results, PROC. 31ST VERY LARGE DATABASES CONF. 781 (2005) (arguing for randomization in<br />
search results to promote obscure websites).<br />
65 P<strong>as</strong>quale, Internet Nondiscriminaion Principles, supra note 18, at 296.<br />
66 Cf. Rebecca Tushnet, It Depends on What the Meaning of “False” Is, Falsity and Misleadingness in<br />
Commercial Speech Doctrine, 41 LOYOLA L.A. L. REV. 101 (2008) (arguing that judgments about<br />
falsity frequently embody contested social policies).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 445<br />
of a right answer. This is why it doesn’t help to say that the Foundem vote is<br />
four-to-one against Google. If deviation from the majority opinion makes a<br />
search engine wrong, then so much for search engine innovation—and so much<br />
for unpopular views. 67<br />
Bi<strong>as</strong><br />
Ironically, it is the goal of protecting unpopular views that drives the concern<br />
with search engine “bi<strong>as</strong>.” Luc<strong>as</strong> Introna and Helen Nissenbaum, for example,<br />
are concerned that search engines will direct users to sites that are already<br />
popular and away from obscure sites. 68 Alex Halavais calls for “resistance to<br />
the homogenizing process of major search engines,” 69 including governmental<br />
interventions. 70 These are structural concerns with popularity-b<strong>as</strong>ed search.<br />
Others worry about more particular bi<strong>as</strong>es. AT&T complains that “Google’s<br />
algorithms unquestionably do favor some companies or sites.” 71 Scott Cleland<br />
objects that Google demotes content from other countries in its countryspecific<br />
search pages. 72<br />
The point that a technological system can display bi<strong>as</strong> is one of those profound<br />
observations that is at once both startling and obvious. 73 It naturally leads to<br />
the question of whether, when, and how one could correct for the bi<strong>as</strong> search<br />
engines introduce. 74 But to pull that off, one must have a working<br />
understanding of what constitutes search-engine bi<strong>as</strong>. Batya Friedman and<br />
Helen Nissenbaum define a computer system to be “bi<strong>as</strong>ed” if it “systematically<br />
and unfairly discriminates against certain individuals or groups of individuals in<br />
favor of others.” 75 Since search engines systematically discriminate by design,<br />
67 This l<strong>as</strong>t point should be especially troubling to Barron-inspired advocates of “access,” since<br />
the point of such a regime is to promote opinions that are not widely shared.<br />
68 Luc<strong>as</strong> D. Introna & Helen Nissenbaum, Shaping the Web: Why the Politics of Search Engines<br />
Matters, 16 INFO. SOC. 169, 175 (2000).<br />
69 HALAVAIS, SEARCH ENGINE SOCIETY, supra note 60, at 106.<br />
70 Id. at 132–38.<br />
71 Comments of AT&T Inc. 102, In the Matter of Preserving the Open Internet Broadband<br />
Industry Practices, GN Docket No. 09-191 (F.C.C. comments filed Jan. 14, 2010).<br />
72 Cleland, Why Google Is Not Neutral, supra note 51.<br />
73 In Landgon Winner’s phr<strong>as</strong>e, “artifacts have politics.” LANGDON WINNER, THE WHALE<br />
AND THE REACTOR: A SEARCH FOR LIMITS IN AN AGE OF HIGH TECHNOLOGY19 (University<br />
of Chicago Press 1986).<br />
74 See, e.g., Pandey et al, Shuffling a Stacked Deck, supra note 64.<br />
75 Batya Friedman & Helen Nissenbaum, Bi<strong>as</strong> in Computer Systems, 14 ACM TRANS. ON<br />
COMPUTER SYS. 330, 332 (1996). See also Alejandro M. Diaz, Through the Google Goggles:<br />
Sociopolitical Bi<strong>as</strong> in Search Engine Design (May 23, 2005) (unpublished B.A. thesis,<br />
Stanford University), available at<br />
http://epl.scu.edu/~stsvalues/readings/Diaz_thesis_final.pdf.
446 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
all of the heavy lifting in the definition is done by the word “unfair.” But this<br />
just kicks the problem down the road. One still must explain when<br />
discrimination is “unfair” and when it is not. Friedman and Nissenbaum’s<br />
discussion is enlightening, but does not by itself help us identify which practices<br />
are abusive. 76<br />
The point that socio-technical systems have embedded bi<strong>as</strong>es also cuts against<br />
search neutrality. We should not <strong>as</strong>sume that if only the search engine could be<br />
made properly neutral, the search results would be free of bi<strong>as</strong>. Every search<br />
result requires both a user to contribute a search query, and websites to<br />
contribute the content to be ranked. Neither users nor websites are p<strong>as</strong>sive<br />
participants; both can be wildly, profoundly bi<strong>as</strong>ed.<br />
On the website side, the web is anything but neutral. 77 Websites compete<br />
fiercely, and not always ethically, for readers. 78 It doesn’t matter what the search<br />
engine algorithm is; websites will try to game it. Search-engine optimization, or<br />
SEO, is <strong>as</strong> much a fixture of the Internet <strong>as</strong> spam. Link farms, 79 spam blog<br />
comments, hacked websites—you name it, and they’ll try it, all in the name of<br />
improving their search rankings. A fully invisible search engine, one that<br />
introduced no new values or bi<strong>as</strong>es of its own, would merely replicate the<br />
underlying bi<strong>as</strong>es of the web itself: 80 heavily commercial, and subject to a truly<br />
mindboggling quantity of spam. Raff says that search algorithms should be<br />
“comprehensive.” 81 But should users be subjected to a comprehensive<br />
presentation of discount Canadian pharmaceutical sites?<br />
On the user side, sometimes the bi<strong>as</strong> is between the keyboard and the chair.<br />
Fully de-bi<strong>as</strong>ing search results would also require de-bi<strong>as</strong>ing search queries—<br />
and users’ ability to pick which results they click on. Take a search for “jew,”<br />
for example. Google h<strong>as</strong> been criticized both for returning anti-Semitic sites (to<br />
76 If one fears, with Bracha and P<strong>as</strong>quale, that “a handful of powerful gatekeepers” wield<br />
disproportionate influence, then the solution is simple: break up the b<strong>as</strong>tards. If they<br />
re<strong>as</strong>semble or reacquire too much power, do it again. Neutrality will always be an imperfect<br />
half-me<strong>as</strong>ure if power itself is the problem.<br />
77 See Clay Shirky, Power Laws, Weblogs, and Inequality, SHIRKY.COM (Feb. 8, 2003),<br />
http://www.shirky.com/writings/powerlaw_weblog.html (discussing v<strong>as</strong>t<br />
disproportion of prominence between famous and obscure weblogs).<br />
78 See IAN H. WITTEN ET AL., WEB DRAGONS: INSIDE THE MYTHS OF SEARCH TECHNOLOGY<br />
145–75 (Morgan Kaufmann Publishers 2007).<br />
79 A link farm is a group of automatically generated web sites that heavily link to each other.<br />
The point is to trick a popularity-b<strong>as</strong>ed search engine into believing that all of the sites in the<br />
group are popular. See Grimmelmann, The Google Dilemma, supra note 63, at 946,<br />
80 See Patterson, Non-Network Barriers, supra note 19, at 2854–55.<br />
81 Raff, Search, But You May Not Find, supra note 11.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 447<br />
American users) and for not returning such sites (to German users). 82 The<br />
inescapable issue is that Google h<strong>as</strong> users who want to read anti-Semitic web<br />
pages and users who don’t. One might call some of those users “bi<strong>as</strong>ed,” but if<br />
they are, it’s not Google’s fault.<br />
Some bi<strong>as</strong> is going to leak through <strong>as</strong> long <strong>as</strong> search engines help users find<br />
what they want. And helping users find what they want is such a profound<br />
social good that one should be skeptical of trying to inhibit it. 83 Telling users<br />
what they should see is a serious intrusion on personal autonomy, and thus<br />
deeply inconsistent with the liberal argument for search neutrality. If you want<br />
Google to steer users to websites with views that differ from their own, 84 your<br />
goal is not properly described <strong>as</strong> search neutrality. In effect, you have gone back<br />
to <strong>as</strong>serting the objective correctness of search results: Certain sites are good for<br />
users, like whole grains.<br />
Traffic<br />
The most common trope in the search debates is the website whose traffic<br />
vanishes overnight when it disappears from Google’s search results. 85 Because<br />
so much traffic flows through Google, it holds websites over the flames of<br />
website hell, ready at any instant to let them fall in the rankings. Chandler’s<br />
proposed right to reach an audience and Foundem’s proposed “effective,<br />
accessible, and transparent appeal process” 86 attempt to protect websites from<br />
82 See Grimmelmann, The Google Dilemma, supra note 63, at 943–45.<br />
83 See James Grimmelmann, Don’t Censor Search, 117 YALE L.J. POCKET PART 48 (2007).<br />
84 See generally CASS SUNSTEIN, REPUBLIC.COM 2.0 (Princeton University Press 2007).<br />
85 See, e.g., BATTELLE, THE SEARCH, supra note 5, at 153–59 (2bigfeet.com, main index);<br />
NUNZIATO, VIRTUAL FREEDOM, supra note 20, at 14–17 (various sites, AdWords and Google<br />
News); Chandler, Right to Reach an Audience, supra note 20, at 1110 (BMW Germany and<br />
Ricoh Germany, main index); Michael Y. Park, Journalist Who Exposes U.N. Corruption<br />
Disappears from Google, FOX NEWS, Feb. 18, 2008,<br />
http://www.foxnews.com/story/0,2933,331106,00.html (Inner City Press, Google<br />
News); Cleland, Why Google Is Not Neutral, supra note 51 (ExtremeTech.com and<br />
Fotolog.com, AdWords); Dan Mcsai, G-Railed: Why Did Google Bury the Web’s Oldest<br />
Entertainment Publication?, FASTCOMPANY.COM (Dec. 2, 2009),<br />
http://www.f<strong>as</strong>tcompany.com/blog/dan-macsai/popwise/why-did-neutral-googlede-list-webs-oldest-entertainment-publication<br />
(Studio Briefing, AdWords and main<br />
index); Foundem’s Google Story, supra note 7 (Foundem, main index). Opinions in lawsuits<br />
challenging demotions or exclusions include Langdon v. Google Inc., 474 F. Supp. 622 (D.<br />
Del. 2007) (NCJusticeFraud.com and ChinaIsEvil.com, AdWords); Kinderstart.com LLC v.<br />
Google, Inc., No. C 06-2057 JF (RS), 2006 U.S. Dist. LEXIS 82481 (N.D. Cal. July 13, 2006)<br />
(Kinderstart.com, main index) and Search King, Inc. v. Google Tech., Inc., No. CIV-02-<br />
1457-M, 2003 U.S. Dist. LEXIS 27193 (W.D. Okla. 2003) (SearcchKing.com, main index).<br />
86 Foundem’s Google Story, supra note 7.
448 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
being dropped. Dawn Nunzatio, for her part, would require search engines to<br />
open their sponsored links to political candidates. 87<br />
A right to continued customer traffic would be a legal anomaly; offline<br />
businesses enjoy no such right. Some Manhattanites who take the free IKEA<br />
ferry to its store in Brooklyn eat at the nearby food trucks in the Red Hook Ball<br />
Fields. 88 The food truck owners would have no right to complain if IKEA<br />
discontinued the ferry or moved its store. Search neutrality advocates, however,<br />
would say that RedHookFoodTruck.com h<strong>as</strong> a Jerome Barron-style free-speech<br />
interest in having access to the search engine’s result pages, and thus h<strong>as</strong> more<br />
right to complain if the Google ferry no longer comes to its neighborhood. 89<br />
But, <strong>as</strong> we saw above, this is really an argument that users have a relevance interest<br />
in seeing the site. If no one actually wants to visit RedHookFoodTruck.com,<br />
then its owner shouldn’t be heard to complain about her poor search ranking.<br />
When push comes to shove, search neutrality advocates recognize that websites<br />
must plead their c<strong>as</strong>e in terms of users’ needs. Chandler’s modern right of<br />
access is a “right to reach a willing audience,” 90 which she describes <strong>as</strong> “the right<br />
to be free of the imposition of discriminatory filters that the listener would not<br />
otherwise have used.” 91 Even Foundem’s Adam Raff presents his actual searchneutrality<br />
principle in user-protective terms: “search engines should have no<br />
editorial policies other than that their results be comprehensive, impartial and<br />
b<strong>as</strong>ed solely on relevance.” 92 Relevance is, of course, the touchstone of users’<br />
interests, not websites’.<br />
Indeed, looking at the rankings from a website’s perspective, rather than from<br />
users’, can be counterproductive to free-speech values. If users really find other<br />
websites more relevant, then making them visit RedHookFoodTruck.com<br />
impinges on their autonomy and on their free speech interests <strong>as</strong> listeners. For<br />
any given search query, there may be dozens, hundreds, thousands of<br />
competing websites. The v<strong>as</strong>t majority of them will thus have interests that<br />
diverge from users’—and every incentive to override users’ wishes.<br />
87 NUNZIATO, VIRTUAL FREEDOM, supra note 20, at 150–51.<br />
88 See Adam Kuban, Red Hook Vendors: A Quick Guide for the Uninitiated, SERIOUS EATS (July 18,<br />
2008), http://newyork.seriouseats.com/2008/07/red-hook-vendors-soccer-tacosguide-how-to-get-there-what-to-eat.html.<br />
89 See Chandler, Right to Reach an Audience, supra note 20; NUNZIATO, VIRTUAL FREEDOM, supra<br />
note 20.<br />
90 Chandler, Right to Reach an Audience, supra note 20, at 1099 (emph<strong>as</strong>is added).<br />
91 Id. at 1103 (emph<strong>as</strong>is added).<br />
92 Raff, Search, But You May Not Find, supra note 11 (emph<strong>as</strong>is added).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 449<br />
Even when users are genuinely indifferent among various websites, some search<br />
neutrality advocates think websites should be protected from “arbitrary” or<br />
“unaccountable” ranking changes <strong>as</strong> a matter of fairness. 93 We should call the<br />
websites that currently sit at the top of search engine rankings by their proper<br />
name—incumbents—and we should look <strong>as</strong> skeptically on their demands to<br />
remain in power <strong>as</strong> we would on any other incumbent’s. The search engine that<br />
ranks a site highly h<strong>as</strong> conferred a benefit on it; turning that gratuitous benefit<br />
into a permanent entitlement gets the ethics of the situation exactly backwards.<br />
Indeed, giving highly-ranked websites what is in effect a property right in search<br />
rankings runs counter to everything we know about how to hand out property<br />
rights. Websites don’t create the rankings; search engines do. Similarly, search<br />
engines are in a better position to manage rankings and prevent w<strong>as</strong>te. And if<br />
each individual search ranking came with a right to placement, every searchresults<br />
page would be an anti-commons in the making. 94<br />
Thus, it is irrelevant that Foundem had a prominent search placement on Google<br />
before it landed in the doghouse. Just <strong>as</strong> the subjectivity of search means that<br />
search engines will frequently disagree with each other, it also means that a<br />
search engine will disagree with itself over time. From the outside looking in,<br />
we have no b<strong>as</strong>is to say whether the initial high ranking or the subsequent low<br />
ranking made more sense. To give Foundem—and every other website<br />
currently enjoying a good search ranking—the right to continue where it is<br />
would lock in search results for all time, obliterating search-engine<br />
experimentation and improvement.<br />
Relevance<br />
Given the importance of user autonomy to search-neutrality theory, relevance is<br />
a natural choice for a neutrality principle. In Foundem’s words, search results<br />
should be “b<strong>as</strong>ed solely on relevance.” 95 Chandler proposes a rule against<br />
“discrimination that listeners would not have chosen.” 96 Bracha and P<strong>as</strong>quale<br />
decry “search engines [that] highlight or suppress critical information” and<br />
thereby “shape and constrain [users’] choices”—that is, hide information that<br />
users would have found relevant. 97<br />
93 Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1175–76.<br />
94 See generally Michael Heller, The Tragedy of the Anticommons, 111 HARV. L. REV. 621 (1998)<br />
(arguing that when too many owners have exclusion rights over a resource, it is prone to<br />
underuse).<br />
95 Search Neutrality, SEARCH NEUTRALITY.ORG (Oct. 11, 2009),<br />
http://www.searchneutrality.org/.<br />
96 Chandler, Right to Reach an Audience, supra note 20, at 1098.<br />
97 Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1177.
450 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Relevance, however, is such an obvious good that its virtue verges on the<br />
tautological. Search engines compete to give users relevant results; they exist at all<br />
only because they do. Telling a search engine to be more relevant is like telling<br />
a boxer to punch harder. Of course, sometimes boxers do throw fights, so it<br />
isn’t out of the question that a search engine might underplay its hand. How,<br />
though, could regulators tell? Regulators can’t declare a result “relevant”<br />
without expressing a view <strong>as</strong> to why other possibilities are “irrelevant,” and that<br />
is almost always going to be contested.<br />
Here’s an example: Foundem. Recall that Foundem is a “vertical search site”<br />
that specializes in consumer goods. Well, a great many vertical search sites are<br />
worthless. (If you don’t believe me, ple<strong>as</strong>e try using a few for a bit.) Like other<br />
kinds of sites that simply roll up existing content and slap some of their own<br />
ads on it—Wikipedia clones and local business directories also come to mind—<br />
they superficially resemble legitimate sites that provide something of value to<br />
users. 98 But only superficially. The “penalties” that reduce vertical search sites’<br />
Google ranks aren’t an attempt to reduce competition at the expense of<br />
relevance; they’re an attempt to implement relevance. 99 There are a few relatively<br />
good, usable product-search sites, but most of them are junk and good riddance<br />
to them. You’re welcome to disagree—search is subjective—but I’d rather have<br />
the anti-vertical penalty in place than not. Those who would argue that<br />
Google’s rankings don’t reflect relevance have a heavy burden of proof, in the<br />
face of ample, e<strong>as</strong>ily verified evidence to the contrary.<br />
In fact, behind almost every well-known story of search engine caprice, there is<br />
a more persu<strong>as</strong>ive relevance-enhancing counter-story. For example,<br />
SourceTool, another vertical search engine, h<strong>as</strong> sued Google under antitrust law<br />
for, in effect, demoting it in Google’s rankings for search ads. 100 SourceTool,<br />
though, is a “directory” with a taxonomic logic of dubious utility—the United<br />
Nations Standard Products and Services Code—and almost no content of its<br />
own. It’s the rare user indeed who will find SourceTool relevant. If you care<br />
about relevance and user autonomy, you should applaud Google’s decision to<br />
demote SourceTool.<br />
98 See Chris Lake, Foundem vs Google: A C<strong>as</strong>e Study in SEO Fail, ECONSULTANCY (Aug. 18, 2009),<br />
http://econsultancy.com/blog/4456-foundem-vs-google-a-c<strong>as</strong>e-study-in-seo-fail;<br />
Little or No Original Content, GOOGLE WEBMASTER CENTRAL (updated June 10, 2009),<br />
www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer=66361.<br />
99 See John Lettice, When Algorithms Attack, Does Google Hear You Scream?, THE REGISTER (Nov.<br />
19, 2009), http://www.theregister.co.uk/2009/11/19/google_hand_of_god/.<br />
100 See TradeComet.com LLC v. Google Inc., No. 09–CIV-1400 (S.D.N.Y. complaint filed Feb.<br />
17, 2009). The District Court dismissed the c<strong>as</strong>e on the b<strong>as</strong>is of the forum-selection clause<br />
in Google’s advertiser agreement, without reaching the merits of the c<strong>as</strong>e. See<br />
TradeComet.com LLC v. Google Inc., 693 F. Supp. 2d 370 (S.D.N.Y. 2010).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 451<br />
Self-Interest<br />
In practice, even <strong>as</strong> search-neutrality advocates claim “relevance” <strong>as</strong> their goal,<br />
they rely on proxies for it. The most common is self-interest. A Consumer<br />
Watchdog report accuses Google of “an abandonment of [its] pledge to provide<br />
neutral search capability” by “steering Internet searchers to its own services” to<br />
“muscle its way into new markets.” 101 Foundem alleges that Google demotes it<br />
and other vertical search sites to fend off competition, and alleges that Google’s<br />
links to itself give it “an un<strong>as</strong>sailable competitive advantage.” 102 Bracha and<br />
P<strong>as</strong>quale worry that search engines can change their rankings “in response to<br />
positive or negative inducements from other parties.” 103<br />
Bad motive may lead to bad relevance, but it’s also a bad proxy for it. The first<br />
problem is evidentiary. By definition, motivations are interior, personal. 104 Of<br />
course, the law h<strong>as</strong> to guess at motives all the time, but the t<strong>as</strong>k is by its nature<br />
harder than looking to extrinsic evidence. People get it wrong all the time. In<br />
2009, an Amazon employee with a fat finger hit a wrong button and categorized<br />
tens of thousands of gay-themed books <strong>as</strong> “adult.” 105 An angry mob of<br />
Netizens <strong>as</strong>sumed the company had deliberately pulled the books from its<br />
search engine out of anti-gay animus, and used the Twitter h<strong>as</strong>htag #amazonfail<br />
to express their very public outrage. 106 Amazon’s recl<strong>as</strong>sification w<strong>as</strong> a mistake<br />
(a quickly corrected one), and a vivid demonstration of the power of search<br />
algorithms—but not a c<strong>as</strong>e of bad motives. 107<br />
In all but the most blatant of c<strong>as</strong>es, in fact, a search engine will be able to tell a<br />
plausible relevance story about its ranking decisions.Proving that a relevance<br />
story is pretextual will be extraordinarily difficult, in view of the complexity and<br />
subjectivity of search. But it would also be dis<strong>as</strong>trous to adopt the opposite<br />
point of view and presume pretext. The absence of bad motive is a negative<br />
that it will often be impossible for the search engine to prove. How can it<br />
101 TRAFFIC REPORT: HOW GOOGLE IS SQUEEZING OUT COMPETITORS AND MUSCLING INTO<br />
NEW MARKETS (Consumer Watchdog 2010),<br />
http://www.consumerwatchdog.org/resources/TrafficStudy-Google.pdf.<br />
102 Reply Comments of Foundem, supra note 10, at 1.<br />
103 Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1170.<br />
104 As an artificial corporate entity, a search engine may not even have motives other than the<br />
ones the law attributes to it.<br />
105 See Nick Eaton, AmazonFail: An Inside Look at What Happened, AMAZON & THE ONLINE<br />
RETAIL BLOG (Apr. 13, 2009), http://blog.seattlepi.com/amazon/archives/166384.<strong>as</strong>p.<br />
106 See Clay Shirky, The Failure of #amazonfail, SHIRKY.COM (Apr. 15, 2009),<br />
http://www.shirky.com/weblog/2009/04/the-failure-of-amazonfail/.<br />
107 But see Mary Hodder, Why Amazon Didn’t Just Have a Glitch, TECHCRUNCH (Apr. 14, 2009),<br />
http://techcrunch.com/2009/04/14/guest-post-why-amazon-didnt-just-have-aglitch/.
452 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
establish, for example, that the engineer who added the anti-vertical penalty<br />
didn’t have a lunchroom conversation with an executive who played up the<br />
competition angle? This is not to say that serious c<strong>as</strong>es of abuse are<br />
implausible, 108 just that investigation will be unusually hard and that false<br />
positives will be dangerously frequent.<br />
There is a nontrivial antitrust issue lurking here. In the United States, Google<br />
h<strong>as</strong> a dominant market share in both search and search advertising, and one<br />
could argue that Google h<strong>as</strong> started to leverage its position in anticompetitive<br />
ways. 109 Antitrust, however approaches such questions with a well-developed<br />
analytical toolkit: relevant markets, market power, pro-competitive and anticompetitive<br />
effects, and so on. 110 Antitrust rightly focuses on the effects of<br />
business practices on consumers; search neutrality should not short-circuit that<br />
consumer-centric analysis by overemph<strong>as</strong>izing the role of a search engine’s<br />
motives. Some things can be good for Google and good for its users.<br />
Thus, when Google links to its own products, not only can there be substantial<br />
technical benefits from integration, but often Google is helping users by<br />
pointing them to services that really are better than the competition. Consumer<br />
Watchdog, for example, cries foul that Google “put its own [map] service atop<br />
all others for generic address searches,” 111 and that Google Maps h<strong>as</strong> taken half<br />
of the local search market at the expense of previously dominant MapQuest and<br />
Yahoo! Maps. 112 But perhaps MapQuest and Yahoo! Maps deserved to lose.<br />
Google Maps w<strong>as</strong> groundbreaking when launched, and years later, it remains<br />
one of the best-implemented services on the Internet, with <strong>as</strong>tonishingly clever<br />
scripting, flexible route-finding, and a powerful application programming<br />
interface (API). 113<br />
108 Baidu’s alleged shakedown (see supra note 24 and accompanying text), if true, would be an<br />
example. Willingness to buy Baidu search ads is not in itself a reliable indicator of relevance<br />
to Baidu searchers. But then again, even pay-for-placement w<strong>as</strong> once considered a plausible<br />
model for main-column search results—and willingness to pay is not inherently a crazy<br />
proxy for relevance. See BATTELLE, THE SEARCH, supra note 5, at 104–14 (discussing GoTo’s<br />
pay-for-placement model). See also Goldman, Co<strong>as</strong>ean Analysis, supra note 30 (envisioning a<br />
future in which advertisers and users negotiate over access to users’ attention). Indeed,<br />
search ads today are sold on an auction-b<strong>as</strong>ed b<strong>as</strong>is. They’re often <strong>as</strong> relevant <strong>as</strong> maincolumn<br />
search results, sometimes more so. It might be better to say that Baidu’s real<br />
problems are monopoly pricing and (compulsory) stealth marketing.<br />
109 See, e.g., Brad Stone, Sure, It’s Big, But Is That Bad?, N.Y. TIMES, May 21, 2010, at BU1.<br />
110 See generally Geoffrey A. Manne & Joshua D. Wright, Google and the Limits of Antitrust: The<br />
C<strong>as</strong>e Against the Antitrust C<strong>as</strong>e Against Google, HARV. J. L. & PUB. POL’Y. (forthcoming).<br />
111 TRAFFIC REPORT, supra note 101, at 5.<br />
112 Id. at 5–7.<br />
113 See, e.g., John Carroll, Google Maps and Innovation, A DEVELOPER’S VIEW (Oct. 12, 2005),<br />
http://www.zdnet.com/blog/carroll/google-maps-and-innovation/1499.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 453<br />
One form of self-interest that may be well-enough defined to justify regulatory<br />
scrutiny is the straightforward bribe: a payment from a website to change its<br />
ranking, or a competitor’s. Search-engine critics argue that search engines<br />
should disclose commercial relationships that bear on their ranking decisions. 114<br />
This is a standard, sensible policy response to the fear of stealth marketing. 115<br />
Indeed, the Federal Trade Commission (FTC) h<strong>as</strong> specifically warned search<br />
engines not to mix their organic and paid search results. 116 More generally, the<br />
FTC endorsement guidelines provide that endorsements must “reflect the<br />
honest opinions, findings, beliefs, or experience of the endorser” 117 and that any<br />
connections between endorser and seller that “might materially affect the weight<br />
or credibility of the endorsement” 118 must be fully disclosed. These policies<br />
have a natural application to search engines. A search engine that factors<br />
payments from sponsors into its ranking decisions is lying to its users unless it<br />
discloses those relationships, and this sort of lie would trigger the FTC’s<br />
jurisdiction. 119 This isn’t a neutrality principle, or even unique to search; it’s just<br />
a natural application of a well-established legal norm.<br />
Transparency<br />
Search-engine critics generally go further and argue that search engines should<br />
also be required to disclose their algorithms in detail:<br />
� Introna and Nissenbaum: “As a first step we would demand full and<br />
truthful disclosure of the underlying rules (or algorithms) governing<br />
indexing, searching, and prioritizing, stated in a way that is meaningful<br />
to the majority of web users.” 120<br />
� Foundem: “Search Neutrality can be defined <strong>as</strong> the principle that<br />
search engines should be open and transparent about their editorial<br />
policies … .” 121<br />
114 See, e.g., P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 286.<br />
115 See generally Ellen P. Goodman, Stealth Marketing and Editorial Integrity, 85 TEX. L. REV.<br />
83 (2006).<br />
116 See Letter from Heather Hippsley, Acting Assoc. Dir., Div. of Adver. Practices, Fed. Trade<br />
Comm., to Gary Ruskin, Executive Dir. at Commercial Alert (June 27, 2007), available at<br />
http://www.ftc.gov/os/closings/staff/commercialalertletter.shtm.<br />
117 16 CFR § 255.1(a).<br />
118 16 CFR § 255.5.<br />
119 Disclosure in common c<strong>as</strong>es need not be onerous. Where, for example, a search engine<br />
auctions off sponsored links on its results pages, telling users that those links are auctioned<br />
off should generally suffice. See generally Letter from Heather Hippsley, supra note 116.<br />
120 Introna and Nissenbaum, supra note 68, at 181.<br />
121 Search Neutrality, SEARCHNEUTRALITY.ORG (Oct. 11, 2009),<br />
http://www.searchneutrality.org/search-neutrality.
454 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
� P<strong>as</strong>quale: “[Dominant search engines] should submit to regulation that<br />
bans stealth marketing and reliably verifies the absence of the practice.” 122<br />
These disclosures are meant to inform users about what they’re getting from a<br />
search engine (Introna and Nissenbaum), to inform websites about the<br />
standards they’re being judged by (Foundem), 123 or to inform regulators about<br />
what the search engine is actually doing (P<strong>as</strong>quale). 124<br />
Algorithmic transparency is a delicate business. Full disclosure of the algorithm<br />
itself runs up against critical interests of the search engine. A fully public<br />
algorithm is one that the search engine’s competitors can copy wholesale. 125<br />
Worse, it is one that websites can use to create highly optimized search-engine<br />
spam. 126 Writing in 2000, long before the full extent of search-engine spam w<strong>as</strong><br />
<strong>as</strong> clear <strong>as</strong> it is today, Introna and Nissenbaum thought that the “impact of<br />
these unethical practices would be severely dampened if both seekers and those<br />
wishing to be found were aware of the particular bi<strong>as</strong>es inherent in any given<br />
search engine.” 127 That underestimates the scale of the problem. Imagine<br />
instead your inbox without a spam filter. You would doubtless be “aware of the<br />
particular bi<strong>as</strong>es” of the people trying to sell you fancy watches and penis<br />
pills—but that will do you little good if your inbox contains a thousand pieces<br />
of spam for every email you want to read. That is what will happen to search<br />
results if search algorithms are fully public; the spammers will win.<br />
For this re<strong>as</strong>on, search-neutrality advocates now acknowledge the danger of<br />
SEO and thus propose only limited transparency. 128 P<strong>as</strong>quale suggests, for<br />
example, that Google could respond to a question about its rankings with a list<br />
of a few factors that principally affected a particular result. 129 But search is<br />
immensely complicated—so complicated that it may not be possible to boil a<br />
ranking down to a simple explanation. When the law demands disclosure of<br />
complex matters in simple terms, we get pro forma statements and boilerplate.<br />
122 P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 299 (emph<strong>as</strong>is added).<br />
123 See also id. at 285 (arguing that search engines benefit from hidden algorithms because<br />
websites, lacking clear information about how to achieve high organic search rankings, must<br />
resort to buying paid search ads).<br />
124 See also The Google Algorithm, supra note 13 (recommending required disclosure).<br />
125 See Grimmelmann, Structure of Search Engine Law, supra note 4, at 49, 55.<br />
126 Id. at 44–46, 56.<br />
127 Introna and Nissenbaum, supra note 68, at 181.<br />
128 See P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 297; Bracha & P<strong>as</strong>quale,<br />
Federal Search Commission, supra note 20, at 1201–02; Chandler, Right to Reach an Audience, supra<br />
note 20, at 1117.<br />
129 P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 296–97.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 455<br />
Consumer credit disclosures and securities prospectuses have brought<br />
important information into the open, but they haven’t done much to aid the<br />
understanding of their average recipient.<br />
Google’s algorithm depends on more than 200 different factors. 130 Google<br />
makes about 500 changes to it a year, 131 b<strong>as</strong>ed on ten times <strong>as</strong> many<br />
experiments. 132 One sixth of the hundreds of millions of queries the algorithm<br />
handles daily are queries it h<strong>as</strong> never seen before. 133 The PageRank of any<br />
webpage depends, in part, on every other page on the Internet. 134 And even<br />
with all the computational power Google can muster, a full PageRank<br />
recomputation takes weeks. 135 PageRank is, <strong>as</strong> algorithms go, elegantly<br />
simple—but I certainly wouldn’t want to have the job of making Markov chains<br />
and eigenvectors “meaningful to the majority of Web users.” 136 In practice, any<br />
simplified disclosure is likely to leave room for the search engine to bury plenty<br />
of bodies.<br />
Some scholars have suggested that concerns about transparency could be<br />
handled through regulatory opacity: The search engine discloses its algorithm to<br />
the government, which then keeps the details from the public. 137 This is a<br />
promising way of dealing with search engines’ operational needs for secrecy, but<br />
it sharpens the question of regulators’ technical competence. If the record is<br />
sealed, they won’t have third-party experts and interested amici to walk them<br />
through novel technical issues. Everything will hinge on their own ability to<br />
evaluate the implications of small details in search algorithms. The track record<br />
of agencies and courts in dealing with other digital technologies does not<br />
provide grounds for optimism on this score. 138 P<strong>as</strong>quale makes an important<br />
130 See, e.g., Technology Overview, GOOGLE, http://www.google.com/corporate/tech.html.<br />
131 See Steven Levy, Inside the Box, WIRED, Mar. 2010, at 96.<br />
132 See Rob Hof, Google’s Udi Manber: Search Is About People, Not Just Data, THE TECH BEAT (Oct.<br />
1, 2009),<br />
http://www.businessweek.com/the_thread/techbeat/archives/2009/10/googles_ud<br />
i_manber_search_is_about_people_not_just_data.html.<br />
133 Id.<br />
134 See AMY N. LANGVILLE & CARL D. MEYER, GOOGLE’S PAGERANK AND BEYOND: THE<br />
SCIENCE OF SEARCH (Princeton University Press 2006).<br />
135 Id.<br />
136 In The Google Dilemma, supra note 63, I didn’t even try to explain the math to law professors.<br />
137 P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18 at 297–98; Bracha & P<strong>as</strong>quale,<br />
Federal Search Commission, supra note 20, at 1294–96. See generally Viva R. Moffat, Regulating<br />
Search, 22 HARV. J. L. & TECH. 475 (2009) (discussing institutional choice issues in search<br />
regulation).<br />
138 But see Frank P<strong>as</strong>quale, Trusting (and Verifying) Online Intermediaries’ Policing, supra at 258<br />
(proposing “Internet Intermediary Regulatory Council” and arguing that it could develop
456 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
point that “it is essential that someone h<strong>as</strong> the power to ‘look under the hood,’” 139<br />
but it is also important that algorithmic disclosure remain connected to a<br />
workable theory of what regulators are looking for and what they would do if<br />
they found it.<br />
Manipulation<br />
Perhaps the most interesting idea in the entire search neutrality debate is the<br />
“manipulation” of search results. It’s a slippery term, and used inconsistently in<br />
the search-engine debates—including by me. 140 In the dictionary sense of<br />
“process, organize, or operate on mentally or logically; to handle with mental or<br />
intellectual skill,” 141 all search results are manipulated and the more skillfully the<br />
better. But in the dictionary sense of “manage, control, or influence in a subtle,<br />
devious, or underhand manner,” 142 it’s a bad thing indeed: no one likes to be<br />
manipulated. 143<br />
In practice—although this is rarely made explicit—the concern is with what I<br />
have described elsewhere <strong>as</strong> “hand manipulation.” 144 This idea imagines the<br />
search engine <strong>as</strong> having both an automatic, general-purpose ranking algorithm<br />
and a human-created list of exceptions. Consumer Watchdog, for example,<br />
derides Google’s claim to rank results “automatically by algorithms,” saying, “It<br />
is hard to see how this can still be true, given the incre<strong>as</strong>ingly pronounced tilt<br />
toward its own services in Google’s search results.” 145 Foundem calls it<br />
“manual intervention,” “special treatment,” and “manual bi<strong>as</strong>,” and documents<br />
how Google’s public statements have quietly backed away from claims that its<br />
rankings are “objective” and “automatic.” 146<br />
Put this way, the distinction between objective algorithm and subjective<br />
manipulation is incoherent. Both kinds of decisions come from the same<br />
sufficient technical expertise to “generate official and even public understanding of [search<br />
engines’] practices”).<br />
139 P<strong>as</strong>quale, Internet Nondiscrimination Principles, supra note 18, at 286.<br />
140 Compare Grimmelmann, Structure of Search Engine Law, supra note 4, at 44 (“technical arms<br />
race between engines and manipulators”) with id., at 59–60 (“hand manipulation of results<br />
[by search engines]”).<br />
141 Oxford English Dictionary (June 2010 draft).<br />
142 Id.<br />
143 See Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1176–79 (discussing effects<br />
of manipulation on user autonomy).<br />
144 Grimmelmann, Structure of Search Engine Law, supra note 4, at 59.<br />
145 TRAFFIC REPORT, supra note 101, at 8.<br />
146 Foundem’s Google Story, supra note 7.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 457<br />
source: the search engine’s programmers. 147 Nor can the algorithm provide a<br />
stable b<strong>as</strong>eline against which to me<strong>as</strong>ure manipulation, since each<br />
“manipulation” is a change to the algorithm itself. It’s not like Bing h<strong>as</strong> rooms<br />
full of employees looking over search results pages and making l<strong>as</strong>t-minute<br />
tweaks before the pages are delivered to users.<br />
Academics, being more careful with concepts, have focused on intentionality:<br />
does the search engine intend the promotions and demotions that will result<br />
from an algorithmic change? Mark Patterson, for example, refers to<br />
“intentional manipulation of results.” 148 Bracha and P<strong>as</strong>quale sharpen this idea<br />
to speak of “highly specific or local manipulations,” such <strong>as</strong> singling out<br />
websites for special treatment. 149 Chandler argues that “search engines should<br />
not manipulate individual search results except to address instances of suspected<br />
abuse.” 150 Google itself is remarkably coy about whether and when it changes<br />
rankings on an individual b<strong>as</strong>is. 151<br />
Surprisingly, no one h<strong>as</strong> explained why special-c<strong>as</strong>ing in and of itself is a<br />
problem. One possibility is that it captures the distinction between individual<br />
adjudication and general rulemaking: changes that only affect a few websites<br />
trigger a kind of due process interest in individualized procedural protections. 152<br />
There is also a kind of Rawlsian argument 153 here, that algorithmic decisions<br />
should be made from behind a veil of ignorance, not knowing which websites<br />
they will favor. For whatever re<strong>as</strong>on, local manipulations make people nervous,<br />
nervous enough that most of the stories told to instill fear of search engines<br />
involve what is or looks like manipulation. 154<br />
Local manipulation, however, is a distraction. The real goal is relevance. From<br />
that point of view, most local manipulations aren’t wrongful at all. Foundem<br />
should know; it benefited from a local manipulation. The penalty that afflicted<br />
147 See Goldman, Search Engine Bi<strong>as</strong>, supra note 28, at 112–15; Grimmelmann, Structure of Search<br />
Engine Law, supra note 4, at 59–60.<br />
148 Patterson, Non-Network Barriers, supra note 19, at 2854.<br />
149 Bracha & P<strong>as</strong>quale, Federal Search Commission, supra note 20, at 1168.<br />
150 Chandler, Right to Reach an Audience, supra note 20, at 1117 (emph<strong>as</strong>is added).<br />
151 See, e.g., Lettice, When Algorithms Attack, supra note 99; James Grimmelmann, Google Replies to<br />
SearchKing Lawsuit, LAWMEME (Jan. 9, 2003),<br />
http://lawmeme.research.yale.edu/modules.php?name=News&file=article&sid=80<br />
7.<br />
152 Compare Londoner v. Denver, 210 U.S. 373, 386 (1908) (hearing required when tax<br />
<strong>as</strong>sessment affects only a few people) with Bi-Metallic Inv. Co. v. Colorado, 239 U.S. 441,<br />
445–46 (hearing not required when tax <strong>as</strong>sessment affects all citizens equally).<br />
153 See JOHN RAWLS, A THEORY OF JUSTICE (1971).<br />
154 See, e.g., Lettice, When Algorithms Attack, supra note 99.
458 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
it for three years appears to have been a relatively general change to Google’s<br />
algorithm, one designed to affect a great many low-value vertical search sites. 155<br />
When Foundem w<strong>as</strong> promoted back to prominent search placement, that w<strong>as</strong><br />
actually the manipulation, since it affected Foundem and Foundem alone.<br />
Google thus “manipulated” its search results to exempt Foundem from what<br />
would otherwise have been a generally applicable rule. To condemn<br />
manipulation on the b<strong>as</strong>is of its specificity is to say that Google acted more<br />
rightfully when it demoted Foundem in 2006 than when it promoted it back in<br />
2009. 156<br />
The point is that local manipulations, being quick and e<strong>as</strong>y to implement, are<br />
often a useful part of a search engine’s toolkit for delivering relevance. Searchengine-optimization<br />
is an endless game of loopholing. Regulators who attempt<br />
to prohibit unfair manipulations will have to wade quite far into the swamp of<br />
white-hat and black-hat SEO. 157 Prohibiting local manipulation altogether<br />
would keep the search engine from closing loopholes quickly and punishing the<br />
loopholers—giving them a substantial leg up in the SEO wars. Search results<br />
pages would fill up with spam, and users would be the real losers.<br />
Conclusion<br />
Search neutrality gets one thing very right: Search is about user autonomy. A<br />
good search engine is more exquisitely sensitive to a user’s interests than any<br />
other communications technology. 158 Search helps her find whatever she wants,<br />
whatever she needs to live a self-directed life. It turns p<strong>as</strong>sive media recipients<br />
into active seekers and participants. If search did not exist, then for the sake of<br />
human freedom it would be necessary to invent it. Search neutrality properly<br />
seeks to make sure that search is living up to its liberating potential.<br />
Having <strong>as</strong>ked the right question—are structural forces thwarting search’s ability to<br />
promote user autonomy?—search neutrality advocates give answers concerned with<br />
protecting websites rather than users. With disturbing frequency, though,<br />
websites are not users’ friends. Sometimes they are, but often, the websites<br />
want visitors, and will be willing to do what it takes to grab them.<br />
155 Id.<br />
156 If you are bothered more by demotions than promotions, remember that search rankings are<br />
zero-sum. Foundem’s 50-place rise is balanced out by 50 one-place falls for other websites.<br />
157 On the distinction between ethical, permitted “white-hat” SEO and unethical, forbidden<br />
“black-hat” SEO, see Frank P<strong>as</strong>quale, Trusting (and Verifying) Online Intermediaries’ Policing, supra<br />
at 258. I believe that what P<strong>as</strong>quale calls the intermediate “grey-hat” zone between the two<br />
is generally less grey than he and his sources perceive it to be.<br />
158 Except, perhaps, the library reference desk. Unfortunately, librarians don’t scale.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 459<br />
If Flowers by Irene sells a bouquet for $30 that Bob’s Flowers sells for $50,<br />
then Bob’s interest in being found is in direct conflict with users’ interest in<br />
being directed to Irene. The l<strong>as</strong>t thing that Bob wants is for the search engine<br />
to maximize relevance. Search-neutrality advocates fear that Bob will pay off<br />
the search engine to point users at his site. But that’s not the only way the story<br />
can play out. Bob could also engage in self-help SEO to try to boost his<br />
ranking. In that c<strong>as</strong>e, the search engine may respond by demoting his site. And<br />
if that happens, then Bob h<strong>as</strong> another card to play: search-neutrality itself.<br />
Regulators bearing search neutrality can inadvertently prevent search engines<br />
from helping users find the websites they want. The typical model <strong>as</strong>sumed by<br />
search neutrality is of a website and a search engine corruptly conspiring to put<br />
one over on users. But much, indeed most, of the time, the real alliance is<br />
between search engines and users, together trying to sort through the clamor of<br />
millions of websites’ sales pitches. Giving websites search-neutrality rights gives<br />
them a powerful weapon in their wars with each other—one that need not be<br />
wielded with users’ interests in mind. 159 Search neutrality will be born with one<br />
foot already in the grave of regulatory capture.<br />
There is a profound irony at the heart of the liberal c<strong>as</strong>e for search neutrality.<br />
Requiring search engines to behave “neutrally” will not produce the desired goal<br />
of neutral search results. The web is a place where site owners compete fiercely,<br />
sometimes viciously, for viewers and users turn to intermediaries to defend<br />
them from the sometimes-abusive tactics of information providers. Taking the<br />
search engine out of the equation leaves users vulnerable to precisely the sorts<br />
of manipulation search neutrality aims to protect them from. Whether it ranks<br />
sites by popularity, by personalization, or even by the idiosyncratic whims of its<br />
operator, a search engine provides an alternative to the Hobbesian world of the<br />
unmediated Internet, in which the richest voices are the loudest, and the<br />
greatest authority on any subject is the spammer with the f<strong>as</strong>test server. Search<br />
neutrality is cynical about the Internet—but perhaps not cynical enough.<br />
159 This h<strong>as</strong> already happened in trademark law, which is supposed to prevent consumer<br />
confusion, but just <strong>as</strong> often is a form of offensive warfare among companies, consumer<br />
interests be damned. See Mark A. Lemley & Mark P. McKenna, Owning Mark(et)s (Stanford<br />
Law and Economics Olin Working Paper No. 395, May 2010), available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1604845. For an exploration of<br />
the competitive dynamics of trademark in the search-engine context, see Goldman,<br />
Deregulating Relevancy, supra note 61.
460 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 461<br />
Search Engine Bi<strong>as</strong> & the<br />
Demise of Search Engine<br />
Utopianism<br />
By Eric Goldman *<br />
In the p<strong>as</strong>t few years, search engines have emerged <strong>as</strong> a major force in our<br />
information economy, helping searchers perform hundreds of millions (or even<br />
billions) of searches per day. 1 With this broad reach, search engines have<br />
significant power to shape searcher behavior and perceptions. In turn, the<br />
choices that search engines make about how to collect and present data can<br />
have significant social implications.<br />
Typically, search engines automate their core operations, including the<br />
processes that search engines use to aggregate their datab<strong>as</strong>es and then<br />
sort/rank the data for presentation to searchers. This automation gives search<br />
engines a veneer of objectivity and credibility. 2 Machines, not humans, appear<br />
to make the crucial judgments, creating the impression that search engines<br />
byp<strong>as</strong>s the structural bi<strong>as</strong>es and skewed data presentations inherent in any<br />
human-edited media. 3 Search engines’ marketing disclosures typically reinforce<br />
this perception of objectivity.<br />
Unfortunately, this romanticized view of search engines does not match reality.<br />
Search engines are media companies. Like other media companies, search<br />
* Associate Professor, Santa Clara University School of Law and Director, High Tech Law<br />
Institute. Home page: http://www.ericgoldman.org. Email: egoldman@gmail.com. I<br />
appreciate the comments of Nico Brooks, Soumen Chakrabarti, Ben Edelman, Elizabeth<br />
Van Couvering and the participants at the Yale Law School Regulating Search Symposium<br />
and the 2005 Association of Internet Researchers (AoIR) Annual Meeting. This essay<br />
focuses principally on American law and consumer behavior. Consumer behavior and<br />
marketplace offerings vary by country, so this discussion may not be readily generalizable to<br />
other jurisdictions.<br />
1 In 2003, search engines performed over a half-billion searches a day. See Danny Sullivan,<br />
Searches Per Day, SEARCH ENGINE WATCH, Feb. 25, 2003,<br />
http://searchenginewatch.com/reports/article.php/2156461.<br />
2 See J<strong>as</strong>on Lee Miller, Left, Right, or Center? Can a Search Engine Be Bi<strong>as</strong>ed?, WEBPRONEWS.COM,<br />
May 10, 2005, http://www.webpronews.com/insidesearch/insidesearch/wpn-56-<br />
20050510LeftRightorCenterCanaSearchEngineBeBi<strong>as</strong>ed.html.<br />
3 There is a broad perception that search engines present search results p<strong>as</strong>sively and neutrally.<br />
See Leslie Marable, False Oracles: Consumer Reaction to Learning the Truth About How Search<br />
Engines Work, CONSUMER REPORTS WEBWATCH, June 30, 2003,<br />
http://www.consumerwebwatch.org/dynamic/search-report-false-oraclesabstract.cfm;<br />
Maureen O’Rourke, Defining the Limits of Free-Riding in Cyberspace: Trademark<br />
Liability for Metatagging, 33 GONZ. L. REV. 277 (1998).
462 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
engines make editorial choices designed to satisfy their audience. 4 These<br />
choices systematically favor certain types of content over others, producing a<br />
phenomenon called “search engine bi<strong>as</strong>.”<br />
Search engine bi<strong>as</strong> sounds scary, but this essay explains why such bi<strong>as</strong> is both<br />
necessary and desirable. The essay also explains how emerging personalization<br />
technology will soon ameliorate many concerns about search engine bi<strong>as</strong>.<br />
Search Engines Make Editorial Choices<br />
Search engines frequently claim that their core operations are completely<br />
automated and free from human intervention, 5 but this characterization is false.<br />
Instead, humans make numerous editorial judgments about what data to collect<br />
and how to present that data. 6<br />
Indexing. Search engines do not index every scrap of data available on the<br />
Internet. Search engines omit (deliberately or accidentally) some web pages<br />
entirely 7 or may incorporate only part of a web page. 8<br />
4 See, e.g., C. EDWIN BAKER, ADVERTISING AND A DEMOCRATIC PRESS (1994).<br />
5 See, e.g., Does Google Ever Manipulate Its Search Results?, GOOGLE.COM,<br />
http://www.google.com/support/bin/answer.py?answer=4115&topic=368 (“The<br />
order and contents of Google search results are completely automated. No one hand picks a<br />
particular result for a given search query, nor does Google ever insert jokes or send messages<br />
by changing the order of results.”); Does Google Censor Search Results?, GOOGLE.COM,<br />
http://www.google.com/support/bin/answer.py?answer=17795&topic=368 (“Google<br />
does not censor results for any search terms. The order and content of our results are<br />
completely automated; we do not manipulate our search results by hand.”); Technology<br />
Overview, GOOGLE.COM, http://www.google.com/corporate/tech.html (“There is no<br />
human involvement or manipulation of results….”); How Can I Improve My Site’s Ranking?,<br />
GOOGLE.COM, http://www.google.com/support/webm<strong>as</strong>ters/bin/answer.py?answer<br />
=34432&topic=8524 (“Sites’ positions in our search results are determined automatically<br />
b<strong>as</strong>ed on a number of factors, which are explained in more detail at<br />
http://www.google.com/technology/index.html. We don’t manually <strong>as</strong>sign keywords to<br />
sites, nor do we manipulate the ranking of any site in our search results.”) see also Complaint<br />
at 37-38, 52-56, KinderStart.com LLC v. Google, Inc., C<strong>as</strong>e No. C 06-2057 RS (N.D. Cal.<br />
Mar. 17, 2006) (giving other examples of Google’s claims to be p<strong>as</strong>sive). Note that Google<br />
h<strong>as</strong> subsequently revised some of these cited pages after its censorship controversy in China.<br />
6 See generally Abbe Mowshowitz & Akira Kawaguchi, Bi<strong>as</strong> on the Web, COMM. ACM, Sept. 2002,<br />
at 56 (distinguishing “indexical bi<strong>as</strong>” and “content bi<strong>as</strong>”).<br />
7 See Judit Bar-Ilan, Expectations Versus Reality – Search Engine Features Needed for Web Research at<br />
Mid-2005, 9 CYBERMETRICS 2 (2005),<br />
http://www.cindoc.csic.es/cybermetrics/articles/v9i1p2.html.<br />
8 For example, many search engines ignore metatags. See Eric Goldman, Deregulating Relevancy<br />
in Internet Trademark Law, 54 EMORY L.J. 507, 567-68 (2005). Search engines also incorporate<br />
only portions of very large files. See Bar-Ilan, supra note 7; Why Doesn’t My Site Have a Cached<br />
Copy or a Description?, GOOGLE.COM, http://www.google.com/support/bin/<br />
answer.py?answer=515&topic=365 (describing how some pages are “partially indexed”);
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 463<br />
During indexing, search engines are designed to <strong>as</strong>sociate third party<br />
“metadata” (data about data) with the indexed web page. For example, search<br />
engines may use and display third party descriptions of the website in the search<br />
results. 9 Search engines may also index “anchor text” (the text that third parties<br />
use in hyperlinking to a website), 10 which can cause a website to appear in<br />
search results for a term the website never used (and may object to). 11<br />
Finally, once indexed, search engines may choose to exclude web pages from<br />
their indexes for a variety of re<strong>as</strong>ons, ranging from violations of qu<strong>as</strong>i-objective<br />
search engine technical requirements 12 to simple capriciousness. 13<br />
Ranking. To determine the order of search results, search engines use complex<br />
proprietary “ranking algorithms.” Ranking algorithms obviate the need for<br />
humans to make individualized ranking decisions for the millions of search<br />
H<strong>as</strong> Google Dropped Their 101K Cache Limit?, RESEARCHBUZZ!, Jan. 31, 2005,<br />
http://www.researchbuzz.org/2005/01/h<strong>as</strong>_google_dropped_their_101k.shtml<br />
(discussing how historically Google indexed only the first 101k of a document).<br />
9 See My Site’s Listing Is Incorrect and I Need it Changed, GOOGLE.COM,<br />
http://www.google.com/webm<strong>as</strong>ters/3.html. Google’s automated descriptions have<br />
spawned at le<strong>as</strong>t one lawsuit by a web publisher who believed the compilation created a false<br />
characterization. See Seth Fineberg, Calif. CPA Sues Google Over “Misleading” Search Results,<br />
ACCT. TODAY, Apr. 19, 2004, at 5, available at<br />
http://www.webcpa.com/article.cfm?articleid=193&pg-acctoday&print=yes.<br />
10 See Jagdeep S. Pannu, Anchor Text Optimization, WEBPRONEWS.COM, Apr. 8, 2004,<br />
http://www.webpronews.com/ebusiness/seo/wpn-4-<br />
20040408AnchorTextOptimization.html.<br />
11 For example, the first search result in Google and Yahoo! for the keyword “miserable<br />
failure” is President George W. Bush’s home page because so many websites have linked to<br />
the biography using the term “miserable failure.” See Tom McNichol, Your Message Here, N.Y.<br />
TIMES, Jan. 22, 2004, at G1. This algorithmic vulnerability h<strong>as</strong> spawned a phenomenon<br />
called “Google bombing,” where websites coordinate an anchor text attack to intentionally<br />
distort search results. See John Hiler, Google Time Bomb, MICROCONTENT NEWS, Mar. 3, 2002,<br />
http://www.microcontentnews.com/articles/googlebombs.htm.<br />
12 See, e.g., Stefanie Olsen, Search Engines Delete Adware Company, CNET NEWS.COM, May 13,<br />
2004, http://news.com.com/2102-1024_3-5212479.html?tag=st.util.print (Google and<br />
Yahoo kicked WhenU.com out of their indexes for allegedly displaying different web pages<br />
to searchers and search engine robots, a process called “cloaking”).<br />
13 This is the heart of KinderStart’s allegations against Google. See Complaint,<br />
KinderStart.com LLC v. Google, Inc., C<strong>as</strong>e No. C 06-2057 (N.D. Cal. Mar. 17, 2006).<br />
Although the complaint’s allegations about Google’s core algorithmic search may not be<br />
proven, Google does liberally excise sources from Google News. For example, Google<br />
claims that “news sources are selected without regard to political viewpoint or ideology,” see<br />
Google News (Beta), GOOGLE.COM,<br />
http://news.google.com/intl/en_us/about_google_news.html#25, but Google<br />
dropped a white supremacist news source from Google News because it allegedly<br />
promulgated “hate content.” See Susan Kuchinsk<strong>as</strong>, Google Axes Hate News,<br />
INTERNETNEWS.COM, Mar. 23, 2005,<br />
http://www.internetnews.com/xSP/article.php/3492361.
464 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
terms used by searchers, but they do not lessen the role of human editorial<br />
judgment in the process. Instead, the choice of which factors to include in the<br />
ranking algorithm (and how to weight them) reflects the search engine<br />
operator’s editorial judgments about what makes content valuable. Indeed, to<br />
ensure that these judgments are producing the desired results, search engines<br />
manually inspect search results 14 and make adjustments accordingly.<br />
Additionally, search engines claim they do not modify algorithmically-generated<br />
search results, but there is some evidence to the contrary. Search engines<br />
allegedly make manual adjustments to a web publisher’s overall ranking. 15 Also,<br />
search engines occ<strong>as</strong>ionally modify search results presented in response to<br />
particular keyword searches. Consider the following:<br />
� Some search engines blocked certain search terms containing the<br />
keyword “phpBB.” 16<br />
� In response to the search term “Jew,” for a period of time (including, at<br />
minimum November 2005 when the author observed the<br />
phenomenon), Google displayed a special result in the sponsored link,<br />
saying “Offensive Search Results: We’re disturbed about these results<br />
<strong>as</strong> well. Ple<strong>as</strong>e read our note here.” The link led to a page explaining<br />
the results. 17<br />
� Reportedly, Ask.com blocked search results for certain terms like<br />
“pedophile,” “bestiality,” “sex with children” and “child sex.” 18<br />
� Google removed some websites from its index in response to a <strong>Digital</strong><br />
Millenium Copyright Act (DMCA) take-down demand from the<br />
Church of Scientology. However, Google displayed the following<br />
legend at the bottom of affected search results pages (such <strong>as</strong> search<br />
results for “scientology site:xenu.net”): “In response to a complaint we<br />
received under the US <strong>Digital</strong> Millennium Copyright Act, we have<br />
14 See Posting of Eric Goldman to Technology & Marketing Law Blog, Google’s Human<br />
Algorithm, http://blog.ericgoldman.org/archives/2005/06/googles_human_a.htm<br />
(June 5, 2005, 14:11 EST) (Google hires students to manually review search results for<br />
quality purposes).<br />
15 See Search King, Inc. v. Google Tech., Inc., No. CIV-02-1457-M, at 4 (W.D. Okla. Jan. 13,<br />
2003) (“Google knowingly and intentionally decre<strong>as</strong>ed the PageRanks <strong>as</strong>signed to both<br />
SearchKing and PRAN.”). This manual adjustment h<strong>as</strong> also been alleged in the recent<br />
KinderStart lawsuit. See Complaint, KinderStart.com L.L.C. v. Google, Inc., C<strong>as</strong>e No. C 06-<br />
2057 RS (N.D. Cal. Mar. 17, 2006).<br />
16 See MSN Blockades phpBB Searchers, TRIMMAIL’S EMAIL BATTLES, Jan. 18, 2006,<br />
http://www.emailbattles.com/archive/battles/vuln_aacgfbgdcb_jd/.<br />
17 See http://www.google.com/explanation.html.<br />
18 See Jennifer Laycock, Ask.com Actively Censoring Some Search Phr<strong>as</strong>es, SEARCH ENGINE GUIDE,<br />
June 25, 2006, http://www.searchengineguide.com/searchbrief/senews/007837.html.<br />
On Aug. 1, 2006, I w<strong>as</strong> unable to replicate these results.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 465<br />
removed 2 result(s) from this page. If you wish, you may read the<br />
DMCA complaint that caused the removal(s) at ChillingEffects.org.” 19<br />
Conclusion. Search engines have some duality in their self-perceptions, and this<br />
duality creates much confusion. 20 Search engines perceive themselves <strong>as</strong><br />
objective and neutral because they let automated technology do most of the<br />
hard work. However, in practice, search engines make editorial judgments just<br />
like any other media company. Principally, these editorial judgments are<br />
instantiated in the parameters set for the automated operations, but search<br />
engines also make individualized judgments about what data to collect and how<br />
to present it. These manual interventions may be the exception and not the<br />
rule, but these exceptions only reinforce that search engines play an active role<br />
in shaping their users’ experiences when necessary to accomplish their editorial<br />
goals.<br />
Search Engine Editorial Choices<br />
Create Bi<strong>as</strong>es<br />
Search results ordering h<strong>as</strong> a significant effect on searchers and web publishers.<br />
Searchers usually consider only the top few search results; the top-ranked search<br />
result gets a high percentage of searcher clicks, and click-through rates quickly<br />
decline from there. 21 Therefore, even if a search engine delivers hundreds or<br />
even thousands of search results in response to a searcher’s query, searchers<br />
19 See Chris Sherman, Google Makes Scientology Infringement Demand Public, SEARCH ENGINE<br />
WATCH, Apr. 15, 2002,<br />
http://searchenginewatch.com/searchday/article.php/2159691.<br />
20 See Danny Sullivan, KinderStart Becomes KinderStopped In Ranking Lawsuit Against Google, SEARCH<br />
ENGINE WATCH, July 14, 2006, http://blog.searchenginewatch.com/blog/060714-<br />
084842. This duality, if it ends up leading to the dissemination of false information, could<br />
also create some legal liability. See KinderStart v. Google, No. 5:06-cv-02057-JF (N.D. Cal.<br />
motion to dismiss granted July 13, 2006) (pointing out the potential inconsistency of<br />
Google’s position that PageRank is both Google’s subjective opinion but an objective<br />
reflection of its algorithmic determinations).<br />
21 See iProspect Search Engine User Behavior Study, IPROSPECT, Apr. 2006,<br />
http://www.iprospect.com/premium<strong>PDF</strong>s/WhitePaper_2006_SearchEngineUserBe<br />
havior.pdf (62% of searchers click on a search result on the first results page); Jakob<br />
Nielsen, The Power of Defaults, JAKOB NIELSEN’S ALERTBOX, Sept. 26, 2005,<br />
http://www.useit.com/alertbox/defaults.html (citing a study by Cornell professor<br />
Thorsten Joachims that the first search result gets 42% of clicks and the second search result<br />
gets 8%; further, when the first two search results are switched, the first search result gets<br />
34%—meaning that positioning dictated searcher behavior); Nico Brooks, The Atl<strong>as</strong> Rank<br />
Report: How Search Engine Rank Impacts Traffic, ATLAS INSTITUTE DIGITAL MARKETING<br />
INSIGHTS, June 2004, http://app.atl<strong>as</strong>onepoint.com/pdf/Atl<strong>as</strong>RankReport.pdf (the<br />
first-ranked search result may get ten times the quantity of clicks <strong>as</strong> the tenth-ranked search<br />
result).
466 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
effectively ignore the v<strong>as</strong>t majority of those search results. Accordingly, web<br />
publishers desperately want to be listed among the top few search results. 22<br />
For search engines, results placement determines how the searcher perceives the<br />
search experience. If the top few search results do not satisfy the searcher’s<br />
objectives, the searcher may deem the search a failure. Therefore, to maximize<br />
searcher perceptions of search success, search engines generally tune their<br />
ranking algorithms to support majority interests. 23 In turn, minority interests<br />
(and the websites catering to them) often receive marginal exposure in search<br />
results.<br />
To gauge majority interests, search engines frequently include a popularity<br />
metric in their ranking algorithm. Google’s popularity metric, PageRank, treats<br />
inbound links to a website <strong>as</strong> popularity votes, but votes are not counted<br />
equally; links from more popular websites count more than links from lesserknown<br />
websites. 24<br />
Beyond promoting search results designed to satisfy majority interests,<br />
PageRank’s non-egalitarian voting structure causes search results to be bi<strong>as</strong>ed<br />
towards websites with economic power 25 because these websites get more links<br />
due to their marketing expenditures and general prominence.<br />
Indeed, popularity-b<strong>as</strong>ed ranking algorithms may reinforce and perpetuate<br />
existing power structures. 26 Websites that are part of the current power elite get<br />
better search result placement, which leads to greater consideration of their<br />
messages and views. Furthermore, the incre<strong>as</strong>ed exposure attributable to better<br />
placement means that these websites are likely to get more votes in the future,<br />
22 See Michael Totty & Mylene Mangalindan, Web Sites Try Everything To Climb Google Rankings,<br />
WALL ST. J. ONLINE, Feb. 26, 2003,<br />
http://online.wsj.com/article/SB1046226160884963943.html?emailf=yes.<br />
23 See Luc<strong>as</strong> D. Introna & Helen Nissenbaum, Shaping the Web: Why the Politics of Search Engines<br />
Matters, INFO. SOC’Y, July-Sept. 2000, at 169.<br />
24 See Our Search: Google Technology, GOOGLE.COM, http://www.google.com/technology/.<br />
25 See Niva Elkin-Koren, Let the Crawlers Crawl: On Virtual Gatekeepers and the Right to Exclude<br />
Indexing, 26 U. DAYTON L. REV. 179, 188 (2001); Frank P<strong>as</strong>quale, Rankings, Reductionism, and<br />
Responsibility, SETON HALL PUBLIC LAW RESEARCH PAPER NO. 888327, at 25, Feb. 25, 2006,<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=888327; Trystan Upstill et al.,<br />
Predicting Fame and Fortune: PageRank or Indegree?, PROC. OF THE 8TH AUSTRALASIAN<br />
DOCUMENT COMPUTING SYMP., Dec. 15, 2003,<br />
http://research.microsoft.com/users/nickcr/pubs/upstill_adcs03.pdf (showing that<br />
BusinessWeek Top Brand, Fortune 500 and Fortune Most Admired companies get<br />
disproportionately high PageRank).<br />
26 See Introna & Nissenbaum, supra note 23; Matthew Hindman et al., “Googlearchy”: How a Few<br />
Heavily-Linked Sites Dominate Politics on the Web, Mar. 31, 2003,<br />
http://www.princeton.edu/~mhindman/googlearchy--hindman.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 467<br />
leading to a self-reinforcing process. 27 In contr<strong>as</strong>t, minority-interest and<br />
disenfranchised websites may have a difficult time cracking through the<br />
popularity contest, potentially leaving them perpetually relegated to the search<br />
results hinterlands. 28<br />
A number of commentators have lamented these effects and offered some<br />
proposals in response:<br />
� Improve Search Engine Transparency. Search engines keep their ranking<br />
algorithms secret. 29 This secrecy hinders search engine spammers from<br />
gaining more prominence than search engines want them to have, but<br />
the secrecy also prevents searchers and commentators from accurately<br />
<strong>as</strong>sessing any bi<strong>as</strong>. To enlighten searchers, search engines could be<br />
required to disclose more about their practices and their algorithms. 30<br />
This additional information h<strong>as</strong> two putative benefits. First, it may<br />
improve market mechanisms by helping searchers make informed<br />
choices among search engine competitors. Second, it may help<br />
searchers determine the appropriate level of cognitive authority to<br />
<strong>as</strong>sign to their search results.<br />
� Publicly Fund Search Engines. Arguably, search engines have “public<br />
good”-like attributes, such <strong>as</strong> reducing the social costs of search<br />
behavior. If so, private actors will not incorporate these social benefits<br />
into their decision-making. In that c<strong>as</strong>e, public funding of search<br />
engines may be required to produce socially-optimal search results. 31<br />
27 See Egalitarian Engines, ECONOMIST, Nov. 17, 2005 (“there is a widespread belief among<br />
computer, social and political scientists that search engines create a vicious circle that<br />
amplifies the dominance of established and already popular websites”); see also Junghoo Cho<br />
& Sour<strong>as</strong>his Roy, Impact of Search Engines on Page Popularity, WWW 2004, May 2004,<br />
http://oak.cs.ucla.edu/~cho/papers/cho-bi<strong>as</strong>.pdf; Upstill, supra note 25. But see Santo<br />
Fortunato et al., The Egalitarian Effect of Search Engines, Nov. 2005,<br />
http://arxiv.org/pdf/cs.CY/0511005 (questioning the consequences of the “rich-getsricher”<br />
effect).<br />
28 See Cho & Roy, supra note 27; but see Filippo Menczer et al., Googlearchy or Googlocracy?, IEEE<br />
SPECTRUM, Feb. 2006 (providing empirical evidence suggesting that “search engines direct<br />
more traffic than expected to less popular sites”).<br />
29 See Search King Inc. v. Google Tech., Inc., No. CIV-02-1457-M, at 3 n.2 (W.D. Okla. Jan. 13,<br />
2003) (“Google’s mathematical algorithm is a trade secret, and it h<strong>as</strong> been characterized by<br />
the company <strong>as</strong> ‘one of Google’s most valuable <strong>as</strong>sets.’”); Stefanie Olsen, Project Searches for<br />
Open-Source Niche, CNET NEWS.COM, Aug. 18, 2003, http://news.com.com/2102-1032_3-<br />
5064913.html?tag=st_util_print.<br />
30 See Introna & Nissenbaum, supra note 23.<br />
31 See id.; Eszter Hargittai, Open Portals or Closed Gates? Channeling Content on the World Wide Web,<br />
27 POETICS 233 (2000); cf. CASS SUNSTEIN, REPUBLIC.COM 170-72 (2001) (advocating<br />
publicly funded “deliberative domains”).
468 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
Indeed, there have been several proposals to create government-funded<br />
search engines. 32<br />
� Mandate Changes to Ranking/Sorting Practices. Search engines could be<br />
forced to incre<strong>as</strong>e the exposure of otherwise-marginalized websites. At<br />
le<strong>as</strong>t five lawsuits 33 have requested judges to force search engines to<br />
reorder search results to incre<strong>as</strong>e the plaintiff’s visibility. 34<br />
In addition to plaintiffs, some academics have supported mandatory reordering<br />
of search results. For example, Pandey et al. advocate a “randomized rank<br />
promotion” scheme where obscure websites randomly should get extra credit in<br />
ranking algorithms, appearing higher in the search results on occ<strong>as</strong>ion and<br />
getting additional exposure to searchers accordingly. 35 In another essay in this<br />
collection, Frank P<strong>as</strong>quale proposes that, when people think the search engines<br />
are providing false or misleading information, search engines should be forced<br />
to include a link to corrective information. 36<br />
Search Engine Bi<strong>as</strong> Is<br />
Necessary and Desirable<br />
Before trying to solve the problem of search engine bi<strong>as</strong>, we should be clear<br />
how search engine bi<strong>as</strong> creates a problem that requires correction. From my<br />
perspective, search engine bi<strong>as</strong> is the unavoidable consequence of search<br />
32 See Kevin J. O’Brien, Europeans Weigh Plan on Google Challenge, INT’L HERALD TRIB., Jan. 18,<br />
2006 (discussing a European initiative called Quaero, which is intended to break the<br />
American hegemony implicit in Google’s dominant market position); Graeme Wearden,<br />
Japan May Create Its Own Search Engine, CNET NEWS.COM, Dec. 21, 2005,<br />
http://news.com.com/Japan+may+create+its+own+search+engine/2100-1025_3-<br />
004037.html.<br />
33 See Search King, Inc. v. Google Tech., Inc., No. CIV-02-1457-M (W.D. Okla. Jan. 13, 2003);<br />
KinderStart.com LLC v. Google, Inc., No. C 06-2057 RS (N.D. Cal. dismissed July 13, 2006);<br />
Langdon v. Google, Inc., No. 1:06-cv-00319-JJF (D. Del. complaint filed May 17, 2006);<br />
Roberts v. Google, No. 1-06-CV-063047 (Cal. Superior Ct. complaint filed May 5, 2006);<br />
Datner v. Yahoo! Inc, C<strong>as</strong>e No. BC355217 (Cal. Superior Ct. complaint filed July 11, 2006)<br />
[note: this list updated <strong>as</strong> of July 24, 2006].<br />
34 As Google said in its response to the KinderStart lawsuit, “Plaintiff KinderStart contends<br />
that the judiciary should have the final say over [search engines’] editorial process. It h<strong>as</strong><br />
brought this litigation in the hopes that the Court will second-guess Google’s search rankings<br />
and order Google to view KinderStart’s site more favorably.” Motion to Dismiss at 1,<br />
KinderStart.com LLC v. Google, Inc., No. C 06-2057 RS (N.D. Cal. May 2, 2006).<br />
35 See Sandeep Pandey et al., Shuffling a Stacked Deck: the C<strong>as</strong>e for Partially Randomized Ranking of<br />
Search Engine Results,<br />
http://www.cs.cmu.edu/~olston/publications/randomRanking.pdf; cf. SUNSTEIN,<br />
supra note 31 (explaining that websites should be forced to link to contrary views <strong>as</strong> a way of<br />
incre<strong>as</strong>ing exposure to alternative viewpoints).<br />
36 See P<strong>as</strong>quale, supra at 401; see also P<strong>as</strong>quale, supra note 25, at 28-30 (proposing that the link be<br />
displayed <strong>as</strong> an <strong>as</strong>terisk to the search results).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 469<br />
engines exercising editorial control over their datab<strong>as</strong>es. Like any other media<br />
company, search engines simply cannot p<strong>as</strong>sively and neutrally redistribute third<br />
party content (in this c<strong>as</strong>e, web publisher content). If a search engine does not<br />
attempt to organize web content, its system quickly and inevitably will be<br />
overtaken by spammers, fraudsters and malcontents. 37 At that point, the search<br />
engine becomes worthless to searchers.<br />
Instead, searchers (like other media consumers) expect search engines to create<br />
order from the information glut. To prevent anarchy and preserve credibility,<br />
search engines must exercise some editorial control over their systems. In turn,<br />
this editorial control necessarily will create some bi<strong>as</strong>.<br />
Fortunately, market forces limit the scope of search engine bi<strong>as</strong>. 38 Searchers<br />
have high expectations for search engines: they expect search engines to read<br />
their minds 39 and infer their intent b<strong>as</strong>ed solely on a small number of search<br />
keywords. 40 Search engines that disappoint (either by failing to deliver relevant<br />
results, or by burying relevant results under too many unhelpful results) are held<br />
37 Every Internet venue accepting user-submitted content inevitably gets attacked by unwanted<br />
content. If left untended, the venue inexorably degrades into anarchy. See, e.g., Step-by-Step:<br />
How to Get BILLIONS of Pages Indexed by Google, MONETIZE BLOG, June 17, 2006,<br />
http://merged.ca/monetize/flat/how-to-get-billions-of-pages-indexed-by-<br />
Google.html (Google indexed over five billion “spam” pages from a single spammer before<br />
manually de-indexing the sites); Alorie Gilbert, Google Fixes Glitch That Unle<strong>as</strong>hed Flood of Porn,<br />
CNET NEWS.COM, Nov. 28, 2005, http://news.com.com/2102-1025_3-<br />
5969799.html?tag=st.util.print (describing how Google B<strong>as</strong>e, a venue for user-submitted<br />
content, w<strong>as</strong> overtaken by pornographers: “the amount of adult content on Google B<strong>as</strong>e<br />
w<strong>as</strong> staggering considering Google only launched the tool a week ago.”); Josh Quittner, The<br />
War Between alt.t<strong>as</strong>teless and rec.pets.cats, WIRED, May 1994, at 46 (describing how a group of<br />
anarchists, for fun, took over a USENET newsgroup about pets).<br />
38 See Mowshowitz & Kawaguchi, supra note 6, at 60 (market forces are the best way to counter<br />
adverse effects of search engine bi<strong>as</strong>).<br />
39 See Our Philosophy, GOOGLE.COM, http://www.google.com/corporate/tenthings.html<br />
(“The perfect search engine … would understand exactly what you mean and give back<br />
exactly what you want.”); Chris Sherman, If Search Engines Could Read Your Mind, SEARCH<br />
ENGINE WATCH, May 11, 2005,<br />
http://searchenginewatch.com/searchday/article.php/3503931.<br />
40 Searchers routinely use a very small number of keywords to express their search interests.<br />
See iProspect.com, Inc., iProspect Natural SEO Keyword Length Study, Nov. 2004,<br />
http://www.iprospect.com/premium<strong>PDF</strong>s/keyword_length_study.pdf (eighty-eight<br />
percent of search engine referrals are b<strong>as</strong>ed on only one or two keywords); see also Declan<br />
Butler, Souped-Up Search Engines, NATURE, May 11, 2000, at 112, 115 (citing an NEC Research<br />
Institute study showing that up to 70% of searchers use only a single keyword <strong>as</strong> a search<br />
term); Bernard J. Jansen et al., Real Life Information Retrieval: A Study of User Queries on the Web,<br />
32 SIGIR FORUM 5, 15 (1998) (stating that the average keyword length w<strong>as</strong> 2.35 words; onethird<br />
of searches used one keyword and 80% used three keywords or fewer); Jakob Nielsen,<br />
JAKOB NIELSEN’S ALERTBOX, Search: Visible and Simple, May 13, 2001,<br />
http://www.useit.com/alertbox/20010513.html (stating that the average keyword length<br />
w<strong>as</strong> 2.0 words).
470 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
accountable by fickle searchers. 41 There are multiple search engines available to<br />
searchers, 42 and few barriers to switching between them. 43<br />
As a result, searchers will shop around if they do not get the results they want, 44<br />
and this competitive pressure constrains search engine bi<strong>as</strong>. If a search engine’s<br />
bi<strong>as</strong> degrades the relevancy of search results, searchers will explore alternatives<br />
even if searchers do not realize that the results are bi<strong>as</strong>ed. Meanwhile, search<br />
engine proliferation means that niche search engines can segment the market<br />
and cater to underserved minority interests. 45 Admittedly, these market forces<br />
41 See Kim Peterson, Microsoft Learns to Crawl, SEATTLE TIMES, May 2, 2005 (MSN Search<br />
“learned that the arcane searches were the make-or-break moments for Web searchers.<br />
People weren’t just happy when a search engine could find answers to their most bizarre,<br />
obscure and difficult queries. They would switch loyalties.”); Bob Tedeschi, Every Click You<br />
Make, They’ll Be Watching You, N.Y. TIMES, Apr. 3, 2006,<br />
http://www.nytimes.com/2006/04/03/business/03ecom.html?ei=5090&en=9e55ae6<br />
4f692433a&ex=1301716800&partner=rssuserland&emc=rss&pagewanted=print.<br />
42 In addition to the recent launch of major new search engines by providers like MSN, the<br />
open-source software community is developing Nutch to allow anyone to build and<br />
customize his or her own web search engine. http://nutch.apache.org/; see also Olsen,<br />
Open-Source Niche, supra note 29.While there are multiple major search engines, the market<br />
may still resemble an oligopoly; a few major players (Google, Yahoo, MSN, Ask Jeeves) have<br />
the lion’s share of the search engine market. However, this may construe the search engine<br />
market too narrowly. Many types of search providers compete with the big m<strong>as</strong>s-market<br />
search engines, ranging from specialty search engines (e.g., Technorati) to alternative types of<br />
search technology (e.g., adware) to non-search information retrieval processes (e.g., link<br />
navigation). Ultimately, every search engine competes against other search engines and these<br />
other search/retrieval options.<br />
43 See Rahul Telang et al., An Empirical Analysis of Internet Search Engine Choice, Aug. 2002 (on file<br />
with author). For example, search engines use the same b<strong>as</strong>ic interface (a white search box),<br />
and searchers rarely use advanced search features that might require additional learning time<br />
at other search engines.<br />
44 See Grant Crowell, Understanding Searcher Behavior, SEARCH ENGINE WATCH, June 14, 2006,<br />
http://searchenginewatch.com/showPage.html?page=3613291 (citing a Kelsey<br />
Research study that 63% of searchers used two or more search engines); Press Rele<strong>as</strong>e,<br />
Vividence, Inc., Google Wins Users’ Hearts, But Not Their Ad Clicks (May 25, 2004),<br />
http://www.vividence.com/public/company/news+and+events/press+rele<strong>as</strong>es/20<br />
04-05-25+ce+rankings+search.htm (stating that up to 47% of searchers try another<br />
search engine when their search expectations are not met).<br />
45 See Rahul Telang et al., The Market Structure for Internet Search Engines, 21 J. MGMT. INFO. SYS.<br />
137 (2004), available at http://www.heinz.cmu.edu/~rtelang/engine_jmis_final.pdf<br />
(describing how searchers sample heterogeneous ranking algorithms, which support a<br />
diversity of search engines); Mário J. Silva, The C<strong>as</strong>e for a Portuguese Web Search Engine,<br />
http://xldb.fc.ul.pt/data/Publications_attach/tumba-icwi2003-final.pdf (describing<br />
the value of a Portuguese-oriented search engine); Jeffrey McMurray, Social Search Promises<br />
Better Intelligence, ASSOCIATED PRESS, July 9, 2006 (discussing niche search engines that draw<br />
on social networking); cf. Jakob Nielsen, Diversity is Power for Specialized Sites, JAKOB NIELSEN’S<br />
ALERTBOX, June 16, 2003, http://www.useit.com/alertbox/20030616.html (describing<br />
how specialized sites will flourish on the Internet).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 471<br />
are incomplete—searchers may never consider what results they are not<br />
seeing—but they are powerful nonetheless.<br />
In contr<strong>as</strong>t, it is hard to imagine how regulatory intervention will improve the<br />
situation. First, regulatory solutions become a vehicle for normative views<br />
about what searchers should see—or should want to see. 46 How should we<br />
select among these normative views? What makes one bi<strong>as</strong> better than another?<br />
Second, regulatory intervention that promotes some search results over others<br />
does not ensure that searchers will find the promoted search results useful.<br />
Determining relevancy b<strong>as</strong>ed on very limited data (such <strong>as</strong> decontextualized<br />
keywords) is a challenging process, and search engines struggle with this<br />
challenge daily. Due to the complexity of the relevancy matching process,<br />
government regulation rarely can do better than market forces at delivering<br />
results that searchers find relevant. As a result, searchers likely will find some of<br />
the promoted results irrelevant.<br />
The clutter of unhelpful results may hinder searchers’ ability to satisfy their<br />
search objectives, undermining searchers’ confidence in search engines’ mindreading<br />
abilities. 47 In this c<strong>as</strong>e, regulatory intervention could counterproductively<br />
degrade search engines’ value to searchers. Whatever the adverse<br />
consequences of search engine bi<strong>as</strong>, the consequences of regulatory correction<br />
are probably worse. 48<br />
Technological Evolution Will<br />
Moot Search Engine Bi<strong>as</strong><br />
Currently, search engines principally use “one-size-fits-all” ranking algorithms to<br />
deliver homogeneous search results to searchers with heterogeneous search<br />
objectives. 49 One-size-fits-all algorithms exacerbate the consequences of search<br />
engine bi<strong>as</strong> in two ways: (1) they create winners (websites listed high in the<br />
46 See, e.g., Susan L. Gerhart, Do Web Search Engines Suppress Controversy?, FIRST MONDAY, Jan.<br />
2004, http://www.firstmonday.org/issues/issue9_1/gerhart/. Gerhart argues that<br />
search engines do not adequately prioritize search results that expose controversies about the<br />
search topic. However, her argument <strong>as</strong>sumes that controversy-related information h<strong>as</strong><br />
value to consumers, an <strong>as</strong>sumption that deserves careful evaluation.<br />
47 See Eric Goldman, A Co<strong>as</strong>ean Analysis of Marketing, 2006 WIS. L. REV. 1151, available at<br />
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=912524.<br />
48 See Susan P. Crawford, Shortness of Vision: Regulatory Ambition in the <strong>Digital</strong> Age, 74 FORDHAM<br />
L. REV. 695 (2005) (discussing the shortcomings of regulatory intervention in organic<br />
information systems).<br />
49 See James Pitkow et al., Personalized Search, COMM. ACM, Vol. 45:9 (Sept. 2002) at 50-1.
472 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”<br />
search results) and losers (those with marginal placement), and (2) they deliver<br />
suboptimal results for searchers with minority interests. 50<br />
These consequences will abate when search engines migrate away from onesize-fits-all<br />
algorithms towards “personalized” ranking algorithms. 51<br />
Personalized algorithms produce search results that are custom-tailored to each<br />
searcher’s interests, so searchers will see different results in response to the<br />
same search query. For example, Google offers searchers an option that “orders<br />
your search results b<strong>as</strong>ed on your p<strong>as</strong>t searches, <strong>as</strong> well <strong>as</strong> the search results and<br />
news headlines you’ve clicked on.” 52<br />
Personalized ranking algorithms represent the next major advance in search<br />
relevancy. One-size-fits-all ranking algorithms have inherent limits on their<br />
maximum relevancy potential, and further improvements in one-size-fits<br />
algorithms will yield progressively smaller relevancy benefits. Personalized<br />
algorithms transcend those limits, optimizing relevancy for each searcher and<br />
thus implicitly doing a better job of searcher mind-reading. 53<br />
Personalized ranking algorithms also reduce the effects of search engine bi<strong>as</strong>.<br />
Personalized algorithms mean that there are multiple “top” search results for a<br />
particular search term instead of a single “winner,” 54 so web publishers will not<br />
compete against each other in a zero-sum game. In turn, searchers will get<br />
results more influenced by their idiosyncratic preferences and less influenced by<br />
the embedded preferences of the algorithm-writers. Also, personalized<br />
algorithms necessarily will diminish the weight given to popularity-b<strong>as</strong>ed metrics<br />
(to give more weight for searcher-specific factors), reducing the structural bi<strong>as</strong>es<br />
due to popularity.<br />
50 See Michael Kanellos, Microsoft Aims for Search on Its Own Terms, CNET NEWS.COM, Nov. 24,<br />
2003, http://news.com.com/2102-1008_3-5110910.html?tag=st.util.print (quoting a<br />
Microsoft researcher <strong>as</strong> saying “If the two of us type a query [into a search engine], we get<br />
the same thing back, and that is just brain dead. There is no way an intelligent human being<br />
would tell us the same thing about the same topic.”); David H. Freedman, Why Privacy Won’t<br />
Matter, NEWSWEEK, Apr. 3, 2006; Personalization of Placed Content Ordering in Search<br />
Results, U.S. Patent App. 0050240580 (filed July 13, 2004).<br />
51 See Pitkow, supra note 49, at 50.<br />
52 What’s Personalized Search?, GOOGLE.COM,<br />
http://www.google.com/support/bin/answer.py?answer=26651&topic=1593.<br />
53 See Jaime Teevan et al., Personalizing Search via Automated Analysis of Interests and Activities,<br />
SIGIR ‘05, http://haystack.lcs.mit.edu/papers/teevan.sigir05.pdf; Terry McCarthy, On<br />
the Frontier of Search, TIME, Aug. 28, 2005 (“Search will ultimately be <strong>as</strong> good <strong>as</strong> having 1,000<br />
human experts who know your t<strong>as</strong>tes scanning billions of documents within a split<br />
second.”) (quoting Gary Flake, Microsoft Distinguished Engineer).<br />
54 See Kevin Lee, Search Personalization and PPC Search Marketing, CLICKZ NEWS, July 15, 2005,<br />
http://www.clickz.com/experts/search/strat/print.php/3519876.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 473<br />
Personalized ranking algorithms are not a panacea—any process where humans<br />
select and weight algorithmic factors will produce some bi<strong>as</strong> 55—but<br />
personalized algorithms will eliminate many of the current concerns about<br />
search engine bi<strong>as</strong>.<br />
Conclusion<br />
Complaints about search engine bi<strong>as</strong> implicitly reflect some disappointed<br />
expectations. In theory, search engines can transcend the deficiencies of<br />
predecessor media to produce a type of utopian media. In practice, search<br />
engines are just like every other medium—heavily reliant on editorial control<br />
and susceptible to human bi<strong>as</strong>es. This fact shatters any illusions of search<br />
engine utopianism.<br />
Fortunately, search engine bi<strong>as</strong> may be largely temporal. In this respect, I see<br />
strong parallels between search engine bi<strong>as</strong> and the late 1990s keyword metatag<br />
“problem.” 56 Web publishers used keyword metatags to distort search results,<br />
but these techniques worked only so long <strong>as</strong> search engines considered keyword<br />
metatags in their ranking algorithms. When search engines recognized the<br />
distortive effects of keyword metatags, they changed their algorithms to ignore<br />
keyword metatags. 57 Search result relevancy improved, and the problem w<strong>as</strong><br />
solved without regulatory intervention.<br />
Similarly, search engines naturally will continue to evolve their ranking<br />
algorithms and improve search result relevancy—a process that, organically, will<br />
cause the most problematic <strong>as</strong>pects of search engine bi<strong>as</strong> to largely disappear.<br />
To avoid undercutting search engines’ quest for relevance, this effort should<br />
proceed without regulatory distortion.<br />
55 Personalized algorithms have other potentially adverse consequences, such <strong>as</strong> creating selfreinforcing<br />
information flows. See SUNSTEIN, supra note 31. For a critique of these<br />
consequences, see Goldman, Co<strong>as</strong>ean Analysis, supra note 47.<br />
56 See generally Goldman, Deregulating Relevancy, supra note 8.<br />
57 See Danny Sullivan, Death of a Meta Tag, SEARCH ENGINE WATCH, Oct. 1, 2002,<br />
http://www.searchenginewatch.com/sereport/print.php/34721_2165061.
474 CHAPTER 7: IS SEARCH NOW AN “ESSENTIAL FACILITY?”
CHAPTER 8<br />
WHAT FUTURE FOR PRIVACY?<br />
475<br />
Privacy Protection in the <strong>Next</strong> <strong>Digital</strong> <strong>Decade</strong>:<br />
“Trading Up” or a “Race to the Bottom”? 477<br />
Michael Zimmer<br />
The Privacy Problem: What’s Wrong with Privacy? 483<br />
Stewart Baker<br />
A Market Approach to Privacy Policy 509<br />
Larry Downes
476 CHAPTER 8: WHAT FUTURE FOR PRIVACY?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 477<br />
Privacy Protection in the <strong>Next</strong><br />
<strong>Digital</strong> <strong>Decade</strong>: “Trading Up” or<br />
a “Race to the Bottom”?<br />
By Michael Zimmer *<br />
Apparent to most citizens of contemporary, industrialized society, people no<br />
longer exist and live in fixed locations and spaces. Instead people are on the<br />
move in their personal, professional, intellectual, and social spheres. Within and<br />
across these spheres, mobility, rather than permanence, is likely to be the norm.<br />
Manuel C<strong>as</strong>tells captures this feature of modern life in his theory of the space of<br />
flows, arguing that “our society is constructed around flows: flows of capital,<br />
flows of information, flows of technology, flows of organizational interaction,<br />
flows of images, sounds, and symbols.” 1 These flows—particularly information<br />
flows—constitute what C<strong>as</strong>tells describes <strong>as</strong> the “network society,” where<br />
“networks constitute the new social morphology of our societies, and the<br />
diffusion of networking logic substantially modifies the operation and outcomes<br />
in processes of production, experience, power and culture.” 2<br />
Nowhere is C<strong>as</strong>tells “network society” more apparent than in our contemporary<br />
global digital information network, with the Internet <strong>as</strong> its backbone.<br />
Originating from a handful of universities and research laboratories in the<br />
1960s, the Internet began to take shape <strong>as</strong> a ubiquitous information network<br />
with the emergence of the “dot-com” economy in the 1990s. Dot-com business<br />
models varied—and met varied levels of success—but most relied on the rapid<br />
delivery of services and exchange of information. While much of the dot-com<br />
economy burst with the dot-com bubble in 2000, the Internet remained a<br />
powerful network enabling robust flows of information, continually modifying<br />
“experience, power and culture,” just <strong>as</strong> C<strong>as</strong>tells described.<br />
In the p<strong>as</strong>t digital decade, the Internet h<strong>as</strong> provided new linkages and spaces for<br />
information flows, and h<strong>as</strong> particularly emerged <strong>as</strong> a potent infr<strong>as</strong>tructure for<br />
the flow and capture of personal information. These flows take many forms and<br />
stem from various motivations. Large-scale web advertising platforms and<br />
search engines utilize robust infr<strong>as</strong>tructures to collect data about web browsing<br />
and search activities in order to provide relevant advertising. Users’<br />
consumption habits are captured by online service providers like Amazon and<br />
Netflix, fueling powerful recommendation systems meant to improve user<br />
* School of Information Studies, University of Wisconsin-Milwaukee<br />
1 MANUEL CASTELLS, THE RISE OF THE NETWORK SOCIETY 412 (1996).<br />
2 Id. at 469.
478 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
satisfaction. Individuals openly share personal information with friends and<br />
colleagues on social networking services such <strong>as</strong> Facebook and LinkedIn, and<br />
their thoughts with the world on platforms like Blogger and Twitter. Looking<br />
back at the p<strong>as</strong>t decade, the Internet h<strong>as</strong> become a platform for the open flow<br />
of personal information—flows that are largely voluntarily provided by users—<br />
and <strong>as</strong> such, appear to have validated Scott McNealy’s (in)famous 1999 remark<br />
that “You have zero privacy anyway … get over it.” 3<br />
Notwithstanding McNealy’s view, privacy h<strong>as</strong> remained a central concern amid<br />
the open information flows in our contemporary network society, including<br />
worries about the growing size and role of networked datab<strong>as</strong>es, 4 the possibility<br />
of tracking and surveillance by Internet service providers 5 and Web search<br />
engines, 6 privacy threats from digital rights management technologies, 7 and<br />
growing concerns about protecting the privacy of users of social networking<br />
sites and related Web 2.0 services. 8<br />
While scholars continue to detail possible threats to privacy spawned by the l<strong>as</strong>t<br />
decade of innovations on the Internet, governments have struggled with<br />
whether—and how—to regulate information flows across these global networks<br />
3 Polly Sprenger, Sun on Privacy: ‘Get Over It,’ WIRED, March 31, 2007,<br />
http://www.wired.com/politics/law/news/1999/01/17538.<br />
4 SIMSON GARFINKEL, DATABASE NATION: THE DEATH OF PRIVACY IN THE 21ST CENTURY<br />
(2000).<br />
5 Colin J. Bennett, Cookies, Web Bugs, Webcams and Cue Cats: Patterns of Surveillance on the World<br />
Wide Web, 3(3) ETHICS AND INFORMATION TECHNOLOGY 195, 197-210 (2001); Paul Ohm,<br />
The Rise and Fall of Inv<strong>as</strong>ive ISP Surveillance, 2009 UNIVERSITY OF ILLINOIS LAW REVIEW 1417-<br />
1496.<br />
6 M. Goldberg, The Googling of Online Privacy: Gmail, Search-Engine Histories, and the New Frontier<br />
of Protecting Private Information on the Web, 9 Lewis & Clark Law Review 249-272 (2005);<br />
Michael Zimmer, The Gaze of the Perfect Search Engine: Google <strong>as</strong> an Infr<strong>as</strong>tructure of Dataveillance<br />
in WEB SEARCHING: MULTIDISCIPLINARY PERSPECTIVES 77-99 (Amanda Spink & Michael<br />
Zimmer, eds., 2008).<br />
7 Julie E. Cohen, A Right to Read Anonymously: A Closer Look at ‘Copyright Management’ in<br />
Cyberspace, 28(4) CONNECTICUT LAW REVIEW 981-1039 (1996); Julie E. Cohen, DRM and<br />
privacy, 18 BERKELEY TECHNOLOGY LAW JOURNAL 575-617 (2003).<br />
8 Ralph Gross & Alessandro Acquisti, Information Revelation and Privacy in Online Social Networks<br />
(ACM Workshop on Privacy in the Electronic Society, Alexandria, VA, 2005); Michael<br />
Zimmer, The Externalities of Search 2.0: The Emerging Privacy Threats When the Drive for the Perfect<br />
Search Engine Meets Web 2.0, FIRST MONDAY, Mar. 3, 2010,<br />
http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2136/1<br />
944; Joseph Bonneau & Sören Preibusch, The Privacy Jungle: On the Market for Data Protection in<br />
Social Networks (The Eighth Workshop on the Economics of Information Security (WEIS<br />
2009)); James Grimmelmann, Facebook and the Social Dynamics of Privacy, 95(4) IOWA LAW<br />
REVIEW 1137 (2009); Marc Parry, Library of Congress, Facing Privacy Concerns, Clarifies Twitter<br />
Archive Plan, THE CHRONICLE OF HIGHER EDUCATION, June 1, 2010,<br />
http://chronicle.com/blogPost/Library-of-Congress-Facing/23818/.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 479<br />
to protect the privacy of their citizens. Given the diversity of interests, histories,<br />
and cultural contexts, a complicated terrain of trans-national laws and policies<br />
for the protection of privacy and personal data flows across networks h<strong>as</strong><br />
emerged across the globe. Some jurisdictions have opted for broad, and<br />
relatively strict, laws regulating the collection, use and disclosure of personal<br />
information, such <strong>as</strong> Canada’s Personal Information Protection and Electronic<br />
Documents Act (PIPEDA) 9 or the European Union’s Data Protection<br />
Directive. 10 The United States, however, maintains a more sectoral approach to<br />
privacy legislation, with laws addressing only specific types of personal<br />
information. For example, the Health Insurance Portability and Accountability<br />
Act (HIPAA) 11 offers protection of personal medical information, the Fair<br />
Credit Reporting Act 12 regulates the collection and flow of personal financial<br />
data, and the Video Privacy Protection Act 13 makes the wrongful disclosure of<br />
video rental records illegal.<br />
The differences between Canadian/EU approaches to privacy and that of the<br />
United States have been well documented and analyzed. 14 Put bluntly, the<br />
Canadian/EU regulators can be described <strong>as</strong> embracing a more paternalist<br />
approach to data protection policy, aiming to preserve a fundamental human<br />
right of its citizens through preemptive governmental action. In contr<strong>as</strong>t, the<br />
governance of privacy in the U.S. typically emerges only after some<br />
informational harm h<strong>as</strong> occurred, often taking the form of industry selfregulation<br />
or very targeted legislation, with the responsibility of initiating<br />
enforcement resting on the harmed data subject herself. As Dorothee<br />
Heisenberg summarizes, “In practical terms, the EU and the US reached very<br />
different conclusions about the rights of businesses and individuals related to<br />
personal data.” 15 While the EU and Canada focus on direct and preemptive<br />
9 R.S., 1985, c. P-21, http://laws.justice.gc.ca/en/P-21/index.html.<br />
10 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995<br />
on the protection of individuals with regard to the processing of personal data and on the<br />
free movement of such data, http://eurlex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:EN:HTML.<br />
11 Health Insurance Portability and Accountability Act of 1996, H. Rept. 104-736,<br />
http://www.gpo.gov/fdsys/search/pagedetails.action?granuleId=CRPT-<br />
104hrpt736&packageId=CRPT-104hrpt736.<br />
12 Fair Credit Reporting Act (FCRA), 15 U.S.C. § 1681 et seq.,<br />
http://www.ftc.gov/os/statutes/031224fcra.pdf.<br />
13 Video Privacy Protection Act of 1988, Pub. L. 100-618 (codified at 18 U.S.C. § 2710),<br />
http://www.law.cornell.edu/uscode/18/2710.html.<br />
14 See, e.g., DOROTHEE HEISENBERG, NEGOTIATING PRIVACY: THE EUROPEAN UNION, THE<br />
UNITED STATES, AND PERSONAL DATA PROTECTION (2005); COLIN J. BENNETT & CHARLES<br />
D. RAAB, THE GOVERNANCE OF PRIVACY: POLICY INSTRUMENTS IN GLOBAL PERSPECTIVE<br />
(2003).<br />
15 Heisenberg, supra note 14 at 2.
480 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
regulation of the collection and use of personal data, prohibiting “excess” data<br />
collection and restricting use to the original and stated purposes of the<br />
collection, the U.S, framework begins with the <strong>as</strong>sumption that most data<br />
collection and use is both acceptable and beneficial, that guidelines should be<br />
primarily voluntary and non-inv<strong>as</strong>ive, and that regulation should only address<br />
documented instances of abuse.<br />
This difference in regulatory approaches to privacy—and the underpinning<br />
tensions between different jurisdictions’ views towards the rights of data<br />
subjects—becomes complicated further given the incre<strong>as</strong>ing flows of personal<br />
information across and between transnational networks, and thus, across<br />
jurisdictions. Internet companies like Google have customers accessing their<br />
products and services from across the globe, with data processing and storage<br />
facilities equally scattered. A Canadian citizen, for example, might be accessing a<br />
Google product in the United States, while the record of the particular<br />
information exchange might be stored on a server in Ireland. Each jurisdiction<br />
h<strong>as</strong> its own complex set of regulations and rights <strong>as</strong>signed to the treatment of<br />
any personal information shared and stored.<br />
These kinds of scenarios have prompted growing concerns about whether the<br />
global diversity of privacy governance will result in a “race to the bottom”<br />
where corporate interests in processing personal data will migrate to<br />
jurisdictions where there is little or no control over the circulation and capture<br />
of personal information flows, or a “race to the top” where the f<strong>as</strong>hioning of<br />
privacy policy to the highest possible standards in order to be perceived <strong>as</strong> the<br />
“best” protector of personal information flows. After considering the available<br />
evidence, political scientists Colin Bennett and Charles Raab have suggested that<br />
privacy protection is actually improving globally—a “trading up” of the<br />
governance of privacy. 16 Companies are, on the whole, not moving around in<br />
order to avoid strict privacy regulations, such <strong>as</strong> those developed in the EU;<br />
instead, there h<strong>as</strong> been a gradual incre<strong>as</strong>e in awareness and action on the issue<br />
of privacy. Examples of this “trading up” include Facebook’s strengthening of<br />
its privacy policies and practices in reaction to an investigation by the Office of<br />
the Privacy Commissioner of Canada, or Google’s modifying its Web cookie<br />
and partially anonymizing search logs in response to Norwegian privacy<br />
regulators. 17 In each c<strong>as</strong>e, large multi-national Internet companies reacted to<br />
strong regional privacy laws in ways that benefited all users across the globe.<br />
16 Bennett & Raab, supra note 14.<br />
17 Office of the Privacy Commissioner of Canada, Facebook Agrees to Address Privacy<br />
Commissioner’s Concerns, Aug. 20, 2010, http://www.priv.gc.ca/media/nr-c/2009/nrc_090827_e.cfm;<br />
Nate Anderson, Google To Anonymize Logs In A Nod To Privacy Advocates,<br />
ARS TECHNICA, Aug. 20, 2010, http://arstechnica.com/business/news/2007/03/<br />
google-to-anonymize-logs-in-a-nod-to-privacy-advocates.ars.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 481<br />
Offsetting this positive note, however, is the realization that privacy protection<br />
may not be “trading up” <strong>as</strong> rapidly <strong>as</strong> other global factors, such <strong>as</strong> the extensive,<br />
intensive processing of personal data across borders and platforms; the<br />
incre<strong>as</strong>ed focus on economic growth through the use of electronic<br />
communications and information infr<strong>as</strong>tructures; and the harmonization of law<br />
enforcement and security objectives. Bennett and Raab go to some length to<br />
expose the limitations of relying solely on individual countries to impose<br />
isolated privacy policies in the face of a globally-networked computer system<br />
permitting—indeed encouraging—transnational information flows. 18 While<br />
state-specific data protection governance might have been sufficient in the p<strong>as</strong>t,<br />
they argue, today’s digitally networked society demands that any country’s<br />
efforts to protect its citizens will inescapably be linked with (<strong>as</strong> well <strong>as</strong><br />
dependent on) the actions and laws of other, often disparate, jurisdictions.<br />
This leads to obvious problems, when, for example, a legal approach like that of<br />
the United States, with an emph<strong>as</strong>is on self-regulation and public-sector<br />
enforcement, meets a different philosophy, such <strong>as</strong> the more top-down,<br />
paternalistic approach to data protection held by Canada and the European<br />
Union. This cl<strong>as</strong>h between U.S. and non-U.S. standards for governing personal<br />
information flows h<strong>as</strong> prompted large, multi-national companies dependent on<br />
the relatively unfettered flow of information across global digital networks to<br />
lobby for some middle ground to be reached. In the c<strong>as</strong>e of the U.S. and the<br />
European Union, the result w<strong>as</strong> the 2000 Safe Harbor agreement 19 between the<br />
two global economic powers to avoid the most egregious misuse of Europeans’<br />
private data, while at the same time creating a semi-permanent “ce<strong>as</strong>e fire” that<br />
would allow transatlantic data (and hence commerce) to flow, despite failing to<br />
meet the letter, and perhaps not even the intent, of the E.U. Data Protection<br />
Directive. In the end, while U.S. b<strong>as</strong>ed companies are forced to provide more<br />
privacy protections than U.S. law demands, the Safe Harbor provisions are<br />
weaker than the full European Directive on Data Protection. As Heisenberg<br />
explains, “the evolution … of the [European Union] Commission’s stance on<br />
data protection seems to have been one of softening a bit” during the Safe<br />
Harbor negotiations, <strong>as</strong> the “Commission began to accommodate the US <strong>as</strong><br />
privacy legislation cl<strong>as</strong>hed with first commercial, and then security concerns.” 20<br />
So, while there h<strong>as</strong> been no clear “race to the bottom” in global privacy<br />
protections, the “trading up” to an incre<strong>as</strong>ed level of protection of personal<br />
18 Bennett & Raab, supra note 14.<br />
19 2000/520/EC: Commission Decision of 26 July 2000 pursuant to Directive 95/46/EC of<br />
the European Parliament and of the Council on the adequacy of the protection provided by<br />
the safe harbour privacy principles and related frequently <strong>as</strong>ked questions issued by the US<br />
Department of Commerce (notified under document number C(2000) 2441), http://eurlex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000D0520:EN:HTML.<br />
20 Heisenberg, supra note 14 at 136.
482 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
information flows on our transnational digital networks h<strong>as</strong> not materialized <strong>as</strong><br />
quickly or clearly <strong>as</strong> one might expect. Heisenberg correctly notes that with the<br />
Safe Harbor Agreement, the EU w<strong>as</strong> able to force the U.S. to deal with the<br />
privacy issues that might have otherwise been ignored, force some minor<br />
concessions, and show that the EU’s privacy standard w<strong>as</strong> significant, granting<br />
the EU something like a “first-mover advantage” in future trans-border privacy<br />
disputes. 21 Yet, beyond isolated examples of Internet companies’ hesitant<br />
acquiescence to non-U.S. regulatory bodies—like the Facebook and Google<br />
examples provided above—new norms of personal data protection are unlikely<br />
to emerge in the next digital decade, <strong>as</strong> data protection officials in Europe have<br />
begun to publicly question the appropriateness of the current levels of<br />
protections. 22<br />
Recalling C<strong>as</strong>tells’ warning that “networks constitute the new social morphology<br />
of our societies, and the diffusion of networking logic substantially modifies the<br />
operation and outcomes in processes of production, experience, power and<br />
culture,” we are left to consider the status of privacy protections in the next<br />
digital decade. Our network society will continue to grow in size and density, <strong>as</strong><br />
well <strong>as</strong> in its global importance and interconnectedness. Without concerted<br />
efforts to ensure a “trading up” in global privacy protections—a renewed<br />
commitment to the rights of data subjects embodied in the Canadian and<br />
European Union approach to data protection—those caught within the<br />
inescapable “diffusion of networking logic” may have little control over how<br />
the incre<strong>as</strong>ed flows of their personal information will modify “experience,<br />
power and culture” over the next digital decade.<br />
21 Id. at 170.<br />
22 W. Scott Blackmer, The Information Law Group, European Reservations?, Aug. 26, 2010,<br />
http://www.infolawgroup.com/2010/08/articles/eu-1/european-reservations/.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 483<br />
The Privacy Problem:<br />
What’s Wrong with Privacy?<br />
By Stewart Baker *<br />
Why are privacy groups so viscerally opposed to government action that could<br />
reduce the risks posed by exponential technologies? The cost of their stance w<strong>as</strong><br />
made clear on September 11, 2001. That tragedy might not have occurred if not<br />
for the aggressive privacy and civil liberties protection imposed by the Foreign<br />
Intelligence Surveillance Court and the Department of Justice’s Office of<br />
Intelligence; and it might have been avoided if border authorities had been able<br />
to use airline reservation data to screen the hijackers <strong>as</strong> they entered the United<br />
States.<br />
But even after 9/11, privacy campaigners tried to rebuild the wall and to keep<br />
the Department of Homeland Security (DHS) from using airline reservation<br />
data effectively. They failed; too much blood had been spilled.<br />
But in the fields where dis<strong>as</strong>ter h<strong>as</strong> not yet struck—computer security and<br />
biotechnology—privacy groups have blocked the government from taking even<br />
modest steps to head off danger.<br />
I like to think that I care about privacy, too. But I had no sympathy for privacy<br />
crusaders’ ferocious objection to any new government use of technology and<br />
data. Where, I wondered, did their objection come from?<br />
So I looked into the history of privacy crusading. And that’s where I found the<br />
answer.<br />
The Birth of the Right of Privacy<br />
In the 1880s, Samuel Dennis Warren w<strong>as</strong> near the top of the Boston<br />
aristocracy. He had finished second in his cl<strong>as</strong>s at Harvard Law School. He<br />
founded a law firm with the man who finished just ahead of him, Louis<br />
Brandeis, and they prospered mightily. Brandeis w<strong>as</strong> a brilliant, creative lawyer<br />
and social reformer who would eventually become a great Supreme Court<br />
justice.<br />
But Samuel Dennis Warren w<strong>as</strong> haunted. There w<strong>as</strong> a canker in the rose of his<br />
life. His wife w<strong>as</strong> a great hostess, and her parties were carefully planned. When<br />
* Stewart A. Baker is a partner in the W<strong>as</strong>hington office of Steptoe & Johnson LLP. He<br />
returned to the firm following 3½ years at the Department of Homeland Security <strong>as</strong> its first<br />
Assistant Secretary for Policy.
484 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
Warren’s cousin married, Mabel Warren held a wedding breakf<strong>as</strong>t and filled her<br />
house with flowers for the event. The papers described her home <strong>as</strong> a “veritable<br />
floral bower.”<br />
No one should have to put up with this. Surely you see the problem. No? Well,<br />
Brandeis did.<br />
He and Warren both thought that, by covering a private social event, the<br />
newspapers had reached new heights of impertinence and intrusiveness. The<br />
parties and guest lists of a Boston Brahmin and his wife were no one’s business<br />
but their own, he thought. And so w<strong>as</strong> born the right to privacy.<br />
Angered by the press coverage of these private events, Brandeis and Warren<br />
wrote one of the most frequently cited law review articles ever published. In<br />
fact, “The Right to Privacy,” which appeared in the 1890 Harvard Law Review,<br />
is more often cited than read—for good re<strong>as</strong>on, <strong>as</strong> we’ll see. 1 But a close<br />
reading of the article actually tells us a lot about the modern concept of privacy.<br />
Brandeis, 2 also the father of the policy-oriented legal brief, begins the article<br />
with a candid exposition of the policy re<strong>as</strong>ons why courts should recognize a<br />
new right to privacy. His argument is uncompromising:<br />
The press is overstepping in every direction the obvious<br />
bounds of propriety and of decency. Gossip is no longer the<br />
resource of the idle and of the vicious, but h<strong>as</strong> become a trade,<br />
which is pursued with industry <strong>as</strong> well <strong>as</strong> effrontery … To<br />
occupy the indolent, column upon column is filled with idle<br />
gossip, which can only be procured by intrusion upon the<br />
domestic circle. The intensity and complexity of life, attendant<br />
upon advancing civilization, have rendered necessary some<br />
retreat from the world, and man, under the refining influence<br />
of culture, h<strong>as</strong> become more sensitive to publicity, so that<br />
solitude and privacy have become more essential to the<br />
individual; but modern enterprise and invention have, through<br />
inv<strong>as</strong>ions upon his privacy, subjected him to mental pain and<br />
distress, far greater than could be inflicted by mere bodily<br />
injury … Even gossip apparently harmless, when widely and<br />
persistently circulated, is potent for evil … When personal<br />
gossip attains the dignity of print, and crowds the space<br />
available for matters of real interest to the community, what<br />
1 Samuel Warren & Louis D. Brandeis, The Right to Privacy, 4 HARVARD L. REV. 193 (1890).<br />
2 Because the article owes much of its current fame to Brandeis’s later career, I will from this<br />
point on discuss only his views without each time laboriously giving credit, if that is the right<br />
word, to his coauthor.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 485<br />
wonder that the ignorant and thoughtless mistake its relative<br />
importance … Triviality destroys at once robustness of<br />
thought and delicacy of feeling. 3<br />
What does Brandeis mean by this? To be brief, he thinks it should be illegal for<br />
the newspapers to publish harmless information about himself and his family.<br />
That, he says, is idle gossip, and it distracts “ignorant and thoughtless”<br />
newspaper readers from more high-minded subjects. It also afflicts the refined<br />
and cultured members of society—like, say, Samuel Dennis Warren and his<br />
wife—who need solitude but who are instead har<strong>as</strong>sed by the fruits of “modern<br />
enterprise and invention.”<br />
What’s remarkable about “The Right to Privacy” is that the article’s title still<br />
invokes reverence, even though its substance is, well, laughable.<br />
Is there anyone alive who thinks it should be illegal for the media to reveal the<br />
guest-list at a prominent socialite’s dinner party or to describe how elaborate the<br />
floral arrangements were? Today, it’s more likely that the hostess of a prominent<br />
dinner party will blog it in advance, and that the guests will send Twitter updates<br />
while it’s under way. For most socialites, what would really hurt is a lack of<br />
media coverage. To be blunt, when he complains so bitterly about media<br />
interest in a dinner party, Brandeis sounds to modern ears like a wuss.<br />
Equally peculiar is the suggestion that we should keep such information from<br />
the inferior cl<strong>as</strong>ses lest they abandon self-improvement and wallow instead in<br />
gossip about their betters. That makes Brandeis sound like a wuss and a snob.<br />
He does sound quite up-to-date when he complains that “modern enterprise<br />
and invention” are invading our solitude. That is a familiar complaint. It’s what<br />
privacy advocates are saying today about Google, not to mention the National<br />
Security Agency (NSA). Until you realize that he’s complaining about the<br />
scourge of “instantaneous photographs and newspaper enterprise.” 4 Huh?<br />
Brandeis evidently thinks that publishing a private citizen’s photo in the<br />
newspaper causes “mental pain and distress, far greater than could be inflicted<br />
by mere bodily injury.” 5<br />
If we agreed today, of course, we probably wouldn’t have posted 5 billion<br />
photographs of ourselves and our friends on Flickr. 6<br />
3 The Right to Privacy, at 196 (1890).<br />
4 Id. at 195.<br />
5 Id. at 196.<br />
6 Zack Sheppard, 5,000,000,000, Flickr Blog, Sept. 19, 2010,<br />
http://blog.flickr.net/en/2010/09/19/5000000000/.
486 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
Spirit of the Privacy Movement Today<br />
Anachronistic <strong>as</strong> it seems, the spirit of Brandeis’s article is still the spirit of the<br />
privacy movement. The right to privacy w<strong>as</strong> born <strong>as</strong> a reactionary defense of the<br />
status quo, and so it remains. Then, <strong>as</strong> now, new technology suddenly made it<br />
possible to spread information more cheaply and more e<strong>as</strong>ily. This w<strong>as</strong> new,<br />
and uncomfortable. But apart from a howl of pain—pain “far greater than …<br />
mere bodily injury”—Brandeis doesn’t tell us why it’s so bad. I guess you had to<br />
be there—literally. Unless you were an adult when photography came to<br />
newspapers, you’ll probably never really understand what the fuss w<strong>as</strong> about.<br />
We’ve all been photographed, and most of us aren’t happy with the results, at<br />
le<strong>as</strong>t not all the time. But that’s life, and we’ve learned to live with it. Most of us<br />
can’t imagine suing to prevent the distribution of our photographs—which w<strong>as</strong><br />
the tort Brandeis wanted the courts to create.<br />
We should not mock Brandeis too harshly. His article clearly conveys a heartfelt<br />
sense of inv<strong>as</strong>ion. But it is a sense of inv<strong>as</strong>ion we can never share. The<br />
sensitivity about being photographed or mentioned in the newspapers, a raw<br />
spot that rubbed Brandeis so painfully, h<strong>as</strong> calloused over. So thick is the<br />
callous that most of us would be tickled, not appalled, to have our dinner parties<br />
make the local paper, and especially so if it included our photos.<br />
And that’s the second thing that Brandeis’s article can tell us about more<br />
contemporary privacy flaps. His brand of resistance to change is still alive and<br />
well in privacy circles, even if the targets have been updated. Each new privacy<br />
kerfuffle inspires strong feelings precisely because we are reacting against the<br />
effects of a new technology. Yet <strong>as</strong> time goes on, the new technology becomes<br />
commonplace. Our reaction dwindles away. The raw spot grows a callous. And<br />
once the initial reaction h<strong>as</strong> p<strong>as</strong>sed, so does the sense that our privacy h<strong>as</strong> been<br />
invaded. In short, we get used to it.<br />
At the beginning, of course, we don’t want to get used to it. We want to keep<br />
on living the way we did before, except with a few more amenities. And so, like<br />
Brandeis, we are tempted to <strong>as</strong>k the law to stop the changes we see coming.<br />
There’s nothing more natural, or more reactionary, than that.<br />
Most privacy advocates don’t see themselves <strong>as</strong> reactionaries or advocates for<br />
the status quo, of course. Right and left, they c<strong>as</strong>t themselves <strong>as</strong> underdogs<br />
battling for change against the entrenched forces of big government. But<br />
virtually all of their activism is actually devoted to stopping change—keeping<br />
the government (and sometimes industry) from taking advantage of new<br />
technology to process and use information.<br />
But simply opposing change, especially technological change, is a losing battle.<br />
At heart, the privacy groups know it, which may explain some of their shrillness<br />
and lack of perspective. Information really does “want to be free”—or at le<strong>as</strong>t
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 487<br />
cheap. And the spread of cheap information about all of us will change our<br />
relationship to the world. We will have fewer secrets. Crippling government by<br />
preventing it from using information that everyone else can get will not give us<br />
back our secrets.<br />
In the 1970s, well before the personal computer and the Internet, privacy<br />
campaigners persuaded the country that the FBI’s newspaper clipping files<br />
about U.S. citizens were a threat to privacy. Sure, the information w<strong>as</strong> public,<br />
they acknowledged, but gathering it all in one file w<strong>as</strong> viewed <strong>as</strong> vaguely sinister.<br />
The attorney general banned the practice in the absence of some legal re<strong>as</strong>on<br />
for doing so, usually called an investigative “predicate.”<br />
So, in 2001, when Google had made it possible for anyone to <strong>as</strong>semble a clips<br />
file about anyone in seconds, the one institution in the country that could not<br />
print out the results of its Google searches about Americans w<strong>as</strong> the FBI. This<br />
w<strong>as</strong> bad for our security, and it didn’t protect anyone’s privacy either.<br />
The privacy campaigners are fighting the inevitable. The “permanent record”<br />
our high school principals threatened us with is already here—in Facebook.<br />
Anonymity, its thrills and its freedom, h<strong>as</strong> been characteristic of big cities for<br />
centuries. But anonymity will also grow scarce <strong>as</strong> data becomes e<strong>as</strong>ier and e<strong>as</strong>ier<br />
to gather and correlate. We will lose something <strong>as</strong> a result, no question about it.<br />
The privacy groups’ response is profoundly conservative in the William F.<br />
Buckley sense—standing athwart history yelling, “Stop!” 7<br />
I’m all for conservatism, even in unlikely quarters. But using laws to fight the<br />
inevitable looks a lot like Prohibition. Prohibition w<strong>as</strong> put in place by an Anglo-<br />
Saxon Protestant majority that w<strong>as</strong> sure of its moral superiority but not of its<br />
future. What the privacy community wants is a kind of data Prohibition for<br />
government, while the rest of us get to spend more and more time in the corner<br />
bar.<br />
That might work if governments didn’t need the data for important goals such<br />
<strong>as</strong> preventing terrorists from entering the country. After September 11, though,<br />
we can no longer afford the forced inefficiency of denying modern information<br />
technology to government. In the long run, any effective method of ensuring<br />
privacy is going to have to focus on using technology in a smart way, not just<br />
trying to make government slow and stupid.<br />
7 See William F. Buckley Jr., Publisher’s Statement, NATIONAL REVIEW, Nov. 19, 1955, at 5,<br />
available at www.nationalreview.com/articles/223549/our-mission-statement/williamf-buckley-jr.
488 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
The Evolution of Technology<br />
& the “Zone of Privacy”<br />
That doesn’t mean we have to give up all privacy protection. It just means that<br />
we have to look for protections that work with technology instead of against it.<br />
We can’t stop technology from making information cheap and reducing<br />
anonymity, but we can deploy that same technology to make sure that<br />
government officials can’t misuse data and hide their tracks. This new privacy<br />
model is partially procedural—greater oversight and transparency—and partly<br />
substantive—protecting individuals from actual adverse consequences rather<br />
than hypothetical informational injuries.<br />
Under this approach, the first people who should lose their privacy are the<br />
government workers with access to personal data. They should be subject to<br />
audit, to challenge, and to punishment if they use the data for improper<br />
purposes. That’s an approach that works with emerging technology to build the<br />
world we want to live in. In contr<strong>as</strong>t, it is simple Luddism to keep government<br />
from doing with information technology what every other part of society can<br />
do.<br />
The problem is that Luddism always h<strong>as</strong> appeal. “Change is bad” is a slogan<br />
that h<strong>as</strong> never lacked for adherents, and privacy advocates sounded alarm after<br />
alarm with that slogan <strong>as</strong> the backdrop when we tried to put in place a datab<strong>as</strong>ed<br />
border screening system.<br />
But would we really thank our ancestors if they’d taken the substance of<br />
Brandeis’s article <strong>as</strong> seriously <strong>as</strong> its title? If, without a legislature ever<br />
considering the question, judges had declared that no one could publish true<br />
facts about a man’s nonpolitical life, or even his photograph, without his<br />
permission?<br />
I don’t think so. Things change. Americans grow less private about their sex<br />
lives but more private about financial matters. Today, few of us are willing to<br />
have strangers living in our homes, listening to our family conversations, and<br />
then gossiping about us over the back fence with the strangers who live in our<br />
friends’ homes. Yet I’ll bet that both Brandeis and Warren tolerated without a<br />
second thought the limits that having servants put on their privacy.<br />
Why does our concept of privacy vary from time to time? Here’s one theory:<br />
Privacy is allied with shame. We are all <strong>as</strong>hamed of something about ourselves,<br />
something we would prefer that no one, or just a few people, know about. We<br />
want to keep it private. Sometimes, of course, we should be <strong>as</strong>hamed. Criminals<br />
always want privacy for their acts. But we’re also <strong>as</strong>hamed—or at le<strong>as</strong>t feel<br />
embarr<strong>as</strong>sment, the first cousin of shame—about a lot of things that aren’t<br />
crimes.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 489<br />
We may be <strong>as</strong>hamed of our bodies, at le<strong>as</strong>t until we’re sure we won’t be mocked<br />
for our physical shortcomings. Privacy is similar; we are often quite willing to<br />
share information about ourselves, including what we look like without our<br />
clothes, when we trust our audience, or when the context makes us believe that<br />
our shortcomings will go unnoticed. Most of us would rather be naked with our<br />
spouse than a random stranger. And we would not appear at the office in our<br />
underwear, even if it covers more than the bathing suit we wore at the beach on<br />
the weekend.<br />
For that re<strong>as</strong>on, enforced nudity often feels like a profound inv<strong>as</strong>ion of our<br />
privacy. At le<strong>as</strong>t at first. In fact, though, we can get used to it pretty quickly, <strong>as</strong><br />
anyone who h<strong>as</strong> played high school sports or served in the army can attest.<br />
That’s because the fear of mockery is usually worse than the experience. So<br />
when we discover that being naked in a crowd of other naked people doesn’t<br />
lead to mockery and shame, we begin to adapt. We develop a callous where we<br />
once were tender.<br />
The things that Brandeis considered privacy inv<strong>as</strong>ions are similar. Very few of<br />
us are happy the first time we see our photograph or an interview in the<br />
newspaper. But pretty soon we realize it’s just not that big a deal. Our nose and<br />
our style of speech are things that the people we know have already accepted,<br />
and no one else cares enough to embarr<strong>as</strong>s us about them. The same is true<br />
when we Google ourselves and see that a bad review of our dinner-theater<br />
performance is number three on the list. Our first reaction is embarr<strong>as</strong>sment<br />
and unhappiness, but the reaction is oddly evanescent.<br />
If this is so, then the “zone of privacy” is going to vary from time to time and<br />
place to place—just <strong>as</strong> our concept of physical modesty does. The zone of<br />
privacy h<strong>as</strong> boundaries on two sides. We don’t care about some information<br />
that might be revealed about us, probably because the revelation causes us no<br />
harm—or we’ve gotten used to it. If the information is still embarr<strong>as</strong>sing, we<br />
want to keep it private, and society may agree. But we can’t expect privacy for<br />
information that society views <strong>as</strong> truly shameful or criminal.<br />
Over time, information will move into and out of the zone of privacy on both<br />
sides. Some information will simply become so unthreatening that we’ll laugh at<br />
the idea that it is part of the privacy zone. Photographs long ago entered that<br />
category, despite Brandeis’s campaigning. Some information will move from<br />
criminal evidence into the zone of privacy, <strong>as</strong> sexual preference h<strong>as</strong>. Conversely,<br />
it may move in the other direction: information that a man beats his wife is no<br />
longer protected by a zone of familial privacy, <strong>as</strong> it once w<strong>as</strong>; now it’s viewed <strong>as</strong><br />
evidence of a crime.<br />
The biggest privacy battles will often be in circumstances where the rules are<br />
changing. The subtext of many Internet privacy fights, for example, is whether<br />
some new me<strong>as</strong>ure will expose the identities of people who download
490 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
pornography or copyrighted music and movies. Society is divided about how<br />
shameful it is to download these items, and it displaces that moral and legal<br />
debate into a fight about privacy.<br />
Divorce litigation, for instance, is brutal in part because information shared in a<br />
context of love and confidence ends up being disclosed to the world in a<br />
deliberately harmful way. Often the activity in question (like making a telephone<br />
call or a credit card purch<strong>as</strong>e) is something that the individual does freely, with<br />
clear knowledge that some other people (his bank or his phone company) know<br />
what he is doing. Sometimes the activities are proudly public in nature—<br />
protests against government policy, for example.<br />
In those c<strong>as</strong>es, the privacy concern is not that the bank or the phone company<br />
(or our spouse) actually h<strong>as</strong> the information, but rather what they will do with<br />
the information they have—whether they will use the data in ways we didn’t<br />
expect or give the data to someone who can harm us. We want to make sure the<br />
data will not be used to harm us in unexpected ways.<br />
And that helps explain why privacy advocates are so often Luddite in<br />
inclination. Modern technology keeps changing the ways in which information<br />
is used. Once, we could count on practical obscurity—the difficulty of finding<br />
bits of data from our p<strong>as</strong>t—to protect us from unexpected disclosures. Now,<br />
storage costs are virtually nil, and processing power is incre<strong>as</strong>ing exponentially.<br />
It is no longer possible to <strong>as</strong>sume that your data, even though technically public,<br />
will never actually be used. It is dirt cheap for data processors to compile<br />
dossiers on individuals, and to use the data in ways we didn’t expect.<br />
Some would argue that this isn’t really “privacy” so much <strong>as</strong> a concern about<br />
abuse of information. However it’s defined, though, the real question is what<br />
kind of protection is it re<strong>as</strong>onable for us to expect. Can we really write a<br />
detailed legislative or contractual pre-nup for each disclosure, setting forth<br />
exactly how our data will be used before we hand it over? I doubt it. Maybe we<br />
can forbid obvious misuses, but the more detailed we try to get, the more we<br />
run into the problem that our notions of what is private, and indeed of what is<br />
embarr<strong>as</strong>sing, are certain to change over time. If so, does it make sense to freeze<br />
today’s privacy preferences into law?<br />
In fact, that’s the mistake that Brandeis made—and the l<strong>as</strong>t lesson we can learn<br />
from the odd mix of veneration and guffawing that his article provokes.<br />
Brandeis wanted to extend common law copyright until it covered everything<br />
that can be recorded about an individual. The purpose w<strong>as</strong> to protect the<br />
individual from all the new technologies and businesses that had suddenly made<br />
it e<strong>as</strong>y to gather and disseminate personal information: “the too enterprising
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 491<br />
press, the photographer, or the possessor of any other modern device for<br />
rewording or reproducing scenes or sounds.” 8<br />
This proposal is wacky in two ways. First it tries to freeze in 1890 our sense of<br />
what is private and what is not. Second, it tries to defy the gravitational force of<br />
technology.<br />
Every year, information gets cheaper to store and to duplicate. Computers,<br />
iPods, and the Internet are all “modern devices” for “reproducing scenes or<br />
sounds,” which means that any effort to control reproduction of pictures,<br />
sounds, and scenes becomes extraordinarily difficult if not impossible. In fact, it<br />
can’t be done.<br />
There is a deep irony here. Brandeis thought that the way to ensure the strength<br />
of his new right to privacy w<strong>as</strong> to enforce it just like state copyright law. If you<br />
don’t like the way “your” private information is distributed, you can sue<br />
everyone who publishes it. One hundred years later, the owners of federal<br />
statutory copyrights in popular music and movies followed this prescription to a<br />
T. They began to use litigation to protect their data rights against “the<br />
possessor[s] of any other modern device for … reproducing scenes or<br />
sounds,” 9 a cl<strong>as</strong>s that now included many of their customers. The Recording<br />
Industry Association of America (RIAA) sued consumers by the tens of<br />
thousands for using their devices to copy and distribute songs.<br />
Unwittingly, the RIAA gave a thorough test to Brandeis’s notion that the law<br />
could simply stand in front of new technology and bring it to a halt through<br />
litigation. There aren’t a lot of people who think that that h<strong>as</strong> worked out well<br />
for the RIAA’s members, or for their rights.<br />
Brandeis wanted to protect privacy by outlawing the use of a common new<br />
technology to distribute “private” facts. His approach h<strong>as</strong> fared no better than<br />
the RIAA’s. Information that is e<strong>as</strong>y to gather, copy and distribute will be<br />
gathered, copied, and distributed, no matter what the law says.<br />
It may seem a little bit odd for me to criticize Brandeis and other privacy<br />
campaigners for resisting the spread of technology. After all, we can’t simply<br />
accept the world that technology and commerce serve up.<br />
It’s one thing to redirect the path of technological change by a few degrees. It’s<br />
another to insist that it take a right angle. Brandeis wanted it to take a right<br />
8 Warren & Brandeis, supra note 1 at 206.<br />
9 Id.
492 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
angle; he wanted to defy the changes that technology w<strong>as</strong> pressing upon him. So<br />
did the RIAA.<br />
Both were embracing a kind of Luddism—a reactionary sp<strong>as</strong>m in the face of<br />
technological change. They were doomed to fail. The new technologies, after all,<br />
empowered ordinary citizens and consumers in ways that could not be resisted.<br />
If the law tries to keep people from enjoying the new technologies, in the end it<br />
is the law that will suffer.<br />
But just because technologies are irresistible does not mean that they cannot be<br />
guided, or cannot have their worst effects offset by other technologies. The<br />
solutions I’m advocating will only work if they allow the world to keep<br />
practically all the benefits of the exponential empowerment that new technology<br />
makes possible.<br />
Privacy for the Real World:<br />
Proposed Solutions<br />
So what’s my solution to the tension between information technology and our<br />
current sense of privacy? The short answer is that we should protect privacy,<br />
but not by defying the course of technology or by crippling government when it<br />
investigates crimes. We can do it by working with technology, not against it. In<br />
particular, we can use information technology to make sure that government<br />
officials lose their privacy when they misuse data that h<strong>as</strong> been gathered for<br />
legitimate re<strong>as</strong>ons. Information technology now makes it e<strong>as</strong>ier to track every<br />
datab<strong>as</strong>e search made by every user, and then to follow any distribution of that<br />
data outside the system. In other words, it can make misuse of the data in<br />
government files much more difficult and much more dangerous.<br />
But before talking about what might work, let’s take a closer look at some of the<br />
ide<strong>as</strong> that don’t.<br />
Ownership of Personal Data<br />
The first privacy solution is one we’ve already seen. It’s the Brandeisian notion<br />
that we should all “own” our personal data. That h<strong>as</strong> some appeal, of course. If<br />
I have a secret, it feels a lot like property. I can choose to keep it to myself, or I<br />
can share it with a few people whom I trust. And I would like to believe that<br />
sharing a secret with a few trusted friends doesn’t turn it into public property.<br />
It’s like my home. Just because I’ve invited one guest home doesn’t mean the<br />
public is welcome.<br />
But in the end, information is not really like property. Property can only be held<br />
by one person at a time, or at most by a few people. But information can be<br />
shared and kept at the same time. And those with whom it is shared can p<strong>as</strong>s it<br />
on to others at little or no cost. If you ever told a friend about your secret crush
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 493<br />
in junior high, you’ve already learned that information cannot be controlled like<br />
property. As Ben Franklin is credited with saying, “Three may keep a secret if<br />
two of them are dead.” 10 The redistribution of information cannot be e<strong>as</strong>ily<br />
controlled in the best of times, and Moore’s Law is making the control of<br />
information nearly impossible. 11<br />
The recording and movie industries discovered the same thing. If these<br />
industries with their enormous lobbying and litigation budgets cannot control<br />
information that they own <strong>as</strong> a matter of law, the rest of us are unlikely to be<br />
able to control information about ourselves. Gossip is not going to become<br />
illegal simply because technology amplifies it.<br />
That’s why Brandeis’s proposal never really got off the ground, at le<strong>as</strong>t not <strong>as</strong> he<br />
envisioned it. Buoyed by Brandeis’s prestige, the idea that private facts are<br />
private property lingered on in the courts for years, but what survived of his<br />
proposal is scarcely recognizable today.<br />
In fact, so transformed is Brandeis’s privacy doctrine that it is now described,<br />
accurately, <strong>as</strong> a “right of publicity,” which surely would have him turning in his<br />
grave. Currently, most states honor Brandeis by allowing lawsuits for<br />
unauthorized commercial use of a person’s likeness, either by statute or judgemade<br />
law.<br />
Over time, courts lost sight of Brandeis’s purpose. They began to take the<br />
analogy to property literally. Brandeis wanted to treat private information like<br />
property because that w<strong>as</strong> the only way to give a remedy for the “mental pain<br />
and distress, far greater than could be inflicted by mere bodily injury,” that he<br />
thought a man suffered when his photo w<strong>as</strong> published without permission. But<br />
<strong>as</strong> people got used to having their pictures taken, the mental pain and distress<br />
slowly drained out of the experience.<br />
All that w<strong>as</strong> left w<strong>as</strong> the property analogy. And so judges began shrinking the<br />
right until it only had bite in the one set of circumstances where the right to<br />
control one’s image actually feels like a property right—when the image is<br />
worth real bucks. Thus, the courts require disgorgement of profits made when a<br />
celebrity’s name, face, voice, or even personal style is used without permission<br />
to sell or endorse products. As a result, the right to exploit a celebrity’s image<br />
really is property today; it can be sold, transferred, and even inherited.<br />
10 Benjamin Franklin, POOR RICHARD’S ALMANAC, July 1735.<br />
11 Moore’s Law describes the long-term trend that the number of transistors that can be placed<br />
inexpensively on an integrated circuit h<strong>as</strong> doubled approximately every two years. It is named<br />
after Intel’s co-founder Gordon E. Moore, who described the trend in the essay Cramming<br />
More Components Onto Integrated Circuits, ELECTRONICS MAGAZINE 4, 1965.
494 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
There’s only one problem with this effort to turn privacy into property: it h<strong>as</strong>n’t<br />
done much for privacy. It simply protects the right of celebrities to make money<br />
off their fame. In fact, by monetizing things like celebrity images, it rewards<br />
those who have most relentlessly sacrificed their privacy to gain fame.<br />
The right of publicity is well named. It is the right to put your privacy up for<br />
sale. Not surprisingly, a lot of people have been inspired to do just that.<br />
Ironically, Brandeis’s doctrine h<strong>as</strong> helped to destroy the essence of what he<br />
hoped to preserve.<br />
Oh, and in the process, Brandeis’s approach h<strong>as</strong> stifled creativity and restricted<br />
free speech—muzzling artists, social commentators, and businesspeople who<br />
want to make creative use of images that are an essential part of our cultural<br />
environment. It’s a dis<strong>as</strong>ter. Slowly, courts are waking up to the irony and<br />
limiting the right of publicity.<br />
The same “private information <strong>as</strong> property” approach h<strong>as</strong> also made a modest<br />
appearance in some consumer privacy laws, and it’s worked out just <strong>as</strong> badly. At<br />
bottom, consumer privacy protection laws like the Right to Financial Privacy<br />
Act 12treat a consumer’s data like a consumer’s money: You can give your data<br />
(or your money) to a company in exchange for some benefit, but only if you’ve<br />
been told the terms of the transaction and have consented. Similarly, the Cable<br />
Communications Policy Act of 1984 prevents cable providers from using or<br />
rele<strong>as</strong>ing personal information in most c<strong>as</strong>es unless the providers get the<br />
customer’s consent. The fruit of this approach is clear to anyone with a bank<br />
account or an Internet connection. Everywhere you turn, you’re confronted<br />
with “informed consent” and “terms of service” disclosures; these are uniformly<br />
impenetrable and non-negotiable. No one reads them before clicking the box,<br />
so the “consent” is more fiction than reality; certainly it does little to protect<br />
privacy. Indeed, it’s turning out a lot like the right of publicity. By treating<br />
privacy <strong>as</strong> property, consumer privacy protection law invites all of us to sell our<br />
privacy.<br />
And we do. Only for most of us, the going price turns out to be disconcertingly<br />
cheap.<br />
Mandatory Predicates for Information Access<br />
The second way of protecting privacy is to require what’s called a “predicate”<br />
for access to information. That’s a name only a lawyer could love. In fact, the<br />
whole concept is one that only lawyers love.<br />
12 The Right to Financial Privacy Act of 1978, Pub. L. No. 95-630, 92 Stat. 3695 (1978)<br />
(codified <strong>as</strong> amended at 12 U.S.C. § 3401 et seq.).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 495<br />
Simply put, the notion is that government shouldn’t get certain private<br />
information unless it satisfies a threshold requirement—a “predicate” for access<br />
to the data. Lawyers have played a huge role in shaping American thinking<br />
about privacy, and the predicate approach h<strong>as</strong> been widely adopted <strong>as</strong> a privacy<br />
protection. But its value for that purpose is quite doubtful.<br />
The predicate approach to privacy can be traced to the Fourth Amendment,<br />
which guarantees that “no Warrants shall issue, but upon probable cause.”<br />
Translated from legalese, this means that the government may not search your<br />
home unless it h<strong>as</strong> a good re<strong>as</strong>on to do so. When the government <strong>as</strong>ks for a<br />
search warrant, it must show the judge “probable cause”—evidence that the<br />
search will likely turn up criminal evidence or contraband. Probable cause is the<br />
predicate for the search.<br />
When a flap arose in the 1970s over the FBI practice of <strong>as</strong>sembling domestic<br />
security dossiers on Americans who had not broken the law, the attorney<br />
general stepped in to protect their privacy. He issued new guidelines for the<br />
FBI. He w<strong>as</strong> a lawyer, so he declared that the FBI could not do domestic<br />
security investigations of Americans without a predicate.<br />
The predicate w<strong>as</strong>n’t probable cause; that w<strong>as</strong> too high a standard. Instead, the<br />
attorney general allowed the launching of a domestic security investigation only<br />
if the bureau presented “specific and articulable facts giving re<strong>as</strong>on to believe”<br />
that the subject of the investigation may be involved in violence. 13<br />
Actually, the story of the FBI guidelines shows why the predicate approach<br />
often fails. The dossiers being <strong>as</strong>sembled by the FBI were often just clippings<br />
and other public information. They usually weren’t the product of a search in<br />
the cl<strong>as</strong>sic sense; no federal agents had entered private property to obtain the<br />
information. Nonetheless, the FBI guidelines treated the gathering of the<br />
information itself <strong>as</strong> though it were a kind of search.<br />
In so doing, the guidelines were following in Brandeis’s footsteps—treating<br />
information <strong>as</strong> though it were physical property. The collection of the<br />
information w<strong>as</strong> equated to a physical intrusion into the home or office of the<br />
individual. Implicitly, it <strong>as</strong>sumes that data can be locked up like property.<br />
But that analogy h<strong>as</strong> already failed. It failed for Brandeis and it failed for the<br />
RIAA. It failed for the FBI guidelines, too. As clippings became e<strong>as</strong>ier to<br />
13 The Right to Financial Privacy Act of 1978, Pub. L. No. 95-630, 92 Stat. 3695 (1978)<br />
(codified at 12 U.S.C. § 3414(a)(5)(A)), amended by the Uniting and Strengthening America<br />
by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA<br />
PATRIOT) Act of 2001, Pub. L. 107–56, 115 STAT. 272, § 505(b),<br />
www.law.cornell.edu/uscode/pdf/uscode12/lii_usc_TI_12_CH_35_SE_3401.pdf.
496 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
retrieve, clippings files became e<strong>as</strong>ier to <strong>as</strong>semble. Then Google made it<br />
possible for anyone to <strong>as</strong>semble an electronic clips file on anyone. There w<strong>as</strong><br />
nothing secret about the clippings then. They were about <strong>as</strong> private <strong>as</strong> a bus<br />
terminal.<br />
But the law w<strong>as</strong> stuck in another era. Under the guidelines, only the FBI and<br />
CIA needed a predicate to do Google searches. You have to be a pretty resilient<br />
society to decide that you want to deny to your law enforcement agencies a tool<br />
that is freely available to nine-year-old girls and terrorist gangs. Resilient, but<br />
stupid. (Not surprisingly, the guidelines were revised after 9/11.)<br />
That’s one re<strong>as</strong>on we shouldn’t treat the <strong>as</strong>sembling of data <strong>as</strong> though it were a<br />
search of physical property. As technology makes it e<strong>as</strong>ier and e<strong>as</strong>ier to collect<br />
data, the analogy between doing that and conducting a search of a truly private<br />
space will become less and less persu<strong>as</strong>ive. No one thinks government agencies<br />
should have a predicate to use the White Pages. Soon, predicates that keep law<br />
enforcement from collecting information in other ways will become equally<br />
anachronistic, leaving law enforcement stuck in the 1950s while everyone else<br />
gets to live in the twenty-first century.<br />
I saw this lawyerly affinity for predicates up close at DHS. The issue w<strong>as</strong> laptop<br />
searches at the border. The government h<strong>as</strong> always had the right to search<br />
anything crossing the border without probable cause. Smugglers are smart and<br />
highly motivated; they would find a way to exploit any limitations on the<br />
authority to conduct searches. The first Congress knew that quite well, and in<br />
1789, two months before it sent the Fourth Amendment to the states for<br />
approval, Congress gave the customs service “full power and authority” to<br />
search “any ship or vessel, in which they shall have re<strong>as</strong>on to suspect any goods,<br />
wares or merchandise subject to duty shall be concealed.” 14<br />
Obviously, DHS and its border predecessors didn’t search laptops in 1789. But<br />
they did search books, papers, correspondence, and anything else that could<br />
store information. That w<strong>as</strong> the law for two hundred years, with one exception.<br />
The Supreme Court h<strong>as</strong> ruled that a few extraordinarily intrusive techniques—<br />
body cavity searches and forced x-rays—require a “re<strong>as</strong>onable suspicion.” 15<br />
Laptops are treated like books and papers. They are searched whenever border<br />
officials think that such a search is likely to be productive. Even the famously<br />
14 An Act to Regulate the Collection of Duties, 1st Cong. 1st Sess., Stat.1, Ch. V, Sec. 24 at 43<br />
(July 31, 1789).<br />
15 U.S. v. Flores-Montano, 541 U.S. 149 (2004).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 497<br />
liberal Ninth Circuit, the court of appeal that includes California, h<strong>as</strong> had no<br />
trouble approving that practice, 16 and for good re<strong>as</strong>on—laptop searches pay off.<br />
In 2006, for example, border officials at the Minneapolis-St. Paul airport<br />
referred a suspect traveler to secondary inspection. There they found that his<br />
computer contained video clips of IEDs being used to kill soldiers and destroy<br />
vehicles and a video on martyrdom. He w<strong>as</strong> also carrying a manual on how to<br />
make improvised explosive devices, or IEDs—a weapon of choice for terrorists<br />
in Afghanistan and Iraq.<br />
Despite two hundred years of history and precedent, <strong>as</strong> well <strong>as</strong> the proven value<br />
of searching electronic media, privacy groups launched a campaign against<br />
laptop searches toward the end of the Bush administration. This w<strong>as</strong> a strange<br />
and unhappy era in the debate over privacy. By 2005, privacy advocates had<br />
found a growing audience for claims that the Bush administration had<br />
abandoned all limits in pursuing terrorism—that it had swung the pendulum<br />
violently away from privacy and in favor of government authority.<br />
The privacy advocates’ solution to the laptop issue w<strong>as</strong> the lawyer’s favorite—a<br />
predicate requirement. Laptops should not be searched at the border, they<br />
argued, unless the border official could articulate some specific re<strong>as</strong>on for<br />
conducting the search. That argument w<strong>as</strong> rejected by both the Bush and the<br />
Obama administrations after careful consideration.<br />
We rejected it for two re<strong>as</strong>ons. It wouldn’t have protected privacy in any<br />
meaningful way, and it would have helped criminals like pedophiles and<br />
terrorists defeat our border defenses. Other than that, it w<strong>as</strong> jim-dandy.<br />
Why wouldn’t it help protect privacy? Because, <strong>as</strong> a practical matter, no border<br />
official today searches a laptop without some re<strong>as</strong>onable suspicion about the<br />
traveler. The exponential incre<strong>as</strong>e in commercial jet travel and the unforgiving<br />
thirty-second rule mean that only one traveler in two hundred is sent to<br />
secondary inspection for a closer look. Once there, many travelers quickly<br />
satisfy the officials that they don’t deserve more detailed inspection.<br />
Everyone at the border is busy; border officers don’t have the luxury of<br />
hooking up the laptops of random travelers for inspection without a good<br />
re<strong>as</strong>on. Officers who w<strong>as</strong>te their time and DHS’s resources that way are going<br />
to hear from their supervisors long before they hear from the travelers’ lawyers.<br />
If border officials only search laptops today when they have a good re<strong>as</strong>on to<br />
do so, why not make that a requirement? What harm can it do to make<br />
re<strong>as</strong>onable suspicion a predicate for laptop searches at the border? Plenty.<br />
16 U.S. v. Arnold, 523 F.3d 941 (9th Cir. 2008).
498 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
Requiring re<strong>as</strong>onable suspicion before a laptop search will open every border<br />
search to litigation. And in court, it may be hard to justify even some very<br />
re<strong>as</strong>onable judgments.<br />
Inevitably, enforcement of a predicate requirement for border searches will<br />
produce litigation. The litigation will focus on the motives of the border<br />
officials. The courts will tell those officials that some re<strong>as</strong>ons are not good<br />
enough. Defense lawyers will want to see the personnel records of border<br />
officials, hoping to show that they’ve inspected a disproportionate number of<br />
laptops belonging to minorities, or to Saudis, or to men, or any other pattern<br />
that might get the c<strong>as</strong>e thrown out. Border officials will have to start keeping<br />
detailed records justifying each laptop search. New paperwork and new<br />
procedures will clog the inspection process, backing up travelers and penalizing<br />
any inspector who searches a laptop.<br />
Wait a minute, you might <strong>as</strong>k, what if those officials are racists or sexists?<br />
Let’s <strong>as</strong>sume that this concern is legitimate, at le<strong>as</strong>t sometimes, and that there<br />
are bi<strong>as</strong>ed officials at work on the border. Surely there’s a better way to find<br />
them and get them off the job than to count on criminal defense lawyers<br />
exposing them on the witness stand years after the event.<br />
By now, notice, we’re not even talking about privacy anymore. The “predicate”<br />
solution h<strong>as</strong>, in effect, changed the subject. We’re talking about the motives of<br />
border officials, or ethnic profiling, or something—but it isn’t privacy. We’re<br />
also moving the whole discussion into territory that lawyers find comfortable<br />
but that ordinary people might question.<br />
The Fourth Amendment approach to privacy <strong>as</strong>sumes that privacy is best<br />
protected by letting criminals challenge the search that produced the evidence<br />
against them, but before adopting that solution, we ought to be pretty sure that<br />
we’re going to get benefits that match the cost of letting guilty defendants go<br />
free, something that isn’t obvious here.<br />
Limits on Information Use<br />
That leaves the third approach to privacy, one we’ve already seen in action. If<br />
requiring a predicate is the lawyer’s solution; this third approach is the<br />
bureaucrat’s solution. It is at heart the approach adopted by the European<br />
Union: Instead of putting limits on when information may be collected, it sets<br />
limits on how the information is used.<br />
The European Union’s data protection principles cover a lot of ground, but<br />
their unifying theme is imposing limits on how private data is used. Under those<br />
principles, personal data may only be used in ways that are consistent with the
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 499<br />
purposes for which the data were gathered. Any data that is retained must be<br />
relevant to the original purposes and must be stored securely to prevent misuse.<br />
The EU’s negotiating position in the p<strong>as</strong>senger name records conflict w<strong>as</strong><br />
largely derived from this set of principles. The principles also explain Europe’s<br />
enthusi<strong>as</strong>m for a wall between law enforcement and intelligence. If DHS<br />
gathered reservation data for the purpose of screening travelers when they cross<br />
the border, why should any other agency be given access to the data? This also<br />
explains the EU’s insistence on short deadlines for the destruction of PNR data.<br />
Once it had been used to screen p<strong>as</strong>sengers, it had served the purpose for which<br />
it w<strong>as</strong> gathered and should be promptly discarded.<br />
There is a core of sense in this solution. It focuses mainly on the consequences<br />
of collecting information, and not on the act of collection. It doesn’t try to insist<br />
that information is property. It recognizes that when we give information to<br />
others, we usually have an expectation about how it will be used, and <strong>as</strong> long <strong>as</strong><br />
the use fits our expectations, we aren’t too fussy about who exactly gets to see<br />
it. By concentrating on how personal information is used, this solution may get<br />
closer to the core of privacy than one that focuses on how personal information<br />
is collected.<br />
It h<strong>as</strong> another advantage, too. In the c<strong>as</strong>e of government datab<strong>as</strong>es, focusing on<br />
use also allows us to acknowledge the overriding importance of some<br />
government data systems while still protecting against petty uses of highly<br />
personal information.<br />
Call it the deadbeat-dad problem, or call it mission creep, but there’s an<br />
uncomfortable pattern to the use of data by governments. Often, personal data<br />
must be gathered for a pressing re<strong>as</strong>on—the prevention of crime or terrorism,<br />
perhaps, or the administration of a social security system. Then, <strong>as</strong> time goes on,<br />
it becomes attractive to use the data for other, less pressing purposes—<br />
collecting child support, perhaps, or enforcing parking tickets. No one would<br />
support the gathering of a large personal datab<strong>as</strong>e simply to collect unpaid<br />
parking fines; but “mission creep” can e<strong>as</strong>ily carry the datab<strong>as</strong>e well beyond its<br />
original purpose. A limitation on use prevents mission creep, or at le<strong>as</strong>t forces a<br />
debate about each step in the expansion.<br />
That’s all fine. But in the end, this solution is also flawed.<br />
It, too, is fighting technology, though less obviously than the predicate and<br />
property approaches. Data that h<strong>as</strong> already been gathered is e<strong>as</strong>ier to use for<br />
other purposes. It’s foolish to pretend otherwise. Indeed, developments in<br />
information technology in recent years have produced real strides in searching<br />
unstructured data or in finding relationships in data without knowing for sure<br />
that the data will actually produce anything useful. In short, there are now good
500 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
re<strong>as</strong>ons to collate data gathered for widely differing purposes, just to see the<br />
patterns that emerge.<br />
This new technical capability is hard to square with use limitations or with early<br />
destruction of data. For if collating data in the government’s hands could have<br />
prevented a successful terrorist attack, no one will congratulate the agency that<br />
refused to allow the collation because the data w<strong>as</strong> collected for tax or<br />
regulatory purposes, say, and not to catch terrorists.<br />
What’s more, use limitations have caused great harm when applied too<br />
aggressively. The notorious “wall” between law enforcement and intelligence<br />
w<strong>as</strong> at heart a use limitation. It <strong>as</strong>sumed that law enforcement agencies would<br />
gather information using their authority, and then would use the information<br />
only for law enforcement purposes. Intelligence agencies would do the same.<br />
Or so the theory went. But strict enforcement of this use limitation w<strong>as</strong><br />
unimaginably costly. In August 2001, two terrorists were known to have<br />
entered the United States. As the search for them began, the government’s top<br />
priority w<strong>as</strong> enforcing the wall -- keeping intelligence about the terrorists from<br />
being used by the “wrong” part of the FBI. Government lawyers insisted that<br />
law enforcement resources could not be used to pursue intelligence that two<br />
known al Qaeda agents were in the United States in August 2001.<br />
This w<strong>as</strong> a fatal blunder. The criminal investigators were well-resourced and<br />
eager. They might have found the men. The intelligence investigators, in<br />
contr<strong>as</strong>t, had few resources and did not locate the terrorists, at le<strong>as</strong>t not until<br />
September 11, when the terrorists’ names were discovered on the manifests of<br />
the hijacked planes. It w<strong>as</strong> a high price to pay for the modest comfort of “use”<br />
limitations.<br />
Like all use limitations, the “wall” between law enforcement sounded re<strong>as</strong>onable<br />
enough in the abstract. While no one could point to a real privacy abuse arising<br />
from cooperation between the intelligence and law enforcement agencies in the<br />
United States, it w<strong>as</strong> e<strong>as</strong>y to point to the Gestapo and other totalitarian<br />
organizations where there had been too much cooperation among agencies.<br />
What w<strong>as</strong> the harm in a little organizational insurance against misuse of<br />
personal data, the argument ran. The rules allowed cooperation where that w<strong>as</strong><br />
strictly necessary, and we could count on the agencies to crowd right up to the<br />
line in doing their jobs. Or so we thought. In fact, we couldn’t. As the pressure<br />
and the risk ratcheted up, agents were discouraged from pushing for greater<br />
communication and cooperation across the wall. All the W<strong>as</strong>hington-wise knew<br />
that the way to bureaucratic glory and a good press lay in defending privacy.<br />
Actually, more to the point, they knew that bad press and bureaucratic disgrace<br />
were the likely result if your actions could be characterized <strong>as</strong> hurting privacy.<br />
Congress would hold hearings; appropriators would zero out your office; the<br />
second-guessing arms of the Justice Department, from the inspectors general to
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 501<br />
the Office of Professional Responsibility, would fe<strong>as</strong>t on every detail of your<br />
misstep. So, what might have been a sensible, modest use restriction preventing<br />
the dissemination of information without a good re<strong>as</strong>on became an<br />
impermeable barrier.<br />
That’s why the bureaucratic system for protecting privacy so often fails. The use<br />
restrictions and related limits are abstract. They make a kind of modest sense,<br />
but if they are enforced too strictly, they prevent new uses of information that<br />
may be critically important.<br />
And often they are enforced too strictly. You don’t have to tell a bureaucrat<br />
twice to withhold information from a rival agency. Lawsuits, bad press, and<br />
Congressional investigations all seem to push against a flexible reading of the<br />
rules. If a use for information is not identified at the outset, it can be nearly<br />
impossible to add the use later, no matter how sensible the change may seem.<br />
This leads agencies to try to draft broad uses for the data they collect, which<br />
defeats the original point of setting use restrictions.<br />
It’s like wearing someone else’s dress. Over time, use restrictions end up tight<br />
where they should be roomy—and loose where they should be tight. No one is<br />
left satisfied.<br />
The Audit Approach: Enforced Accountability<br />
So what will work? Simple: accountability, especially electronically-enforced<br />
accountability.<br />
The best way to understand this solution is to begin with Barack Obama’s<br />
p<strong>as</strong>sport records—and with “Joe the Plumber.” These were two minor flaps<br />
that punctuated the 2008 presidential campaign. But both tell us something<br />
about how privacy is really protected these days.<br />
In March of 2008, Barack Obama and Hillary Clinton were dueling across the<br />
country in weekly primary showdowns. Suddenly, the campaign took an odd<br />
turn. The Bush administration’s State Department announced that it had fired<br />
or disciplined several contractors for examining Obama’s p<strong>as</strong>sport records.<br />
Democrats erupted. It w<strong>as</strong>n’t hard to jump to the conclusion that the<br />
candidate’s files had been searched for partisan purposes. 17After an<br />
investigation, the flap slowly deflated. It soon emerged that all three of the main<br />
presidential candidates’ p<strong>as</strong>sport files had been improperly accessed.<br />
Investigators reported that the State Department w<strong>as</strong> able to quickly identify<br />
17 Karen Tumulty, Snooping Into Obama’s P<strong>as</strong>sport, TIME, Mar. 21, 2008,<br />
http://www.time.com/time/politics/article/0,8599,1724520,00.html.
502 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
who had examined the files by using its computer audit system. This system<br />
flagged any unusual requests for access to the files of prominent Americans.<br />
The fired contractors did not deny the computer record. Several of them were<br />
charged with crimes and pleaded guilty. All, it turned out, had acted purely out<br />
of “curiosity.”<br />
Six months later, it w<strong>as</strong> the Republicans’ turn to howl about privacy violations<br />
in the campaign. Samuel “Joe” Wurzelbacher, a plumber, became an overnight<br />
hero to Republicans in October 2008 after he w<strong>as</strong> practically the only person<br />
who laid a glove on Barack Obama during the campaign. The candidate made<br />
an impromptu stop in Wurzelbacher’s Ohio neighborhood and w<strong>as</strong> surprised<br />
when the plumber forced him into a detailed on-camera defense of his tax plan.<br />
Three days later, “Joe the Plumber” and his taxes were invoked dozens of times<br />
in the presidential debates.<br />
The price of fame w<strong>as</strong> high. A media frenzy quickly stripped Wurzelbacher of<br />
anonymity. Scouring the public record, reporters found that the plumber had<br />
been hit with a tax lien; they also found government data that raised doubts<br />
about the status of his plumbing license.<br />
Reporters weren’t the only ones digging. Ohio state employees also queried<br />
confidential state records about Wurzelbacher. In all, they conducted eighteen<br />
state records checks on Wurzelbacher. They <strong>as</strong>ked whether the plumber owed<br />
child support, whether he’d ever received welfare or unemployment benefits,<br />
and whether he w<strong>as</strong> in any Ohio law enforcement datab<strong>as</strong>es. Some of these<br />
searches were proper responses to media requests under Ohio open records<br />
laws; others looked more like an effort to dig dirt on the man.<br />
Ohio’s inspector general launched an investigation and in less than a month w<strong>as</strong><br />
able to cl<strong>as</strong>sify all but one of the eighteen records searches <strong>as</strong> either legitimate<br />
or improper. 18 Thirteen searches were traced and deemed proper, but three<br />
particularly intrusive searches were found improper; they had been carried out<br />
at the request of a high-ranking state employee who w<strong>as</strong> also a strong Obama<br />
supporter. She w<strong>as</strong> suspended from her job and soon stepped down. A fourth<br />
search w<strong>as</strong> traced to a former information technology contractor who had not<br />
been authorized to search the system he accessed; he w<strong>as</strong> placed under criminal<br />
investigation.<br />
What do these two flaps have in common? They were investigated within weeks<br />
of the improper access, and practically everyone involved w<strong>as</strong> immediately<br />
caught. That’s vitally important. Information technology isn’t just taking away<br />
your privacy or mine. It’s taking away the privacy of government workers even<br />
18 See State of Ohio, Office of Inspector General, Report of Investigation, File ID Number<br />
2008299, Nov. 20, 2008, www.judicialwatch.org/documents/2009/IGReport.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 503<br />
f<strong>as</strong>ter. Data is cheap to gather and cheap to store. It’s even getting cheap to<br />
analyze.<br />
So it isn’t hard to identify every official who accessed a particular file on a<br />
particular day. That’s what happened here. And the consequences for privacy<br />
are profound.<br />
If the lawyer’s solution is to put a predicate between government and the data<br />
and the bureaucrat’s solution is to put use restrictions on the data, then this is<br />
the auditor’s solution. Government access to personal data need not be<br />
restricted by speed bumps or walls. Instead, it can be protected by rules, so long<br />
<strong>as</strong> the rules are enforced.<br />
What’s new is that network security and audit tools now make it e<strong>as</strong>y to enforce<br />
the rules. That’s important because it takes the profit motive out of misuse of<br />
government data. No profit-motivated official is going to take the risk of<br />
stealing personal data if it’s obvious that he’ll be caught <strong>as</strong> soon <strong>as</strong> people start<br />
to complain about identity theft. Systematic misuse of government datab<strong>as</strong>es is<br />
a lot harder and more dangerous if good auditing is in place.<br />
Take another look at why government officials accessed these files. It w<strong>as</strong>n’t to<br />
steal identities. The re<strong>as</strong>on most of these people accessed the data w<strong>as</strong> simple<br />
curiosity. Even the one access that may have been for more reprehensible<br />
re<strong>as</strong>ons—the woman who checked confidential child support and welfare<br />
records for Joe the Plumber—w<strong>as</strong> quickly caught and the data never leaked.<br />
The speed and nearly complete effectiveness of the audit process in these c<strong>as</strong>es<br />
tells us that network auditing tools can transform the way we enforce the rules<br />
for handling data in government. For example, if we catch every error, we can<br />
improve compliance and at the same time reduce the penalties for mistakes.<br />
Harsh penalties are not the most effective way to enforce rules. In fact, they’re<br />
usually a confession of failure.<br />
When we catch every offender, we can afford to lower the penalty. Lighter,<br />
more certain penalties for privacy violations serve another purpose, too. We’ve<br />
talked a lot about the oddly protean nature of privacy. Not causing harm in<br />
unexpected ways is at the core of the concept, but it’s nearly impossible to write<br />
detailed rules spelling out what is and is not a violation of privacy. Indeed, the<br />
effort to write such rules and stick to them is what gave us the wall, and<br />
thousands of American dead. So something must be left to discretion.<br />
Government employees must use good sense in handling personal data. If they<br />
don’t, they should be punished. But if we are confident that we can identify any<br />
questionable use of personal data and correct it quickly, the punishments can be<br />
smaller. They can be learning experiences rather than penological experiences.
504 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
So why did we criminally prosecute the poor schlubs whose hobby w<strong>as</strong> looking<br />
at the p<strong>as</strong>sport pictures of famous people? The election happened. Everything<br />
that touched on the election w<strong>as</strong> put under a microscope. Evil motives were<br />
always <strong>as</strong>cribed to the other side. The State Department had to make a blood<br />
sacrifice to show that accessing the data w<strong>as</strong> not part of an evil plot by one<br />
party against the other. Opening a criminal investigation w<strong>as</strong> a way of<br />
condemning the access in the clearest possible f<strong>as</strong>hion. That the poor schlubs<br />
probably only deserved demotions counted for little in the super-heated<br />
atmosphere of a presidential campaign.<br />
That shows one of the problems with the audit approach. It is too e<strong>as</strong>ily turned<br />
into a phony privacy scandal. In both the Wurzelbacher and Obama c<strong>as</strong>es, the<br />
audits did their job. With one possible exception, they caught the government<br />
staff that broke the rules. They prevented any harm to either Wurzelbacher or<br />
Obama. And they made sure that the officials who were responsible would<br />
never repeat their errors again.<br />
The system worked. Privacy w<strong>as</strong> protected. But that’s certainly not the<br />
impression that w<strong>as</strong> left by coverage of the affairs. Indeed, the chairman of the<br />
Senate Judiciary Committee, Senator Leahy, used the p<strong>as</strong>sport flap to tout new<br />
legislation strengthening privacy protections on government datab<strong>as</strong>es. From a<br />
political point of view, then, the system failed. There were no thanks for the<br />
government officials who put the system in place, who checked the audit logs,<br />
who confronted and disciplined the wrongdoers, and who brought the solved<br />
problem to public attention. To the contrary, they were pilloried for allowing<br />
the access in the first place—even though preventing such access is an<br />
impossible t<strong>as</strong>k unless we intend to re-erect walls all across government.<br />
How’s that for irony? Audits work. But they work too well. Every time they<br />
catch someone and put a stop to misuse of personal data they also provide an<br />
opening for political grandstanding. In the end, the finger pointing will<br />
discourage audits. And that will mean less privacy enforcement. So, the more we<br />
turn every successful audit into a privacy scandal, the less real privacy we’re<br />
likely to have.<br />
That would be a shame, because the auditor’s solution to the problem is the<br />
only privacy solution that will get more effective <strong>as</strong> technology advances. And<br />
we’re going to need more solutions that allow flexible, e<strong>as</strong>y access to sensitive<br />
datab<strong>as</strong>es while still protecting privacy.<br />
If the plight of government investigators trying to prevent terrorist attacks<br />
doesn’t move you, think about the plight of medical technicians trying to keep<br />
you alive after a bad traffic accident.<br />
The Obama administration h<strong>as</strong> launched a long-overdue effort to bring<br />
electronic medical records into common use. But the privacy problem in this
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 505<br />
area is severe. Few of us want our medical records to be available to c<strong>as</strong>ual<br />
browsers. At the same time, we can’t personally verify the bona fides of the<br />
people accessing our records, especially if we’re lying by the side of the road<br />
suffering from what looks like brain or spine damage.<br />
But the electronic record system won’t work if it can’t tell the first responders<br />
that you have unusual allergies or a pacemaker. It h<strong>as</strong> to do that quickly and<br />
without a lot of formalities. Auditing access after the fact is likely to be our best<br />
answer to this problem, <strong>as</strong> it is to the very similar problem of how to let law<br />
enforcement and intelligence agencies share information smoothly and quickly<br />
in response to changing and urgent circumstances. The Markle Foundation h<strong>as</strong><br />
done pioneering work in this area, and its path-breaking 2003 report on privacy<br />
and security in the war on terror recommends embracing technologies that<br />
watch the watchers. 19 A unique mix of security, privacy, and technology experts<br />
managed to reach agreement in that report; they found that one key to<br />
protecting privacy without sacrificing security w<strong>as</strong> a network that included<br />
“access control, authentication, and full auditing capability.” 20<br />
The Markle report urges that large datab<strong>as</strong>es with personal information use<br />
emerging technologies that can identify all users of the system with certainty<br />
and then give them access that depends on their roles at any particular time.<br />
This includes “the ability to restrict access privileges so that data can be used<br />
only for a particular purpose, for a finite period of time, and by people with the<br />
necessary permissions.” 21 The technologies they cited are not pie in the sky.<br />
They exist today: “smart cards with embedded chips, tokens, biometrics, and<br />
security circuits” <strong>as</strong> well <strong>as</strong> “[i]nformation rights management technologies.” 22<br />
The Markle t<strong>as</strong>k force later did a thoughtful paper on one of those technologies,<br />
which would preserve audit logs even if high-ranking officials seek to destroy or<br />
modify them later. 23<br />
These technologies can be very flexible. This makes them especially suitable for<br />
c<strong>as</strong>es where outright denial of data access could have fatal results. The tools can<br />
be set to give some people immediate access, or to open the datab<strong>as</strong>es in certain<br />
situations, with an audit to follow. They can monitor each person with access to<br />
19 Markle Foundation T<strong>as</strong>k Force, Creating a Trusted Network for Homeland Security, Dec. 2003,<br />
http://www.markle.org/downloadable_<strong>as</strong>sets/nstf_report2_full_report.pdf.<br />
20 Id. at 15.<br />
21 Id.<br />
22 Id.<br />
23 MARKLE FOUNDATION TASK FORCE, IMPLEMENTING A TRUSTED INFORMATION SHARING<br />
ENVIRONMENT: USING IMMUTABLE AUDIT LOGS TO INCREASE SECURITY, TRUST, AND<br />
ACCOUNTABILITY (2006), available at<br />
http://www.markle.org/downloadable_<strong>as</strong>sets/nstf_IAL_020906.pdf.
506 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
the data and learn that person’s access patterns—what kinds of data, at what<br />
time, for how long, with or without copying, and the like. Deviations from the<br />
established pattern can have many consequences. Perhaps access will be granted<br />
but the person will be alerted that an explanation must be offered within<br />
twenty-four hours. Or access could be granted while a silent alarm sounds,<br />
allowing systems administrators to begin a real-time investigation.<br />
There’s a kind of paradox at the heart of this solution. We can protect people<br />
from misuse of their data, but only by stripping network users of any privacy or<br />
anonymity when they look at the data. The privacy campaigners aren’t likely to<br />
complain, though. In our experience, their interest in preserving the privacy of<br />
intelligence and law enforcement officers is pretty limited.<br />
When I w<strong>as</strong> general counsel of the National Security Agency, a well-known<br />
privacy group headed by Marc Rotenberg filed a Freedom of Information Act<br />
request <strong>as</strong>king the NSA to <strong>as</strong>semble all documents and emails sent “to or from<br />
Stewart Baker.” Then <strong>as</strong> now, the NSA w<strong>as</strong> forbidden to <strong>as</strong>semble files on<br />
American citizens who were not agents of a foreign power. Even so, Rotenberg<br />
w<strong>as</strong> <strong>as</strong>king NSA to <strong>as</strong>semble a dossier on me. Since NSA and I were locked in a<br />
battle with Rotenberg over encryption policy at the time, the purpose of the<br />
dossier w<strong>as</strong> almost certainly to look for embarr<strong>as</strong>sing information that might<br />
help Rotenberg in his political fight. Indeed, Rotenberg claimed when I<br />
confronted him that he w<strong>as</strong> planning to scrutinize my dossier for evidence of<br />
misconduct.<br />
Had the FBI or NSA <strong>as</strong>sembled a dossier on their political adversaries, it would<br />
have been a violation of law. In fact, it would have caused a privacy scandal. But<br />
Rotenberg saw no irony in his request. It w<strong>as</strong>n’t a privacy problem, in his view,<br />
because government officials deserve no privacy.<br />
I still think Rotenberg’s tactics were reprehensible: He had singled me out for a<br />
selective loss of privacy because he didn’t like my views. But I’ve come to<br />
appreciate that there’s a core of truth to his view of government. Anyone who<br />
h<strong>as</strong> access to government files containing personal data h<strong>as</strong> special<br />
responsibilities. He should not expect the same privacy when he searches that<br />
data <strong>as</strong> he h<strong>as</strong> while he’s surfing the net at home. And now that technology<br />
makes it e<strong>as</strong>y to authenticate and track every person, every device, and every<br />
action on a network, perhaps it’s time to use that technology to preserve<br />
everyone else’s privacy.<br />
In the end, that’s the difference between a privacy policy that makes sense and<br />
one that doesn’t. We can’t lock up data that is getting cheaper every day.<br />
Pretending that it’s property won’t work. Putting “predicates” between<br />
government and the data it needs won’t work, and neither will insisting that they<br />
may only be used for purposes foreseen when it w<strong>as</strong> collected.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 507<br />
What we can do is use new information technology tools to deter government<br />
officials from misusing their access to that data.<br />
As you know by now, I think that some technology poses extraordinary risks.<br />
But we can avoid the worst risks if we take action early. We shouldn’t try to<br />
stop the trajectory of new technology. But we can bend it just a little. Call it a<br />
course correction on an exponential curve.<br />
That’s also true for privacy. The future is coming—like it or not. Our data will<br />
be everywhere. But we can bend the curve of technology to make those who<br />
hold the data more accountable. Bending the exponential curve a bit—that’s a<br />
privacy policy that could work. And a technology policy that makes sense.
508 CHAPTER 8: WHAT FUTURE FOR PRIVACY?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 509<br />
A Market Approach<br />
to Privacy Policy<br />
By Larry Downes *<br />
Privacy: The Problem<br />
What happens when the cost of deleting information is higher than the cost of<br />
retaining it?<br />
The answer is that nothing gets deleted. In the age of cloud computing, mobile<br />
devices, and social networking, what that really means is that more and more<br />
data—some of it enterprise data, some of it personal information, and more and<br />
more of it something that merges the two—is being saved.<br />
Soon, perhaps already, much of it will be consolidated, aggregated, reorganized,<br />
and mined for valuable patterns, behaviors, and insights. Privacy h<strong>as</strong> become<br />
an unintended c<strong>as</strong>ualty of Moore’s Law—collateral damage from friendly fire.<br />
That, at le<strong>as</strong>t, is one way of thinking about privacy in the digital age, one that<br />
h<strong>as</strong> been on my mind for the l<strong>as</strong>t several months. I wrote about the privacy<br />
problem in my recent book, The Laws of Disruption, in which I argued that the<br />
real solution to concerns about privacy in the digital age would be the<br />
emergence of robust markets for private information, where consumers would<br />
be able to trade personal information with other individuals and enterprises<br />
when doing so generated mutual benefit. 1<br />
The privacy problem h<strong>as</strong> morphed since then into the latest terror of the digital<br />
age, surp<strong>as</strong>sing earlier shibboleths, such <strong>as</strong> copyright piracy, identity theft, cyber<br />
war and net neutrality. 2 Daily media coverage of the latest privacy policy<br />
* Larry Downes is an Internet analyst and consultant, helping clients develop business<br />
strategies in an age of constant disruption caused by information technology. He is the<br />
author of UNLEASHING THE KILLER APP: DIGITAL STRATEGIES FOR MARKET<br />
DOMINANCE (Harvard Business School Press 1998) and, most recently, of THE LAWS OF<br />
DISRUPTION: HARNESSING THE NEW FORCES THAT GOVERN LIE AND BUSINESS IN THE<br />
DIGITAL AGE (B<strong>as</strong>ic Books 2009) [hereinafter THE LAWS OF DISRUPTION].<br />
1 See LARRY DOWNES, THE LAWS OF DISRUPTION, Law Two: Privacy.<br />
2 As I’ve written elsewhere, all of these problems share a common core. Each raises the<br />
fundamental question about the nature of digital life and by whom and how its b<strong>as</strong>ic<br />
infr<strong>as</strong>tructure is to be governed. In some sense, each is another view of the same regulatory<br />
problem, seen through lenses that are equally unfocused, but in different ways. See Larry<br />
Downes, After the Deluge, More Deluge, THE TECHNOLOGY LIBERATION FRONT, July 22, 2010,<br />
http://techliberation.com/2010/07/22/after-the-deluge-more-deluge/.
510 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
change, hacking incident, stolen government laptop or inadvertent disclosure<br />
h<strong>as</strong> raised the stakes and the tension in a problem that, it seems, people react to<br />
with such strong emotions that rational discussion of any solution is now<br />
impossible. 3<br />
The privacy crisis is very much on the mind of regulators around the world,<br />
who see the emergence of privacy fears among consumers <strong>as</strong> the latest and<br />
perhaps the best opportunity to gain a toehold in regulating (and perhaps<br />
taxing) content on the web. Nearly all of the earlier efforts—including outright<br />
censorship, imposition of protectionist laws on global e-commerce, and<br />
enforcement of strict copyright, trademark and patent regimes onto the<br />
evolving collaborative ethos of digital life—have failed utterly. 4 By aligning<br />
themselves with consumer interests (and perhaps helping to stoke the fires of<br />
anxiety), regulators may have at l<strong>as</strong>t found their point of entry into the market<br />
for Internet regulation.<br />
That certainly seems to be the attitude adopted by the once-moribund U.S.<br />
Federal Trade Commission (FTC), which began a series of workshops in late<br />
2009 aimed at exploring “the privacy challenges posed by the v<strong>as</strong>t array of 21st<br />
century technology and business practices that collect and use consumer data.” 5<br />
On January 28th, 2010, which w<strong>as</strong> also dubbed Data Privacy Day by the nonprofit<br />
group The Privacy Projects, 6 the second workshop in the FTC’s threepart<br />
7 series took place at the University of California, Berkeley campus.<br />
Attendees heard from government, business, and public interest speakers on<br />
3 Recent examples include the Google Maps drive-by, see Robert Graham, Technical Details of<br />
the Street View WiFi Payload Controversy, ERRATA SECURITY, May 19, 2010,<br />
http://errat<strong>as</strong>ec.blogspot.com/2010/05/technical-details-of-street-view-wifi.html,<br />
Facebook’s on-going changes to its privacy policy and user options, Twitter’s FTC<br />
settlement, see Press Rele<strong>as</strong>e, Federal Trade Commission, Twitter Settles Charges that it<br />
Failed to Protect Consumers’ Personal Information; Company Will Establish Independently<br />
Audited Information Security Program, June 24, 2010,<br />
http://www.ftc.gov/opa/2010/06/twitter.shtm, the botched launch of Google Buzz, see<br />
Danny Goodwin, Google to Pay $8.5 Million in Buzz Privacy Cl<strong>as</strong>s Action Settlement,<br />
SEARCHENGINEWATCH.COM, Nov. 3, 2010, http://blog.searchenginewatch.com/101103-<br />
081738, shocking behavior on Chatroulette, the conviction of Google executives in an Italian<br />
court over a video showing the bullying of a minor with disabilities by a Google Video user,<br />
see Reuters, Italy Convicts Google Execs for Down Syndrome Video, Feb. 24, 2010,<br />
http://www.wired.com/epicenter/2010/02/google-executive-convicted-in-italy-fordowns-video/,<br />
and the launch of “social shopping” websites such <strong>as</strong> Blippy and Swipely.<br />
4 See generally, THE LAWS OF DISRUPTION, supra note 1.<br />
5 See Exploring Privacy: A Roundtable Series, Fed. Trade Comm’n,<br />
www.ftc.gov/bcp/workshops/privacyroundtables/ [hereinafter FTC Roundtable Series].<br />
6 For more information on Data Privacy Day, visit http://dataprivacyday2010.org.<br />
7 See FTC Roundtable Series, supra note 5.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 511<br />
whether and how the FTC should regulate the private collection and use of data<br />
to protect consumer privacy interests. The conversation that day, characterized<br />
by histrionic rhetoric, self-congratulatory moralizing, and an utter lack of focus,<br />
reflects well the current state of the so-called “privacy problem.”<br />
Why is the FTC holding such hearings in the first place? The agency’s charter,<br />
which h<strong>as</strong> evolved over its long history, includes policing anticompetitive<br />
behavior 8 and enforcing a Congressional ban on “unfair and deceptive acts or<br />
practices.” So far, the FTC’s main contribution to the debate about digital<br />
privacy h<strong>as</strong> been the drafting of non-binding guidelines for consumer notice of<br />
online services’ privacy policies, the so-called “Fair Information Practice<br />
Principles (FIPs).” 9 In the United States, the adoption of the FIPs is voluntary,<br />
but failure to abide by them can lead to FTC enforcement. 10<br />
The limits of the so-called “notice” regime are pretty obvious. Consumers<br />
don’t read privacy policies. Even if they did, they would find them to be<br />
absurdly long, most of them written in some of the worst legalese I’ve ever<br />
seen.<br />
Notice is also difficult to achieve in practice. During the l<strong>as</strong>t few years,<br />
Facebook h<strong>as</strong> repeatedly landed in trouble for its mostly-admirable efforts to<br />
craft a working privacy regime for its now 500 million users. The generally poor<br />
response to these efforts, I think, stems from a growing privacy paranoia fueled<br />
by the media and governments, kindled by the growing pains of a company that<br />
by its nature deals with very personal, even intimate, information and whose<br />
growth rate challenges pretty much everything.<br />
At the same time, the company’s founder, Mark Zuckerberg, h<strong>as</strong> demonstrated<br />
remarkably poor timing and nearly perfect political tone deafness. Even <strong>as</strong> the<br />
company dug itself out of criticism of a new set of privacy tools in the fall,<br />
Zuckerberg told an audience in January 2010 that “the social norm” for sharing<br />
private information had “evolved.” 11 Well, he is only 25 years old, and many of<br />
8 The FTC recently reached a settlement with Intel on a broad regulatory action brought<br />
against the company. See Press Rele<strong>as</strong>e, Fed. Trade Comm., FTC Settles Charges of<br />
Anticompetitive Conduct Against Intel, Aug. 4, 2010,<br />
http://www.ftc.gov/opa/2010/08/intel.shtm.<br />
9 For more information on the Fair Information Practice Principles, visit<br />
http://www.ftc.gov/reports/privacy3/fairinfo.shtm.<br />
10 The FTC can take action if a website violates its own policy on the b<strong>as</strong>is of its jurisdiction<br />
over deceptive practices. But there is no requirement for a website to have a privacy policy,<br />
and no explicit privacy protection in U.S. law for most categories of personally-identifiable<br />
information.<br />
11 See Marshall Kirkpatrick, Facebook’s Zuckerberg Says Age of Privacy is Over, READWRITEWEB,<br />
Jan. 9, 2010, http://www.readwriteweb.com/archives/<br />
facebooks_zuckerberg_says_the_age_of_privacy_is_ov.php.
512 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
his comments, including these, have been unfairly taken out of context by<br />
mainstream, business, and technology media companies. 12<br />
The limits of notice become acute when the data collection device is not a<br />
computer. Cell phones, in addition to taking on more and more data functions,<br />
collect a great deal of information about the location and movements of their<br />
users—they have to in order to function. But many applications such <strong>as</strong><br />
Loopt 13 and Foursquare 14 take advantage of GPS data to offer services not<br />
possible on a fixed computing device, including locating friends or providing<br />
location-specific ads. On the smaller screen of a cell phone, reading any<br />
document is difficult enough. What consumer in any c<strong>as</strong>e is going to read a<br />
separate privacy policy for every application they download?<br />
Even <strong>as</strong> it continues to refine and promote FIPs, the FTC h<strong>as</strong> held hearings,<br />
workshops and other information-gathering sessions regarding emerging<br />
technologies that seem to raise new and worrisome privacy concerns. These<br />
have included Radio Frequency ID tags, targeted or “behavioral”<br />
advertisements, cookies and now Fl<strong>as</strong>h-b<strong>as</strong>ed “super cookies.” 15<br />
There w<strong>as</strong> plenty of evidence at the Berkeley session of these and more of what<br />
Microsoft’s Peter Cullen called “anxiety-b<strong>as</strong>ed conversations.” 16 This year’s<br />
themes include the dangers of mobile computing, social networking, and cloudb<strong>as</strong>ed<br />
computing, <strong>as</strong> well <strong>as</strong> continuing hand-wringing over targeted or<br />
contextual advertising.<br />
The structure of these conversations doesn’t change much over time. The new<br />
technology is discussed by law professors, company representatives, and FTC<br />
12 Caroline McCarthy, Facebook Follies: A Brief History, CNET NEWS.COM, May 13, 2010,<br />
http://news.cnet.com/8301-13577_3-20004853-36.html; Guilbert Gates, Facebook Privacy:<br />
A Bewildering Tangle of Options, N.Y. TIMES, May 12, 2010,<br />
http://www.nytimes.com/interactive/2010/05/12/business/facebook-privacy.html.<br />
13 Loopt is a mobile mapping service that allows users to find local information using the<br />
geographic location of their mobile phones. For more information, visit<br />
http://www.loopt.com.<br />
14 Foursquare is a location-b<strong>as</strong>ed social networking application that allows users to connect<br />
with friends in their geographic location. For more information, visit<br />
http://foursquare.com.<br />
15 The super cookie “problem,” piled on by many speakers at the Berkeley workshop, turned<br />
out to be a red herring, e<strong>as</strong>ily controlled by engineered fixes to major browsers. It’s hard not<br />
to read too much into that example. See Berin Szoka, Privacy Innovation: Adobe Fl<strong>as</strong>h Supports<br />
Private Browsing & Deletes Fl<strong>as</strong>h Cookies, THE TECH. LIBERATION FRONT, Feb. 17, 2010,<br />
http://techliberation.com/2010/02/17/privacy-innovation-adobe-fl<strong>as</strong>h-supportsprivate-browsing-deletes-fl<strong>as</strong>h-cookies/.<br />
16 Private Conversation with Peter Cullen, Jan. 28, 2010 (notes on file with author).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 513<br />
staff (everybody but the engineers who know how the technology works), who<br />
try to point out the benefits and mostly dangers of the technology.<br />
Extremists on either end of the spectrum call for the FTC to either ban or<br />
ignore the development. In typical f<strong>as</strong>hion, satisfied that all sides have been<br />
heard, the agency takes the problem under advisement, and then waits for the<br />
next crisis du jour to emerge. No legislation or regulations are enacted.<br />
Well, that’s probably just <strong>as</strong> well—<strong>as</strong>suming there is no real privacy crisis that<br />
needs to be addressed, or rather that needs to be addressed by an organization<br />
with the FTC’s institutional limitations. I mean no disrespect to the FTC’s hardworking<br />
staff. By institutional limits, I am thinking of the inherent constraints<br />
on a U.S. regulatory agency. These include the mismatch of a national regulator<br />
supervising behavior that is natively global, problems in revising rules and<br />
jurisdictions for an environment that is evolving at accelerating speeds using a<br />
process designed to be slow and deliberative, and the dangers of solving what<br />
are largely engineering problems with staff whose expertise is policy—and<br />
offline policy at that.<br />
It’s probably clear that I don’t think there is a crisis. But I admit that it’s<br />
difficult to know. Both sides in this non-debate have an unfortunate habit of<br />
relying on unscientific survey data and anecdotal evidence, the latter of which,<br />
on closer inspection, turns out to be highly incomplete if not urban myth. Pam<br />
Dixon of the World Privacy Forum, for example, told a story about grocer<br />
Whole Foods using facial recognition software in stores to collect data from the<br />
tomato aisle for what Dixon called “direct marketing purposes.” But according<br />
to the Forum’s own report, The One Way Mirror Society, (whose investigation of<br />
the Whole Foods story w<strong>as</strong> limited to parsing company press rele<strong>as</strong>es) the<br />
software could at best distinguish gender, not specifically identify customers. 17<br />
Direct marketing is targeted to an individual (or their likely interests) rather than<br />
demographic characteristics such <strong>as</strong> gender. So whatever Whole Foods w<strong>as</strong><br />
doing, it w<strong>as</strong>n’t direct marketing.<br />
The surveys that purportedly show a privacy panic are, for the most part, poorly<br />
constructed and unscientifically executed. For some re<strong>as</strong>on law professors and<br />
their students, who have at best a c<strong>as</strong>ual acquaintance with the methods and<br />
rigors of any social science, are the ones called on by private and public actors<br />
to conduct these studies.<br />
Consider, for example, two questions about the same technical feature of cell<br />
phones. If you <strong>as</strong>k for an emotional response to the statement, “My cell phone<br />
17 Pam Dixon, The One-Way-Mirror Society: Privacy Implications of the New <strong>Digital</strong> Signage Networks,<br />
WORLD PRIVACY FORUM, Jan. 27, 2010,<br />
http://www.worldprivacyforum.org/pdf/onewaymirrorsocietyfs.pdf.
514 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
tracks where I go” you’ll get one answer, but if you phr<strong>as</strong>e it, “My cell phone<br />
can tell me where I am,” you’ll get a very different result. It’s the same feature.<br />
The “findings” are useless. The choice and wording of the questions are the<br />
only valuable information, in that it reveals a great deal about the authors of the<br />
survey. But the surveys tell us nothing about the respondents or the choices<br />
they would make when faced with real-world tradeoffs between restricting data<br />
and the benefits that flow from it—such <strong>as</strong> getting more relevant ads or a<br />
greater quality and quantity of “free” online content and services supported by<br />
advertising revenue.<br />
Privacy: Defining the Problem<br />
There’s a bigger problem here, and that is with the terms of the debate. In most<br />
conversations, no one knows what anyone else means by “privacy,” or what<br />
information is included in the term “personally-identifiable information,” which<br />
drives much of the privacy regulations in the European Union. The discussion<br />
at the FTC’s Berkeley roundtable, <strong>as</strong> with all privacy discussions, conflated<br />
several different information problems into one, freely mixing and matching<br />
issues and regulatory solutions that don’t actually go together. Until we separate<br />
the problems and solve them individually, the chances for meaningful policy<br />
solutions are nil.<br />
To start with, it’s essential to understand the unique properties of information<br />
<strong>as</strong> an economic good. Information h<strong>as</strong> very different properties from<br />
traditional commodities such <strong>as</strong> farm products, timber, and oil. Information<br />
can be used simultaneously by everyone, for one thing, and when we’re done<br />
using it, it’s still there, potentially more valuable because of the use. These<br />
remarkable features make information what economists call a non-rivalrous or<br />
“public” good, and they are the main re<strong>as</strong>on that information now drives<br />
economic activity in much of the developed world. (The other is the continued<br />
decline in the cost of computing power.)<br />
So rather than talking about who “owns” private information or who is<br />
“stealing” data (words that make more sense when talking about traditional<br />
commodities), I find it much more constructive to talk about whether any<br />
particular use of information is “productive” or “destructive” A productive<br />
use of information is one that makes it more valuable, including collaboration,<br />
remixing, and validation. Destructive uses leave the information less valuable,<br />
and include misrepresentation, misidentification, and dilution. 18<br />
18 Some information uses include both productive and destructive elements. Arguably,<br />
remixing and other information sampling adds value to information protected by copyright<br />
and trademark law while potentially diluting markets the law protects on behalf of the<br />
information producer. See Larry Downes, Viacom v. YouTube: The Principle of Le<strong>as</strong>t Cost<br />
Avoidance, The Tech. Liberation Front, June 26, 2010,
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 515<br />
As I argue in my book, The Laws of Disruption, privacy laws are best understood<br />
<strong>as</strong> legal protections against destructive uses of information by different<br />
categories of users. Seen that way, there is not one overarching and<br />
overwhelming privacy problem, but several very different privacy problems,<br />
each deserving of particular analysis and, one hopes, its own resolution. Here<br />
are the main categories of destructive information uses:<br />
Information User Destructive Uses<br />
Criminals Identity theft, phishing, malware and other forms of<br />
fraud<br />
Commercial<br />
enterprises<br />
Other consumers,<br />
friends, family<br />
Government and<br />
other state actors<br />
Surreptitious collection of consumer information for sale<br />
or use in marketing, often without adequate<br />
compensation or revenue sharing with the consumer<br />
Stalking, bullying, accidental or intentional disclosure of<br />
embarr<strong>as</strong>sing or secret information<br />
Unlawful search and seizure, accidental disclosure<br />
News media Publication of defamatory or erroneous information that<br />
damages reputation<br />
Employers and<br />
business <strong>as</strong>sociates<br />
Insurers and health<br />
care professionals<br />
Eavesdropping and other monitoring to identify poor<br />
performance, violation of employer rules, or business<br />
secrets<br />
Collection and use of known and potential risks to<br />
determine coverage or the danger of accidental<br />
disclosure<br />
Though many of these destructive uses were discussed at the FTC hearing, it<br />
should be noted at the outset that the agency’s charter only extends to the first<br />
and second category. 19 (To be fair, FTC staff frequently reminded the speakers<br />
to limit their discussion to topics over which the agency had jurisdiction.) Why,<br />
then, did the speakers repeatedly bring up all the others? For one thing, some<br />
http://techliberation.com/2010/06/26/viacom-v-youtube-the-principle-of-le<strong>as</strong>tcost-avoidance/.<br />
For now, I’ll stick to the “e<strong>as</strong>ier” problems of uses that are almost purely<br />
destructive.<br />
19 See Fed. Trade Comm’n, Privacy Initiatives, Introduction, http://www.ftc.gov/privacy. As the<br />
agency explains the scope of its privacy initiatives, “The Federal Trade Commission is<br />
educating consumers and businesses about the importance of personal information privacy,<br />
including the security of personal information. Under the FTC Act, the Commission guards<br />
against unfairness and deception by enforcing companies’ privacy promises about how<br />
they collect, use and secure consumers’ personal information. Under the Gramm-Leach-<br />
Bliley Act, the Commission h<strong>as</strong> implemented rules concerning financial privacy notices<br />
and the administrative, technical and physical safeguarding of personal information, and it<br />
aggressively enforces against pretexting. The Commission also protects consumer privacy<br />
under the Fair Credit Reporting Act and the Children’s Online Privacy Protection Act.”
516 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
of the most lurid stories suggesting a privacy crisis come from the other<br />
categories, making them irresistible to those arguing for a crisis response.<br />
Unfortunately, most parties in the privacy “debate” so far have shown little<br />
interest in cabining the discussion to manageable and discrete problems when<br />
an emotional point is there to be scored.<br />
Indeed, one important re<strong>as</strong>on to evaluate the categories of destructive<br />
information use separately is to help us see that some goals of the privacy<br />
movement are mutually exclusive. In the abstract, for example, most people are<br />
uncomfortable with the proliferation of surveillance camer<strong>as</strong> in urban locations.<br />
But listen to the indignation that erupts the minute a serious crime or terrorist<br />
act occurs and the police turn out not to have caught it on film.<br />
Or take the often-repeated example of the victim of domestic violence, used <strong>as</strong><br />
a stalking horse for the proposition that search engines, cell phone carriers, and<br />
other service providers, who collect bits and pieces of information that might be<br />
used to identify and locate an individual, should immediately purge their<br />
datab<strong>as</strong>es, lest they fall into the wrong hands.<br />
Turn the problem around, however, and you can make the exact opposite c<strong>as</strong>e.<br />
For the victim, it’s important to er<strong>as</strong>e all traces of their online activity. But for<br />
the perpetrator, effective law enforcement requires <strong>as</strong> much information <strong>as</strong><br />
possible. 20 Optimally, we’d like to tell information collectors to purge data<br />
about victims but retain it for criminals, but of course, we don’t know who is<br />
who until after the fact. What’s a data collector to do? 21<br />
This isn’t a hypothetical problem. Lawmakers in the United States and the<br />
European Union are simultaneously putting pressure on phone companies,<br />
search engines and social networking sites to both purge and retain the same<br />
data, a kind of whipsaw that h<strong>as</strong> led these providers uncharacteristically to call<br />
for new laws—laws that would give them a straight answer on what is expected<br />
of them.<br />
It’s also worth noting that in the United States in particular, most of the existing<br />
legal protections for private information are squarely aimed at deterring<br />
destructive uses by criminals and by governments themselves. Every year I<br />
have to convince another batch of students that the right to privacy recognized<br />
by U.S. courts and grounded in the U.S. Constitution does not apply to conflicts<br />
20 Declan McCullagh, Web Searches Lead to Murder Conviction, CNET NEWS.COM, Feb. 12, 2010,<br />
http://news.cnet.com/8301-13578_3-10452471-38.html.<br />
21 See Miguel Helft, For Data, Tug Grows Over Privacy vs. Security, N. Y. TIMES, Aug. 2, 2010,<br />
http://www.nytimes.com/2010/08/03/technology/03blackberry.html. See also Lance<br />
Whitney, German Court Rules Against Data Retention Policy, CNET NEWS.COM, March 2, 2010,<br />
http://news.cnet.com/8301-13578_3-10462117-38.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 517<br />
with commercial enterprises, parents, friends, or the news media. (Indeed, the<br />
latter are strongly protected by the First Amendment against such regulation.)<br />
With at le<strong>as</strong>t a whiff of First Amendment rationale, even the common law torts<br />
that deal with conflicts between individuals over information use—including<br />
rights of publicity, defamation, and “false light” claims—have fallen into<br />
disrepute during the l<strong>as</strong>t fifty years. 22<br />
Here there are also economic forces at work: information technology h<strong>as</strong><br />
er<strong>as</strong>ed the temporary m<strong>as</strong>k of anonymity created by 19 th century urban life,<br />
reviving a social goal of transparency <strong>as</strong> old <strong>as</strong> Hawthorne’s The Scarlet Letter. 23<br />
American life, built of equal parts frontier necessity and Puritan <strong>as</strong>piration, calls<br />
for complete and accurate information about each other. That goal incre<strong>as</strong>ingly<br />
weighs more heavily in private disputes over what Warren and Brandeis in 1890<br />
famously termed “the right to be left alone.” 24 More about that in a moment.<br />
Why the focus on state actors? The principal fear of the drafters of the<br />
Constitution, obviously informed by the experience of the colonies, w<strong>as</strong> with<br />
potential tyranny from government. The government, after all, practically holds<br />
a monopoly on the coercive power of the military and the ability to incarcerate.<br />
For historical re<strong>as</strong>ons, the focus is very different in Europe and many parts of<br />
Asia, which have enacted strong privacy laws that align citizens and democratic<br />
governments against everybody else. The difference between U.S. and<br />
European law, in particular here, is perhaps the broadest effort to apply<br />
terrestrial laws to digital life generally. 25<br />
In the United States, distrust of government evolved to include a fear that<br />
private information could be misused to achieve the same ends <strong>as</strong> more overt<br />
repression. These fears were underscored by the late 20 th century scandals,<br />
22 See Haynes v. Alfred A. Knopf, Inc., 8 F.3d 1222 (7th Cir. 1993) (opinion by Posner).<br />
23 NATHANIEL HAWTHORNE, THE SCARLET LETTER (Ticknor, Reed & Fields 1850).<br />
24 Samuel Warren & Louis D. Brandeis, The Right to Privacy, 4 HARV. L. REV. 193 (1890),<br />
(discussed below). See also Peggy Noonan, The Eyes Have It, WALL ST. J., May 21, 2010,<br />
http://online.wsj.com/article/NA_WSJ_PUB:SB10001424052748703559004575256732<br />
042885638.html.<br />
25 See Adam Liptak, When American and European Ide<strong>as</strong> of Privacy Collide, N.Y. TIMES, Feb. 26,<br />
2010, http://www.nytimes.com/2010/02/28/weekinreview/28liptak.html; Kevin J.<br />
O’Brien, Europe Says Search Firms Are Violating Data Rules, N.Y. TIMES, May 26, 2010,<br />
http://www.nytimes.com/2010/05/27/technology/27data.html; Jessica E. V<strong>as</strong>cellaro,<br />
Ten Countries Ask Google to do More to Protect Privacy, WALL ST. J., April 20, 2010,<br />
http://online.wsj.com/article/NA_WSJ_PUB:SB10001424052748704671904575194992<br />
879579682.html.
518 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
including The Pentagon Papers and later Watergate, which led to the first<br />
comprehensive anti-wiretapping law at the federal level. 26<br />
So, with notable exceptions, all existing U.S. privacy laws afford limited—and<br />
perhaps inadequate—protection to citizens against their governments. That fact is<br />
often whitew<strong>as</strong>hed in the furor of the privacy debate. The ACLU of Northern<br />
California, for example, recently published a white paper called Cloud Computing:<br />
Storm Warnings for Privacy? 27 The paper points out the mismatch between<br />
existing privacy law and the reality of cloud computing, where personal<br />
information is turned over for storage and processing to a variety of third<br />
parties and often to their unnamed and changing business partners.<br />
The report never quite says so directly, but all the proposed reforms are aimed<br />
at curbing the ability of state actors, not private parties, to gain access and make<br />
use of data in the cloud. It is not “consumers” <strong>as</strong> the report characterizes them,<br />
who need to be worried about their “privacy protections” in the cloud. It is<br />
citizens. I think the ACLU is right to be worried about government access to<br />
cloud data sources, but I wish it wouldn’t pretend to have a broader agenda<br />
than it does.<br />
Though the right to privacy against government is now firmly established, it’s<br />
also worth remembering that this is a relatively new right. Despite all the talk of<br />
one’s right to privacy, you will scour the Bill of Rights in vain for any reference<br />
to privacy even against state actors.<br />
It w<strong>as</strong>n’t until the 1960s that the Supreme Court began to interpret the First<br />
Amendment’s free speech provisions, along with the Fourth Amendment’s<br />
prohibition against unre<strong>as</strong>onable searches and seizures of people and their<br />
property, <strong>as</strong> implying a more general right to privacy enforceable against state<br />
actors. 28 In key c<strong>as</strong>es, including Griswold v. Connecticut (birth control), Roe v. Wade<br />
(abortion), and more recently Lawrence v. Tex<strong>as</strong> (homosexuality), the Court<br />
struggled to reign in government intrusions into the private lives of citizens,<br />
intrusions the Founders would never have imagined possible. Lacking a<br />
26 See The Omnibus Crime Control and Safe Streets Act of 1968, 18 U.S.C. §§ 2510 et. seq.<br />
27 Am. Civil Liberties Union of N. Cal., Cloud Computing: Storm Warnings for Privacy?, Jan. 2010,<br />
http://www.dotrights.org/sites/default/files/Cloud%20Computing%20Issue%20Pa<br />
per.pdf.<br />
28 See, e.g., Griswold v. Connecticut, 381 U.S. 479, 483 (1965) (“In other words, the First<br />
Amendment h<strong>as</strong> a penumbra where privacy is protected from governmental intrusion.”);<br />
Roe v. Wade, 410 U.S. 113, 152 (1973) (“The Constitution does not explicitly mention any<br />
right of privacy. In a line of decisions, however, going back perhaps <strong>as</strong> far <strong>as</strong> [1891], the<br />
Court h<strong>as</strong> recognized that a right of personal privacy, or a guarantee of certain are<strong>as</strong> or<br />
zones of privacy, does exist under the Constitution.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 519<br />
Constitutional provision that guaranteed the right to privacy for b<strong>as</strong>ic human<br />
activities, the Court more-or-less invented one.<br />
At the same time, <strong>as</strong> 20 th century society came to recognize that information<br />
w<strong>as</strong> a kind of property (something with value, in any c<strong>as</strong>e), the perception of a<br />
right to privacy emerged. It w<strong>as</strong> found, to use Justice Dougl<strong>as</strong>’s famous but<br />
unfortunate phr<strong>as</strong>e, in “the penumbr<strong>as</strong> and emanations” of the Bill of Rights. 29<br />
But to reiterate, the right to privacy, <strong>as</strong> it is currently understood, is a right to be<br />
free of unre<strong>as</strong>onable interference from government, not from each other or the<br />
businesses with which we interact.<br />
Some of today’s most vocal privacy advocates are calling for a broader right of<br />
privacy, one that could be <strong>as</strong>serted against any or all of the destructive<br />
information uses and perhaps against many of the productive uses <strong>as</strong> well.<br />
An earlier effort to create such a right, it is worth noting, failed. In 1890,<br />
Samuel Warren and Louis Brandeis wrote a famous article for the Harvard Law<br />
Review in which they called for the creation of a general right of privacy, defined<br />
<strong>as</strong> the “right to be left alone.” 30 The article w<strong>as</strong> inspired by Warren’s personal<br />
experience. Warren’s wife w<strong>as</strong> the daughter of a U.S. Senator, and Warren w<strong>as</strong><br />
appalled to discover that their daughter’s wedding w<strong>as</strong> reported, with<br />
photographs, in The W<strong>as</strong>hington Post. 31<br />
Warren and Brandeis’s proposal led to some experimentation, mostly at the<br />
state level and mostly through judge-made law. Common law courts invented<br />
new tort injuries for damage to reputation, rights of publicity, and more<br />
expanded forms of defamation, including the portrayal of someone in a “false<br />
light.” Many of these rights are no longer recognized, or have become nearly<br />
impossible to enforce.<br />
Under the Supreme Court’s 1964 decision in New York Times v. Sullivan, 32 for<br />
example, public officials must prove actual malice to recover damages for<br />
defamation. Extending that decision, the Court held in the Florida Star 33 c<strong>as</strong>e<br />
that a rape victim could not prevent a newspaper from publishing information<br />
from a police report about the crime committed against her. More recently,<br />
29 Griswold, 381 U.S. at 484. Justice Thom<strong>as</strong>, a strict constructionist, h<strong>as</strong> a sign on his desk<br />
<strong>as</strong>king visitors to kindly keep their penumbr<strong>as</strong> off his emanations.<br />
30 Warren & Brandeis, supra note 24.<br />
31 What would Warren think of today’s celebrity media, or reality TV, in which non-celebrities<br />
volunteer to give themselves the celebrity treatment?<br />
32 376 U.S. 254 (1964).<br />
33 Florida Star v. B.J.F., 491 U.S. 524 (1989).
520 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
homeowners have been rebuffed in efforts to prohibit Google maps from<br />
photographing their homes. 34<br />
The Warren and Brandeis experiment failed for a good re<strong>as</strong>on, at le<strong>as</strong>t from an<br />
economic standpoint. Many of the uses of private information that Warren and<br />
Brandeis sought to outlaw are in fact productive uses, generating substantial<br />
social value that outweighs the costs to the individuals of keeping such<br />
information public.<br />
It may be very important for you to keep secret the fact that you have a criminal<br />
record, a communicable dise<strong>as</strong>e, a Swiss bank account or a secret liaison with an<br />
employee. But it is more valuable to the rest of us to know these things about<br />
you, if only to know how many others have the same attribute so we can take<br />
appropriate actions—quarantine the sick, p<strong>as</strong>s stricter banking laws, etc. The<br />
benefits of disclosure, the courts have determined, generally outweigh the costs<br />
of secrecy.<br />
Warren and Brandeis weren’t entirely wrong, however. It’s important to note<br />
that the same “private” information can also be used destructively. I might<br />
overestimate the risks of hiring a former felon, for example, or even exclude<br />
potential tenants for my apartments b<strong>as</strong>ed on an irrational reliance on personal<br />
traits and <strong>as</strong>sociated stereotypes. That’s why we have anti-discrimination laws,<br />
one of the notable exceptions where protections for misuse of personallyidentifiable<br />
information extend to commercial and other non-governmental<br />
users. But it is an exception, narrowly focused on a particular abuse.<br />
Generally speaking, in fact, privacy legislation in the U.S. h<strong>as</strong> only been enacted<br />
when legislators find particular and persistent market failures—failures, that is,<br />
to use personally-identifiable information in rational ways. Along with antidiscrimination<br />
laws, we have laws that control private information use in credit<br />
reporting, medical records, and identity theft and other forms of financial<br />
fraud. 35 In each of these c<strong>as</strong>es, the law is focused on a particular destructive use<br />
in a particular context, with remedies for consumers narrowly-tailored to leave<br />
<strong>as</strong> much information unprotected <strong>as</strong> possible.<br />
Privacy: The Solution<br />
Particular solutions to particular information use failures are, I believe, a model<br />
lawmakers ought to be encouraged to continue following.<br />
34 See Steven Musil, Google Wins Street View Privacy Suit, CNET NEWS.COM, Feb. 18, 2009,<br />
http://news.cnet.com/8301-1023_3-10166532-93.html.<br />
35 See Larry Downes, If Feds Fail, What Can Stop Identity Theft? CIO INSIGHT, July 2005,<br />
http://www.cioinsight.com/c/a/P<strong>as</strong>t-Opinions/If-Feds-Fail-What-can-Stop-<br />
Identity-Theft/.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 521<br />
But maybe it’s time to revisit Warren and Brandeis’s call for broader privacy<br />
protections against non-state actors. The argument in favor would go<br />
something like this: Now that the cost of information retention is less than the<br />
cost of information deletion, commercial enterprises may soon wield the same<br />
kind of coercive power that until now h<strong>as</strong> usually been the domain of<br />
governments.<br />
Perhaps the same kinds of risks to society that led the courts to recognize a<br />
“zone of privacy” for information collection and use now justify extending that<br />
zone to the complicated web of enterprises that collect, consolidate, and resell<br />
information about a wide range of consumer behaviors—<strong>as</strong> well <strong>as</strong> to<br />
companies that by design collect intimate information, including social<br />
networking sites and anything mobile.<br />
But note that even if this argument carries the day—that is, if the benefits of<br />
reducing destructive uses of private information in some of the other categories<br />
exceed the costs (including inadvertent limits on some productive uses, such <strong>as</strong><br />
the advertising that supports so much “free” media), it doesn’t necessarily<br />
follow that the FTC or any other government entity is the right institution to<br />
define, enact, and enforce those new legal rights.<br />
Again, the FTC’s authority is limited to investigating anticompetitive behavior<br />
and “unfair” or “deceptive” trade practices—terms that have clear meanings<br />
under the agency’s statute, its previous decisions and policy statements, and<br />
court c<strong>as</strong>es interpreting the law.<br />
Moreover, there is the agency’s tendency simply to react with dismay to new<br />
technologies and then to move on when the technology soon after resolves its<br />
own problems. This habit—not limited to the FTC by any means—leads me to<br />
doubt its institutional capacity to define what kind of new privacy rights<br />
consumers should have, independent of always-changing technological<br />
capacities. The FTC’s staff invokes the phr<strong>as</strong>e “privacy by design” like a mantra<br />
in conversation with policy representatives and consumer advocates. The hope<br />
behind that phr<strong>as</strong>e is that future technologies can be engineered to protect<br />
privacy interests. But the phr<strong>as</strong>e is meaningless without first defining what<br />
interests are to be protected.<br />
There are bigger re<strong>as</strong>ons to question whether federal or state government is<br />
well-suited to the role of privacy cop. For one thing, competing government<br />
interests in effective law enforcement create a kind of regulatory schizophrenia<br />
over the use of privacy-enhancing technologies. As an example, consider<br />
encryption. Given that one common theme of destructive information use in<br />
several categories, <strong>as</strong> noted in the chart above, involves accidental disclosure<br />
(including access gained illegally by hackers or by fraud), it might be thought<br />
that more widespread use of encryption technology would reduce those risks.<br />
But the federal government h<strong>as</strong> been ambiguous at best about encouraging
522 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
enterprises to adopt encryption. In the 1990s, b<strong>as</strong>ic encryption methods were<br />
cl<strong>as</strong>sified <strong>as</strong> a “munition” by the Department of State (and later, the<br />
Department of Commerce). Phil Zimmerman, the inventor of an open source<br />
encryption protocol known <strong>as</strong> Pretty Good Privacy, w<strong>as</strong> the subject of a lengthy<br />
criminal investigation for exporting “weapons” without a license. Only under<br />
intense pressure from the courts in a series of First Amendment decisions did<br />
the Commerce Department finally liberalize these export controls.<br />
There w<strong>as</strong> also the memorable fight in the early 1990s over the Clipper Chip, a<br />
government-developed encryption technology for use in cell phones. 36 Had the<br />
chip been made mandatory or even widely deployed (opponents, including the<br />
Electronic Frontier Foundation (EFF), successfully prevented that outcome), it<br />
would have given law enforcement a back door into mobile phone calls and<br />
might have led to the prohibition of other encryption technologies. 37<br />
Federal and state governments also receive failing grades at adopting and using<br />
the kinds of safe data handling practices some think these governments should<br />
enforce against everyone else. That failure is evidenced by numerous<br />
embarr<strong>as</strong>sing breaches and stolen unsecured laptops containing millions of<br />
records of sensitive citizen data. Government in this sense offers a good<br />
example (several, actually) of how not to manage privacy and the worst, not best,<br />
safe handling practices. Finally, a general problem of state, <strong>as</strong> opposed to<br />
federal, legislation is the potential for fifty different definitions of privacy, all<br />
imposed on what is a global information economy.<br />
More broadly, there’s a disturbing irony to handing privacy enforcement over to<br />
governments, an argument I have also made with regard to enforcement of net<br />
neutrality principles. Enforcement powers would require the agency to examine<br />
the information and how it w<strong>as</strong> disclosed. For agencies to investigate and<br />
punish banned uses of private information, they would necessarily need to have<br />
access to the data sources in question.<br />
If I complain, to the FTC, for example, that a social networking enterprise is<br />
selling my information in violation of some future privacy protection law, how<br />
else will the agency evaluate my claim without looking deeply into the<br />
company’s information practices? Like any good audit, they will need to follow<br />
the flow of information from beginning to end to determine what, if any,<br />
violations of law occurred.<br />
36 For more information on the Clipper Chip, including government documents and public<br />
response, visit http://epic.org/crypto/clipper.<br />
37 The Clipper Chip fight w<strong>as</strong> instrumental in the formation of EFF, proving once again the<br />
law of unintended consequences. The government lost the battle, but more importantly, it<br />
inadvertently helped to organize a permanent and effective opposition.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 523<br />
Yet state actors, <strong>as</strong> noted above, are currently the most restricted from seeing<br />
this information. In a bitter irony, enforcement of privacy rights against<br />
enterprises would have the effect of exposing data to the government that it<br />
otherwise would be forbidden to see. As a result, there is the potential for<br />
abuse of that privilege in the name of law enforcement or other government<br />
priorities (e.g., terrorism or tax fraud). 38<br />
Who else can define, let alone enforce, new privacy protections against<br />
enterprises and other non-state actors? One obvious alternative is selfregulation<br />
by the enterprises themselves, perhaps encouraged by the threat of<br />
government intervention if the market fails.<br />
There’s already a good deal of self-regulation, including voluntary adoption of<br />
the FIPs and certification from third parties such <strong>as</strong> TRUSTe 39 and the Better<br />
Business Bureau. 40 Much of this self-regulation, however, presumes the<br />
usefulness of the notice regime which, <strong>as</strong> noted, no one presumes any more.<br />
The potential is there, however, for more effective self-regulation. Even<br />
companies who make money from information collection and its use are<br />
worried about the privacy problem, or at le<strong>as</strong>t the perception of one. Some are<br />
even calling for new legislation, in part to solve the whipsaw problem described<br />
above.<br />
Microsoft and other companies who are counting on cloud computing <strong>as</strong> the<br />
next major computer architecture are eager for regulatory frameworks that will<br />
both allay consumer anxiety and give cloud-b<strong>as</strong>ed service providers safe harbors<br />
in which to develop their offerings.<br />
An emerging consensus among a wide range of technology companies and<br />
information service providers would move the discussion from guidelines about<br />
private information notice to rules about information use. The “use” approach<br />
would develop acceptable principles for how collected and retained data (private<br />
or otherwise) can be used, by whom, and with what controls reserved to<br />
consumers to limit or block those uses.<br />
As part of the Business Forum for Consumer Privacy (BFCP), Microsoft and<br />
others are calling for a legislated framework for use-b<strong>as</strong>ed rules. 41 What does<br />
38 See Declan McCullaugh, Amazon Fights Demand for Customer Records, CNET NEWS.COM, April<br />
19, 2010, http://news.cnet.com/8301-13578_3-20002870-38.html.<br />
39 See http://www.truste.com.<br />
40 See http://www.bbb.org/us/Business-Accreditation.<br />
41 The Business Forum for Consumer Privacy, A Use and Obligations Approach to Protecting Privacy:<br />
A Discussion Document, Dec. 7, 2009,
524 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
that mean? According to a 2009 BFCP white paper, a “use‐and‐obligations”<br />
model requires all organizations to be transparent, offer and honor appropriate<br />
choice, ensure that risks to consumers related to use are <strong>as</strong>sessed and managed,<br />
and implement security for the information they maintain. 42 The white paper<br />
describes a taxonomy of use types (for example, marketing, internal operations)<br />
within the enterprise category, and tries to define an appropriate set of default<br />
rules that should apply for each.<br />
The BFCP approach is certainly more productive than the anxiety-b<strong>as</strong>ed<br />
conversations. Focusing on use, for one thing, moves the conversation away<br />
from the emotional subject of privacy to the more rational subject of propriety,<br />
by which I mean the recognition that both enterprises and consumers<br />
participate in the creation of valuable new sources of data and both should have<br />
rights to monetize that value. Consumers ought to be able to buy out<br />
enterprises for productive uses they don’t want, and vice-versa.<br />
In this scenario, privacy policy evolves from self-regulation by the market to<br />
actually becoming a market for private and other data. Assuming this market<br />
works, data will be put to the most highly-valued productive uses, and those<br />
uses that are not valued or are destructive will not be implemented. 43 As with<br />
any market, the government will stand in the background to ensure the rules are<br />
obeyed and to intervene when market failures occur.<br />
One immediate concern about the creation of a privacy market is that<br />
consumers will have no voice in its creation or operation. Consumers, after all,<br />
are individuals, e<strong>as</strong>ily overwhelmed by the economic might of large<br />
corporations, who can be expected to develop rules that give consumers an<br />
unfair disadvantage in the information marketplace.<br />
That would have been an entirely re<strong>as</strong>onable concern in the pre-digital age, and<br />
one of the principal justifications for the creation of private and public<br />
consumer watchdog groups such <strong>as</strong> the FTC in the first place (the agency is<br />
nearly 100 years old). Such groups, at le<strong>as</strong>t in theory, lobby and sue on behalf<br />
http://www.huntonfiles.com/files/webupload/CIPL_Use_and_Obligations_White_<br />
Paper.pdf.<br />
42 See Hunton & Williams LLP, Business Forum for Consumer Privacy Introduces New Data Protection<br />
Model, Dec. 21, 2009,<br />
http://www.huntonprivacyblog.com/2009/12/articles/events/business-forum-forconsumer-privacy-introduces-new-data-protection-model/.<br />
43 This emerging privacy market, operating with minimal transaction costs or “friction,” could<br />
present a wonderful opportunity to test Ronald Co<strong>as</strong>e’s theory that absent transaction costs,<br />
stakeholders will necessarily bargain to the most productive use of a given resource. See<br />
Ronald Co<strong>as</strong>e, “The Problem of Social Cost,” 3 Journal of Law & Economics 1 (1960) (the<br />
so-called “Co<strong>as</strong>e Theorem”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 525<br />
of large groups of consumers to ensure their interests are fairly represented in a<br />
wide variety of market activities. (The cl<strong>as</strong>s-action mechanism is another<br />
example.) To use the economic terminology, these legal constructs overcome<br />
the collective action problem of individual users whose individual losses may be<br />
too small to justify the expense of enforcing or even negotiating rights.<br />
But the same technologies that create the privacy problem are also proving to<br />
be the source of its solution. Even without government intervention,<br />
consumers incre<strong>as</strong>ingly have the ability to organize, identify their common<br />
demands, and enforce their will on enterprises. 44<br />
Facebook’s journey to a fair privacy policy continues to be instructive here.<br />
The company’s ongoing privacy crisis actually started in early 2009, when<br />
Facebook announced what w<strong>as</strong> in fact a modest change to its terms of service.<br />
Some users who read the modification objected to it, and used the very tools<br />
Facebook provides for group formation to create a “People Against the New<br />
Terms of Service” page, which quickly signed up 100,000 fans. 45 (Still a<br />
relatively small number given Facebook’s size: now over 500 million users<br />
worldwide.)<br />
The revolt and (perhaps more influential) the ensuing bad publicity led<br />
Facebook to withdraw the changes—no lawsuit or legislation necessary. Even<br />
more, the company soon announced that it w<strong>as</strong> changing its entire approach to<br />
governance. In the future, the company said, it would rewrite its user<br />
agreement to be understandable to lay readers, and circulate future changes <strong>as</strong><br />
proposals to the entire population of users.<br />
If enough users objected to a proposed change, the changes would be put to a<br />
vote. (The changes to privacy settings that set off the firestorm in the fall of<br />
2009 were circulated ahead of time, but didn’t qualify for a vote, perhaps<br />
suggesting that Facebook’s “constitution” still needs some tweaking.)<br />
So <strong>as</strong>suming that buyers and sellers have equal access and power, how would a<br />
market for private data work in practice? 46 There are already some good<br />
44 Steve Lohr, You Want My Personal Data? Reward Me for It, N.Y. TIMES, July 16, 2010,<br />
http://www.nytimes.com/2010/07/18/business/18unboxed.html; L. Gordon Crovitz,<br />
Privacy Isn’t Everything on the Web,” WALL ST. J., May 24, 2010,<br />
http://online.wsj.com/article/NA_WSJ_PUB:SB10001424052748704546304575260470<br />
054326304.html.<br />
45 To view the Facebook page, visit<br />
http://www.facebook.com/group.php?gid=77069107432.<br />
46 Venture capitalists have begun to recognize the potential of privacy markets and are<br />
investing accordingly. See Pui-Wing Tam & Ben Worthen, Funds Invest in Privacy Start-ups,<br />
WALL ST. J., June 20, 2010,
526 CHAPTER 8: WHAT FUTURE FOR PRIVACY?<br />
prototypes in place in the form of consumer loyalty and affinity programs that<br />
have been around for decades. The grocery store scanner data can already<br />
determine what items have been bought at the same time, but in order to tie<br />
that collection to particular demographic characteristics (gender and zip code<br />
are the most important for marketing purposes), it requires the cooperation of<br />
the customer.<br />
So, in exchange for using a club card and giving the store permission to connect<br />
purch<strong>as</strong>es to a particular customer, the consumer gets special discounts. In<br />
effect, the store recognizes the value of the identifying information and pays the<br />
consumer for it.<br />
That approach is attractive for several re<strong>as</strong>ons. For one thing, it is entirely<br />
voluntary—consumers make the decision to sign up for the programs or not, or<br />
even whether or not to participate in any individual transaction. This protomarket<br />
also operates under very low transaction costs. Instead of negotiating<br />
for the purch<strong>as</strong>e of each individual data element with each individual consumer,<br />
a framework is established that allows both parties to opt out at will.<br />
Discounted prices become a new form of currency requiring no oversight. No<br />
lawyers and no regulators are involved.<br />
The loyalty program also makes explicit what I think is the most important<br />
feature of the privacy debate, one that is usually lost in the extreme rhetoric of<br />
both sides. Information only attains its true monetizable value when every<br />
participant in the transaction h<strong>as</strong> an incentive to provide it, authenticate it and<br />
protect it. Retail data is of limited use to the supply chain without demographic<br />
linkages. Individual data is not marketable unless someone is able to collect and<br />
analyze large volumes of it. As with all information exchanges, the whole is<br />
greater than the sum of its parts.<br />
Today, even fifty years into the computer revolution, most <strong>as</strong>pects of retail<br />
engineering, including production planning, pricing and promotions, product<br />
design, forec<strong>as</strong>ting, and marketing, operate with precious little real information.<br />
Most are more black arts than science. With the potential capture of complete<br />
life-cycle transaction data, there is at l<strong>as</strong>t the hope of discipline, and with it great<br />
incre<strong>as</strong>es in efficiency.<br />
And in the next digital decade, we now have the potential for information to be<br />
collected post-sale: How did you like the product? Did it perform <strong>as</strong> expected?<br />
What other products did you use it with? It becomes even clearer that without<br />
http://online.wsj.com/article/NA_WSJ_PUB:SB10001424052748703438604575315182<br />
025721578.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 527<br />
cooperation from the consumer, no real value will come from v<strong>as</strong>t data<br />
warehouses that enterprises have built but rarely use effectively. 47<br />
The privacy marketplace is already here, and there is every indication that <strong>as</strong><br />
technology continues to evolve, an incre<strong>as</strong>ingly robust and valuable set of<br />
institutions will develop alongside of it. As new forms of data enter the digital<br />
domain, new tools and techniques will emerge to harness their value and<br />
allocate it among those who develop it.<br />
In order for the technology of information processing to reach its potential,<br />
however, the histrionics of the privacy debate must stop. Instead, consumers<br />
must be encouraged and educated to think about information use in terms of<br />
productive and destructive costs and benefits. In short, we must learn to think<br />
rationally about information value. If the FTC and other regulators are looking<br />
for a role in solving the problems of online privacy, a good starting point would<br />
be to contribute constructively to the emergence of this organic, elegant<br />
solution.<br />
47 See LARRY DOWNES, THE STRATEGY MACHINE: BUILDING YOUR BUSINESS ONE IDEA AT A<br />
TIME, (Harper Business 2002), Chapter 3, The Information Supply Chain. See also Jules<br />
Polontesky & Christopher Wolf, Solving the Privacy Dilemma, THE HUFFINGTON POST, July 27,<br />
2010, http://www.huffingtonpost.com/jules-polonetsky/solving-the-privacydilem_b_660689.html.
528 CHAPTER 8: WHAT FUTURE FOR PRIVACY?
CHAPTER 9<br />
CAN SPEECH BE POLICED<br />
IN A BORDERLESS WORLD?<br />
529<br />
The Global Problem of State Censorship<br />
& the Need to Confront It 531<br />
John G. Palfrey, Jr.<br />
The Role of the Internet Community<br />
in Combating Hate Speech 547<br />
Christopher Wolf
530 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 531<br />
The Global Problem of State<br />
Censorship & the Need<br />
to Confront It<br />
By John G. Palfrey, Jr. *<br />
Speech is policed through technical Internet filtering in more than three dozen<br />
states around the world. This practice is incre<strong>as</strong>ingly widespread. States<br />
including China, Iran, Syria, Tunisia, and Uzbekistan have extensive Internet<br />
filtering regimes in place. Censorship using technological filters is often<br />
combined with restrictive laws related to what the press can publish, opaque<br />
surveillance practices, and severe penalties for people who break the state’s rules<br />
of using the Internet. This trend h<strong>as</strong> been emerging, and documented with<br />
precision, for nearly a decade. 1<br />
An empirical study of technical Internet filtering tells only part of the story,<br />
however. Speech is policed actively in parts of the world with regimes that are<br />
substantially more democratic than China or Iran. Through mechanisms that<br />
include surveillance, “encouraging” self-censorship, intellectual property<br />
restrictions, and defamation laws, virtually every state in the world polices<br />
online speech through multiple means. 2 As more and more of everyday life<br />
moves onto the Internet, so h<strong>as</strong> regulation of that activity. These forms of<br />
regulation are driven by the same types of concerns that animate the regulation<br />
of speech in traditional environments.<br />
* John Palfrey is the Henry N. Ess III Professor of Law at Harvard Law School and Faculty<br />
Co-director of the Berkman Center for Internet & Society at Harvard University. This<br />
chapter draws upon research by the OpenNet Initiative, which is a collaboration that joins<br />
researchers at the Citizen Lab at the Munk Centre, University of Toronto (Prof. Ron<br />
Deibert, principal investigator), the SecDev Group (formerly the University of Cambridge<br />
where Rafal Rohozinski is principal investigator), and the Berkman Center (where the author<br />
and Jonathan Zittrain are co-principal investigators). The author is grateful to the large<br />
number of researchers who have participated in gathering, over nearly a decade, the data on<br />
which this chapter draws. Parts, though not all, of this argument have been published in<br />
other volumes.<br />
1 See www.opennet.net for the results of the OpenNet Initiative’s research since 2002. See<br />
also Ronald Deibert, John Palfrey, Rafal Rohozinski, & Jonathan Zittrain, eds., ACCESS<br />
DENIED: THE PRACTICE AND POLICY OF GLOBAL INTERNET FILTERING (MIT Press, 2008)<br />
and Ronald Deibert, John Palfrey, Rafal Rohozinski, & Jonathan Zittrain, eds., ACCESS<br />
CONTROLLED: THE SHAPING OF POWER, RIGHTS, AND RULE IN CYBERSPACE (MIT Press,<br />
2010), in which variations on these themes appear throughout.<br />
2 Like technical Internet filtering, this is not a new phenomenon. See Adam D. Thierer, 190<br />
Internet Censors? Rising Global Threats to Online Speech, 38 TechKnowledge, July 26, 2002,<br />
www.cato.org/pub_display.php?pub_id=11535.
532 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
The social, political, and cultural issues that give rise to online speech<br />
restrictions are familiar. Child safety is one of the primary drivers of online<br />
speech regulation, most commonly related to pornography. Police use<br />
surveillance to track the misdeeds of wrong-doers, in turn causing a chilling<br />
effect on online activities. Business people, authors, musicians, and others want<br />
their intellectual property rights vindicated to the fullest extent, prompting<br />
extensive intellectual property restrictions online. Those who perceive harm to<br />
their reputations seek the extension of defamation laws to the digital world. In<br />
each of these instances, speech restrictions cross traditional and digital<br />
environments more or less seamlessly.<br />
Is It True that “All Politics Is Local”<br />
in the <strong>Digital</strong>ly-Mediated World?<br />
The problem of policing speech on the Internet arises at every level of<br />
government, from local political conversations about community norms to the<br />
international debate over Internet governance. The principal difference at these<br />
multiple levels of governance is the willingness to address the difficult problems<br />
of when to regulate speech online and the ramifications that flow from doing<br />
so. State legislatures fight over whether to constrain speech in schools when<br />
young people engage in cyber-bullying. 3 In the United States, Congress takes up<br />
the same issue from the perspective of federal law enforcement and funding for<br />
schools and education. At the same time, the topic of speech-b<strong>as</strong>ed controls is<br />
an important <strong>as</strong>pect of global Internet regulation. However, it rarely makes it<br />
into the agenda at the international level <strong>as</strong> a genuine, openly-discussed matter.<br />
At the local level, there is often substantial appetite to regulate speech online,<br />
primarily <strong>as</strong> a response to fears about child safety. The clearest example at the<br />
local level in the United States is the effort to curb cyber-bullying through<br />
school environments, which states, regional, and municipal governments seek to<br />
regulate. 4 Speech, largely online in social network environments, is policed by<br />
local regulation that bars young people from expression that may harm their<br />
peers psychologically. While the debate rages <strong>as</strong> to the appropriateness of these<br />
regulations, dozens of states in the United States have enacted, or have<br />
considered enacting, these types of laws at the local level. Courts have split <strong>as</strong><br />
to the constitutionality of such provisions. In short, intense local concerns,<br />
3 The First Amendment Center published a fine series of essays and research materials related<br />
to the speech restrictions <strong>as</strong>sociated with the anti-bullying efforts, with special attention paid<br />
to state legislative activities. See First Amendment Ctr., Online Symposium: Cyberbullying &<br />
Public Schools,<br />
www.firstamendmentcenter.org/collection.<strong>as</strong>px?item=cyberbullying_public_schools.<br />
4 Sameer Hinduja, & Justin W. Patchin, Cyberbullying Research Center, State Cyberbullying Laws:<br />
A Brief Review of State Cyberbullying Laws and Policies, July 2010,<br />
www.cyberbullying.us/Bullying_and_Cyberbullying_Laws_20100701.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 533<br />
such <strong>as</strong> about how young people treat one another, are driving legislative<br />
attempts to regulate online speech through school-b<strong>as</strong>ed enforcement. 5<br />
At the federal level, online speech restrictions have arisen repeatedly <strong>as</strong><br />
proposals and <strong>as</strong> enacted law, in the United States and around the world.<br />
Twenty-five years after the creation of the .COM top-level domain, it is<br />
apparent that national governments can, and often do, <strong>as</strong>sert sovereignty over<br />
the acts of their citizens in the online environment, including limitations on<br />
their speech. At a b<strong>as</strong>ic level, in the United States and in many other<br />
jurisdictions, speech that is deemed to be harmful in the offline world is<br />
considered equally unlawful in the online environment. For the speaker, there is<br />
no free p<strong>as</strong>s simply because the utterance in question appears online.<br />
The primary difference between the policing of speech online and offline lies in<br />
terms of how intermediaries are treated. Under United States federal law, and<br />
the national law of many other jurisdictions, intermediaries that enable people<br />
to publish speech online are exempt from liability in most c<strong>as</strong>es, for example,<br />
from claims of defamation under Section 230 of the Communications Decency<br />
Act. 6 There are exceptions to this rule, even in the United States: Criminal<br />
matters and copyright violations fall outside of the statute’s safe harbor<br />
provisions. 7 In other jurisdictions, regulation of intermediaries may soon be the<br />
law. For instance, in Sweden, the Data Inspection Board issued a report in July<br />
2010 <strong>as</strong>serting that companies offering social media services, such <strong>as</strong> blogs,<br />
Facebook, or Twitter, have a legal obligation to monitor personal data posted to<br />
the pages on their site. 8<br />
At the international level, we observe extensive policing of speech online, but<br />
discussion of the issues involved is largely invisible—or else not happening at<br />
all. While there is extensive and healthy debate about many <strong>as</strong>pects of the<br />
problem of Internet governance, the discussion does not reach the hard<br />
problems of when online speech regulation should be permitted. There are<br />
many issues worthy of the attention of the many capable minds focused on<br />
Internet governance, but the topic of speech regulation rarely makes the list of<br />
what is, in fact, publicly discussed and vetted seriously. 9 The primary focus for<br />
5 See Ronald Collins, A Look at “Cyber-bullying and Public Schools,” March 31, 2009,<br />
www.firstamendmentcenter.org/analysis.<strong>as</strong>px?id=21410.<br />
6 47 U.S.C. § 230.<br />
7 See id.<br />
8 See Swedish Data Inspection Board Report, July 5, 2010, www.datainspektionen.se/inenglish/<br />
(on intermediary liability); see also Companies Responsible for Social Media Content, THE<br />
LOCAL: SWEDEN’S NEWS IN ENGLISH, July 5, 2010, www.thelocal.se/27606/20100705/.<br />
9 The International Telecommunication Union (ITU), official host of the World Summit on<br />
the Information Society (WSIS) in Geneva, h<strong>as</strong> held several events designed to refine the<br />
debate further. Through these events, the ITU h<strong>as</strong> convinced dozens of observers to
534 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
Internet governance discussions continues to be issues related to the<br />
management of Internet resources, including the domain name system and<br />
related policy issues. Discussion of the non-profit Internet Corporation for<br />
Assigned Names and Numbers (ICANN) continues to play a central role.<br />
ICANN occupies an arcane bit of turf—essentially, the port allocation business.<br />
That is important in some respects but does not appear to concern most users<br />
of the Internet, particularly in a world in which most people find Internet<br />
resources through search engines and, incre<strong>as</strong>ingly, mobile devices and<br />
applications. 10 As an example, within the context of the Internet Governance<br />
Forum 2009 meeting in Egypt, the first substantive panel of the event w<strong>as</strong><br />
devoted to traditional ICANN-related matters such <strong>as</strong> the transition from<br />
Internet Protocol version 4 (IPv4) to IPv6 and the addition of new top-level<br />
domains (TLDs). 11 Possible topics for consideration, other than ICANN<br />
reform and these highly specific technical issues, each more important to the<br />
end-users of the Internet and their sovereigns, have included a fund for<br />
developing countries to build Internet infr<strong>as</strong>tructure, the quandary of what to<br />
do about spam, and a cluster of intellectual property concerns.<br />
Internet speech restrictions should serve <strong>as</strong> the focal point for the world’s heads<br />
of state and their designees when Internet governance is on the table. While<br />
online speech restrictions raise a wide array of issues, a discussion of Internet<br />
filtering would hone in on whether states actually want their citizens to have full<br />
access to the Internet or not. It would help guide a public conversation about<br />
what is truly most important about having access to the Internet and the extent<br />
to which states place a premium, if at all, on the global flow of information.<br />
Without collective action, the Internet will likely continue to become balkanized<br />
into a series of local networks, each governed by local laws, technologies,<br />
markets, and norms. As Jonathan Zittrain argued in Who Rule the Net?, the<br />
predecessor of this collection, we may be headed toward a localized version of<br />
the Internet, governed in each instance by local laws. 12 If such a version of the<br />
Internet is inevitably part of our future, there ought to be open and transparent<br />
publish what comprises an extensive body of work on this topic on the ITU website. In<br />
addition, long-time experts in this field, such <strong>as</strong> Prof. Milton Mueller of Syracuse, and<br />
others, have constructed helpful models to structure the conversation. For suggestions on<br />
further information of this general nature, ple<strong>as</strong>e see www.netdialogue.org, a joint project<br />
of Harvard Law School and Stanford Law School.<br />
10 Witness the abysmal turnout for ICANN’s election of 2000, in which a free and open<br />
election for five ICANN directors attracted fewer than 100,000 votes globally.<br />
11 Internet Governance Forum of 2009, Managing Critical Internet Resources, Transcript, Nov.<br />
16, 2009,<br />
www.intgovforum.org/cms/2009/sharm_el_Sheikh/Transcripts/Sharm%20El%20S<br />
heikh%2016%20November%202009%20Managing%20Critical%20Internet%20Resour<br />
ces.pdf.<br />
12 Jonathan Zittrain, Be Careful What You Ask For, in WHO RULES THE NET? INTERNET<br />
GOVERNANCE AND JURISDICTION 13-30 (Adam Thierer et al. eds., Cato Inst. 2003).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 535<br />
consideration of ways to embrace it that can preserve elements of the network<br />
that are the most important.<br />
The Internet Filtering Problem<br />
The world may appear borderless when viewed from cyberspace, but<br />
geopolitical lines are, in fact, well-established online. The fact that extensive<br />
Internet filtering occurs at a national level around the world is clearly<br />
documented. Through a collaborative research effort called the OpenNet<br />
Initiative, 13 the Citizen Lab at the University of Toronto, the Berkman Center<br />
for Internet and Society at Harvard University, and the SecDev Group are<br />
working together to compare Internet filtering practices of states in a<br />
systematic, methodologically rigorous f<strong>as</strong>hion. In the p<strong>as</strong>t several years,<br />
OpenNet Initiative h<strong>as</strong> sought to reach substantive conclusions about the<br />
nature and extent of Internet filtering in roughly 70 states and to compare<br />
practices across regions of the world. The OpenNet Initiative h<strong>as</strong> rele<strong>as</strong>ed<br />
extensive reports that document and provide context for Internet filtering,<br />
previously reported anecdotally, in each of the states that it h<strong>as</strong> studied closely.<br />
Reports rele<strong>as</strong>ed to date have focused on states in the Middle E<strong>as</strong>t and North<br />
Africa, Asia, and Central Asia, where the world’s most extensive filtering takes<br />
place. OpenNet Initiative’s research also covers states in every region of the<br />
world, including North America and Western Europe, where forms of speech<br />
regulation other than technical Internet filtering at the state level are the norm.<br />
Filtering implementations (and their respective scopes and levels of<br />
effectiveness) vary widely among the countries OpenNet Initiative h<strong>as</strong> studied.<br />
China continues to institute by far the most intricate filtering regime in the<br />
world, with blocking occurring at multiple levels of the network and covering<br />
content that spans a wide range of topic are<strong>as</strong>. Though its filtering program is<br />
widely discussed, Singapore, by contr<strong>as</strong>t, blocks access to only a handful of<br />
sites, each pornographic in nature. Most other states that OpenNet Initiative is<br />
studying implement filtering regimes that fall between the poles of China and<br />
Singapore, each with significant variation from one to the next. These filtering<br />
regimes are properly understood only in the political, legal, religious and social<br />
context in which they arise.<br />
Internet filtering occurs in different ways in different parts of the world. Some<br />
states implement a software application developed by one of a small handful of<br />
U.S.-b<strong>as</strong>ed technology providers. Burma, in the first incarnation of its filtering<br />
regime, used an open source product for filtering, called DansGuardian. 14<br />
Others rely less heavily on technology solutions and more extensively on “soft<br />
controls.” Sometimes the filtering regime is supported explicitly by the state’s<br />
13 For more information, see www.opennetinitiative.net/.<br />
14 For more information on the DansGuardian filtering product, see dansguardian.org/.
536 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
legal code; in other c<strong>as</strong>es, the filtering regime is carried out through a national<br />
security authority. In yet other instances, the regulation is simply presumed to<br />
be permissible. The content blocked spans a wide range of social, religious, and<br />
political information. Studies by OpenNet Initiative have reviewed whether<br />
individual citizens could access sites in a “global b<strong>as</strong>ket” of bellwether sites,<br />
testing every jurisdiction across a variety of sensitive are<strong>as</strong>—akin to a stock<br />
index sorted by sector—<strong>as</strong> well <strong>as</strong> a list of websites likely to be sensitive in some<br />
categories, but only in some countries.<br />
Extent, Character & Locus of Filtering<br />
More than three dozen states around the world practice technical Internet<br />
filtering of various sorts. 15 That number h<strong>as</strong> grown over time. Those states<br />
that do filter the Internet have established a network of laws and technical<br />
me<strong>as</strong>ures to carry out substantial amounts of filtering that could allow the<br />
practice to become further embedded in their political and cultural<br />
environments. Web content is constantly changing, which poses a problem for<br />
the censors. Mobile devices and social networks have further complicated the<br />
t<strong>as</strong>k of speech regulation online. No state yet studied, even China, seems able<br />
to carry out its Web filtering in a comprehensive manner, i.e., consistently<br />
blocking access to a range of sites meeting specified criteria. China appears to<br />
be the most nimble at responding to the shifting Web, likely reflecting a<br />
devotion of the most resources, not to mention political will, to the filtering<br />
enterprise.<br />
A state wishing to filter its citizens’ access to the Internet h<strong>as</strong> several initial<br />
options: Domain Name System (DNS) filtering, Internet Protocol (IP) filtering,<br />
or Uniform Resource Locator (URL) filtering. 16 Most states with advanced<br />
filtering regimes implement URL filtering, <strong>as</strong> it can avoid even more dr<strong>as</strong>tic<br />
over-filtering or under-filtering situations presented by the other choices,<br />
discussed below. 17 To implement URL filtering, a state must first identify<br />
where to place the filters; if the state directly controls the Internet service<br />
provider(s) (ISP), the answer is clear. Otherwise, the state may require private<br />
or semi-private ISPs to implement the blocking <strong>as</strong> part of their service. The<br />
technical complexities presented by URL filtering become non-trivial <strong>as</strong> the<br />
15 See Deibert et al., ACCESS DENIED, supra note 1.<br />
16 Nart Villeneuve, Why Block IP Addresses?, NART VILLENEUVE: INTERNET CENSORSHIP<br />
EXPLORER, Feb. 14, 2005, www.nartv.org/2005/02/14/why-block-by-ip-address/.<br />
17 For instance, IP filtering forces the choice of blocking all sites sharing an IP address. A<br />
recent OpenNet Initiative bulletin found more than 3,000 web sites blocked in an attempt to<br />
prevent access to only 31 sites. See Collateral Blocking: Filtering by South Korean<br />
Government of Pro-North Korean Websites, OpenNet Initiative Bulletin 009, Jan. 31, 2005,<br />
www.opennetinitiative.net/bulletins/009/. DNS blocking requires an entire domain and all<br />
subdomains to be either wholly blocked or wholly unblocked. See Villeneuve, supra note 16.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 537<br />
number of users grows to millions rather than tens of thousands. Some states<br />
appear to have limited overall access to the Internet in order to keep URL<br />
filtering manageable. The government of Saudi Arabia, for example, made<br />
filtering a pre-requisite for public Internet access, delaying any such access for a<br />
period of several years until the resources to filter were set in place.<br />
Citizens with technical knowledge can generally circumvent filters that a state<br />
h<strong>as</strong> put in place. Some states acknowledge <strong>as</strong> much: The overseer of Saudi<br />
Arabia’s filtering program, via the state-run Internet Services Unit, admits that<br />
technically savvy users can simply not be stopped from accessing blocked<br />
content. Expatriates in China, <strong>as</strong> well <strong>as</strong> those citizens who resist the state’s<br />
control, frequently find up-to-date proxy servers through which they can<br />
connect to the Internet while evading filters. While no state will ultimately win<br />
a game of cat-and-mouse with those citizens who are resourceful and dedicated<br />
enough to employ circumvention me<strong>as</strong>ures, many users will never do so—<br />
rendering filtering regimes at le<strong>as</strong>t partially effective despite the obvious<br />
workarounds.<br />
Some of the earliest theorizing about control in the online environment, <strong>as</strong><br />
discussed in Who Rules the Net?, 18 suggested that such state-run control of<br />
Internet activity would not work. It is important to note that states such <strong>as</strong><br />
China have proven that an ambitious state can, by devoting substantial<br />
technical, financial, and human resources, exert a large me<strong>as</strong>ure of control over<br />
what their citizens do online. States, if they want, can erect certain forms of<br />
gates at their borders, even in cyberspace, and can render them effective<br />
through a wide variety of modes of control. 19 These controls have proven the<br />
claims of Jack L. Goldsmith and others who have emph<strong>as</strong>ized the extent to<br />
which the online environment can be regulated and the ways in which<br />
traditional international relations theory will govern in cyberspace the same <strong>as</strong><br />
they do in real-space. 20<br />
That does not mean that the issue is simple. For starters, states ordinarily need<br />
a great deal of help in carrying out filtering and surveillance regimes. Enter<br />
ISPs, many of which require a license from the government to lawfully provide<br />
Internet access to citizens. Much Internet filtering is effected by these private<br />
ISPs under respective states’ jurisdictions, though some governments partially<br />
centralize the filtering operation at private Internet Exchange<br />
Points�topological crossroads for network traffic�or through explicit state-<br />
18 Supra note 12.<br />
19 See Jack L. Goldsmith and Tim Wu, WHO CONTROLS THE INTERNET: ILLUSIONS OF A<br />
BORDERLESS WORLD 65-86 (Oxford University Press 2006).<br />
20 See Jack L. Goldsmith, Against Cyberanarchy, in WHO RULES THE NET?: INTERNET<br />
GOVERNANCE AND JURISDICTION (Thierer et al., eds., Cato Inst. 2003).
538 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
run clearing points established to serve <strong>as</strong> gatekeepers for Internet traffic. Some<br />
governments implement filtering at public Internet access points such <strong>as</strong> the<br />
computers found within cybercafés. Such filtering can take the form of<br />
software used in many American libraries and schools for filtering purposes, or<br />
“normative” filtering—government-encouraged interventions by shop-owners<br />
and others <strong>as</strong> citizens surf the Internet in a public place.<br />
Sometimes technical controls are not enough to constrain speech in the manner<br />
that the censors want. The exercise of more traditional state powers can have a<br />
meaningful impact on Internet usage that does not require the complete<br />
technical inaccessibility of particular categories of content. China, Vietnam,<br />
Syria, and Iran have each jailed “cyber-dissidents.” 21 Against this backdrop, the<br />
blocking of Web pages may be intended to deliver a message to users that state<br />
officials monitor Internet usage—in other words, making it clear to citizens that<br />
“someone is watching what you do online.” This message is reinforced by<br />
methods to gather what sites a particular user h<strong>as</strong> visited after the fact, such <strong>as</strong><br />
the requirement of p<strong>as</strong>sports to set up accounts with ISPs and tighter controls<br />
of users at cybercafés.<br />
As we learn more and more about how Internet filtering takes place, the<br />
problems of “governing” the Internet come more sharply into relief—about<br />
how control is exerted, about how citizens in one state can or cannot connect to<br />
others in another state, about the relationship between each state and its<br />
citizens, and about the relationships between states.<br />
Types of Content Filtered<br />
Around the world, states are blocking access to information online b<strong>as</strong>ed upon<br />
its content—or what applications hosted at certain sites can do—for political,<br />
religious, cultural, security, and social re<strong>as</strong>ons. Sensitivities within these<br />
categories vary greatly from country to country. These sensitivities often track,<br />
to a large extent, local conflicts. The Internet content blocked for social<br />
re<strong>as</strong>ons—commonly child safety, pornography, information about gay and<br />
lesbian issues, and sex education—is more likely to be the same across countries<br />
than the political and religious information to which access is blocked.<br />
21 Reporters Sans Frontières, Internet Enemies: China, en.rsf.org/internet-enemiechina,36677.html<br />
(l<strong>as</strong>t accessed Aug. 25, 2010) (“Thirty journalists and seventy-two<br />
netizens are now behind bars for freely expressing their views.”); Reporters Sans Frontières,<br />
Internet Enemies: Viet Nam, en.rsf.org/internet-enemie-viet-nam,36694.html (l<strong>as</strong>t accessed<br />
Aug. 25, 2010) (“Vietnam is the world’s second biggest prison for netizens: it now h<strong>as</strong><br />
seventeen of them behind bars.”); Reporters Sans Frontières, Internet Enemies: Syria,<br />
en.rsf.org/internet-enemie-syria,36689.html (l<strong>as</strong>t accessed Aug. 25, 2010) (“At le<strong>as</strong>t four<br />
netizens are currently behind bars.”); Reporters Sans Frontières, Internet Enemies:Iran,<br />
http://en.rsf.org/internet-enemie-iran,36684.html (l<strong>as</strong>t accessed Aug. 25, 2010) (“Some<br />
thirty netizens have been arrested since June 2009, and a dozen are still being detained.”).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 539<br />
Several states carry out extensive filtering on certain topics. OpenNet Initiative<br />
testing h<strong>as</strong> shown that 50% or more of the sites tested on a given topic (like sex<br />
education) or application (such <strong>as</strong> anonymization tools) are inaccessible. Very<br />
rarely does any state manage to achieve complete filtering on any<br />
topic/application. The only are<strong>as</strong> in which 100% filtering is approached are<br />
pornography and anonymizers (sites that, if left unfiltered, would defeat filtering<br />
of other sites by allowing a user to access any Internet destination through the<br />
anonymizers’ gateways). States like Burma, which reportedly monitors e-mail<br />
traffic, also block a high percentage of free e-mail service providers. Such<br />
complete, or near-complete, filtering is additionally only found in countries that<br />
have outsourced the t<strong>as</strong>k of identifying pornographic sites to one of several forprofit<br />
American companies, and is inevitably accompanied by over-blocking.<br />
Outside of these three are<strong>as</strong>, OpenNet Initiative testers are consistently able to<br />
access some material of a similar nature to the sites that were blocked.<br />
Filtering & Over-breadth<br />
Internet filtering is almost impossible to accomplish with any degree of<br />
precision. There is no way to stem the global flow of information in a<br />
consistently accurate f<strong>as</strong>hion. A country that is deciding to filter the Internet<br />
must make an “over-broad” or “under-broad” decision at the outset. The<br />
filtering regime will either block access to too much or too little Internet<br />
content. Very often, this decision is tied to whether to use a home-grown<br />
system or whether to adopt a commercial software product, such <strong>as</strong> SmartFilter,<br />
WebSense, or an offering from security provider Fortinet, each of which are<br />
products made in the United States and are believed to be licensed to countries<br />
that filter the Internet. Bahrain, for instance, h<strong>as</strong> opted for an “under-broad”<br />
solution for pornography; its ISPs appear to block access to a small and<br />
essentially fixed number of “black-listed” sites. Bahrain may seek to block<br />
access to pornographic material online, while actually blocking only token<br />
access to such material. The United Arab Emirates, by contr<strong>as</strong>t, seem to have<br />
made the opposite decision by attempting to block much more extensively in<br />
similar categories, thereby sweeping into its filtering b<strong>as</strong>ket a number of sites<br />
that appear to have innocuous content by any metric. And Yemen w<strong>as</strong> rebuked<br />
by the United States-b<strong>as</strong>ed WebSense for allegedly using the company’s filtering<br />
system to block access to material that w<strong>as</strong> not pornographic in nature, contrary<br />
to the company’s policies. 22<br />
Most of the time, states make blocking determinations to cover a wide range of<br />
Web content, commonly grouped around a second-level domain name or the IP<br />
address of a Web service (such <strong>as</strong> www.twitter.com or 66.102.15.100), rather<br />
22 See Jillian C. York, WebSense Bars Yemen’s Government from Further Software Updates, OpenNet<br />
Initiative, Aug. 12, 2009, opennet.net/blog/2009/08/websense-bars-yemensgovernment-further-software-updates.
540 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
than b<strong>as</strong>ed on the precise URL of a given Web page (such <strong>as</strong><br />
www.twitter.com/username), or a subset of content found on that page (such <strong>as</strong><br />
a particular image or string of text). Iran, for instance, h<strong>as</strong> used such an<br />
approach to block a cluster of weblogs that the state prefers not to reach its<br />
citizens. This approach means that the filtering process will often not<br />
distinguish between permissible and impermissible content so long <strong>as</strong> any<br />
impermissible content is deemed “nearby” from a network standpoint.<br />
Because of this wholesale acceptance or rejection of a particular speaker or site,<br />
it is difficult to know exactly what speech w<strong>as</strong> deemed unacceptable for citizens<br />
to access. It’s even harder to <strong>as</strong>certain why, exactly, the speech is blocked.<br />
Bahrain, a country in which we only found a handful of blocked sites at the<br />
outset of our first round of testing, blocked access to a discussion board at<br />
www.bahrainonline.org. The message board likely contains a combination of<br />
messages that would be tolerated independently <strong>as</strong> well <strong>as</strong> some that would<br />
appear to meet the state’s criteria for filtering. Likewise, we found minimal<br />
blocking for internal political purposes in the United Arab Emirates, but the<br />
state did block a site that essentially acted <strong>as</strong> a catalog of criticism of the state.<br />
Our tests cannot determine whether it w<strong>as</strong> the material covering human rights<br />
abuses or discussion of historical border disputes with Iran, but in <strong>as</strong> much <strong>as</strong><br />
the discussion of these topics is taking place within a broad dissention-b<strong>as</strong>ed<br />
site, the calculation we project onto the censor in the United Arab Emirates<br />
looks significantly different than that for a site with a different ratio of<br />
“offensive” to “approved” content.<br />
For those states using commercial filtering software and update services to<br />
maintain a current list of blocked sites matching particular criteria, OpenNet<br />
Initiative h<strong>as</strong> noted multiple instances where such software h<strong>as</strong> mistaken sites<br />
containing gay and lesbian content for pornography. For instance, the site for<br />
the Log Cabin Republicans of Tex<strong>as</strong> w<strong>as</strong> blocked by the U.S.-b<strong>as</strong>ed SmartFilter<br />
<strong>as</strong> pornography, and therefore, the apparent b<strong>as</strong>is for its blocking by the United<br />
Arab Emirates. (OpenNet Initiative research shows that gay and lesbian<br />
content is itself often targeted for filtering, and even when it is not explicitly<br />
targeted, states may not be overly concerned with its unavailability.) 23<br />
As content changes incre<strong>as</strong>ingly f<strong>as</strong>ter on the Web and generalizations become<br />
more difficult to make by URLs or domains,—thanks in part to the rise of<br />
simpler, f<strong>as</strong>ter, and aggregated publishing tools, like those found on weblog<br />
sites and via other social networking applications—accurate filtering is getting<br />
trickier for filtering regimes to address unless they want to ban nearly<br />
everything. Mobile devices have further added to the complexity of the<br />
problem from the censor’s viewpoint.<br />
23 OpenNet Initiative, INTERNET FILTERING IN THE UNITED ARAB EMIRATES IN 2004-2005: A<br />
COUNTRY STUDY, Feb. 2005, opennet.net/studies/uae.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 541<br />
For example, free web hosting domains tend to group an enormous array of<br />
changing content and thus provoke very different responses from state<br />
governments. In 2004, Saudi Arabia blocked every page on freespace.virgin.net<br />
and www.erols.com. 24 However, research indicated the www.erols.com sites<br />
had been only minimally blocked in 2002, and the freespace.virgin.net sites had<br />
been blocked in 2002, but were accessible in 2003 before being re-blocked in<br />
2004. In all three tests, Saudi Arabia imposed URL blocking on<br />
www.geocities.com (possibly through SmartFilter categorization), but only<br />
blocked 3% of more than a thousand sites tested in 2004. Vietnam blocked all<br />
sites tested on the www.geocities.com and members.tripod.org domains. In<br />
OpenNet Initiative’s recent testing, it h<strong>as</strong> found that Turkey and Syria have<br />
been blocking all blogs hosted on the free Blogspot service. 25<br />
China’s response to the same problem provides an instructive contr<strong>as</strong>t. When<br />
China became worried about bloggers, they shut down the main blogging<br />
domains for weeks in the summer of 2004. When the domains came back<br />
online, the blogging systems contained filters that would reject posts containing<br />
particular keywords. 26 Even Microsoft’s MSN Spaces blogging software<br />
prevented writers from publishing terms like “democracy” from China. In<br />
effect, China moved to a content-b<strong>as</strong>ed filtering system, but determined that the<br />
best place for such content evaluation w<strong>as</strong> not the point of Web page access,<br />
but the point of publication, and it possessed the authority to force these filters<br />
on the downstream application provider. This approach is similar to that taken<br />
with Google in response to the accessibility of disfavored content via Google’s<br />
caching function. Google w<strong>as</strong> blocked in China until a mechanism w<strong>as</strong><br />
implemented to prevent cache access. 27 These examples clearly demonstrate the<br />
length to which regimes will go to preserve “good” access instead of simply<br />
blocking an entire service.<br />
These examples also demonstrate the incre<strong>as</strong>ing reliance by states on “just-intime”<br />
filtering, rather than filtering that occurs in the same, consistent way over<br />
time. While the paradigmatic c<strong>as</strong>e of Internet filtering w<strong>as</strong> initially the state that<br />
wished to block its citizens from viewing any pornography online at any time<br />
24 Saudi Arabia blocked every page on www.erols.com except for the root page at<br />
www.erols.com itself, potentially indicating a desire to manage perceptions <strong>as</strong> to the extent<br />
of the blocking.<br />
25 All data from OpenNet Initiative testing can be found in the country-by-country summaries<br />
at www.opennet.net/.<br />
26 Filtering by Domestic Blog Providers in China, OpenNet Initiative Bulletin 008, Jan. 14,<br />
2005, www.opennetinitiative.net/bulletins/008/.<br />
27 This mechanism turned out to be extremely rudimentary, <strong>as</strong> outlined in a previous OpenNet<br />
Initiative bulletin. See Google Search & Cache Filtering Behind China’s Great Firewall,<br />
OpenNet Initiative Bulletin 008, Sept. 3, 2004,<br />
www.opennetinitiative.net/bulletins/006/.
542 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
(for instance, Saudi Arabia), the phenomenon of a state blocking particular<br />
speech or types of speech at a sensitive moment h<strong>as</strong> become commonplace.<br />
For instance, China blocked applications such <strong>as</strong> Twitter and YouTube at the<br />
time of the 20 th anniversary of the Tiananmen Square demonstrations in June<br />
2009. A few weeks later, Iran blocked similar applications, including Facebook,<br />
at the time of demonstrations in the streets of Tehran. These blocks are often<br />
lifted once the trouble h<strong>as</strong> p<strong>as</strong>t. One means of tracking these changes in the<br />
availability of applications and websites is a project called Herdict.org, which<br />
enables people from around the world to submit reports on what they can and<br />
cannot access in real-time. 28<br />
Alternate approaches that demand a finer-grained means of filtering, such <strong>as</strong> the<br />
use of automated keywords to identify and expunge sensitive information on<br />
the fly, or greater manual involvement in choosing individual Web pages to be<br />
filtered, are possible, so long <strong>as</strong> a state is willing to invest in them. China in<br />
particular appears prepared to make such an investment, one mirrored by<br />
choices demonstrated through more traditional media. For example, China<br />
allows CNN to be broadc<strong>as</strong>t within the country with a form of time delay, so<br />
the feed can be temporarily turned off <strong>as</strong> when, in one c<strong>as</strong>e, stories about the<br />
death of Zhao Ziyang were broadc<strong>as</strong>t. 29 Online policing of speech, even in<br />
what appears to be a “borderless world,” can be carried out through technical<br />
controls at many layers.<br />
Law, Surveillance & Soft Controls<br />
Just <strong>as</strong> dozens of states use technical means to block citizens from accessing<br />
content on the Internet, most also employ legal and other soft controls.<br />
Surveillance practices are most commonly coupled with outright technical<br />
censorship. Many states that filter use a combination of media,<br />
telecommunications, national security, and Internet-specific laws and regulatory<br />
schemes to restrict the publication of and access to information on the Internet.<br />
States often require ISPs to obtain licenses before providing Internet access to<br />
citizens. Some states—China and Turkey, for instance, which have each<br />
enacted special regulations to this effect—apply pressure on cybercafés and<br />
ISPs to monitor Internet usage by their customers. With the exceptions of<br />
Saudi Arabia and Qatar, no country seems to explicitly communicate to the<br />
public about its process for blocking and unblocking content on the Internet.<br />
Most countries, instead, have a series of broad laws that cover content issues<br />
online, both empowering states that need these laws to carry out filtering<br />
28 See www.herdict.org. The histories of reports of these just-in-time blocking patterns can<br />
be viewed from this website.<br />
29 See Eric Priest, Reactions to the Internet & Society 2004 Session on “Business” (December 11, 10:45-<br />
12:15), Spring 2005,<br />
cyber.law.harvard.edu/blogs/gems/tka/EPriestReactionPaper2.pdf.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 543<br />
regimes, and putting citizens on general notice not to publish or to access<br />
content online that violates certain norms.<br />
Often these soft controls are exercised through social norms or through control<br />
at the far edges of the network. Sometimes the state requires non-governmental<br />
organizations and religious leaders to register before using the Internet to<br />
communicate about their work. In China and in parts of the former Soviet<br />
Union, very often the most fearsome enforcer of the state’s will is the old<br />
woman on one’s block, who may or may not be on the state’s payroll. The<br />
control might be exercised, <strong>as</strong> in Singapore, largely through family dynamics.<br />
The call by the local police force to the Malaysian blogger to come and talk<br />
about his web publishing might have <strong>as</strong> much of an effect on expression <strong>as</strong> any<br />
law on the books or technical blocking system.<br />
Whether through advanced information technology, legal mechanisms, or soft<br />
controls, a growing number of states around the world are seeking to control<br />
the global flow of information. Ordinarily, this control takes the form of<br />
blocking, through technical means, state’s citizens from accessing certain<br />
information online. In other instances, the blocking stops the state’s citizens<br />
from publishing information online, in effect disallowing people outside the<br />
state from hearing the voices of the state’s citizens. As a result, most filtering<br />
regimes cause a chilling effect on the use of information technologies <strong>as</strong> a<br />
means of free expression, whether for political, religious, or cultural purposes.<br />
From “How to Police Speech Online”<br />
to “How to Limit Speech<br />
Restrictions Online”<br />
It is commonplace to argue that states have generally regulated the Internet<br />
lightly, but it is incre<strong>as</strong>ingly untrue. The author of a chapter in an important<br />
recent book wrote, “governments exercise relatively little control over the<br />
Internet, even though it h<strong>as</strong> a tremendous impact on society.” 30 This statement<br />
misleads readers into thinking that the Internet might somehow be a freer, more<br />
open environment than traditional spaces are. Such a statement might have<br />
been true in the United States fifteen years ago. But <strong>as</strong> of today, it is inaccurate.<br />
From a global perspective, both the importance of digitally-mediated<br />
communications and the extent of regulation of speech continue to grow over<br />
time.<br />
The policing of speech in a borderless world brings with it a series of problems<br />
that merit public discussion. The types of controls that take place at a local<br />
30 Harold Kwalw<strong>as</strong>ser, Internet Governance, in CYBERPOWER AND NATIONAL SECURITY, 491<br />
(Franklin D. Kramer et al., eds., Nat’l Defense Univ. Press 2009).
544 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
level, about bullying or hate speech or any other issue that may arise, can be<br />
vetted in a public meeting of a school district or in a state legislature. The<br />
online speech issues that are consistently on federal agend<strong>as</strong> in the United<br />
States—intellectual property restrictions, child safety, network neutrality,<br />
defamation, and so forth—tend to play out in public forums like the legislatures<br />
or in the court system. Of course, there are instances where the debate about<br />
speech controls occurs behind closed doors at the local and national levels <strong>as</strong><br />
well. Corporate-level filtering, which can affect millions of employees, is one<br />
example where the issues are rarely vetted in a meaningful way. But the place<br />
where the debate is most consistently and conspicuously absent is on the<br />
international stage. It ought to merit meaningful consideration in the Internet<br />
governance debate.<br />
The practice of state-mandated Internet filtering, and related regulations like<br />
surveillance, is now a widely-known fact, but the hard problems that stem from<br />
this practice are infrequently discussed <strong>as</strong> a matter of public policy outside of<br />
human rights and academic circles and the occ<strong>as</strong>ional national-level hearing.<br />
The blocking and surveillance of citizens’ activity on the Internet—by virtue of<br />
the network’s architecture, an issue of international dimensions—calls for<br />
discussion at a multilateral level. Rather than fretting over the finer points of<br />
the domain name system, time would be better spent in Internet governance<br />
discussions on issues like transparency in Internet filtering or broad issues of<br />
interconnection of the global network. The Internet filtering problem offers<br />
much more to be gained—through frank discussion, if not action—and<br />
provides an exercise worthy of an extraordinary gathering of world leaders who<br />
want to talk about the global “Information Society.”<br />
On one level, Internet filtering is a private matter between a state and its<br />
citizens <strong>as</strong> to what information citizens may access online. 31 States that censor<br />
the Internet <strong>as</strong>sert the right to sovereignty. From the state’s perspective, the<br />
public interest, <strong>as</strong> defined in one state, such <strong>as</strong> Saudi Arabia, is different from<br />
the public interest <strong>as</strong> defined by the state in Uzbekistan, China, or the United<br />
Kingdom. States can and do exercise their sovereignty through control of the<br />
information environment.<br />
But even if one accepts the state sovereignty argument, that viewpoint should<br />
not end the conversation about Internet filtering. The state-b<strong>as</strong>ed censorship<br />
31 Some states make an effort to suggest that their citizens (in Saudi Arabia and the United<br />
Arab Emirates specifically) are largely in support of the filtering regime, particularly when it<br />
comes to blocking access to pornographic material. For instance, the agency responsible for<br />
both internet access and filtering in Saudi Arabia conducted a user study in 1999, and<br />
reported that 45% of respondents thought “too much” w<strong>as</strong> blocked, 41% thought it<br />
“re<strong>as</strong>onable,” and 14% found it “not enough.” These studies stand for the proposition, in<br />
the context of our report, that some states that filter seek to make the c<strong>as</strong>e that their filtering<br />
regime enjoys popular support, not that such support necessarily exists.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 545<br />
and surveillance practices of any state affect the citizens and businesses of other<br />
states in the context of the interconnected global communications network.<br />
Incre<strong>as</strong>ingly, the censorship and surveillance practices of states reach p<strong>as</strong>t the<br />
web-browsing habits of their own citizens. High-profile debates between<br />
Canadian company Research in Motion, maker of BlackBerry smartphones, and<br />
the United Arab Emirates and Saudi Arabia about the ability of the state to<br />
intercept BlackBerry communications make this point clear. 32 The Internet<br />
blocking that takes place in one state can also affect the network at a technical<br />
level in other jurisdictions, <strong>as</strong> Pakistan found out when it brought down<br />
YouTube globally for two hours in early 2009. 33<br />
A global discussion about the relationship between these filtering and<br />
surveillance practices and human rights is necessary and could be extremely<br />
fruitful. Specifically, states might consider rules that relate to common<br />
standards for transparency in Internet filtering and surveillance practices <strong>as</strong> they<br />
relate to individuals and those corporations drawn into the process. On a<br />
broader level, the issue raised here is about interconnection between states and<br />
the citizens of those states—and ultimately about what sort of an Internet we<br />
want to be building and whether the global flow of information is a sustainable<br />
vision.<br />
For instance, we have yet to join the ethical interests at play in filtering. States<br />
vary greatly in terms of how explicitly the filtering regime is discussed and the<br />
amount that citizens can come to know about it. No state OpenNet Initiative<br />
studied makes its block list generally available. 34 The world leaders who gather<br />
32 See Adam Shreck, UAE BlackBerry Crackdown Affects Visitors Too, ASSOCIATED PRESS, August<br />
3, 2010,<br />
www.google.com/hostednews/ap/article/ALeqM5iJ1MLhAMIeRDhT4heu4LKwxgH3QD9HBKDJO0.<br />
See also Anthony DiPaola & Vivian Salama, UAE to Suspend BlackBerry<br />
Service Citing Security, BLOOMBERG, August 1, 2010, www.businessweek.com/news/2010-<br />
08-01/u-a-e-to-suspend-blackberry-services-citing-security.html.<br />
33 See Declan McCullagh, How Pakistan Knocked YouTube Offline (and how to make sure it never<br />
happens again),” CNET, February 25, 2008, news.cnet.com/8301-10784_3-9878655-7.html.<br />
34 Saudi Arabia publishes its rationale and its blocking practices on an e<strong>as</strong>ily accessible website,<br />
at www.isu.net.sa/saudi-internet/contenet-filtring/filtring.htm (“The Internet Services<br />
Unit oversees and implements the filtration of web pages in order to block those pages of<br />
an offensive or harmful nature to the society, and which violate the tenants of the Islamic<br />
religion or societal norms. This service is offered in fulfillment of the directions of the<br />
government of Saudi Arabia and under the direction of the Permanent Security Committee<br />
chaired by the Ministry of the Interior.”). In Saudi Arabia, citizens may suggest sites for<br />
blocking or for unblocking, in either Arabic or English, via a public website. Most sites<br />
include a block-page, indicating to those seeking to access a website that they have reached a<br />
disallowed site. Most states have enacted laws that support the filtering regime and provide<br />
citizens with some context for why and how it is occurring, though rarely with any degree of<br />
precision. As among the states we have studied, China is one of the states that obscures the<br />
nature and extent of its filtering regime to the greatest extent through a long-running series<br />
of conflicting public statements about its practices in this respect.
546 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
periodically at United Nations-sponsored meetings and at the Internet<br />
Governance Forums could make the most of their leadership by seeking to<br />
establish a set of best practices related to Internet filtering and the transparency<br />
related to filtering regimes. They might also focus profitably on the difficult<br />
problems facing those multinational companies that do business in regimes that<br />
require filtering and surveillance of the network in ways that would not be<br />
legally permissible in the company’s home jurisdiction, <strong>as</strong> the Global Network<br />
Initiative h<strong>as</strong> in its private capacity. 35 The issue of speech regulation in a<br />
borderless world is too important to leave only to those handful of companies,<br />
right-minded though they may be, which are seeking to do the right thing in a<br />
geopolitically complex regulatory environment.<br />
The critical question in the next digital decade is not whether speech can be<br />
policed in a borderless world, but whether we should set new limits on the<br />
extent and manner in which it is policed today. We should <strong>as</strong>k, too, whether it<br />
is today e<strong>as</strong>ier, in fact, than in the p<strong>as</strong>t to police speech in a digitally-mediated<br />
world, and what the ramifications are for civil liberties if so. The Internet is<br />
becoming larger and more fractured each day. We should not pretend the<br />
Internet is a “lightly regulated” medium <strong>as</strong> though the calendar on our kitchen<br />
wall reads “1985” at the top. Trends that support more speech from more<br />
people in more places around the globe—using mobile applications generally,<br />
such <strong>as</strong> blogs, wikis, Twitter, SMS, and so forth—are countered by the<br />
incre<strong>as</strong>ing sophistication and reach of Internet filtering and surveillance<br />
practices. A richer understanding of the complexities at play in Internet filtering<br />
and other speech restrictions would help develop a foundation that does not yet<br />
exist for building a sustainable, and truly global, network that will continue to<br />
bring with it the innovation, jobs, services, and other social benefits that it<br />
promises.<br />
35 See www.globalnetworkinitiative.org.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 547<br />
The Role of the Internet<br />
Community in Combating<br />
Hate Speech<br />
By Christopher Wolf *<br />
In less than twenty years, the Internet h<strong>as</strong> evolved to become an<br />
unprecedented tool for information, communication, entertainment and<br />
commerce. Much of the progress w<strong>as</strong> made possible by the protections<br />
of the First Amendment and the absence of legal constraints. But there<br />
is a growing dark side to the Internet, which raises the question of how<br />
to police harmful content. Online child predators are a well-known<br />
blight, a frequent focus of headlines and television news. Cyber-bullying<br />
also h<strong>as</strong> received significant attention recently, especially in the wake of<br />
teen suicides.<br />
Less reported but equally troubling is the fact that the Internet h<strong>as</strong><br />
become a technology embraced by racists, anti-Semites, homophobes<br />
and bigots of all kinds to spread their messages of hate. The online<br />
haters use all of the tools of the Internet, from static websites, to<br />
streaming audio and video, to social networking sites like Facebook.<br />
No longer relegated to meeting in dark alleys and the b<strong>as</strong>ements of<br />
abandoned buildings, or to mailing their propaganda in plain brown<br />
wrappers, hate groups have a platform to reach millions around the<br />
world. They seek to victimize minorities, to embolden and mobilize likeminded<br />
haters, and to recruit followers. And in their wake, an online<br />
culture h<strong>as</strong> developed—aided by the m<strong>as</strong>k of anonymity—in which<br />
people who would never consider themselves members of hate groups<br />
employ racial, religious and other epithets <strong>as</strong> part of their vocabulary in<br />
posting comments to news stories on mainstream sites and in other<br />
<strong>as</strong>pects of online life. In turn, the common appearance of such epithets<br />
desensitizes readers, making hate speech and the denigration of<br />
minorities “normal.”<br />
* Christopher Wolf is a partner in the law firm Hogan Lovells LLP where he leads the Privacy<br />
and Information Management practice. He founded and chairs the Internet T<strong>as</strong>k Force of<br />
the Anti-Defamation League, and is P<strong>as</strong>t Chair of the International Network Against Cyber-<br />
Hate (INACH).
548 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
One recent example of how haters are using the Internet occurred on the<br />
Fourth of July in 2010. As Americans were celebrating that event, a new<br />
“Event” w<strong>as</strong> announced on Facebook, entitled “Kill a Jew Day.” 1 The<br />
Facebook “host” for the Event wrote, “You know the drill guys,” and he<br />
urged followers to engage in violence “anywhere you see a Jew” between<br />
July 4 and July 22. A Nazi sw<strong>as</strong>tika adorned the Event page.<br />
The posting of that sickening Event prompted a wave of anti-Semitic<br />
rants on Facebook in support of the targeting of Jewish people. But it<br />
also prompted a counter-event on Facebook entitled “One Million<br />
Strong Against Kill a Jew Day” (whose supporters actually numbered,<br />
more modestly, in the thousands). 2 And, pursuant to the Facebook<br />
Terms of Service, complaints about the “Kill a Jew Day” event to<br />
Facebook administrators resulted in the company disabling the Event<br />
page.<br />
The outrage over the Facebook Event site w<strong>as</strong> justified, not just because<br />
of the vile anti-Semitism underneath it or the glorified display of a<br />
sw<strong>as</strong>tika. People also objected to the site because they know that<br />
Internet messages can and do inspire violence. Online anti-Semitic hate<br />
speech h<strong>as</strong> been implicated in real-world acts of violence, such <strong>as</strong> an<br />
attack on Nobel Laureate and Holocaust survivor Elie Wiesel by a<br />
Holocaust denier in 2007, 3 and the 2009 murder of a guard at the<br />
Holocaust Memorial Museum in W<strong>as</strong>hington, D.C., by a white<br />
supremacist who maintained his own online hate site and who w<strong>as</strong><br />
egged-on by fellow haters. 4 Words have consequences, and indeed<br />
inspire acts of hate and violence.<br />
This recent example of online hate provides another opportunity to<br />
examine what society’s response to online hate speech should be. What<br />
1 See Yaakov Lappin, “Kill a Jew” Page on Facebook Sparks Furor, THE JERUSALEM POST, July 5,<br />
2010, http://www.jpost.com/JewishWorld/JewishNews/Article.<strong>as</strong>px?id=180456.<br />
2 See Amada Schwartz, Anti-Semitism v. Facebook, JEWISH JOURNAL, July 13, 2010,<br />
http://www.jewishjournal.com/community/article/antisemitism_vs_facebook_20100713/.<br />
3 Suzanne Herel, Holocaust Survivor, Nobel Peace Prize Winner Elie Wiesel Attacked in S.F. Hotel,<br />
S.F. CHRONICLE, Feb. 9, 2007, http://www.sfgate.com/cgibin/blogs/nwzchik/detail?blogid=32&entry_id=13385.<br />
4 Michael E. Ruane, Paul Duggan & Clarence Williams, At a Moment of Sorrow, A Burst of<br />
Deadly Violence, THE WASHINGTON POST, June 11, 2009,<br />
http://www.w<strong>as</strong>hingtonpost.com/wpdyn/content/article/2009/06/10/AR2009061001768.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 549<br />
is the best way to police sites like the “Kill a Jew Day” Event page on<br />
Facebook, and the thousands of hate-filled videos uploaded to YouTube,<br />
and the white supremacist websites designed to recruit young people,<br />
glorifying violence against minorities?<br />
One visceral response to the proliferation of online hate is: “There<br />
ought to be a law.” Legal rules are the way a society decrees what is right<br />
and what is wrong, and since hate speech is wrong, it seems logical that<br />
the law would be employed to police it. A legal ban on hate speech and<br />
the criminalization of its publication is indeed an alternative in some<br />
jurisdictions. But, of course, it is not an option in the United States where<br />
the First Amendment gives broad latitude to virtually all speech, even the<br />
most repugnant. (Only direct threats against identifiable targets are<br />
criminalized.)<br />
Legislatures around the world have heeded the call for laws<br />
encomp<strong>as</strong>sing Internet hate speech. The hate speech protocol to the<br />
Cybercrime Treaty is a prime example of a heralded legal solution to the<br />
problem. 5 It w<strong>as</strong> designed to eliminate racist sites from the Internet<br />
through criminal penalties.<br />
From Brazil to Canada, and from South Africa to Great Britain, there are<br />
legal restrictions on hate speech, online and offline. In much of Europe,<br />
denial of the Holocaust (online or offline) is forbidden. In Germany,<br />
even displaying the sw<strong>as</strong>tika is a crime. The enforcement of laws against<br />
Holocaust deniers—given the bitterly sad history of those countries—<br />
serves <strong>as</strong> a message to all citizens (especially impressionable children) that<br />
it is literally unspeakable to deny the Holocaust given the horrors of<br />
genocide inflicted in those countries.<br />
Still, there are many who believe that prosecutions, such <strong>as</strong> that of<br />
Holocaust denier David Irving in Austria, 6 promoted his visibility and<br />
stirred up his benighted supporters, rather than quelling future hate<br />
speech and enlightening the public.<br />
Moreover, laws against hate speech have not demonstrably reduced hate<br />
speech or deterred haters. The hate speech protocol to the Cybercrime<br />
5 The Convention on Cybercrime Nov. 23, 2001, Europ. T.S. No. 185,<br />
http://conventions.coe.int/Treaty/en/Treaties/Html/185.htm.<br />
6 See Austria Holds “Holocaust Denier,” BBC NEWS, Nov. 17, 2005,<br />
http://news.bbc.co.uk/2/hi/europe/4446646.stm.
550 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
Treaty, for example, h<strong>as</strong> not reduced online hate. The shield of Internet<br />
anonymity and the viral nature of online hate make legal policing an<br />
unrealistic challenge, except in c<strong>as</strong>es where authorities want to “set an<br />
example.” And since the U.S., with its First Amendment is essentially a<br />
safe-haven for virtually all web content, shutting down a website in<br />
Europe or Canada through legal channels is far from a guarantee that the<br />
contents have been censored for all time. The borderless nature of the<br />
Internet means that, like ch<strong>as</strong>ing cockroaches, squ<strong>as</strong>hing one does not<br />
solve the problem when there are many more waiting behind the walls—<br />
or across the border.<br />
Many see prosecution of Internet speech in one country <strong>as</strong> a futile<br />
gesture when the speech can re-appear on the Internet almost<br />
instantaneously, hosted by an Internet service provider (ISP) or social<br />
networking site in the United States. Moreover, in the social networking<br />
era, the ability of people to upload far outpaces the ability of the police to<br />
track and pursue offending speech.<br />
Like the prosecution in Austria of David Irving, the prosecutions in<br />
Germany of notorious Holocaust deniers and hate site publishers Ernst<br />
Zundel 7 and Frederick Töben 8 sent messages of deterrence to people that<br />
make it their life’s work to spread hate around the world that they may<br />
well go to jail <strong>as</strong> well. And, again, such prosecutions expressed society’s<br />
outrage at the messages. But all one need do is insert the names of those<br />
criminals in a search engine, and you will find websites of supporters<br />
paying homage to them <strong>as</strong> martyrs and republishing their messages.<br />
Even some free speech advocates around the world applaud the use of<br />
the law to censor speech when it is hate speech because of the pernicious<br />
effects of hate speech on minorities and children, and because of its<br />
potential to incite violence. But many of those same people object to the<br />
use of the law by repressive regimes like China to censor speech it deems<br />
to be objectionable <strong>as</strong> hate directed towards the Chinese government. It<br />
is not e<strong>as</strong>y to draw the line between good and bad state use of censorship<br />
because defining what is hate speech can be quite subjective. Giving the<br />
state the power to censor is problematic, especially given the potential for<br />
abuse.<br />
7 See Zundel Gets Five Years from German Count, WINNIPEG FREE PRESS, Feb. 16, 2007,<br />
http://www.winnipegfreepress.com/historic/32115694.html.<br />
8 See Steve Kettmann, German Hate Law: No Denying It, WIRED, Dec. 15, 2000,<br />
http://www.wired.com/politics/law/news/2000/12/40669#ixzz0jTf5lnLh.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 551<br />
This is not to say that law h<strong>as</strong> no role to play in fighting online hate<br />
speech—far from it. But countries with speech codes intended to<br />
protect minorities should make sure that the proper discretion is<br />
employed to use those laws against Internet hate speech, lest the<br />
enforcement be seen <strong>as</strong> ineffectual and result in a diminished respect for<br />
the law. And, again, the realities of the Internet are such that shutting<br />
down a website through legal means in one country is far from a<br />
guarantee that the website is shuttered for all time.<br />
Thus, the law is but one, albeit limited, tool in the fight against online<br />
hate.<br />
Counter-speech—exposing hate speech for its deceitful and false<br />
content, setting the record straight, and promoting the values of<br />
tolerance and diversity, h<strong>as</strong> an important role to play in the policing of<br />
online hate speech. That is the thrust of the First Amendment. To<br />
paraphr<strong>as</strong>e U.S. Supreme Court Justice Brandeis, sunlight is still the best<br />
disinfectant—it is always better to expose hate to the light of day than to<br />
let it fester in the darkness. One answer to hate speech is more speech.<br />
The Facebook event “One Million Strong Against Kill a Jew Day,” even<br />
if far short of a million members, is a vivid example of the power of<br />
counter-speech <strong>as</strong> a vehicle for society to stand up against hate speech.<br />
And, of course, education from an early age on Internet civility and<br />
tolerance would go far to stem the next generation of online haters.<br />
An equally important and powerful tool against hate speech is the<br />
voluntary cooperation of the Internet community, including Internet<br />
Service Providers, social networking companies and others. When<br />
Facebook enforced its Terms of Service (which requires users not to<br />
“post content that: is hateful, threatening, or … incites violence…” 9 ) and<br />
disabled the “Kill Jew Day” event site, 10 that w<strong>as</strong> a powerful example of<br />
an Internet company exercising its own First Amendment rights to<br />
ensure that it remained an online service with standards of decency. That<br />
voluntary act w<strong>as</strong> quick and effective. A legal action against Facebook for<br />
hosting the site—impossible in the U.S. but viable elsewhere around the<br />
world—would have been expensive, time-consuming and no more<br />
9 Statement of Rights and Responsibilities, Facebook,<br />
http://www.facebook.com/terms.php.<br />
10 Lappin, supra note 1.
552 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?<br />
effective. The chilling effect of a legal action against Facebook may have<br />
resulted in undue restrictions by Facebook on future user postings.<br />
Voluntary enforcement by Internet companies of self-established<br />
standards against hate speech is effective. If more Internet companies in<br />
the U.S. block content that violate their Terms of Service, it will at le<strong>as</strong>t<br />
be more difficult for haters to gain access through respectable hosts. The<br />
challenge, of course, is with social media sites where postings occur<br />
constantly and rapidly. Social media companies normally wait for a user<br />
complaint before they investigate hate speech posted on their service, but<br />
the proliferation of hate-filled postings outpaces the effectiveness of such<br />
a “notice and take down” arrangement. New monitoring techniques to<br />
identify hate speech <strong>as</strong> it is posted may be in order.<br />
In the era of search engines <strong>as</strong> the primary portals for Internet users, 11<br />
cooperation from the Googles of the world is an incre<strong>as</strong>ingly important<br />
goal. The example of the Anti-Defamation League 12 and Google with<br />
the site “Jew Watch” is a good one. 13 The high ranking of Jew Watch in<br />
response to a search inquiry w<strong>as</strong> not due to a conscious choice by<br />
Google, but w<strong>as</strong> solely a result of an automated system of ranking.<br />
Google placed text on its site that explained the ranking, and gave users a<br />
clear explanation of how search results are obtained, to refute the<br />
impression that Jew Watch w<strong>as</strong> a reliable source of information, and<br />
linked to the ADL site for counter-speech. 14<br />
In short, vigilance and voluntary standards are more effective than the<br />
law in dealing with the incre<strong>as</strong>ing scourge of online hate speech. Hate<br />
speech can be “policed” in a borderless world, but not principally by the<br />
traditional police of law enforcement. The Internet community must<br />
continue to serve <strong>as</strong> a “neighborhood watch” against hate speech online,<br />
11 A study by Nielsen found that 37 percent of respondents used search engines when looking<br />
for information, <strong>as</strong> compared to 34 percent using portals and only 18 percent using social<br />
media. See Jon Gibs, Social Media: The <strong>Next</strong> Great Gateway for Content Discovery?, NIELSEN, Oct.<br />
5, 2009, http://blog.nielsen.com/nielsenwire/online_mobile/social-media-the-nextgreat-gateway-for-content-discovery/.<br />
12 The Anti-Defamation League w<strong>as</strong> founded to combat anti-Semitism. For more information,<br />
see http://www.adl.org/.<br />
13 See Google Responds to Jew Watch Controversy, WEBPRONEWS, April 15, 2004,<br />
http://www.webpronews.com/topnews/2004/04/15/google-responds-to-jew-watchcontroversy.<br />
14 Google, An Explanation of Our Search Results, http://www.google.com/explanation.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 553<br />
“saying something when it sees something,” and working with online<br />
providers to enforce community standards.
554 CHAPTER 9: CAN SPEECH BE POLICED IN A BORDERLESS WORLD?
CHAPTER 10<br />
WILL THE NET LIBERATE THE WORLD?<br />
Can the Internet Liberate the World? 557<br />
Evgeny Morozov<br />
Internet Freedom: Beyond Circumvention 565<br />
Ethan Zuckerman<br />
555
556 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 557<br />
Can the Internet<br />
Liberate the World?<br />
By Evgeny Morozov *<br />
It may be useful to start by laying out the b<strong>as</strong>ics. Anyone pondering the<br />
question posed in the title of this essay most likely <strong>as</strong>sumes that there exists<br />
some powerful forces of oppression from which the world could and should be<br />
liberated. A list of such problems is <strong>as</strong> infinite <strong>as</strong> it is intuitive: From poverty to<br />
racism and from pollution to obesity, “oppressive forces” seem to be all around<br />
us.<br />
Yet, for some re<strong>as</strong>on, these are rarely the kind of problems that one wants to<br />
fight with the help of technology, let alone the Internet. It’s in solving political<br />
rather than socio-economic problems that the Internet is presumed to hold the<br />
greatest promise. Most specifically, it is its ability to undermine repressive<br />
governments that is widely discussed and admired, even more so <strong>as</strong> Internet<br />
companies like Google find themselves struggling with the likes of the Chinese<br />
government.<br />
Two features of the Internet are often praised in particular: 1) its ability to<br />
quickly disseminate any kind of information—including the information that<br />
authoritarian governments may not like—and 2) its ability to allow like-minded<br />
individuals to find each other, to mobilize supporters and to collectively pursue<br />
future goals—including democratization. The hype surrounding Iran’s Twitter<br />
Revolution of 2009 w<strong>as</strong> probably the strongest public manifestation of high<br />
hopes for the transformative potential of the Internet; only a rare pundit did not<br />
predict the eventual collapse of the Iranian regime under the barrage of angry<br />
tweets from its citizens.<br />
Still, such praise is not without merit. Even the hardest skeptics would grant<br />
these two features to the Internet; to deny that it does enhance the citizens’<br />
ability to inform (and get informed) and to mobilize—what the Internet theorist<br />
Clay Shirky calls “ridiculously e<strong>as</strong>y group forming” 1—would be to deny the<br />
obvious. The skeptics would also have no trouble acknowledging that both of<br />
these features are constantly under threat, <strong>as</strong> governments keep implementing<br />
new systems of censorship and control.<br />
* Evgeny Morozov is the author of THE NET DELUSION: THE DARK SIDE OF INTERNET<br />
FREEDOM (Public Affairs, 2011). He’s also a visiting scholar at Stanford University, a fellow<br />
at the New America Foundation and a contributing editor to Foreign Policy magazine.<br />
1 CLAY SHIRKY, HERE COMES EVERYBODY: THE POWER OF ORGANIZING WITHOUT<br />
ORGANIZATIONS 54 (Penguin Press 2008), quoting social scientist Séb<strong>as</strong>tien Paquet.
558 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
Nor does anyone really contest the fact that the Internet h<strong>as</strong> proved quite<br />
resilient against such attacks, giving rise to numerous tools to circumvent<br />
government censorship. For many, the fact that an institution <strong>as</strong> powerful <strong>as</strong> the<br />
U.S. government h<strong>as</strong> trouble reining in a fluid and mostly virtual organization<br />
like WikiLeaks is a testament to the power of the Internet, even though the<br />
morality of WikiLeaks’ actions (in publishing leaked information about the U.S.<br />
military occupation of Afghanistan) is still widely disputed.<br />
However, conceding that the Internet helps to disseminate information and<br />
mobilize campaigners around certain causes is not quite the same <strong>as</strong> conceding<br />
that authoritarian regimes are doomed or that democracy is inevitable. There<br />
may be good independent re<strong>as</strong>ons to campaign for greater freedom of<br />
expression on the Internet—but one shouldn’t presume that such freedoms<br />
would necessarily translate into democratization.<br />
For the Net—and its two powerful features discussed above—to be able to<br />
“liberate the world” from authoritarianism, one needs to make a few further<br />
<strong>as</strong>sumptions. First, one needs to <strong>as</strong>sume that modern authoritarian regimes<br />
derive their power primarily by suppressing the activities that the Internet helps<br />
to amplify: i.e., dissemination of information and popular mobilization around<br />
specific causes. Second, one also needs to <strong>as</strong>sume that the Internet won’t have<br />
any other political and social effects that may—if only indirectly—create new<br />
modes of “oppression,” entrenching authoritarianism <strong>as</strong> a result. In other<br />
words, the Internet can only deliver on its liberating promise <strong>as</strong> long <strong>as</strong> the<br />
things it offers are also the things that the fight against authoritarianism<br />
requires—and <strong>as</strong> long <strong>as</strong> it doesn’t produce any other regime-strengthening<br />
effects that may inadvertently undermine that fight.<br />
On initial examination, the first <strong>as</strong>sumption seems to hold. There is no shortage<br />
of suppression of both information and mobilization opportunities in modern<br />
authoritarian states. Their rulers have not lost the desire to guard their secrets or<br />
regulate how their citizens participate in public life. The fact that some forms of<br />
censorship persist even in democratic societies suggests that governments do<br />
not really <strong>as</strong>pire to lift all the digital gates and let information flow freely. The<br />
urge, then, is to find ways to break through those gates—and the Internet seems<br />
to excel at the job. If there is one thing that techies and hackers know how to<br />
do well, it’s to build tools to pierce firewalls.<br />
But suppose that such tools can be found and can even achieve the kind of<br />
scale where all Internet users in China or Iran have access to them. What would<br />
the effect be on their populations and their governments? I’d like to propose<br />
that one’s answer to this question depends mostly on one’s views about the<br />
sources of legitimacy of modern authoritarianism.<br />
Those who believe that such legitimacy is derived primarily through the<br />
brainw<strong>as</strong>hing of their citizens are justifiably very excited about the Internet.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 559<br />
Moreover, they are usually very quick to predict the inevitable fall of<br />
authoritarian governments. After all, their theoretical conception of<br />
authoritarianism posits that once the information gates are open, brainw<strong>as</strong>hing<br />
loses much of its effectiveness; people realize they have been lied to all along—<br />
and they rebel <strong>as</strong> a result.<br />
At first sight, the contemporary global situation may seem to vindicate such<br />
views. Many modern authoritarian governments—and here c<strong>as</strong>es like Belarus,<br />
China, Russia come to mind—enjoy strong levels of support from v<strong>as</strong>t swathes<br />
of their populations. One may quibble about the ways by which the Kremlin h<strong>as</strong><br />
solidified its power in the l<strong>as</strong>t decade—many of those ways are far from<br />
democratic—but virtually all opinion polls reveal that the Kremlin’s policies are<br />
genuinely popular. Ditto China, where the government is one of the most<br />
trusted institutions in the country, enjoying a level of trust that the U.S.<br />
Congress could only dream of.<br />
Is it all because of brainw<strong>as</strong>hing? If the answer is “yes,” then there are, indeed,<br />
good re<strong>as</strong>ons to be optimistic about the power of the Internet. The moment the<br />
authoritarian governments’ monopoly over information disappears, any<br />
manipulations of truth that were possible in an age of information scarcity<br />
would be impossible.<br />
This, I’d like to propose, may be a very simplistic reading of the situation—and<br />
a reading that is also extremely insensitive to historical and social forces. There<br />
is much more to the legitimacy of modern authoritarian states than just their<br />
skillful manipulation of information. Many authoritarian regimes—Belarus,<br />
China and Russia are again excellent examples but one could also add<br />
Azerbaijan, Kazakhstan, and Vietnam to the list—have made genuine advances<br />
in economic development, many of them thanks to their embrace of technology<br />
in general and the Internet in particular. Furthermore, in many of these<br />
countries, the once extremely contentious political life h<strong>as</strong> stabilized <strong>as</strong> well,<br />
allowing their populations to enjoy a rare period of peace and prosperity—even<br />
it came at the hefty price of having their governments tighten the valves on<br />
freedom of expression or freedom of <strong>as</strong>sembly.<br />
It seems disingenuous to argue that modern Russians or Chinese do not<br />
appreciate the fact that today they purch<strong>as</strong>e considerably more commodities—<br />
including luxurious ones—than they could 20 years ago; or that they can travel<br />
the world much more freely; or that they—at le<strong>as</strong>t online—can consume any<br />
kind of entertainment they want, regardless of its origins. Placed in the historical<br />
context and compared against other possible scenarios of where these two<br />
countries may have been had their rulers not embarked on a series of reforms,<br />
such achievements look even more impressive.<br />
Should it turn out that large segments of the populations of authoritarian states<br />
are well aware of the kind of human rights violations that are needed in order to
560 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
sustain the impressive rates of economic progress—or, even more shockingly,<br />
that they are actually supportive of such violations—a strategy of “unbrainw<strong>as</strong>hing”<br />
simply would not work. If modern authoritarianism is <strong>as</strong> much a<br />
product of a social contract <strong>as</strong> modern democracy is, then changing the<br />
attitudes of those who have long given up the fight for freedom would take<br />
much more than just exposing them to facts.<br />
The big question that Western do-gooders should be <strong>as</strong>king themselves here is<br />
not whether the Internet can liberate the world but whether the world actually<br />
wants to be liberated. Above all, this is a question of whether capitalism<br />
unburdened by democratic norms and ideals is sustainable in the long term—a<br />
possibility that goes directly against the theory that the logic of capitalism<br />
inevitably leads to democracy, a view that w<strong>as</strong> extremely popular in the early<br />
1990s. 2<br />
If capitalism can get by without democracy, the ability to spread subversive<br />
information that might reveal the horrors of the regime looks considerably less<br />
impressive, for the populations that the West seeks to liberate are already wellaware<br />
of what’s going on and many of them may have simply chosen to look<br />
the other way in expectation of a better life for their children.<br />
Granted, it’s not just facts that may help change their attitudes. One may use the<br />
Net to distribute subversive poetry and fiction that would reawaken (or, in most<br />
c<strong>as</strong>es, create from scratch), the political consciousness of those living under<br />
authoritarianism. Technology would certainly be of great help here, both in<br />
terms of helping to distribute such materials but also in terms of protecting<br />
those who access them. But the ability of such materials to incite people to<br />
democratic change is not just a function of how many people read them; rather,<br />
it’s a function of how well-argued such materials are.<br />
It seems that even if the West succeeded in distributing 1984, Darkness at Noon,<br />
or Brave New World to every single citizen of an authoritarian state, this might<br />
not lead to a revolution, simply because those books offer a poor critique of the<br />
actually existing modern-day authoritarianism, which h<strong>as</strong> come to terms with<br />
both Western popular culture and globalization.<br />
2 See, e.g., Commission on Security & Cooperation in Europe (U.S. Helsinki Commission),<br />
Briefing on Twitter Against Tyrants: New Media In Authoritarian Regimes, Oct. 22, 2009, available<br />
at http://csce.gov/index.cfm?FuseAction=ContentRecords.ViewDetail&<br />
ContentRecord_id=462&Region_id=0&Issue_id=0&ContentType=H,B&ContentRe<br />
cordType=B&CFID=32177263&CFTOKEN=96274551; Nichol<strong>as</strong> Kristof, Tear Down<br />
This Cyberwall!, N.Y. TIMES, June 17, 2009, available at http://www.nytimes.com/2009/06/<br />
18/opinion/18kristof.html?_r=1; L. Gordon Crovitz, Mrs. Clinton, Tear Down this Cyberwall,<br />
WALL ST. JOURNAL, May 3, 2010, available at http://online.wsj.com/article/<br />
SB10001424052748704608104575219022492475364.html; FRANCIS FUKUYAMA, THE END<br />
OF HISTORY AND THE LAST MAN (1992).
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 561<br />
If support for modern authoritarianism is not the product of ignorance and<br />
brainw<strong>as</strong>hing but rather of a rational calculation that, under the present<br />
conditions, authoritarianism is the best way to generate and preserve economic<br />
growth, the Internet’s ability to disseminate and mobilize, while very impressive<br />
in itself, may not deliver the kind of benefits that so many in the West expect.<br />
To put it simply, citizens of authoritarian states may not be uninformed—so<br />
getting them informed is going to be of only limited value. As such, the Net’s<br />
ability to liberate the world is severely constrained by the absence of a strong<br />
intellectual vision for how a liberated Russia or China would look (and work)<br />
like.<br />
Now, if the Soviet experience is anything to judge by, revolutions don’t need<br />
the absolute majority of the population to be successful. In other words, it may<br />
be possible that a small group of politically active citizens could take advantage<br />
of political openings at the right moment and push for significant reform—or<br />
the overthrow of a government altogether. In situations like this, the Internet’s<br />
ability to mobilize may indeed come very handy.<br />
Several caveats are in order here. First, obviously, this doesn’t mean that the<br />
Internet can help create such political openings—those are usually created by<br />
structural factors. As much <strong>as</strong> it is tempting to believe that it w<strong>as</strong> fax machines<br />
and photocopiers that brought down communism in E<strong>as</strong>tern European<br />
countries, one would probably be better off studying their dismal economic<br />
record in the late 1980s. So far, it seems that the information revolution—which<br />
many have taken to mean the end of authoritarianism everywhere—h<strong>as</strong>, overall,<br />
had a positive impact on the rates of economic growth in modern authoritarian<br />
regimes—and, to this extent, that revolution may actually have strengthened these<br />
regimes.<br />
The second caveat is that there is little certainty that the group with the best<br />
ability to mobilize will also be the group with the most impressive democratic<br />
agenda. Once again, the Soviet example—with a tiny group of Bolsheviks<br />
gaining control of a country <strong>as</strong> m<strong>as</strong>sive <strong>as</strong> Russia—is quite instructive. Not all<br />
revolutions are democratic in character, and more than one of them ended up<br />
with the le<strong>as</strong>t democratic groups gaining power. Al-Qaeda is far better at using<br />
technology to mobilize the m<strong>as</strong>ses than the liberal voices of the Middle E<strong>as</strong>t;<br />
the Russian nationalists, likewise, are far more creative online than the<br />
democratic and pro-Western opposition.<br />
Third—and, perhaps, most important—what happens in between political<br />
openings matters a great deal <strong>as</strong> well. It’s simply not true that, from the<br />
perspective of an authoritarian state, all social mobilization is harmful. Take the<br />
c<strong>as</strong>e of China. Thanks to the extremely vibrant nationalist sector of the<br />
country’s blogosphere, the Chinese government is often pushed to adopt a<br />
much more aggressive posture—towards Taiwan, Japan, South Korea—than
562 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
they might otherwise have done. Such hawkishness in their foreign policy may<br />
or may not bolster their legitimacy; the short answer is that we have to look at<br />
the context. For our purposes, it seems clear that we won’t be able to<br />
understand the role of the Internet—even if one concedes that it is, indeed,<br />
conducive to more mobilization and contestation—if we study that role outside<br />
of the socio-political environment in which it is embedded. The <strong>as</strong>sumption that<br />
“greater opportunities for social mobilization equals greater odds that<br />
democracy will prevail in the long run” simply is not true.<br />
This l<strong>as</strong>t point highlights the problem with the second grand <strong>as</strong>sumption that<br />
we still need to examine—namely, that the Net won’t have any other primary or<br />
secondary effects on the quality and sustainability of authoritarianism. The<br />
account that prioritizes the role of information dissemination and mobilization<br />
usually rests on a very simplistic, even reductionist, theory of authoritarianism.<br />
It presumes the existence of an authoritarian chimera—the government—which<br />
controls its citizens through a combination of surveillance and coercion.<br />
Citizens, the theory goes, would take immediate advantage of the Internet and<br />
use it to push against the government.<br />
But why wouldn’t the government do the same to push against the citizens? In<br />
order to understand the overall impact of the Net on the “struggle for<br />
liberation,” one must study how it may have also facilitated government<br />
monitoring and control of what their citizens do. Even the most optimistic<br />
observers of the Net would concede that social networking, fun <strong>as</strong> it is, may not<br />
necessarily be the best way to protect one’s data—both because no social<br />
networking site is secure from occ<strong>as</strong>ional data leaks and because secret police<br />
around the world have now, inadvertently, obtained the ability to map the<br />
connections between different activists, see how they are related to foreign<br />
funders, and so on.<br />
Furthermore, there is a vibrant and rapidly-expanding global market in activities<br />
like face recognition, which makes the identification of those who participate in<br />
anti-government protests much e<strong>as</strong>ier—often this actually happens by<br />
comparing party photos they themselves upload to social networking sites with<br />
the photos taken at the protest rallies. Se<strong>as</strong>oned activists may, of course, be<br />
smart enough to steer away from social networking sites, but this hardly applies<br />
to the rest of the population.<br />
But conceding that it’s not just anti-government activists who have been<br />
empowered by the Net is only part of the story. The truth is that we simply<br />
can’t e<strong>as</strong>ily cl<strong>as</strong>sify all social forces into “pro-” and “anti-government” simply<br />
b<strong>as</strong>ed on the location of their offices (e.g., the secret police are in; the unions are<br />
out). In reality, modern authoritarian regimes derive their power from a much<br />
more diverse pool of resources than sheer brute force or surveillance. To<br />
understand what makes modern authoritarian regimes tick, one thus needs to
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 563<br />
look at a whole range of other political, social, and cultural factors: religion,<br />
history, nationalism, geography (e.g., the relations between the federal center and<br />
the periphery), rates of economic growth, corruption, government efficiency,<br />
fear of a foreign inv<strong>as</strong>ion and so forth. Many of these factors have successfully<br />
been co-opted by modern authoritarian rulers to justify and prolong their rule.<br />
It’s e<strong>as</strong>y to imagine an authoritarian regime that would become stronger <strong>as</strong> a<br />
result of (a) an incre<strong>as</strong>e in religious sentiment among its population, (b) the<br />
promotion of a particular interpretation of recent history that would justify the<br />
current political regime <strong>as</strong> inevitable and an unambiguous improvement over its<br />
predecessors, and (c) impressive rates of economic growth, with little<br />
corruption or government bureaucracy. Likewise, it’s e<strong>as</strong>y to imagine how all of<br />
these developments would be amplified if (d) the Internet ends up providing<br />
more access to more religious materials to more believers (e.g., through mobile<br />
phones), (e) governments find a way to hire and compensate loyal bloggers for<br />
touting a particular reading of history, and (f) governments set up websites that<br />
allow citizens to report on corrupt officials, problems with existing<br />
infr<strong>as</strong>tructure, or government w<strong>as</strong>te.<br />
That l<strong>as</strong>t development may seem like a good thing—until one realizes that an<br />
authoritarian government with less government w<strong>as</strong>te is not necessarily a<br />
weaker authoritarian government. It may actually be more effective and the<br />
country may enjoy f<strong>as</strong>ter rates of economic growth—but that, al<strong>as</strong>, still does not<br />
always translate into a more democratic government.<br />
All of this is to say that the only way to understand how the Internet influences<br />
authoritarianism is to first define a theory of authoritarianism itself—preferably,<br />
a theory that goes beyond Manichean theories of “the totalitarian state” versus<br />
“the dissidents”—and then use it to closely investigate how the Internet affects<br />
each of its components.<br />
As such, our ability to harvest the potential of the Net to “liberate the world”<br />
depends not so much on our ability to understand the Net but on our ability to<br />
understand the world itself. It’s much e<strong>as</strong>ier to understand how the Internet<br />
affects government efficiency than to understand how government efficiency<br />
affects government legitimacy under conditions of capitalism-friendly<br />
authoritarianism.<br />
Political scientists, unfortunately, don’t have much to bo<strong>as</strong>t of on this front:<br />
their understanding of this completely new breed of authoritarianism is at best<br />
rudimentary—and their understanding of how such a fluid and complex<br />
technology <strong>as</strong> the Internet can affect it is even worse. Given the immense<br />
poverty of our current conceptual apparatus, even if the Net does end up<br />
liberating the world, most likely we won’t know it for quite some time.
564 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 565<br />
Internet Freedom:<br />
Beyond Circumvention<br />
By Ethan Zuckerman *<br />
U.S. Secretary of State Hillary Clinton’s January 2010 speech on Internet<br />
Freedom signaled a strong interest from the State Department in encouraging<br />
the use of the Internet to promote political reforms in closed societies. 1 It<br />
makes sense that the State Department would look to support existing projects<br />
to circumvent Internet censorship. The New York Times reports that a group of<br />
senators subsequently urged the Secretary to apply existing funding to support<br />
the development and expansion of censorship circumvention programs. 2<br />
My colleagues Hal Roberts, John Palfrey and I have studied the development of<br />
Internet circumvention systems over the p<strong>as</strong>t five years, and rele<strong>as</strong>ed a study l<strong>as</strong>t<br />
year that compared the strengths and weaknesses of different circumvention<br />
tools. 3 Some of my work at The Berkman Center for Internet & Society at<br />
Harvard University is funded by a U.S. State Department grant that focuses on<br />
the continuing study and evaluation of these sorts of tools. As a result, I spend<br />
a lot of time coordinating efforts between tool developers and people who need<br />
access to circumvention tools to publish sensitive content.<br />
I strongly believe that we need strong, anonymized and useable censorship<br />
circumvention tools. But I also believe that we need lots more than censorship<br />
circumvention tools, and I fear that both funders and technologists may overfocus<br />
on this one particular <strong>as</strong>pect of Internet freedom at the expense of other<br />
avenues. I wonder whether we’re looking closely enough at the fundamental<br />
limitations of circumvention <strong>as</strong> a strategy and <strong>as</strong>king ourselves what we’re<br />
hoping Internet freedom will do for users in closed societies.<br />
* Senior Researcher at the Berkman Center for Internet and Society at Harvard University.<br />
Thanks to Hal Roberts, Janet Haven and Rebecca MacKinnon for help editing and<br />
improving this essay. They’re responsible for the good parts. You can blame the rest on<br />
me.<br />
1 Hillary Rodham Clinton, Secretary of State, Remarks on Internet Freedom at the Newseum<br />
(Jan. 21, 2010), http://www.state.gov/secretary/rm/2010/01/135519.htm.<br />
2 Brad Stone, Aid Urged for Groups Fighting Internet Censors, N.Y. TIMES, Jan. 20, 2010,<br />
http://www.nytimes.com/2010/01/21/technology/21censor.html?_r=1. For more<br />
information on Tor, see http://www.torproject.org. For more information on Psiphon,<br />
see http://psiphon.ca. For more information on Freegate, see http://www.ditinc.us/freegate.<br />
3 HAL ROBERTS, ETHAN ZUCKERMAN & JOHN PALFREY, 2007 CIRCUMVENTION LANDSCAPE<br />
REPORT: METHODS, USES, AND TOOLS (March 2009), http://d<strong>as</strong>h.harvard.edu/<br />
bitstream/handle/1/2794933/2007_Circumvention_Landscape.pdf?sequence=2.
566 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
So here’s a provocation: We can’t circumvent our way around Internet<br />
censorship.<br />
I don’t mean that Internet censorship circumvention systems don’t work. They<br />
do—our research tested several popular circumvention tools in censored<br />
nations and discovered that most can retrieve blocked content from behind the<br />
Chinese firewall or a similar system. 4 There are problems with privacy, data<br />
leakage, the rendering of certain types of content, and particularly with usability<br />
and performance, but the systems we tested can indeed circumvent censorship.<br />
What I mean is this: We couldn’t afford to scale today’s existing circumvention<br />
tools to “liberate” all of China’s Internet users even if they all wanted to be<br />
liberated.<br />
Circumvention systems share a b<strong>as</strong>ic mode of operation—they act <strong>as</strong> proxies to<br />
let users retrieve blocked content. A user is blocked from accessing a website<br />
by her Internet Service Provider (ISP) or that ISP’s ISP. She may want to read a<br />
page from Human Rights Watch’s (HRW) website, which is accessible at IP<br />
address 70.32.76.212. But that IP address is on a national blacklist, and she’s<br />
prevented from receiving any content from it. So, she points her browser to a<br />
proxy server at another address—say, 123.45.67.89—and <strong>as</strong>ks a program on<br />
that server to retrieve a page from the HRW website. Assuming that<br />
123.45.67.89 isn’t on the national blacklist, she should be able to receive the<br />
HRW page via the proxy.<br />
During the transaction, the proxy is acting like an Internet service provider. Its<br />
ability to provide reliable service to its users is constrained by bandwidth—<br />
bandwidth to access the destination site and to deliver the content to the proxy<br />
user. Bandwidth is costly in aggregate, and it costs real money to run a proxy<br />
that’s heavily used.<br />
Some systems have tried to reduce these costs by <strong>as</strong>king volunteers to share<br />
them—the first rele<strong>as</strong>e of Citizen Lab’s Psiphon used home computers hosted<br />
by volunteers around the world <strong>as</strong> proxies, and then used their consumer<br />
bandwidth to access the public Internet. Unfortunately, in many countries,<br />
consumer Internet connections are optimized to download content and are<br />
much slower when they are uploading content. These proxies could access the<br />
Human Rights Watch website pretty quickly, but they took a very long time to<br />
deliver the page to the user behind the firewall. As a result, Psiphon is no<br />
longer primarily focused on trying to make proxies hosted by volunteers work.<br />
Tor, on the other hand, is, but Tor nodes are frequently hosted by universities<br />
and companies that have access to large pools of bandwidth. Still, available<br />
bandwidth is a major constraint of the Tor system. The most usable<br />
circumvention systems today—virtual private network (VPN) tools like<br />
4 See, generally, ROBERTS, ZUCKERMAN & PALFREY, supra note 3.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 567<br />
Relakks 5 or WiTopia 6—charge users between $3 and $6 per month to defray<br />
bandwidth costs.<br />
Assume that systems like Tor, Psiphon and Freegate receive additional funding<br />
from the U.S. State Department. How much would it cost to provide proxy<br />
Internet access for … well, China? China reports 384 million Internet users, 7<br />
meaning we’re talking about running an ISP capable of serving more than 25<br />
times <strong>as</strong> many users <strong>as</strong> the largest U.S. ISP. 8 According to the China Internet<br />
Network Information Center (CNNIC), China consumes 998,217 Mbps of<br />
international Internet bandwidth. 9 It’s hard to get estimates for what ISPs pay<br />
for bandwidth, though conventional wisdom suggests prices between $0.05 and<br />
$0.10 per gigabyte. Using $0.05 <strong>as</strong> a cost per gigabyte, the cost to provide the<br />
uncensored Internet to China would be $13,608,000 per month, or $163.3<br />
million a year in pure bandwidth charges, not including the costs of proxy<br />
servers, routers, system administrators and customer service. Faced with a bill<br />
of that magnitude, the $45 million U.S. senators are <strong>as</strong>king Secretary Clinton to<br />
spend quickly looks pretty paltry. 10<br />
There’s an additional complication—we’re not just talking about running an<br />
ISP—we’re talking about running an ISP that’s very likely to be abused by bad<br />
actors. Spammers, fraudsters and other Internet criminals use proxy servers to<br />
conduct their activities, both to protect their identities and to avoid systems on<br />
free webmail providers, for instance, which prevent users from signing up for<br />
dozens of accounts by limiting an IP address to a certain number of signups in a<br />
limited time period. For example, Wikipedia found that many users used open<br />
proxies to deface their system and now reserve the right to block proxy users<br />
from editing pages. 11 Proxy operators have a tough balancing act—for their<br />
proxies to be useful, people need to be able to use them to access sites like<br />
Wikipedia or YouTube, but if people use those proxies to abuse those sites, the<br />
proxy will be blocked. As such, proxy operators can find themselves at war<br />
with their own users, trying to ban bad actors to keep the tool useful for the rest<br />
of the users.<br />
5 For more information on Relakks, see http://www.relakks.com.<br />
6 For more information on WiTopia, see http://www.witopia.net.<br />
7 Chris Buckley, China Internet Population Hits 384 million, REUTERS, Jan. 15, 2010,<br />
http://www.reuters.com/article/idUSTOE60E06S20100115.<br />
8 See Top 23 U.S. ISPs by Subscriber: Q3 2008, ISP Planet, http://www.ispplanet.com/research/rankings/usa.html.<br />
9 See China Internet Network Info. Ctr., Internet Fundamental Data,<br />
http://www.cnnic.net.cn/en/index/0O/index.htm (l<strong>as</strong>t visited July 29, 2010).<br />
10 Brad Stone, Aid Urged for Groups Fighting Internet Censors, N.Y. TIMES, Jan. 20, 2010,<br />
http://www.nytimes.com/2010/01/21/technology/21censor.html.<br />
11 See Open Proxies, WIKIPEDIA, http://en.wikipedia.org/wiki/Wikipedia:Open_proxies.
568 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
I’m skeptical that the U.S. State Department can or wants to build or fund a<br />
free ISP that can be used by millions of simultaneous users, many of whom may<br />
be using it to commit click fraud or send spam. 12 I know—because I’ve talked<br />
with many of them—that the people who fund blocking-resistant Internet<br />
proxies don’t think of what they’re doing in these terms. Instead, they <strong>as</strong>sume<br />
that proxies are used by users only in special circumstances, to access blocked<br />
content.<br />
Here’s the problem: A government like China is blocking a lot of content. As<br />
Donnie Dong notes in a recent blog post, five of the ten most popular websites<br />
worldwide are blocked in China. 13 Those sites include YouTube and Facebook,<br />
sites that eat bandwidth through large downloads and long sessions. Perhaps it<br />
would be realistic to act <strong>as</strong> an ISP to China if we were just providing access to<br />
Human Rights Watch—but it’s not realistic if we’re providing access to<br />
YouTube, too.<br />
Proxy operators have dealt with this question by putting constraints on the use<br />
of their tools. Some proxy operators block access to YouTube because it’s such<br />
a bandwidth hog. Others block access to pornography, both because it uses<br />
bandwidth and to protect the sensibilities of their sponsors. Others constrain<br />
who can use their tools, limiting access to people coming from Iranian or<br />
Chinese IP addresses, trying to reduce bandwidth use by American high school<br />
kids whose schools have blocked YouTube. In deciding who or what to block,<br />
proxy operators are offering their personal answers to a complicated question:<br />
What parts of the Internet are we trying to open up to people in closed societies? As we’ll<br />
address in a moment, that’s not such an e<strong>as</strong>y question to answer.<br />
Imagine for a moment that we could afford to proxy China, Iran, Myanmar and<br />
others’ international traffic. We figure out how to keep these proxies unblocked<br />
and accessible (it’s not e<strong>as</strong>y—the operators of heavily used proxy systems are<br />
engaged in a f<strong>as</strong>t-moving cat and mouse game) and determine how to mitigate<br />
the abuse challenges presented by open proxies. We still have problems.<br />
Most Internet traffic is domestic. In China, we estimate that, at minimum, 95%<br />
of total traffic is within the country. Domestic censorship matters a great deal,<br />
and perhaps a great deal more than censorship at national borders. As Rebecca<br />
12 Matthew Broersma, Researchers Eye Open Proxy Attacks, TECHWORLD, Nov. 15, 2007,<br />
http://news.techworld.com/security/10663/researchers-eye-open-proxy-attacks.<br />
13 Donnie Dong, Google’s Angry, Sacrifice and the Accelerated Splitting Internet, BLAWGDOG, Jan. 13,<br />
2010, http://english.blawgdog.com/2010/01/googles-angry-sacrifice-andaccelerated.html.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 569<br />
MacKinnon documented in China’s Censorship 2.0, 14 Chinese companies censor<br />
user-generated content in a complex, decentralized way. As a result, a good deal<br />
of controversial material is never published in the first place, either because it’s<br />
blocked from publication or because authors decline to publish it for fear of<br />
having their blog account locked or cancelled. We might <strong>as</strong>sume that if Chinese<br />
users had unfettered access to Blogger, they’d publish there. Perhaps not—<br />
people use the tools that are e<strong>as</strong>iest to use and that their friends use. A<br />
se<strong>as</strong>oned Chinese dissident might use Blogger, knowing she’s likely to be<br />
censored—an average user, posting photos of his cat, would more likely use a<br />
domestic platform and not consider the possibility of censorship until he found<br />
himself posting controversial content.<br />
In promoting Internet freedom, we need to consider strategies to overcome<br />
censorship inside closed societies. We also need to address “soft censorship”:<br />
the co-opting of online public spaces by authoritarian regimes, which sponsor<br />
pro-government bloggers, seed sympathetic message board threads, and pay for<br />
sympathetic comments. Evgeny Morozov offers a thoroughly dark view of<br />
authoritarian use of social media in “How Dictators Watch Us on the Web.” 15<br />
We also need to address a growing menace to online speech—attacks on sites<br />
that host controversial speech. When Turkey blocks YouTube 16 to prevent<br />
Turkish citizens from seeing videos that defame Ataturk, they prevent 20<br />
million Turkish Internet users from seeing everything on YouTube. When<br />
someone—the Myanmar government, patriotic Burmese, mischievous<br />
hackers—mount a distributed denial of service attack on The Irrawaddy, 17 an<br />
online newspaper highly critical of the Myanmar government, this temporarily<br />
prevents everyone everywhere from seeing it.<br />
Circumvention tools help Turks who want to see YouTube get around a<br />
government block, but they don’t help Americans, Chinese or Burmese see The<br />
Irrawaddy if the site h<strong>as</strong> been taken down by a Distributed Denial of Service<br />
(DDoS) 18 or hacking attack. Publishers of controversial online content have<br />
begun to realize that they’re not just going to face censorship by national<br />
14 Rebecca MacKinnon, China’s Censorship 2.0: How Companies Censor Bloggers, 14 FIRST MONDAY<br />
(Feb. 2009), http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/<br />
view/2378/2089.<br />
15 Evgeny Morozov, How Dictators Watch Us on the Web, PROSPECT, Nov. 18, 2009,<br />
http://www.prospectmagazine.co.uk/2009/11/how-dictators-watch-us-on-the-web.<br />
16 Nico Hines, YouTube Banned in Turkey After Video Insults, THE TIMES, Mar. 7, 2007,<br />
http://www.timesonline.co.uk/tol/news/world/europe/article1483840.ece.<br />
17 Aung Zaw, The Burmese Regime’s Cyber Offensive, THE IRRAWADDY, Sept. 18, 2008,<br />
http://www.irrawaddy.org/opinion_story.php?art_id=14280 .<br />
18 A DDoS attack uses multiple computer systems to target and attack a single system, or<br />
website, thus preventing users from accessing the targeted system.
570 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
filtering systems—they’re going to face a variety of technical and legal attacks<br />
that seek to make their servers inaccessible.<br />
There’s quite a bit publishers can do to incre<strong>as</strong>e the resilience of their sites to<br />
DDoS attack and to make their sites more difficult to filter. To avoid blockage<br />
in Turkey, YouTube could incre<strong>as</strong>e the number of IP addresses that lead to the<br />
web server and use a technique called “f<strong>as</strong>t-flux DNS” 19 to give the Turkish<br />
government more IP addresses to block. They could maintain a mailing list to<br />
alert users to unblocked IP addresses where they could access YouTube, or<br />
create a custom application that disseminates unblocked IPs to YouTube users<br />
who download the application. These are all techniques employed by content<br />
sites that are frequently blocked in closed societies.<br />
YouTube doesn’t utilize these anti-blocking me<strong>as</strong>ures for two re<strong>as</strong>ons. One, it<br />
h<strong>as</strong> historically preferred to negotiate with nations who filter the Internet to<br />
make YouTube sites accessible again, rather than to work against these nations<br />
by fighting filtering. (This may be changing, now that Google h<strong>as</strong> decided to<br />
disengage from China due to censorship and hacking issues.) Second, YouTube<br />
doesn’t really have an economic incentive to be unblocked in Turkey. If<br />
anything, being blocked in Turkey, and perhaps even in China, may even be to<br />
its economic advantage, since serving these countries is likely to be unprofitable.<br />
Sites that enable distribution of user-created content are supported by<br />
advertising traffic. Advertisers are generally more excited about reaching users<br />
in the U.S. who have credit cards, more disposable income and are inclined to<br />
buy online than users in China or Turkey. Some suspect that the introduction<br />
of “lite” versions of services like Facebook is designed to serve users in the<br />
developing world at lower cost, since those users rarely create income for the<br />
sites. 20 In economic terms, it may be hard to convince Facebook, YouTube and<br />
others to continue providing services to closed societies, where they have a<br />
tough time selling ads. We also may need to <strong>as</strong>k more of them to take steps to<br />
ensure that they remain accessible and useful in censorious countries.<br />
In short:<br />
� Internet circumvention is difficult and expensive. It can make it e<strong>as</strong>ier<br />
for people to send spam and steal identities.<br />
� Circumventing censorship through proxies gives people access to<br />
international content, but doesn’t address domestic censorship, which<br />
likely affects the majority of people’s Internet behavior.<br />
19 F<strong>as</strong>t-flux DNS prevents the identification of a host server’s IP address.<br />
20 Brad Stone & Miguel Helft, In Developing Countries, Web Grows Without Profit, N.Y. TIMES,<br />
April 26, 2009, http://www.nytimes.com/2009/04/27/technology/startups/27global.html?_r=1.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 571<br />
� Circumventing censorship doesn’t offer a defense against DDoS or<br />
other attacks that target publishers.<br />
To figure out how to promote Internet freedom, we need to start addressing the<br />
question: “How do we think the Internet changes closed societies?” In other<br />
words, do we have a “theory of change” 21 behind our desire to ensure people in<br />
Iran, Burma, China, etc., can access the Internet? Why do we believe this is a<br />
priority for the U.S. State Department or for public diplomacy <strong>as</strong> a whole?<br />
Much work on Internet censorship isn’t motivated by a theory of change—it’s<br />
motivated by a deeply-held conviction—one that I share—that the ability to<br />
share information is a b<strong>as</strong>ic human right. Article 19 of the Universal<br />
Declaration of Human Rights states that “Everyone h<strong>as</strong> the right to freedom of<br />
opinion and expression; this right includes freedom to hold opinions without<br />
interference and to seek, receive and impart information and ide<strong>as</strong> through any<br />
media and regardless of frontiers.” 22 The Internet is the most efficient system<br />
we’ve ever built to allow people to seek, receive and impart information and<br />
ide<strong>as</strong>, and therefore, we need to ensure everyone h<strong>as</strong> unfettered Internet access.<br />
The problem with the Article 19 approach to censorship circumvention is that it<br />
doesn’t help us prioritize. It simply makes it imperative that we solve what may<br />
be an unsolvable problem.<br />
If we believe that access to the Internet will change closed societies in a<br />
particular way, we can prioritize access to those <strong>as</strong>pects of the Internet. Our<br />
theory of change helps us figure out what we must provide access to. The four<br />
theories I list below are rarely explicitly stated, but I believe they underlie much<br />
of the work behind censorship circumvention.<br />
The Suppressed Information Theory: If we can provide certain<br />
suppressed information to people in closed societies, they’ll rise up,<br />
challenge their leaders and usher in a different government. We might<br />
choose to call this the “Hungary ‘56 theory” 23—reports of struggles against<br />
communist governments around the world, reported into Hungary via<br />
Radio Free Europe, encouraged Hungarians to rebel against their leaders.<br />
(Unfortunately, the U.S. didn’t support the revolutionaries militarily—<strong>as</strong><br />
many in Hungary had expected—and the revolution w<strong>as</strong> brutally qu<strong>as</strong>hed<br />
by a Soviet inv<strong>as</strong>ion.)<br />
21 Mark Schmitt, The “Theory of Change” Primary, THE AMERICAN PROSPECT, Dec. 21, 2007,<br />
http://www.prospect.org/cs/articles?article=the_theory_of_change_primary.<br />
22 The Universal Declaration of Human Rights, art. 19, G.A. Res. 217A(III), U.N. GAOR, 3d<br />
Sess., U.N. Doc. A/810 (Dec. 10, 1948),<br />
http://www.un.org/en/documents/udhr/index.shtml.<br />
23 For more information on the Hungarian Revolution of 1965, see Hungarian Revolution of 1956,<br />
WIKIPEDIA, http://en.wikipedia.org/wiki/Hungarian_Revolution_of_1956.
572 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
Or we could term this the “North Korea theory,” because a state <strong>as</strong> closed<br />
<strong>as</strong> North Korea might be a place where unsuppressed information—about<br />
the fiscal success of South Korea, for instance—could provoke revolution.<br />
Barbara Demick’s beautiful piece in the New Yorker, “The Good Cook,”<br />
gives a sense of how little information most North Koreans have about the<br />
outside world and how different the world looks from Seoul. 24<br />
Nonetheless, even North Korea is less informationally isolated than we<br />
think—The Dong-A Ilbo, a South Korean newspaper, reports an<br />
“information belt” along the North Korea/China border where calls on<br />
smuggled mobile phones are possible between North and South Korea. 25<br />
Other nations are far more open—the Chinese tend to be extremely well<br />
informed about both domestic and international politics, both through<br />
using circumvention tools and because Chinese media reports a great deal<br />
of domestic and international news.<br />
It’s possible that access to information is a necessary, though not sufficient,<br />
condition for political revolution. It’s also possible that we overestimate<br />
the power and potency of suppressed information, especially <strong>as</strong> information<br />
is so difficult to suppress in a connected age.<br />
The Twitter Revolution Theory: If citizens in closed societies can use the<br />
powerful communications tools made possible by the Internet, they can<br />
unite and overthrow their oppressors. This is the theory that led the U.S.<br />
State Department to urge Twitter to postpone a period of scheduled<br />
downtime during the Iran election protests. 26 While it’s hard to make the<br />
c<strong>as</strong>e that technologies of connection are going to bring down the Iranian<br />
government, 27 good examples exist, like the role of the mobile phone in<br />
helping to topple President Estrada in the Philippines. 28<br />
There’s been a great deal of enthusi<strong>as</strong>m in the popular press for the Twitter<br />
revolution theory, but careful analysis reveals some limitations. The<br />
communications channels opened online tend to be compromised quickly,<br />
24 Barbara Demick, The Good Cook, THE NEW YORKER, Nov. 2, 2009, at 58,<br />
http://www.newyorker.com/reporting/2009/11/02/091102fa_fact_demick.<br />
25 North Koreans Directly Connect with South Koreans via Chinese Cell Phones, ASK A KOREAN!, Jan. 17,<br />
2010, http://<strong>as</strong>kakorean.blogspot.com/2010/01/excellent-article-on-dong-ilboabout.html.<br />
26 Sue Pleming, U.S. State Department Speaks to Twitter Over Iran, REUTERS, June 16, 2009,<br />
http://www.reuters.com/article/idUSWBT01137420090616.<br />
27 See Cameron Abadi, Iran, Facebook, and the Limits of Online Activism, FOREIGN POLICY, Feb. 12,<br />
2010, http://www.foreignpolicy.com/articles/2010/02/12/irans_failed_facebook_<br />
revolution.<br />
28 See Joseph Estrada Controversies, WIKIPEDIA,<br />
http://en.wikipedia.org/wiki/Joseph_Estrada#Controversies.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 573<br />
used for disinformation and monitoring activists. And when protests get<br />
out of hand, governments of closed societies don’t hesitate to pull the plug<br />
on networks—China h<strong>as</strong> blocked Internet access in Xinjiang for months,<br />
and Ethiopia turned off SMS on mobile phone networks for years after<br />
they were used to organize street protests. And it’s worth noting that<br />
prophesied “twitter revolutions” in Moldova and Iran both failed in the<br />
face of authoritarian governments.<br />
The Public Sphere Theory: Communication tools may not lead to<br />
revolution immediately, but they provide a new rhetorical space where a<br />
new generation of leaders can think and speak freely. In the long run, this<br />
ability to create a new public sphere, parallel to the one controlled by the<br />
state, will empower a new generation of social actors, though perhaps not<br />
for many years.<br />
Marc Lynch made a pretty persu<strong>as</strong>ive c<strong>as</strong>e for this theory in a talk l<strong>as</strong>t year<br />
about online activism in the Middle E<strong>as</strong>t. 29 In the former Soviet Union,<br />
samizdat (self-published, clandestine media) w<strong>as</strong> probably more important<br />
<strong>as</strong> a space for free expression than it w<strong>as</strong> <strong>as</strong> a channel for disseminating<br />
suppressed information. 30 The emergence of leaders, like Vaclav Havel,<br />
whose authority w<strong>as</strong> rooted in cultural expression <strong>as</strong> well <strong>as</strong> political power,<br />
makes the c<strong>as</strong>e that simply speaking out is powerful. But the long timescale<br />
of this theory makes it hard to test.<br />
The theory we accept shapes our policy decisions. If we believe that<br />
disseminating suppressed information is critical—either to the public at large or<br />
to a small group of influencers—we might focus our efforts on spreading<br />
content from Voice of America or Radio Free Europe. Indeed, this is how<br />
many government forays into censorship circumvention began—national news<br />
services began supporting circumvention tools so their content, painstakingly<br />
created in languages like Burmese or Farsi, would be accessible in closed<br />
societies. This is a very efficient approach to anti-censorship—we can ignore<br />
many of the problems <strong>as</strong>sociated with abusing proxies and focus on prioritizing<br />
news over other less-important bandwidth-hogging uses, like the video of the<br />
cat flushing the toilet. Unfortunately, we’ve got a long track record that shows<br />
that this form of anti-censorship doesn’t magically open closed regimes, which<br />
suggests that incre<strong>as</strong>ing our reliance on this strategy might be a poor idea.<br />
29 Ethan Zuckerman, Marc Lynch Asks Us to be Realistic About <strong>Digital</strong> Activism in the Middle E<strong>as</strong>t,<br />
April 27, 2009, http://www.ethanzuckerman.com/blog/2009/04/27/marc-lynch<strong>as</strong>ks-us-to-be-realistic-about-digital-activism-in-the-middle-e<strong>as</strong>t.<br />
30 See generally, Peter Steiner, Introduction: On Samizdat, Tamizdat, Magnitizdat, and Other Strange<br />
Words, 29 POETICS TODAY 613 (2008)<br />
http://poeticstoday.dukejournals.org/cgi/reprint/29/4/613.pdf.
574 CHAPTER 10: WILL THE NET LIBERATE THE WORLD?<br />
If we adopt the Twitter Revolution theory, we should focus on systems that<br />
allow for rapid communication within trusted networks. This might mean tools<br />
like Twitter or Facebook, but it probably means tools like LiveJournal and<br />
Yahoo! Groups, which gain their utility through exclusivity, allowing small<br />
groups to organize outside the gaze of the authorities. If we adopt the public<br />
sphere approach, we want to open any technologies that allow public<br />
communication and debate—blogs, Twitter, YouTube, and virtually anything<br />
else that fits under the banner of Web 2.0. This, unfortunately, presents<br />
technical challenges that are proving extremely difficult to solve.<br />
What does all this mean in terms of how the U.S. State Department should<br />
allocate their money to promote Internet Freedom? My goal w<strong>as</strong> primarily to<br />
outline the questions they should be considering, rather than offering specific<br />
prescriptions. But here are some possible implications of these questions:<br />
If we believe the U.S. government should be exporting “Internet freedom”—<br />
and there are good re<strong>as</strong>ons to argue that a government, and particularly the US<br />
government, shouldn’t take on this t<strong>as</strong>k—we need to continue supporting<br />
circumvention efforts, at le<strong>as</strong>t in the short term. But we need to disabuse<br />
ourselves of the idea that we can “solve” censorship through circumvention.<br />
We should support circumvention until we find better technical and policy<br />
solutions to censorship, not because we can tear down the Great Firewall by<br />
spending more on proxies, etc.<br />
Second, if we want more people using circumvention tools, we need to find<br />
ways to make these systems fiscally sustainable. Sustainable circumvention is<br />
becoming an attractive business for some companies. 31 It needs to be part of a<br />
comprehensive Internet freedom strategy, and we need to develop strategies<br />
that are sustainable and provide low- to zero-cost access to users in closed<br />
societies.<br />
Third, <strong>as</strong> we continue to fund circumvention, we need to address usage of these<br />
tools to send spam, commit fraud and steal personal data. We might do this by<br />
relying less on IP addresses <strong>as</strong> an extensive, fundamental means of regulating<br />
bad behavior, but we have to find a solution that protects networks against<br />
abuse while maintaining the possibility of anonymity, a difficult balancing act.<br />
Additionally, we need to shift our thinking from helping users in closed<br />
societies access blocked content to helping publishers reach all audiences. In<br />
doing so, we may gain those publishers <strong>as</strong> a valuable new set of allies <strong>as</strong> well <strong>as</strong><br />
opening a new cl<strong>as</strong>s of technical solutions.<br />
31 Lara Farrar, C<strong>as</strong>hing in on Internet Censorship, CNN, Feb. 19, 2010,<br />
http://www.cnn.com/2010/TECH/02/18/internet.censorship.business/?hpt=Sbin.
THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 575<br />
Furthermore, if our goal is to allow people in closed societies to access an<br />
online public sphere or to use online tools to organize protests, we need to<br />
bring the administrators of these tools into the dialog. Secretary Clinton<br />
suggests that we make free speech part of the American brand identity—let’s<br />
find ways to challenge companies to build blocking resistance into their<br />
platforms and to consider Internet freedom <strong>as</strong> a central part of their business<br />
mission. We need to address the fact that making platforms unblockable h<strong>as</strong> a<br />
cost for content hosts and that their business models currently don’t reward<br />
companies for providing services to blocked users.<br />
The U.S. government should treat Internet filtering—and more aggressive<br />
hacking and DDoS attacks—<strong>as</strong> a barrier to trade. The U.S, should strongly<br />
pressure governments in open societies like Australia and France to resist the<br />
temptation to restrict Internet access, <strong>as</strong> this behavior helps China and Iran<br />
make the c<strong>as</strong>e that their censorship is in line with international norms. And we<br />
need to fix U.S. tre<strong>as</strong>ury regulations that make it difficult and legally ambiguous<br />
for companies like Microsoft and projects like SourceForge 32 to operate in<br />
closed societies. If we believe in Internet Freedom, the first step is rethinking<br />
these policies so they don’t hurt ordinary Internet users.<br />
Finally, if attempts to export Internet freedom are to be met with something<br />
other than cynicism or skepticism, the U.S. government needs to do a better job<br />
of protecting free speech domestically. The pressure exerted by individual<br />
Senators and by the State Department on companies like Amazon and PayPal to<br />
terminate services to WikiLeaks calls into question the U.S. government’s<br />
commitment to online free speech. If the U.S. wants countries like China to<br />
consider a more free and open Internet, control of the Internet in the U.S. must<br />
also follow the rule of law, and not fall victim to political expediency.<br />
If we take seriously Secretary Clinton’s call, the danger is that we incre<strong>as</strong>e our<br />
speed marching in the wrong direction. As we embrace the goal of Internet<br />
Freedom, now is the time to <strong>as</strong>k what we’re hoping to accomplish and to shape<br />
our strategy accordingly.<br />
32 SourceForge is an open source code depository from which software can be developed and<br />
<strong>downloaded</strong>. For more information, see http://sourceforge.net/.
The BesT Thinking aBouT<br />
The FuTure oF DigiTal Policy<br />
This unique collection brings together 26 thought leaders on Internet<br />
law, philosophy, policy and economics to consider what the next<br />
digital decade might bring. H<strong>as</strong> the Internet been good for our culture?<br />
Is the Internet at risk from the drive to build more secure, but less<br />
“open” systems and devices? Is the Internet really so “exceptional?”<br />
H<strong>as</strong> it fundamentally changed economics? Who—and what ide<strong>as</strong>—<br />
will govern the Net in 2020? Should online intermediaries like access<br />
providers, hosting providers, search engines and social networks do<br />
more to “police” their networks, incre<strong>as</strong>e transparency, or operate<br />
“neutrally?” What future is there for privacy online? Can online free<br />
speech be regulated? Can it really unseat tyrants? These 31 thoughtprovoking<br />
essays tackle these questions and more. This book is<br />
essential reading for anyone gazing toward the digital future.<br />
conTriBuTors<br />
Rob Atkinson<br />
Stewart Baker<br />
Ann Bartow<br />
Yochai Benkler<br />
Larry Downes<br />
Josh Goldfoot<br />
Eric Goldman<br />
James Grimmelman<br />
H. Brian Holland<br />
TechFreedom<br />
techfreedom.org<br />
1899 l sT nW, 12th Floor<br />
W<strong>as</strong>hington, D.c. 20036<br />
David Johnson<br />
Andrew Keen<br />
Hon. Alex Kozinski<br />
Mark MacCarthy<br />
Geoff Manne<br />
Evgeny Morozov<br />
Milton Mueller<br />
John Palfrey<br />
Frank P<strong>as</strong>quale<br />
Paul Szynol<br />
Adam Thierer<br />
Hal Varian<br />
Christopher Wolf<br />
Tim Wu<br />
Michael Zimmer<br />
Jonathan Zittrain<br />
Ethan Zuckerman<br />
<strong>Next</strong><strong>Digital</strong><strong>Decade</strong>.com