Democratic Structures in Cyberspace

 

6.805-10

 

 

 

 

 

 

 

 

 

 

 

Jennifer Chung

Jason Linder

Ian Liu

Wendy Seltzer

May Tse

Democratic Structures in Cyberspace

6.805-10

 

I. Executive Summary *

A. The Study of Democracy in Cyberspace *

B. Online Communities That Self-Govern *

C. Connecting Real Space Government with Cyberspace *

D. Developing a New Architecture for Voting *

E. Internet Governance of the Future *

II. Introduction *

A. Overview *

B. Architecture *

1. Flexibility and Ends *

2. Challenge and Critique *

C. Democracy *

1. Democracy Defined *

2. Democracy’s Discontents *

D. Responses *

1. Criteria *

2. Deliberative Polling *

E. Deliberative Polling in Cyberspace *

F. Caveats *

1. Universal Service *

2. Limitations *

III. Decisionmaking on the Internet *

A. Introduction *

B. Usenet *

1. History of Usenet *

2. Social Norms as a Means of Governance *

3. The End of Democracy? *

4. Usenet II *

5. Short Analysis of the Hierarchical Newsgroup Creation Structures *

C. MUDs, MOOs, and Other Rural Connotatives *

D. Case Study: LambdaMOO *

1. The Democratic Dream *

2. The Empire Strikes Back *

IV. Governance of the Internet *

A. Introduction *

B. Internet architecture - a 'democratic' protocol *

C. The Rise of DNS *

D. Interconnectedness of networks – The Market and the Internet *

E. Universal service of the Internet - 'The last mile' *

F. Domain Name Policy - Internet Governance *

1. Current Governing Institutions and Social Norms *

2. Open standards *

G. A case study: Domain Name Reform *

1. Background *

2. Evolution in governance structure *

3. POC/CORE governance structure *

4. Voluntary multilateralism *

5. POC/CORE Representation *

6. The US NTIA proposal - Green Paper *

7. The Structure of the NTIA Proposal *

8. Representation under the NTIA Proposal *

9. Reactions to the Green Paper *

10. A step forward - the White paper *

11. The ICANN board *

12. International Representation *

13. Representation structure of the two proposals *

V. Government by the Internet *

A. Introduction *

B. Current Online Voting Architectures *

C. The Deliberative Poll Goes Online *

1. The Deliberative Poll Experiment *

D. Some Further Possible Architectures *

E. Feasibility Issues *

VI. Theory and Practice of Internet Democracy *

A. Introduction *

B. ICANN *

C. Technology of democracy on the Internet *

D. Theory: Membership and Representation *

1. The problem of scale *

2. Membership and Citizenship *

E. Real World Meets the Net: ICANN as a Test of Both *

1. ICANN Representation *

2. Theory of the Deliberative Poll *

3. Technology of the Poll *

Appendix A: The Deliberative Poll…………………………………………………………………………… 108

 

  1. Executive Summary

 

As the Internet grows in importance in our everyday lives, the line between cyberspace and real space begins to blur. The prevalence of email communications and the rise of electronic commerce are only the first of the opportunities the Internet offers. Along with its potential for social and economic interactions, the Internet raises questions about governance. It offers a new forum for debate and discussion about real world politics and calls upon us to understand and now redefine the Net's own governing structures. The Internet gives us new lenses and new tools for the study of democracy itself.

Cyberspace is still in its formative stages. Now is the time that we must ask who will be given the authority to shape it, and how. Thus far, it has been characterized as a whole by benevolent oligarchy. The Internet's structure has been developed and controlled by the United States government, the various engineering groups who created and standardized its protocols, and more recently by other governments and corporate entities. That form of governance no longer seems appropriate -- the oligarchs no longer want the role, and their subjects want a say in the network's direction. On the small scale, by contrast, the Internet appears anarchic. Small groups have "colonized" portions of the network's frontier and made their own governance choices: from democracy (newsgroups, perhaps) to dictatorship (some moderated lists and online services). In the still-evolving network infrastructure, is there a non-geographic federalism appropriate to cyberspace?

    1. The Study of Democracy in Cyberspace
    2. Studying democratic structures in cyberspace recalls us to first principles -- democracy is a particular structure for law-making and governance, characterized by the sovereignty of the governed. Ideally, the structure guarantees equality, participation, deliberation, protection of minority rights, and transparency of decision-making and administration. Cyberspace, with its new architectures of interaction and new ways of forming communities, allows us to think about these democratic values apart from their traditional grounding in geographic jurisdictions. It offers us the chance to build off "communities of choice," the newsgroups and listservs of users with proximate interests, rather than locations. It allows convenient and powerful information access. The possible multiplication, division, or concealment of identity raises its own questions about the meaning and basis of citizenship. The Internet offers a sphere in which to recreate and rethink democracy; it gives us an opportunity to design new technological architectures to help reshape social norms.

    3. Online Communities That Self-Govern
    4. Online communities, which have been developing for years, provide a good starting point for studying the Internet’s relationship to democracy. Small communities formed from MUD's, MOO's, BBS's, and Usenet groups resemble local neighborhoods where the citizens of the group govern themselves by choosing their own rules and punishments. Yet these communities differ from real space neighborhoods in the relative permanence of their memberships. In a physical neighborhood, you must relocate to escape the offensive acts of a neighbor, which is not always feasible. In cyberspace, a few keystrokes will switch you from one online group to another. The ease of entry and exit in online groups would appear to sap the power from community norms. Yet many such communities persist. One explanation is that in these small communities, it takes a longer time and more effort to develop an online identity, and this investment is lost when one leaves the community. Establishment of identity may give the user a stake in a particular community, impelling him to attempt to resolve differences with online neighbors rather than moving out. Are these online communities the towns or states of the Internet? If so, what are the consequences of an individual being involved in multiple online communities? When does a user's identity in cyberspace begin to mirror, blur into, and change their real space identity?

    5. Connecting Real Space Government with Cyberspace
    6. The potential for online resolution of real space issues raises its own questions. The Internet offers a powerful tool to connect individual citizens with their government. Its information delivery could enhance transparency, allowing individuals checks on the actions of representatives. Alternatively, it could remove the intermediary and facilitate direct many-to-many participation. The geographic motivation for representative democracy is diminished, but the wealth of information and complexity of decisions may increase. How does this change the face of representation? If a direct democracy is more technically feasible, does that mean that it should be attempted, or should representative democracy still be preserved?

      How might real space governance be facilitated online? The CGI forms already available through a simple web browser can replicate the voting booth where yes/no answers are sought. We can streamline the process by providing easy access to the voting profiles of political parties, and easy hyperlinked access to their platforms and explanations.

    7. Developing a New Architecture for Voting
    8. Yet if we seek a more informed electorate, the deliberative polling concept developed by James Fishkin offers some promise. In Fishkin's model, a pool or sample of voters is given information on a topic and put into group discussions on issues before voting. Online, a combination of hyperlinked documents and media, and real time and asynchronous conversations can simulate this experience. Can we provide the right incentives to encourage informed deliberation? A type of proportional representation, more feasible in cyberspace, in which individuals could add weight to their vote through participation in areas of particular concern, offers one approach. If not yet ready for the prime time of national government, such a protocol might be appropriate for corporate stockholders or membership organizations.

      In any online voting implementation scheme, the technological architecture will raise issues. To enable anonymous voting, a form of digital identity must be established to verify eligibility and to prevent a person from casting multiple votes. Various encryption algorithms offer different answers. For instance, a self-adjudicating protocol uses several layers of encryption and computation to enable participants to vote without involving a third party. Computation is reduced with a central vote repository, but an independent third party must both administer and count the ballots. A multiple voting organization structure distributes the control, relying on two separate parties, one to administer and one to count the ballots. Before scheduling an online vote with real space effects, however, one must ensure universal Internet access to all potential voters.

    9. Internet Governance of the Future

In the immediate future, the governance of the Internet itself raises all these questions. The aforementioned small online communities cannot remain disparate forever, and the time will come when the Internet as a whole needs to have some form of a centralized government. The United States government has provisionally granted that governance to ICANN, a non-profit California corporation. As part of the agreement, however, ICANN has committed to developing structures by which "members of the Internet community" can contribute to and appeal to a neutral arbiter decisions that affect them.

Who is a "member," and by what mechanisms can this expansive "community" be defined and brought together? Here, the Internet is both the question and the tool by which it must be answered. In the coming months, it will depend on the "members of the Internet community" to answer these fundamental questions about themselves even as they construct the foundations for practical decision-making. The technologies of online voting and deliberative polling, combined with an understanding of the types of sub-community that have already carved out their own spaces on the Internet gives them some suggestions. In turn, we should take the construction of government for and in cyberspace as an opportunity to reexamine democracy itself, and apply the lessons and tools of Internet democracy to its real space counterpart.

  1. Introduction

    1. Overview
    2. This paper explores the Internet’s potential to support indigenous democratic forms and to facilitate democratic institutions in real space. These two topics culminate in a discussion of the viability of deliberative polling – a relatively new and little used democratic tool – as a workable structure for governance of the Internet, a task which requires careful mediation between the constituencies found on the Internet and the complex interests of real space sovereigns.

    3. Architecture

We shape our buildings; thereafter they shape us.

-- Winston Churchill

    1. Flexibility and Ends
    2. Our class, the Law of Cyberspace, has been an exploration of cyberspace’s new forms of architecture – also known as code – which is technology "that makes cyberspace as it is." The building blocks for these new structures have included, inter alia, digital identity, encryption, content control, spread spectrum, and trusted systems. Each of these technologies has the power to regulate – to constrain, prod, and shape – the behavior of those who use cyberspace. Indeed, these architectures can not help but regulate; at a minimum, they determine how users interface with their virtual world and with each other.

      These architectures are not "natural" in the way their real space analogues are; they do not come to us already shaped, leaving us little ability to alter their features. In contrast, real space comes, in some unsophisticated sense, "pre-made." It can sometimes be altered, but such alteration often requires great study and effort. Given a choice, for example, we might wish to "redesign" nicotine to render it nonaddictive, but we probably cannot do so, or at least cannot do so right now. Exactly the opposite holds for the code of cyberspace: it is just so much software and hardware, originally conceived and coded by men and women. However it is designed, it could have been otherwise and, with relatively little cost, it can still be otherwise.

      Cyber code’s flexibility carries with it sometimes strong implications for social, economic, and governmental systems and structures, both those existing in real space and those (more or less) tied to the Internet. Choosing from the range of options available for the implementation of a particular architecture is not merely choosing one set of technical specifications over another; it is simultaneously a choice between sets of competing social, economic, and governmental values and visions.

      Take digital identity, for example. We, as programmers, can code a virtual world requiring no, or almost no, user identification. In such a world, people may choose anonymity, pseudonymity, honest self-identification, or an amalgamation of each. Such a world maximizes one type of freedom. A user is free to construct for himself whatever identity he wishes and comport himself accordingly.

      Alternatively, we may encode identification requirements and restrictions upon our users. We may require, for example, that every user supply adequate proof of age and then code their age into their user profile – the digital identity that marks them wherever they go in cyberspace. The age identification requirement can serve many independent goals. It may further governmental purposes by, say, restricting underage users’ access to porn. The restriction might also have beneficial market effects. In an easily imaginable scenario, we may run an online service which hosts retailers. These retailers may seek to segment users into target markets demarcated by age and we may wish to help them so target. And finally, an age identification requirement may aim at reinforcing, or even altering, social norms. Perhaps, for example, we wish to make it easy to avoid the use of inappropriate language around minors.

      To generalize from this example, code’s flexibility allows it to be used to facilitate substantive governmental, social, and economic ends. And this is certainly one way in which we have become accustomed to talking about the Internet. This paper is representative of this approach to the Internet. We will, for the most part, be discussing the ways in which the Internet’s technological composition and possibilities may be used to further regulatory ends, both in real space and within the Internet itself.

    3. Challenge and Critique

It is important, though, to note that the flexibility of code in cyberspace, and the manifold of markedly different worlds available because of it, does not merely facilitate our governmental, social, and economic ends. The flexibility also implicitly criticizes the existing regimes of real space.

Markets, social norms, and governments are (more or less) plastic, and changes in these structures can have varied and far-reaching social implications. Moreover, there exists a wide range of forms these structures may take: "[t]he political, social, and economic institutions established in the rich industrial countries represent a small part of a much larger range of possibilities."

Cyberspace’s flexible new architectures carry the potential to re-shape markets, society, and government in ways and to a degree heretofore scarcely imagined. Cyberspace’s flexibility thus implicitly requires a justification, or rather re-justification, of our existing social, legal, governmental, and economic relations, in the form of an answer to the following question: if cyberspace allows us to do things differently, perhaps much better than we do them now, why don’t we?

Our class has seen and grappled with the challenges that cyberspace architecture brings to our existing regimes. Depending on your bent, some of the challenges have been exciting; problems and boundaries that once seemed insuperable dissolve or take on wildly new, often more difficult, forms in cyberspace. The issue of sovereignty is much like this, as are, perhaps, the possibilities created by spread spectrum technology. Some of the challenges have been frightening; what was seen by many as a more or less good status quo has been threatened by the advent of powerful, malleable code. The discussion of protecting privacy takes shape along those lines, as does the discussion of the threat content control may pose to free speech. And finally, some of the challenges have been less categorizable, but certainly no less perplexing, requiring nimble redefinition of real world concepts and sensitivity to the interplay of policy and technological considerations. Property and intellectual property, subjects already rich in definitional puzzles, are topics like this. Property and intellectual property in cyberspace require translation of concepts, and we may lose (or gain) much in that translation. Other topics that fit, more or less, into this category may include digital identity, libel, and universal access.

The possibilities available in cyberspace implicitly critique the existing governmental regime especially sharply. We have embodied in our Constitution fundamental values which we, as a nation, believe should give shape to our government and which should serve as checks against governmental abuses of individual liberty. The checks are commonly called "rights," and they include, inter alia, a right to vote, to bear arms, to speak freely and peaceably assemble, and a right to be secure in our homes against unreasonable search and seizures by law enforcement officials. Our fundamental values also find expression in the Constitution in the form of guarantees of procedural fairness and of equal protection of the laws, especially for certain "suspect" classes.

The implementation of these rights and guarantees is shaped around real world architecture. Insofar as this is true, the governmental regime that administers them will be challenged to translate them into cyberspace. We have seen this phenomenon in our class in the context of content control, as mentioned above.

But our constitutionally-grounded governmental regime faces a sharper challenge than just the now somewhat familiar one of translation into a vast and vastly different new cyber-world. The values that underlie our Constitution are ideals which, when implemented in real space, sometimes yield to practical necessities. The exceptions to the Fourth Amendment’s prohibition against unreasonable search and seizure may be seen as this sort of trimming of our rights, aimed at furthering policy goals difficult to satisfy in real space in any other way. The compromise of our most basic shared values to "fit" with the real world is likely undesirable, at least more so than compromise in non-constitutional settings. The possibility that cyberspace might allow for fuller exercises of our constitutional guarantees and rights seems to require that we either enjoy those fuller exercises or be given compelling justification why we should not.

Pragmatic concessions have limited not only the implementation of our values and the protection of our rights; we have conceded to practical demands in the very shape of our government. In The Federalist, James Madison offers two arguments advocating representative democracy over direct democracy. The first embodies a substantive political theory and view of human nature. Madison thought representative government would minimize the effects of faction and partisanship, because a representative scheme is able to

refine and enlarge the public views, by passing them through the medium of a chosen body of citizens, whose wisdom may best discern the true interest of their country, and whose patriotism and love of justice will be least likely to sacrifice it to temporary or partial considerations.

 

Our representative scheme, along with the separation of governmental powers into three branches and two levels – state and federal, thus supposedly acts against the natural, human inclinations of officials to sacrifice considerations of public good to immediate personal interest.

Madison’s second argument is of a different nature. He notes two distinctions between a direct democracy and a republic, i.e., a representative democracy. First, as just discussed, is the delegation of government to elected officials. Second, interestingly, is "the greater number of citizens, and greater sphere of country, over which the latter [i.e., a republic] may be extended." Elsewhere, Madison explains this distinction: "In a democracy the people meet and exercise government in person; in a republic they assemble and administer it by their representatives and agents. A democracy, consequently, must be confined to a small spot. A republic may be extended over a large region."

This statement is an observation about the constraints of real world code, an observation about the interaction of human nature and large geographic distances. A direct democracy may work in small communities, but can not work with a large number of citizens and over large geographic areas.

A republic, in contrast, actually becomes more robust in Madison’s view as it increases in size. As a republic’s size increases, the number of factions likely found within it also increases, each becomes more difficult to organize, and they will come to counteract each other:

Extend the sphere, and you take in a greater variety of parties and interests; you make it less probable that a majority of the whole will have a common motive to invade the rights of other citizens; or if such a common motive exists, it will be more difficult for all who feel it to discover their own strength, and to act in unison with each other.

 

This second argument is at a root a practical one, based firmly upon the constraints of real world code. It takes only a relatively few elected officials to run a representative democracy, as opposed to the whole polity, thereby making administration of larger areas easier and more efficient in representative democracies. Increased size, in turn, makes it harder for factions to organize effectively, thereby reducing the threat they pose to the public good. Madison, of course, couldn’t foresee that the telephone, airplane, radio, television, or Internet would enable incredibly effective nationwide organizing.

And so, Madison’s claim is, in essence, that even were a true, direct democracy desirable – a position Madison certainly did not defend – it is a practical impossibility in a country as large as the United States. Madison’s argument contemplated only thirteen states and some few millions of citizens; his argument has only gained force as new states and an exploding population have further constrained citizens’ opportunities to directly decide together, as a community, issues of national importance.

This constraint, though, does not exist in cyberspace, or at least does not have to exist there. With sufficient time, money, and universalized service, we could create an Internet-based system in which every eligible voter could vote on every national political issue. The absence of the relevant real world constraints raises an important question then: if we can have direct democracy, why don’t we?

One of Madison’s two key justifications for representative democracy is, or by hypothesis soon could be, no longer germane. If our representative democracy is thereby no longer sufficiently justified, we may have to give it up or modify it. Over a hundred years ago, Oliver Wendell Homes groused that

[i]t is revolting to have no better reason for a rule of law than that so it was laid down in the time of Henry IV. It is still more revolting if the grounds upon which it was laid down have vanished long since, and the rule simply persists from blind imitation of the past.

 

Holmes’ view as to a rule of law surely applies a fortiori to a system of government. Living under an entire system of government now foundationless is far worse than living under a single rule of ungrounded law.

A modern echo of Holmes’ view arises in a discussion of the possible approaches to the distribution of domain names. Therein, Sharon Eisner Gillett and Mitchell Kapor warn of a grievous pitfall: "The worst outcome would be to institutionalize an allocation process that stays in use long after the rationale for it has disappeared."

The question that thus arises is this: has the rationale for our system of representational democracy disappeared in the wake of the technological advances attendant upon the development of the Internet? Our system of government has its own peculiar "allocation process[es]," which (arguably) made sense in 1787 and (again, arguably) for generations thereafter. They were structured to prevent ordinary citizens from having too much control or direct input, while also aiming at restraining those in government from infringing too far upon the rights and well-being of their constituents. But, "what is practical changes over time, and it is better to consider and reject the possibility of user control [e.g., direct democracy] for a good reason than not to think of it at all."

Eisner Gillett and Kapor sketch out a spectrum useful for comparing governmental and administrative regimes, marked at its poles by "… two extremes of management style: centralized and decentralized." In the range of available governmental regimes, a totalitarian regime stands at one pole and a completely direct democracy stands at the other. As Eisner Gillett and Kapor describe the latter pole: "The ultimate in decentralization is to put users [read: citizens] in control." What we currently have, a Jeffersonian democracy whose central feature is the election of popular representatives, falls somewhere in between the two poles and certainly short of total "user control." Is our place on the spectrum still justified, if indeed it ever was?

The arguments available in response to that question, although of great importance, are beyond the scope of this paper. All that can be noted in passing are some subsidiary questions that would have to be addressed by any sufficient argument for an answer to the question: does cyberspace, in fact, dissolve the purely practical difficulties involved in implementing direct democracy? If so, is Madison’s first argument for representative democracy a sufficient justification for our current system of government? If not, are other persuasive lines of justification available? What are they and what are their counter-arguments?

    1. Democracy

    1. Democracy Defined
    2. While it is important to bear in mind the critical stance the code of cyberspace affords, this paper focuses on the positive contributions cyberspace may make to democracy in real space and the challenges of governing cyberspace itself. And so, with our understanding of code in hand, we turn now to an overview of democracy.

      The word democracy itself is notoriously vague in scope and ambiguous in meaning: it has "several different meanings which cannot be reconciled with one another." Moreover, political theorists and philosophers have spent more than two millennia now debating questions concerning democracy’s nature and desirability. Although engaging, that debate is outside the bounds of this paper.

      For our purposes, we can rely on a relatively minimal and a more or less uncontroversial conception of democracy. Democracy, simply put, is a style of governance reinforced by a strong set of social norms. It is a bottom-up approach to decisionmaking, in which the governed decide which rules they will live by, either directly, through plebiscite, initiative, or town meeting for example, or through mediating institutions, such as legislatures and executives.

      James Fishkin articulates four political conditions that characterize democracy. First is political equality, in which "citizen’s preferences count equally in a process that can plausibly be viewed as representative of everyone." This condition has a constitutional and long-standing legal foundation in the United States.

      Second is the prevalence of political deliberation, in which "a wide range of competing arguments is given careful consideration in small-group, face-to-face discussion." Of the four conditions, this is probably the one least agreed upon by political theorists. Again, however, that debate is beyond the proper scope of this paper. It is sufficient to say at this juncture that an emphasis on deliberation has enjoyed a long history among political theorists, that it continues to do so today, and that it is an integral part of this paper.

      Deliberation is, in any event, an unsurprising outgrowth of Fishkin’s third condition, participation. Participation as a condition for democracy means, very simply, having "a significant proportion of the citizenry … engaged in the process." Political participation often lends itself to, and is in turn strengthened by, the formation of civil and political associations, which serve as community spaces in which political deliberation may take place. Some political theorists and observers see these associations as critical to the health of a democratic society, both because they instill norms of "cooperation, solidarity, and public-spiritedness," and because they have ameliorative effects on political processes. Indeed, the value and importance of these benefits have led one theorist to remark that "[i]nterest in public issues and devotion to public causes are the key signs of civic virtue."

      Fourth and finally on Fishkin’s list of conditions for democracy comes non-tyranny. This condition requires that "the political process avoids, whenever possible, depriving any portion of the citizenry of rights or essential interests. … [T]he process …must … avoid the ‘tyranny of the majority.’" The Constitution embodies some measure of minority protection in it; determining just how much has been a recurrent problem in Constitutional jurisprudence and scholarship.

      In addition to these political conditions, citizens must share a basic set of social practices, expectations, and understandings relating to their decisionmaking if they are to legislate effectively for themselves. These closely track the political conditions just set forth and include, at a minimum: a desire for information, a willingness to participate in political life, a desire to deliberate about important issues, and an insistence on equality among citizens. As we implied above when discussing the condition of participation, in a well-functioning democracy social norms and associations will engender a more robust, effective government, which will, in turn, further foster civically-oriented social norms.

    3. Democracy’s Discontents

Unfortunately, contemporary American democracy appears to many to be anything but well-functioning. Views regarding the health of our democratic institutions and practices are probably as great in number and as wide ranging as are definitions of democracy. Again, however, this paper’s focus precludes entering the important discussion comprising these views.

On one account, the account this paper addresses, citizens of democracy are disconnected from their government. In the words of Roberto Unger, citizens are beset by "exhaustion and perplexity" when faced with the task of formulating new political alternatives. Most ordinary, working citizens do not form the civic and political associations stressed by Tocqueville and Robert Putnam. As a result, they do not enjoy the political benefits of aggregation and issue articulation that result from these associations, and find themselves "part of a fragmented and marginalized majority, powerless to reshape the collective basis of the collective problems they face."

America’s political ailment stems at least in part from the breakdown of civic and political associations and the norms that underlie them. Each of the social norms listed at the end of the immediately preceding section – a desire for information, participation, deliberation, and equality – appears to be in decline.

First, citizens are not informed about issues, candidates, and current events. Though television, newspaper, and radio should be convenient sources of political information, Americans consistently demonstrate a poor understanding of what government does, how it does it, and why. As an example, a recent Chicago Tribune article reports that:

Several polls have shown Americans think the United States contributes 40 percent of all aid to poorer countries, with a total outlay of 15 percent to 20 percent of the federal budget, a bigger percentage than an other industrial country. In fact, the United States contributes only 12 percent of all aid and spends less than 1 percent of its federal budget on aid, a smaller percentage than any other wealthy nation.

 

Such rudimentary political information is a necessary condition for political deliberation and for anything beyond merely superficial political participation.

Unfortunately, citizens lack not only political information; they lack the incentive or desire to become informed. Their political ignorance is "rational," in the sense that "the time and effort required to overcome it do not represent a reasonable investment" given the incredibly small chance an individual vote has of affecting the outcome of a political election. To overcome this problem, our political system must be tweaked, perhaps overhauled, so that it creates sufficient incentives for citizens to become informed about political affairs.

Second, citizens are not participating in politics and political life. Their political "exhaustion and perplexity" translates into discouragingly low voter turnout rates. A 1998 post-election Christian Science Monitor article ironically reassures us that "[v]oter turnout Tuesday did not hit a record low, as some had predicted. Turnout was average for a midterm election, at 38 percent, slightly below 1994’s 38.8 percent." While this turnout may not be the worst in American history, nor even particularly aberrant, the fact that just slightly over a third of eligible voters exercised their right to vote is cause for justified concern.

One problematic consequence of citizens’ low levels of participation is that their legislators and administrators do not know who they are, how they live, nor what they need. In the words of John Perry Barlow, describing the "atrocity" of the CDA, we are governed by "people who haven’t the slightest idea who we are or where our conversation is being conducted." Former FCC Chairman Reed Hundt’s description of the media as an "anti-individual, exclusively mass market, conglomerate-dominated centralized model of lowest common denominator (lcd) content" applies as well to our modern political system. Hundt’s goal of replacing the current, one-directional flow of media information "with unrestricted capacity to send and limitless capacity to choose" could certainly serve also as a partial solution to our problem of political participation. Creating a public space in which citizens can exchange views and become involved in political discourse is a powerful way of increasing participation. It is not a sufficient remedy, however; as with the problem of information above, citizens must be given sufficient incentive to participate, in addition to the opportunity.

Third, citizens are not deliberating on matters of public importance. If citizens are uninformed and not participating in the political and governmental process, it is scarcely imaginable that they could be so deliberating. In a healthy democracy, a candidate would attempt to persuade voters to support her on the merits of her views and a candidate would win through the "force of the better argument."

In contrast, the approach to politics today begins with the premise that citizens’ views on issues are set. The goal then becomes to discover citizens’ views and shape candidates’ positions to conform to them. The political process thereby becomes hostage to the impulsive views of an unreflective majority.

Fourth and finally, there is a common perception that the political process does not afford equal access to all citizens. Instead, citizens perceive that elites, that is, rich, professional politicians, and special interests such as the Christian Coalition, the NRA, and unions, exert influence in vast disproportion to their numbers. This perceived inequality – in which the wealthy, the well-organized, and the insiders conspire to predetermine the outcomes of elections – further erodes whatever small incentive citizens feel for informing themselves, participating, and deliberating.

    1. Responses

    1. Criteria

The challenge, then, given our account of problems plaguing American democracy, is to find a way to reinvigorate civic and political norms and associations. That kind of reinvigoration likely requires a structural change in the way our government conducts its political affairs: "[i]t is not enough to call for an intensification of voluntary association without reimagining and remaking the institutional context in which voluntary association takes place."

A variety of solutions have been proposed to democracy’s perceived problems. A theme that informs this paper is that government cannot be effectively improved unless the social norms underlying political and civic activity are first cured of the problems just discussed. A proposed solution’s potential for success should accordingly be judged by how well it can foster, encourage, and reinvigorate those social norms.

The same concerns should be paramount when judging proposals for the governance of cyberspace. While it is difficult to say that social norms on the Internet are in decay, it is not difficult to predict that a structure of governance which does not foster political and civic norms will eventually suffer problems akin to those undermining American democracy.

What follows is a list of questions with which to examine a proposal for governance, whether in real space or cyberspace. It by no means exhausts the store of relevant questions, nor is the relative weight each question should be given in any context clear. Each category has a central, key question, as well as a number of subsidiary ones.

  1. Information: How successful is the architecture at informing the public?

    1. How much information is publicly accessible?
    2. Is the information conveniently accessible?
    3. Do people know the information is available?
    4. Do people regard the information as reliable?

 

  1. Participation: Does the architecture encourage and/or facilitate participation?

    1. Are people able to participate easily?
    2. Does the technology in question tend to motivate people to participate?
    3. Do people feel that their opinions matter, and that they’ll have an effect on the outcome?
    4. Do people feel they have a stake in the outcome, that is, that the outcome will materially affect them?
    5. Do people feel their voice will be heard and taken seriously?
    6. Does the technology operate to make issues more or less relevant than existing mechanisms?

 

  1. Deliberation: Does the architecture facilitate opportunities for substantive exchanges and meaningful reflection?

    1. Is there a space in which participants may discuss issues under consideration?
    2. Does the technology tend to foster individual reflection on the issues?
    3. Will participants regard deliberation as significant and worthwhile?
    4. Is the architecture consistent with the notion that individual views are not fixed and that minds can be changed through learning, discussion, and reflection?
    5. Does the architecture reinforce the importance of the decision under consideration?

 

  1. Equality: Does the architecture treat participants equally?

    1. Or, instead, does it privilege participants on the basis of:

    1. knowledge
    2. enthusiasm
    3. interest
    4. expertise
    5. or attention and participation?

    1. If an architecture does privilege some participants, is their a rational basis for the differential treatment justifiable on principled grounds?

 

  1. Feasibility: How feasible is implementation of the proposal?

    1. Can the architecture be implemented currently?
    2. Will the lack of universal internet service be a problem?
    3. Is this architecture useful for informing representatives before they make decisions, or can it be used to allow people to make decisions directly?
    4. Is a representative sample:

    1. available
    2. and sufficient for the purpose?

    1. Deliberative Polling

One potentially promising response to the challenges democracy faces is deliberative polling. Developed by Professor James S. Fishkin of the University of Texas, deliberative polling "attempts to model what the public would think, had it a better opportunity to consider the questions at issue."

The structure of the poll is relatively simple. A random sample of an electorate is selected and polled on an issue. The sample is then brought together and immersed in the issue, "… with carefully balanced briefing materials, with intensive discussions in small groups, and with the chance to question competing experts and politicians." After this intense period of deliberation, the sample is polled, and the result is, hopefully, "… a representation of the considered judgment of the public."

Deliberative polling is explicitly aimed at fostering civic social norms to improve government: "[d]eliberative polling is premised on the convictions that democratic governance is improved by wide-spread and thoughtful dialogue about political issues." It fosters each of the social norms we have been discussing. It produces a well-informed, deliberating mini-populace, engaged and participating in a political process. The poll is also designed so that each participant has equal access to materials and equal opportunity to speak.

Fishkin designed deliberative polling as a "different form of opinion polling," a way of discovering an ideally participatory public’s opinion. But it does more than an opinion poll. Deliberative polling does not merely predict what public opinion would be in an ideal society; it also prescribes opinions and views in real societies. That is to say, a deliberative poll has "recommending force." It can be used before elections and referenda to inform the public at large and it can be used in other instances to inform elected officials of what truly informed public opinion would be. Participants have thus not been required to reach a consensus or a majority decision; the distribution of views in a sample has been informative, as has been the relative change between participants’ views before and after their immersion in the deliberative poll.

Deliberative polling has not been used as a direct decisionmaking tool. If deliberative polling were to be used to decide issues rather than to merely inform decisionmakers (whether voters or elected officials), we might require a different sort of outcome, something closer to a unanimity. A decisionmaking deliberative poll might, accordingly, closely model juries, in the emphasis would be on building consensus among the participants.

    1. Deliberative Polling in Cyberspace
    2.  

      Proposals for implementing deliberative polling in cyberspace have sprung up recently. Originally designed to take advantage of polling and television, deliberative polling seems poised to take advantage of the Internet’s powerful communications and information technologies. Indeed, cyberspace seems in some respects an ideal host for deliberative polling.

      The first advantage of cyberspace to deliberative polling is that a much greater array of information is available and more readily accessible on the Internet than in real space. Moreover, participants in cyberspace deliberative polling can break the bottleneck created by a real space deliberative poll’s presentation of a limited body of "neutral" information. Online technology allows participants to submit documents and links which can be made available to all participants in a hierarchical directory, allowing a deliberating group to actively seek information rather than have it fed to them.

      A second advantage is that cyberspace makes participation in deliberative polling easier and expands the possible pool of participants. People may participate from non-geographically contiguous locations and may do so without as great of costs, in terms of time and effort, as would be required in a real space, face-to-face deliberative poll. This relative ease does, however, create a concern that participants will fell less like, well, participants; participants, that is, may feel less invested in the deliberative polling process.

      In other respects, the desirability of consequences of conducting deliberative polling on the Internet is even more unclear. The first and perhaps most radical of these consequences is that cyberspace makes it feasible to use deliberative polling as an actual decisionmaking tool rather than as a merely informative or prescriptive one. Every member of a particular voting community could conceivably be involved in a deliberative poll on an issue. Or perhaps voters could be grouped into many small deliberative polls, with votes somehow tallied by groups. Whether the use of deliberative polls actually to make decisions is a wise idea will be discussed more fully below.

      Second is the change that deliberation will undergo when translated into cyberspace. "Voting, [when] separated from a social context that makes … face-to-face deliberation possible" claims James Fishkin, "becomes less meaningful." Individuals may, as a result, not take deliberation in cyberspace as seriously as they would its real world counterpart. The relative ease with which participants may participate may exacerbate this problem, making participants feel less invested in the process.

      Third is the possibility of anonymity in cyberspace. Anonymity may allow for more frank discussion and encourage the espousal of unpopular views because anonymous participants will not suffer the usual social consequences of their frankness and minority views. At the same time, however, anonymous participants’ diminished accountability may make them less hesitant to "flame" other participants, rather than treat them with respect.

      As discussed in the first section of this paper, we may, when designing a cyberspace architecture, require as much or as little digital identification as we wish, or at least as much as our users are willing to divulge. The deliberative poll provides a clear example of how a choice we make about code will affect a political structure we superimpose over that code. The difficult questions, addressed below, focus on which kind of social and political environment we want to create, and what code we need to create it.

      So far, deliberative polling in cyberspace is a largely speculative idea. Our group conducted a small cyberspace deliberative poll experiment with the members of the Law of Cyberspace course as participants, which produced mixed feedback and results. The experiment, results, and feedback are discussed in-depth below.

      Deliberative polling has also been suggested as a form of participation for Internet users wishing to be involved with the Internet Corporation for Assigned Names and Numbers ("ICANN"). It is still too early to determine whether deliberative polling, if used at all, would be a tool to inform the ICANN board or an actual method of decisionmaking. This topic will also be more fully discussed below.

    3. Caveats

Two important caveats are important to keep in mind before embarking on the discussion of democracy in cyberspace – reference points, really, to help limn the borders of a somewhat murky topic.

    1. Universal Service
    2. Clothes make the man. Naked people have little or no influence on society.

      -- Mark Twain

       

      We need universal service. If the Internet is ever to serve in the political and legal context as more than a challenging lens through which to reexamine age-old issues, if it is, instead, to become a vehicle for ameliorative social change and a means to a more robust government of whatever sort we choose, everyone or nearly everyone must have the chance to exploit its capabilities.

      Universal service is not merely a necessary condition for political equality in a wired democracy, though it certainly is that. It will also be essential for protection of minority rights, for full participation, and rich deliberation. Just as clothes have been a necessary precondition for social influence, computers will be. It may not be direly untrue to say that, in 20 years or even much sooner, a person without a computer will "have little or no influence on society."

      So, we may need universal service, but we have not got it. At the recent conference entitled "Legal/Technical Architectures of Cyberspace," Michael Dertzouzos of MIT reported that, of the 6 billion inhabitants of this planet, only ten percent - 600 million people – have telephone access. In turn, of those ten percent, only another ten percent - a mere 60 million people – were interconnected via computer network. Thus, a scant one percent of the world’s population could even currently participate in any democratic structure which may be set up on the internet.

      Joseph Lockard forcefully articulates the political implications of our lack of universal service:

      A few excepted classes exist, but a middle-class income is the basic password to Internet access. Nonetheless, cyberspace has arrived virtually unchallenged as a democratic myth, a fresh field for participatory citizenship. Standard celebratory phrases include "a new Jeffersonian democracy" and "an electronic Agora." Aside from the historical ignorance incorporated into these comments, they all leave unspoken the hard fact that access capital is the poll tax for would-be virtual citizens.

       

      The government is aware of the problem, although there is, of course, no solution as of yet:

      A major objective in developing the NII will be to extend the Universal Service concept to the information needs of the American people in the 21st century. As a matter of fundamental fairness, this nation cannot accept a division of our people among telecommunications or information 'haves' and 'have-nots'. The administration is committed to developing a broad, modern concept of Universal Service - one that would emphasize giving all Americans who desire it easy, affordable access to advanced communications and information services, regardless of income, disability or location.

    3. Limitations

Cyberspace may allow us to translate and reshape democratic structures; it will even certainly spur the creation of new methods and forms of democracy. But it is, obviously, special terrain, with its own distinct features with which people interact very differently than they do the features of real space. Some of these different ways of interacting are less desirable and more limiting than their real world counterparts.

One obvious limitation of cyberspace is the lack of face-to-face interaction available in the "real" world, a feature considered by some to be vital to a successful democratic experience. This loss may become less noticeable, less important, if voice and video transmission become the norm in cyberspace, but, even then, cyberspace may only be a powerful yet imperfect substitute for physical interpersonal relations.

Other cyber-limitations surely exist; there are many ways in which the experiences in cyberspace are different from, and sometimes inferior to, those available in real space. This point is important to remember, to avoid thinking of the Internet as an unqualified cure for our social ills. And with that warning in hand, we proceed to a discussion of democracy on the Internet.

  1. Decisionmaking on the Internet

    1. Introduction
    2. Since the early days of the Internet, in its inception as ARPAnet, professional and academic users have taken advantage of the global nature of this vast network to communicate with others not physically nearby. Amid the global network, some carved out smaller communities for themselves. "While the Internet community was evolving into something analogous to a ramshackle Roman Empire of the entire computer world, numerous smaller, independent colonies and confederations were also developing." The non-geographic nature of the Internet loosens the tie of town or city, but provides an opportunity for new communities to form along non-spatial lines for the discussion of mutual interests.

      This "non-geographic gerrymandering" along lines of common interest allows individuals to participate in multiple groups, and may create different sorts of diversity among the members of virtual communities. "No longer limited by geographical happenstance to the interactions that might develop in a town or neighborhood or workplace, individuals can free themselves from the accidents of physical location to create their own virtual places." The online communities have functioned as alternatives to "real life," perhaps acting as havens for network users who seek a specialized knowledge or interest group to converse with, a group that might not be so easily collected in real space.

      There are multiple ways of grouping these communities, the most common of which is by the architecture or medium of cyberspace through which the community is conducted. Thus, one often hears of communities such as Usenet, a loose system of newsgroups, or Multi-User Dungeons or Domains ("MUDs"), a real-time forum which tends to consist either of an adventure game or of a single location of multiple chatrooms.

      Taking into account the composition of the original users of these communities -- members of the Cold War Defense Department, computer programmers, and academics -- and the organization, or lack, of these communities, we can examine how smaller communities of the Internet have had the opportunity to put democracy into practice. If not wholesale models of democratic governance, these groups have often exhibited some of the elements of democracy, such as ideals of equality or non-tyranny of the majority.

      Most immediately, the amount of personal interaction within these communities already gives participants a greater sense of social obligation. According to Alexis de Tocqueville, a citizen who does not have reason to talk with other citizens will retire to the insular society of friends and family and cease to care about the larger society to which he or she belongs. By opening the society of the individual, by exposing him or her to an open community that only grows larger, the Internet forces people to acknowledge the presence of others. The citizen of this connected world must at least knowingly and consciously abandon to others around him or her the civic tasks of making the society work. The citizen better realizes that he or she, too, is a part of the society, as are the other members of the open community the citizen participates in.

      Additionally, by enabling the formation of communities of users who may not have gathered or been able to gather so easily in real space, the Internet embraces the democratic ideal of a "right to assembly." The assembly of far-flung "communities" in turn provides a groundwork for minority rights, an ideal not necessarily intrinsic to non-democratic forms of government. "The right of assembly, which has always been a legal guarantee, becomes more consequential as the constraints of localization give way to the unfettered opportunities of virtual association." The goals of democracy are furthered when the communities of cyberspace serving as convenient gathering points for citizens, assembling members unimpeded by their physical locations. Democracy can battle many tyrannies in the words of the people on the medium of the Internet.

      Brian A. Connery compares the Internet to the 17th and 18th-century British coffeehouses. Charles II did not approve of the coffee houses that existed across the country; they were places "where the public gathered to discuss politics or, as many feared, to hatch plots and conspiracies." Connery states that "users of the Internet are afforded the opportunity to forge new identities for themselves in a public space that regards all participants as roughly equal until proven otherwise" and that "like the denizens of coffeehouses, [they] frequently have access to information . . . not disseminated by the ‘official’ media." Those coffeehouses were "reincarnations of classical ‘marketplaces of ideas’ like the agora and the forum," which knew, of citizens frequenting a particular coffeehouse, only what citizens chose to tell about themselves. On the Internet, as in the coffeehouses, there is no authoritative leader sanctioning the conversations; members have a chance to use the forum to discuss heated topics of the moment. It is difficult for an authority to suppress people’s engagement in conversation in either forum, as demonstrated by Charles II’s inability to shut coffeehouses down completely.

      The Internet’s lack of a controlling authority makes it a unique laboratory in which to examine the development of governance mechanisms. Two specific types of electronic community, Usenet and MUDs, have exhibited particularly interesting evolutions of self-governance.

    3. Usenet

    1. History of Usenet
    2. Created in 1979 by two graduate students at Duke University, the (Unix) User Network, commonly referred to as "Usenet," is a hierarchy of organized newsgroups whose main points of connection are run through different servers that communicate with one another to receive the electronic transmissions of news. Their original plans called for three worldwide hierarchies, or main branches of newsgroups: "net.*" for unmoderated groups, "mod.*" for moderated groups, and "fa.*" for ARPAnet. The original design purpose of Usenet was to provide the ungeographically centered Unix-using community a means of communication.

      Usenet proceeded in this structure until 1986, when the growing population already prompted administrators’ proposals for reform. The three-branched Usenet was difficult to administer: new groups were created haphazardly and named inconsistently, transmission costs were high, and site administrators clamored for a talk.* hierarchy "for the high flame groups," so that newsgroups would be easier to filter. The so-called "Backbone Cabal," a group of server administrators who worked together to ensure rapid, reliable news propagation, took charge of the discussion. This seizure of power disturbed some users, who thought a group of mostly homogeneous "male computer experts in their 20s and 30s" should not be deciding the naming schema for the entire, diverse Usenet. Input from the Usenet populace at large was ultimately heard. "[A]s the Great Renaming discussion progressed, a current list of proposed new newsgroups was posted to net.news several times along the way. However, protests by a few vocal people forced changes (this is Usenet after all)." In March of 1987, the renaming was complete. Eight new hierarchies were created ("comp.*," "misc.*," "news.*," "rec.*," "sci.*," "soc.*," "talk.*," "setup.*") to replace the current three.

      Within a few months, however, three administrators who disliked even the new organization proposed and implemented another hierarchy, "alt.*," for "alternative." Soon, alt.* was providing an outlet for newsgroup subjects that had been protested among or barred from the Big 8. The alt.* hierarchy could hold alt.drugs or alt.sex, as well as alt.fan.your-favorite-unknown-singer.

      The changing nature of the technology which connected Usenet hosts and users caused the demise of the Backbone Cabal, and prompted the emergence of a user-governance structure to take over the only administrative duty, that of naming and creating new newsgroups. The democracy of this structure is evident. Any user is allowed to create a newsgroup in any of the "Big 8" hierarchies (all but "setup.*" from the 1987 renaming, plus "humanities.*"); the rules for alt.* are even more lenient, for newsgroup creation; other hierarchies will generally allow users to suggest new newsgroups, at the very least. Usenet users trust each other.

      The "Guidelines for Usenet Group Creation," for newsgroups in the Big 8 hierarchies minimizes its own governance role. The document is "NOT intended as guidelines for setting Usenet policy other than group creation," it states. Big 8 newsgroup creation begins with a post to relevant newsgroups: the proponent sends a request for discussion of the new newsgroup to news.announce.newgroups and news.groups, as well as other newsgroups which may be related to the subject (all subsequent discussion is directed to news.groups). The group is given a 30-day discussion period in which to reach consensus on a group name, charter, and whether it will be unmoderated or moderated, and if moderated, by whom. If no consensus is reached, then the discussion moves off news.groups and onto e-mail. If structural issues are resolved, a 21- to 31-day call for votes is announced to news.announce.newgroups and newsgroups relevant to the subject. Voting administration is handled by the Usenet Volunteer Votetakers (UVV), a neutral, third-party group. At the end of the voting period, a tally is posted, and there is a 5-day waiting period for corrections and error- checking. To win approval, the proposed group must gain at least 2/3 of the total vote, and must win the vote by at least a 100-vote margin. If the number requirements are not met, then there is a 6-month period before the proposal can be brought up for discussion again.

      Within the alt.* hierarchy, the procedure for creating a new newsgroup differs significantly. According to the document "So you Want to Create an Alt Newsgroup" by David Barr, the alt.* hierarchy was created for the "people who felt that there should be a provision for a place where people could create groups without having to go through any discussion or votes." Anyone can create an alt.* newsgroup simply by sending a control message -- giving even more power to the average user than is evident within the Big 8. Since each news administrator decides independently whether to carry individual newsgroups, however, mere creation of a newsgroup does not guarantee its automatic propagation to other sites. A randomly created alt.* group will likely have only limited distribution, and hence limited readership. Thus, a set of social norms has contributed to suggest a set of guidelines would-be creators of alt.* hierarchy newsgroups should follow to maximize distribution. Among these guidelines are "appropriate names," suggestions for posting information about the new group, messages to post and not to post onto alt.config.

      However, as Mark Weber says in the Epilogue of the same document, alt.* "is the last remaining refuge away from the control freaks, namespace purists and net.cops that maintain and enforce the mainstream newsgroup guidelines." The anarchic tendencies this sort of system is capable of exhibiting are held in democratic check by existing social norms, and users interacting with other users in a mentoring fashion help to ensure the propagation of these social norms, which create a set of laws.

    3. Social Norms as a Means of Governance
    4. Since the administrative structure of Usenet has prevented any one group from dominating its governance, Usenet users themselves have had to ensure that the informal rules and regulations that are desired habits of users are properly maintained. The subtle social pressure on new users of newsgroups has manifested itself as a system of social norms varying from individual newsgroup to newsgroup. Most groups generally include a periodically posted Frequently Asked Questions list (FAQ), which not only answers common questions about the group’s subject, to cut down on the rehashing of old arguments by new users, but may also describe social practices of the newsgroup or provide an introduction to the social community and informal "rules" of the newsgroup.

      Usenet as a whole exhibits the expectation that users will observe proper rules of "Netiquette" when posting to newsgroups. Netiquette, "a code aimed at the creation of a civil community," is a voluntary set of standards generally referring to guidelines such as proper topics to post messages about, and what to include in the message. For instance, Netiquette discourages newbies from posting a message consisting solely of the words "Me, too!" to a newsgroup; Netiquette frowns upon newsgroup posts which are 93 lines of quoted text, and 4 lines of new information or content. The rudeness conveyed by a user disregarding a group’s Netiquette reflects badly on him. He is reminded of the social norms through gentle correction by senior Usenet posters or sterner reprimand for repeat offenses.

      On many newsgroups, there is no "authority" who authorizes the posting of a message, although in moderated groups, the moderator is obviously an authority, and Connery argues that lists devoted to specific authors or texts also develop an authority, as discussion is heavily influenced and guided by these texts. In the majority of cases, however, newsgroup participants themselves "constitute the authority of the group." According to Connery, by reference to FAQs and group norms, messages can be classified either "pertinent" or "impertinent." Posters of impertinent messages receive responses that can subsequently discourage others from posting likewise impertinent messages (which may include asking questions which are explicitly handled in the FAQ), that only serve to annoy regular users of the newsgroup.

      It is instructive to examine the culture of one particular Usenet newsgroup, alt.folklore.urban. Known colloquially as "AFU," alt.folklore.urban is a community devoted to the discussion and debunking of urban legends. The community of AFU does not have a specific governmental or organizational structure. Long-time users, however, obtain respect and personal status within the community, and are definitely heeded more often than the newbies. Gaining respect within the AFU community is incentive enough for many users to pay attention to the informal rules.

      The non-tyrannical nature of the newsgroup is evidenced by the lack of "group leader"; AFU is an unmoderated newsgroup. Admittedly, AFU is not a democracy per se – there is no government, nor, really, any issues for which an organized structure to discuss and determine solutions would be required. However, AFU espouses the democratic ideal of all users being equal. Since "AFU has to maintain a certain openness to new posters, since new stories and new versions of old stories are always needed," the AFU community does not immediately discriminate against newbies.

      Rather, Tepper claims that older members of AFU test new users by engaging in the pastime of "trolling," which The New Hacker’s Dictionary defines as "[uttering] a posting on Usenet designed to attract predictable responses or flames." The practice of trolling on AFU itself has certain "rules" attached to it. The typical AFU troll is a post containing deliberately incorrect information, but (to regulars on the newsgroup) comically so. Long-time members know not to respond with corrections, or respond with a post to the group saying "That’s a good troll." Newbies who may not have read the FAQ, or who have not spent enough time reading or understanding the newsgroup before sending a first post will respond inappropriately, thereby branding them as, bluntly, "outsiders." Trolling separates the clueful from the clueless, and thereby serves to protect the AFU "group culture."

      It is important to note, however, that the users respected on AFU are not respected because of who they are in real life or what address they are posting from (even the much- maligned aol.com user can be given a chance). Rather, users are respected based on the level of "cluefulness" – manifested by not falling for trolls, for example, or demonstrating intelligence or research in posts – they display. It is the user’s personality, mind, and potential as an information source which allow her to move up and down the AFU social scale.

    5. The End of Democracy?
    6. An interesting phenomenon which has occurred in the AFU community recently is the move of discussion from the open, public forum newsgroup to one of two invitation-only mailing lists, "Old Hats" and "Young Hats." (An "Old Hat" is AFU slang for a member who has been on the newsgroup for an especially long time or who is particularly respected.) While this is not immediately undemocratic – it is not difficult to imagine friends meeting through a real space community choosing to retire from the community to speak amongst themselves – it is significant that the users who are being invited to join the mailing lists are the users who are earning respect in the newsgroup, and hence have the most to invest in the newsgroup and are more likely to stay in AFU. To quote Tepper:

      As the number of people posting to AFU continues to increase and the public spaces of Usenet become more anarchic and less communal, it may well be that in the future these very successful semi-private lists will serve as the center rather than the margins of the AFU community . . . these lists . . . could serve as a physical boundary when, and if, the participants feel that cultural boundaries have completely failed.

       

      If the influx of AFU newbies causes an overwhelmingly large number of posts to be impertinent or irrelevant, these may provide incentive to communicate within the private mailing list and further reduce the content of the newsgroup discussion. The eventual disappearance of those users who had the most reason to stay in AFU might eventually destroy the community, since other users would not yet have learned the community’s social norms. Thus the disappearance of old members would leave the group without teachers who could inculcate the norms in newbies, breaking the continuity of the group’s unique culture. Stability is lost, and the cooperative equality gives way to unauthoritative anarchy.

      Curiously enough, this move of certain key newsgroup users from the "public space" to a more private forum is analogous to a similar trend that occurred with Connery’s 17th century British coffeehouses. Warns Connery:

      Regular denizens of particular coffeehouses, presumably becoming self-satisfied and uninterested in the views and news of less-regular participants in discussion, began to withdraw into backrooms. As early as 1715, according to William Thomas Laprade, "the bluff democracy of the public rooms in earlier coffeehouses [was gone as] men of note withdrew and did not court the common crowd. Selected assemblies gathered in the private rooms." Such "select" assemblies led to the rise of the private club.

       

      The parallel of the coffeehouse backroom to the private mailing list should be obvious. Connery goes on to argue that the remaining coffeehouse communities began to develop an explicit, authoritative leader – akin to a newsgroup moderator – who dominated conversations and gave authority for people to speak. The attitude in coffeehouses, and perhaps beginning to manifest itself on newsgroups (as evidenced with AFU and the Old Hats) as posts become more and more frequent and impertinent, seems to be, "If you don’t like it, go create your own group."

      In fact, a group of users, frustrated by any number of things, have gone and created not only their own group, but their own version of Usenet. However, in a return to democracy, one can argue that the propagation of multiple hierarchies on Usenet only serves to further emphasize the democratic nature of Usenet. Traffic on Usenet is, after all, dependent on users. Because of the decentralized nature of Usenet command – in particular, that it allows individual news-carriers to determine which newsgroups they do and do not wish to carry – a set of host server-owners can collaborate to create a system they administer among themselves. This is how the original alt.* hierarchy was begun. A more recent example is the formation of Usenet II.

    7. Usenet II
    8. In 1997, frustrated with the current system, several users created what they viewed as an alternate news network, calling it Usenet II. Resurrecting the net.* hierarchy, Usenet II has tried to solve some of the problems with today’s Usenet and recreate the aura of the original Usenet system by limiting spam and unnecessary, off-topic posts. Whereas the alt.* hierarchy’s naming schema was designed to circumvent the mainstream voting structure by allowing any user to create an alt.* newsgroup, Usenet II’s new group creation scheme does exactly the opposite; in Usenet II, sub-hierarchies (net.aquaria.* and net.subculture.* are examples) are controlled by "czars," who determine the creation and naming of all sub-newsgroups, and moderate all groups and posts for their entire sub-hierarchy.

      Usenet II is certainly not democratic. Rather, the Usenet II attitude seems to be that for newsgroups to function well, somebody (or a group of trusted somebodies) needs to be in charge. In a post to news.admin.net-abuse.usenet, Russ Allbery, one of the main creators of Usenet II, said of the old hierarchies before Usenet II’s creation: "Somewhere along the line, those newsgroups broke." In an attempt to fix it, Usenet II gives controls to the self-labeled czars; the assumption is that the tedious administration associated with the actual running of newsgroup hierarchies is something the ordinary user should not have to worry about; if a user wants to see a particular Usenet II newsgroup created, he or she can contact the czar of the sub-hierarchy in which the user wants to see the newsgroup located, and the czar can then consider whether or not to create the newsgroup.

      The Usenet II system of czars, in this fashion, might actually be interpreted as functioning not unlike a "representational democracy" system. Instead of as in direct democracy, where every individual votes, or has the opportunity to vote, on everything, here, a few representatives are selected to take care of the day-to-day administration (so as to leave individuals unshackled). The first major difference, of course, is that in Usenet II the czars were hand-picked by the original Usenet II creation team. In a way, that, too, might be seen as consistent with democratic principle, as there were not yet any users whose votes could be counted. It is doubtful, too, that many users of Usenet II have concerns with the way the czar system of newsgroup creation is set up.

      Not everybody can access Usenet II. Although anybody can read posts, only users reading news through specific sites that have been deemed "sound" are allowed to post to a Usenet II newsgroup. The soundness clause is related to Usenet II’s goal of getting rid of irrelevant posts, as well as keeping Usenet II’s community to itself; the idea behind soundness is that a site will only accept Usenet II articles from other sound sites, and never accept articles from unsound sites. If a site wishes to be part of the Usenet II community, it must be sound: "Usenet II only exists as a result of cooperation among sites, and this cooperation can not be coerced, so there is no countering mechanism by which a site can be ‘voted in’." On the other hand, if a site is suspected of being unsound, there is a complicated voting mechanism whereby the remaining sound sites may vote to kick the other site out. With the implicit web of trust built into the soundness doctrine, users are expected to be posting only relevant, on-topic messages, and sites are responsible for making sure that posts through the site are sound. The rules of usage for Usenet II clearly indicate that if you do not follow the creators’ rules, you will be severely restricted.

    9. Short Analysis of the Hierarchical Newsgroup Creation Structures

In light of the varying ways of naming newsgroups, within hierarchies, a comparison among the three systems (mainstream, alt.*, and net.*) is instructive. It appears that the somewhat deliberative democratic method of newsgroup creation currently implemented by the mainstream, or Big 8, Usenet newsgroups functions most effectively.

The alt.* hierarchy’s system of naming relies too heavily on social norms for success. Because anybody can create whatever newsgroup he or she wishes, the value of creating any particular newsgroup is severely diminished, and newsgroup-carrying sites are given the burden of weeding the signal from the noise. On the other hand, Usenet II’s czar-led naming structure severely restricts users’ powers; in addition, users have raised concerns regarding "‘free speech without censure’ issues" and the overwhelming power of the czars.

The Big 8 system instigates a deliberation among affected and interested members of the larger Usenet polis to decide upon new groups. This discussion for a period followed by voting is probably the most effective. Because the voting algorithm for determining which newsgroups "pass" and can be created is not based on any percentage of "total Usenet users" (which would, in itself, be difficult enough to determine), and because an appropriate post is sent to groups whose readers may be interested in the creation of a particular new newsgroup, users can vote on the newsgroups that interest them without being bogged down by the numerous other newsgroup votes taking place at any time. The discussion period gives other users who may be interested in the administrative aspects of newsgroup creation but relatively ignorant about the proposed newsgroup the chance to better inform themselves, so that they can make a correct decision. The discussion period also gives the proposers of the newsgroup the opportunity to justify creation of the newsgroup. Additionally, the vote lends some authority to the creation of a newsgroup, so that site administrators can more willingly carry the newsgroup.

    1. MUDs, MOOs, and Other Rural Connotatives
    2. Named for the first game of its kind, the "Multi-User Dungeon," the term "MUD" is used to denote any of a group of interactive, multi-player computer environments. The first MUD, written by Roy Trubshaw and Richard Bartle, was completed in 1980. Originally mere adventure games with explicit goals and objectives, MUDs have developed to include online societies where members can talk and discuss topics of interest. The MUD is an extension of the normal Internet chatroom that may allow users to "move" around and perform virtual acts, as well as to create their own rooms and objects. A MOO, or "MUD, Object Oriented," is a particular type of MUD that tends to have more of these advanced options.

      The anonymous nature of MUDs furthers the equality ideal of democracy. Users are less likely to face discrimination because nobody really knows who else is online – just because a MUD character is described as being male does not mean that the character’s creator is male. Anonymity also gives users the opportunity to say things and do things they otherwise might not. While this may mean that users can tend to be more violent or sexually uninhibited through their characters than in face-to-face interaction, anonymity also means that, in a more serious situation, users can feel at liberty to express their thoughts. In particular, anonymity can encourage the expression of minority opinions.

      Governance structures of the communities on these social MUDs, where players are not involved in an adventure game, but log in to chat with others, vary from MUD to MUD, but there tend to be a few constants. At the top of the social hierarchy of a MUD is the class known as "Wizards." A MUD’s wizards are generally equivalent to the administrators of the site; on a MOO called WolfMOO "the wizards have special powers and controls that other players do not have, which allow them to successfully run and administer the MOO." The range of powers which non-wizard users of the MUD are awarded depends on the MUD. Wizards may grant users powers including increased disk quotas, the ability to program and create nonstandard objects on the MUD, or the ability to "boot" or kick off guest users who are being obnoxious.

      Throughout MUDs, however, some level of democracy remains. Analogous to the Usenet "democracy" in which users ultimately decide which newsgroups to frequent and hence which newsgroups "live," MUD users determine which of the plethora of available MUDs they will play in, encouraging MUD administrators to pass policies that players like lest their players leave to find a different, more satisfactory MUD. After all, "the common wisdom is that simple economics will make it unrewarding for a Wizard . . . to treat players badly, and so most successful holders of those positions will by necessity treat their players reasonably well."

      Often, to relieve their workloads, wizards will designate several experienced players as helpers. These helpers may be given tasks of answering questions and helping new players, or they may be given more substantial power. On a MOO called LambdaMOO, an Architecture Review Board ("ARB") was created to shift the burden of determining which players could be granted a requested quota increase (so that the player had more room to store files on the MOO, for programming or other purposes), from the wizards to the community itself.

      In general, decisions about MUD policy are made by the wizards. On WolfMOO, arguments between players are also resolved by the wizards; in fact, "to prevent the MOO from falling into complete anarchy, the wizards expect players with complaints or problems with other players to bring the problems to them . . . not to deal with them by themselves." WolfMOO wizards have gone so far as to punish vigilante players who try to exact revenge on their own without informing the wizards. On the other end of the decision-making spectrum, however, is LambdaMOO, whose head wizard determined that the residents of LambdaMOO would make all of the policy decisions about the MOO, and that the wizards would only serve in the technical role of carrying out the wishes of the MOO’s community.

    3. Case Study: LambdaMOO

    1. The Democratic Dream
    2. On December 9, 1992, the wizard Haakon posted a message to LambdaMOO announcing that

      As the last social decision we make for you, and whether or not you independent adults wish it, the wizards are pulling out of the discipline/manners/arbitration business; we’re handing the burden and freedom of that role to the society at large. We will no longer be the right people to run to with complaints about one another’s behavior, etc. The wings of this community are still wet (as anyone can tell from reading *social-issues), but I think they’re strong enough to fly with."

      Several issues related to the sudden onslaught of democracy on LambdaMOO still needed to be dealt with, however. Haakon asked what should be done with the ARB, for instance, in the same post. Additionally, although the community of LambdaMOO had been granted the power of democracy, there was apparently nothing to do with it.

      An incident a few months later changed all that. Julian Dibbell in the famous article "A Rape in Cyberspace, or How an Evil Clown, a Haitian Trickster Spirit, Two Wizards, and a Cast of Dozens Turned a Database Into a Society" discusses the incident in greater detail. In March 1993, a LambdaMOO player using the character name of Mr. Bungle programmed a "voodoo doll" which the player then used to engage several characters in obscene acts against their users’ wills. Some other residents of LambdaMOO realized what was happening and put Mr. Bungle into a box, but another well-meaning character freed him.

      The LambdaMOO community was outraged. In heated discussion, many members demanded that Mr. Bungle be punished; the question was how. A "town meeting" was called, and about thirty characters showed up to participate in the discussion – including Mr. Bungle himself, who made a few statements before withdrawing. The meeting created no general consensus, however. Despite the lack of resolution, one of the wizards enacted punishment on Mr. Bungle unilaterally, by deleting the player from the database.

      When Haakon returned from a trip to discover what had happened, he determined to ensure that, should an occasion like this arise again, the LambdaMOO community would itself be able to convey to the wizards what punishment should be imposed or rule enacted. Therefore, a system of petitions and ballots was created. LambdaMOO users could vote on issues, and their vote bound the wizards. "The awkward gap between the will of the players and the efficacy of the technicians would be closed." And for some years LambdaMOO was administered by the players and not the wizards.

    3. The Empire Strikes Back

In May of 1996, however, Haakon posted another message to LambdaMOO rescinding the 1992 promise that wizards would not administer LambdaMOO. "We Are Reintroducing Wizardly Fiat," he said. Haakon admitted that the wizards were forced to make social decisions after all, and he bluntly stated that the wizards would "no longer attempt[] to justify every action we take," and that

The wizards will no longer refrain from taking actions that may have social implications. In three and a half years, no adequate mechanism has been found that prevents disruptive players from creating an intolerably hostile working environment for the wizards. The [original proposal’s] ideal that we might somehow limit ourselves solely to technical decisions has proven to be untenable.

These turns of events provide a stark reminder that in cyberspace, authority comes down to the owner of the real space machine, or in this case, the programmers who solely have the ability to change the world. LambdaMOO tried a democratic system, with equal representation for every user and limited involvement by wizards, and it didn’t work. What must have originally been a system designed to lessen the work of wizards by distributing the decision-making process apparently turned into something which only served to heighten wizards’ stress. And the wizards responded to the system abuse by taking the system away.

Perhaps, in light of the governance structure changes on LambdaMOO, and in light of the creation of alternative Usenet hierarchies such as Usenet II, an explicitly democratic governance of communities on the Internet will never be effective. Nonetheless, democratic ideals will always be valued, as long as market norms (the existence of multiple groups gives users the opportunity to pick which groups to visit) force administrators to enact policy which is pleasing to users. Even as Haakon says in the reversal-of-fortunes post, "Your input is essential."

  1. Governance of the Internet

    1. Introduction
    2. This section moves from a focus on small online communities such as MUDs and MOOs to a larger-scale discussion of the problems and challenges which any scheme to govern the Internet as a whole (whatever that exactly means) must address. First, we begin with a brief historical overview of Internet architecture and the Domain Name System (DNS), with a focus on how technology policy has been developed in allocating IP addresses. The DNS controversy serves as a case study highlighting the similarities and differences between democracy in real space and cyberspace. Second, we will discuss the evolution and formulation of other Internet governance structures. Issues of intellectual property, such as copyright and trademark ownership, figure prominently in these discussions. Third, we will analyze the conflicts and struggles among those parties with stakes in the current, on-going debate over how the Internet will be governed. Finally, we will explore how the discussion of these topics reveals that democracy in cyberspace requires complex and delicate tradeoffs between the sometimes conflicting values of participation, representation, deliberation and feasibility.

    3. Internet architecture - a 'democratic' protocol
    4. The forerunner of the current Internet was first developed as a U.S.-based research vehicle by the U.S. Defense Advanced Research Projects Agency (DARPA) in conjunction with the National Science Foundation (NSF) and other U.S. research agencies. Not surprisingly, this first network was thus called the ARPANET and, later, the NSFNET. The main purpose of ARPANET/NSFNET was to connect academic institutions and military networks across the United States to enhance research productivity.

      Technologically, the design of the ARPANET was innovative and untraditional, especially when compared to the traditional X.25/X.75 protocols. The Internet Protocol (IP) which runs on top of Transport Control Protocol (TCP) connects heterogeneous groups of networks. The notion of running IP over everything embraces heterogeneity and accommodates multiple service types. The protocol design adopts End-to-End argument, fate sharing and a principal and soft-state approach to maintain the reliability, robustness and stability of a network. Most importantly, network failures are assumed to be common. In essence, the Internet, which runs on TCP/IP, is a technologically democratic protocol that allows any quality of network to participate.

    5. The Rise of DNS
    6. As the number of users on the NSFNET grew, the Internet community saw a need to design a mechanism to assign IP addresses conveniently. DNS, which maps unique domain name identifiers to specific IP addresses, was proposed. DNS provides a mechanism for naming resources in such a way that names are usable in different hosts, networks, protocol families and administrative organizations. It was designed as a distributed protocol, which is a protocol compatible with existing transport protocols, to allow parallel use of different formats of data type addresses. A DNS transaction is independent of the communications system that carries it, and thus it is useful across a wide spectrum of host capabilities. The design philosophy of DNS thereby aligns with the properties of the Internet – it is distributed, independent, reliable and robust.

      The Internet can be seen as a democratic architecture because it accommodates heterogeneity and its control is distributed. Conversely, the Domain Name Protocol is a 'manager' protocol, because it requires a single point to collect and disseminate all information. This section gives a brief outline of the mechanism of the protocol and explains the technical architecture resulting from its properties and functionality. DNS has a dual existence. It is a naming system that runs in parallel with existing TCP/IP suites together with other families of network protocols. At the same time, it is a universal address system by which every node in the network shares a common source of information. Historically, this master source of information has been a database stored in the 'Root A' server that is managed and maintained by an organization known as IANA. The political importance of this organization will be discussed later in this section.

      When a user wants to retrieve a piece of information from the Internet, he uses a local agent – a resolver – to retrieve the associated information using a domain name. The resolver itself does not have the entire knowledge of the network’s topology or the information stored in the original 'Root A' database. What it does know is that the domain name database is distributed among various name servers. Different name servers store different parts of the domain space, with some parts of the database being stored in multiple redundant servers. For example, suppose a user would like to request a piece of information on the web. His machine, which acts as the resolver, starts with knowledge of at least one name server and asks for the requested information from (one of) its known server(s). The server in return will either send the information or refer the request to another name server if it does not have the requested information. By searching back towards the source, resolvers learn the identities and contents of other name servers.

      Name servers try to manage data in a manner ensuring that their databases are up-to-date and transactions are efficient. It has not been technically feasible or efficient for each name server to keep the most up-to-date master file all the time; accordingly, name servers typically keep two types of data. The first type of data is called authoritative, which is the complete database information for a particular domain space. One of the jobs for a name server is to periodically check whether its authoritative data is valid and complete; if the data is aged, the server will obtain an updated copy from the master or another name server. The second type of data is cached data, which is data which has been acquired by a local resolver. This data may be incomplete but improves the performance of the retrieval process. This functional structure isolates failures of individual name servers, and it also isolates database update and refresh problems in name servers.

      The domain name architecture requires a central source to maintain the master database, and this role is essential to maintain completeness of the naming space in the Internet. Name servers can keep a portion of the database of which they have knowledge, but they can always refer back to the master database when they see discrepancies. This architecture thus requires a single point of operation, and it moves the Internet from a totally distributed, flat hierarchy model to a centralized model. Since IP is run independently of DNS, however, the flat hierarchy of the Internet still remains and is merely constrained by the centralized requirements of DNS.

    7. Interconnectedness of networks – The Market and the Internet
    8. Positive network externality is the force driving the growth of the Internet. No one would care about the Internet if only one person were on the network. Each person's utility of being in the network increases tremendously when more and more people participate in using the network. A manifestation of this phenomenon is that the most popular applications on the Internet involve interactions between people. Email, chat rooms, newsgroups and html web pages gain positive network externalities when they are more readily available and accessible to the public.

      In the early days of ARPANET, no one could imagine a use for the network other than remote login applications. When email was first introduced, it quickly became the killer application, that is a wildly popular application, among almost all groups of Internet users. As more people use the Internet, more users are willing to develop new applications for the Internet. New applications attract new users, and the number of users on the net expands rapidly.

      This positive reinforcing force, symbiosis really, is only possible if the barriers of entry to the Internet market are low. The protocol TCP/IP is an open standard, which is to say that it is given to the public without cost. The equipment required to get online – a personal computer and a modem at present - is at present affordable for some, but by no means even most, Americans. The most important element for participating in the net is accessibility to the Internet Point of Presence.

      The Internet is a decentralized system so that centralized deployment strategy does not apply. Other communications systems, such as telephony networks, have been deployed under strong influence by governments. The expansion of the Internet, in contrast, depends almost exclusively on market forces. Commercial networks interconnect with each other and thereby gain positive network externalities, because they are benefited from the extra traffic that their customers generate across the networks. Thus it has historically been common practice for each carrier to carry each other's traffic without paying settlements to each other.

      Sometimes networks interconnect with each other by joining an Internet Exchange Point such as the Commercial Internet Exchange (CIX). Members of the CIX pay fixed membership fees to connect to the Internet, and they agree to carry other members' traffic voluntarily. Another Internet Exchange Point is the Metropolitan Area Ethernet-East (MAE-East). The AlterNet, PSI and SprintLink started the MAE-East, in which ISPs work out a set of bilateral agreements among themselves. It was unofficially reported that there were no monetary settlements between ISPs at MAE-East, and each ISP carries every other ISPs' traffic according to agreements between them without any additional monetary compensation. Insofar the same settlement practice is still used between networks. Internet connections can be obtained by paying a fixed sum to the backbone provider, or in some cases, networks are connected without exchange of money.

      The Internet market is structured such that each additional participant will increase the utility of the network. Costs are always settled on a lump sum basis, and the marginal cost of sending an extra packet is zero. This cost structure creates increased incentives to encourage use and participation on the Internet. As long as regional ISPs are able to recover their costs from their customers, they are willing to buy Internet connection from backbone providers and enter the business. Since ISPs settle their interconnection costs up front, their risks of entering the business are low. Market qualities like low barriers of entry, low business operation risks, and the continual expansion of the market have attracted and will continue to attract many service providers.

    9. Universal service of the Internet - 'The last mile'

The Internet’s market structure has created sufficient incentives for ISPs to invest in and expand the Internet. Business competitions will tend to drive the price of internet services down to their marginal cost. Internet access will accordingly continue to become more affordable. There will continue to be a significant portion of the population that will be unable to afford the service and the government should therefore consider intervening to promote universal services.

§254 of the 1996 Telecommunications Act introduced for the first time the goal of "universal service" into United States law. The old concept of universal access, which applied to telephone service, and which implied only that each citizen have access to a phone has become obsolete. New technologies, such as wireless phone services, cable television and the Internet, have pushed regulators to adopt the concept of universal services. In this information age, universal service is not merely a matter of connectivity, that is, access, but is also embodies concerns about the content of available information, knowledge of computers, and the costs of connection. The FCC has provided a guide with the following six principles to characterize the term "universal service":

    1. Quality and Rates: quality services should be available at affordable rates.
    2. Access to Advanced Services: access to advanced telecommunications and information services like the Internet should be available to all citizens of the nation.
    3. Access to rural and high cost areas: All consumers in the nation, including low income consumers and rural consumers, should have access to telecommunications services at reasonable rates.
    4. Equitable and non-discriminatory contributions: all providers of telecommunication services should provide services to all consumers in an equitable and non-discriminatory fashion.
    5. Specific and predictable support mechanisms: there should be sufficient Federal and State mechanisms to preserve and advance universal service.
    6. Access to advanced telecommunication services for schools, health care facilities and libraries.

The FCC and the US government have an apparently strong commitment to providing advanced services, such as the Internet, to all citizens of the nation. While the government’s specific policies and programs are outside the scope of this paper, it is important to remember universal service as one area in which the law may have a tremendous impact on how and by whom the Internet will be used.

    1. Domain Name Policy - Internet Governance

    1. Current Governing Institutions and Social Norms
    2. One may question why the Internet needs governance at all. It seems that the very idea of governance is opposed to the independent philosophy of the Internet. Certain functions of the Internet must, however, be coordinated. Most notable among these are the management of the root database and the assignment of IP addresses. The Internet architecture was created wholly by engineers, and its design process was free from explicit political considerations. It is therefore unsurprising that the Internet was designed without levels of hierarchy and points of monitoring and control. The design philosophy of the Internet has always aimed at accommodating heterogeneity, which has lead to its distinct, flat architecture, completely unlike an earlier network – the public switch telephony network – in which calls must go through central switches. Technologically, the Internet can "run itself" and there is no real need to have a governance structure at all. There are several organizations that take care of the development and operation aspects of the Internet, but they do not have absolute authority to impose rules on netizens.

      Participation on the Internet is a freely made, individual decision. There are no strict social norms to encourage anybody to use the Internet, but we may find contexts in which Internet access has importance for certain groups of people, like MIT students or class members of 6.805. The Internet is distinctly unlike the telephony system, in which it is assumed that everyone should be able to access to a telephone. In developed countries, a telephone number is also a piece of identification essential for applying for credit cards, jobs or even a supermarket discount card. In America, owning an email account is no longer a privilege enjoyed only by academics or advance technology agencies, but it is still not a method of communication widely adopted by the general public. The public accessibility of the Internet is worse in developing countries, where Internet access is confined to groups of elites and government officials. During the deployment of telephony systems, countries like the US had tremendous government involvement in both policy design and technical implementation of the process. The old AT&T was a special arm of the US government, and their policies were tightly hinged on government decisions. In the deployment of the telephony network, the US government played an important role in fostering the process. They subsidized individuals who could not afford a telephone and ensured that all households would have access to telephones. The idea here was to provide universal access to the general public.

      The Internet differs from the PSTN telephony network both in terms of policy and technology. Historically, in the development of the Internet, the US government has not played much of a decisionmaking role. There have been several organizations that separately take care of developing the Internet.

      The Internet Engineering Task Force (IETF) designs and develops protocols for the Internet, and it has been the technical arm of the Internet. The IETF works tightly with the Internet Engineering Steering Group (IESG) which is responsible for technical management of IETF activities and the Internet standards process. There is also the Internet Architecture Board who is responsible for defining the overall architecture of the Internet and providing guidance and broad direction to the IETF. As mentioned before, participation on the Internet is totally voluntary; these organizations merely set standards for the community and ensure interoperability of the Internet.

      The Domain Name System is managed by the Internet Assigned Number Authority (IANA). Before his death, Dr. Jon Postel, of the Information Sciences Institute (ISI) at the University of Southern California, ran IANA. IANA manages the root of the DNS to promote stability and robustness. This role primarily consists of making decisions about the location of root name servers, as well as considering the qualifications of applicants seeking to manage country code top level domains.

      These core organizations maintain and ensure the stability of the Internet. They are rarely involved in political decisions concerning the Internet. While they have thus far been the governance group of the Internet so far, they have little power outside of their tasks. This has been, so far, a form of minimal governance.

      This mode of minimal governance has existed since the beginning of the Internet. The relative absence of governance has fostered a cultural norm of freedom on the Internet. Moreover, most net users have not even noticed the existence of what minimal governance structures do exist. The only organization noticeable to users has been the company that contracts out domain names to the general public.

      One may wonder why the Internet, such an enormous interconnected networks, has so little central power and a lack of bureaucracy and governance. In answering this question, we have to trace back to the creators of the Internet - Internet engineers. Bureaucracy, power struggle and control of people are rarely a priority in engineers’ decisions. They were tackling a technical problem, which in their minds, was a question of how to increase participation in a fair and equal way. The idea was to maximize an individual user’s utility in participating on the network by letting more users participate into the network, thereby taking advantage of network externalities. The Internet was not designed originally as a profit-making vehicle or with any explicit political purposes. There were no coded constraints on participation, and the costs of participation were low.

    3. Open standards

In the early days of the Internet, the protocol TCP/IP, which is required to connect to the Internet, was given out free. The only prerequisite was to be able to install the protocol to your computer and have network access; hence, the cost to participate was low. The French Minitel system provides a starkly different, contrasting model in participation.

The Minitel network was developed by the French Government in order to provide digital user directory services. In many aspects it is very similar to today's Internet, but it was first deployed in 1982.

The design process of Minitel was different from that of the Internet. The deployment of Minitel was centralized and controlled completely by the French Government. Its network standard is highly proprietary, and it required a special terminal to run the service. In the early days of Minitel, the terminals were given out for free by the government, but they were only given out to elite, targeted groups. Minitel is a technology that had been used for almost a decade before web technology was invented. Both the web and Minitel offer similar kinds of services and interfaces, but Minitel has never grown to become a common standard, especially outside of France. The French government has protected the technology and they have not shared the technology freely with other countries. As mentioned before, the prerequisite to run Internet was the acquisition of a copy of TCP/IP and a network computer. These low barriers to entry allowed the Internet to spread rapidly throughout academia and out into the general public in the past decade.

The US Government decided to commercialize the Internet in 1992, without altering the concept of open participation. The TCP/IP protocol can still be obtained at no cost. Nowadays, the only requirement for joining the Internet is a personal computer with network access from an Internet service provider. The installation process has also been simplified greatly since it was first developed. In most developed countries, the cost of buying an Internet connection is relatively low; therefore, the Internet is widely used as a medium for communications.

We have come to take this freedom of joining the Internet for granted, and most people do not realize that it would not be the same if the Internet were controlled by a government. This is a cultural and social norm that has been shaped by technology and the people who created the technology.

Most Internet users who use the web everyday hardly notice the governance structure of the Internet. In this web culture, there are no governments and no restrictions, as well as a freedom of expression limited only by the Internet’s social norms. So, in addition to the ease of Internet participation, this non-restriction concept also attracts many people to join the Internet. The Internet is a space where one can gain a freedom not available in real space. It is a space in wihch anyone can meet, exchange ideas and form a community.

    1. A case study: Domain Name Reform

Keep in mind these social norms and expectations that the Internet community has established over the years. Against that backdrop, the domain name reform process is an example that demonstrates how individuals have struggled to form a governance structure which represents all interests of the Internet community.

    1. Background
    2. The Internet was funded by the DARPA or NSF from its inception, and it was thus developed with United States funding. The US government thus unsurprisingly claims that the Internet is an outgrowth of US government technology.

      Historically, a server and Domain Name system has been maintained by IANA. When the United States commercialized the Internet since 1992, the United States government was no longer in a position to participate in Internet policy directly. In January of 1998, the NTIA of the US government proposed a Green Paper to address the future changes of the Internet. The motivation for restructuring the governance policy of the Internet is the promotion of a self-governance structure for the Internet community and a removal of the United States government from Internet policy.

      The domain name system has proven to be indispensable to the Internet. No similar system that can provide the same function across the entire world has yet been proposed. The domain name system provides stability and flexibility for Internet domain name users. The DNS maps names to hierarchical IP addresses and provides the extra layering that enhances stability. It also makes IP addresses portable.

      The Internet community could have adopted another way to address and access the Internet. For example, they could have adopted an address search engine like the yellow pages, in which customers are allowed to look for their needs by looking up categories of services. We can also totally avoid using domain names by using IP addresses.

      These options, however, are inconvenient. The number of hosts on the Internet is nearly 40 millions and increasing. Given that size, DNS provides the most user-friendly interface to navigate around the Internet.

      There are several factors that have triggered a call for DNS reform. First is the scarcity of generic level domain names (gTLDs), which has lead to domain name conflicts and behavior such as domain name ‘hijacking’ and ‘reverse-hijacking.’ Second, there have been misuses and abuses of top level domains, due to a lack of organization for allocating domain names to the public. Third and finally, the contract between Network Solutions Inc. and the United States government had been set to expire in September 1998. These factors have pushed the United States government to work to structure a new organization to oversee Internet policy.

    3. Evolution in governance structure
    4. The Internet community has come to realize that a governance structure had to be formed. There have been a number of attempts to get consensus on this issue from various parties. This section will describe the two most elaborated governance structures on which the Internet community had reached agreement. The two have very different mechanics in the way that the structures were formed, but they have similar principles and goals. In essence, each governance structure aims to maintain the stability and security of the Internet, promote competition in the domain name registrar business, create a mechanism for resolving domain name conflicts and form an international representative structure. One of the structures is a proposal by the POC/CORE coalition, which were formed under the gTLD-MoU. The second structure is a proposal of the United States government, created while it was seeking to step out of Internet governance. This structure has evolved to become the ICANN board that is the present formal governance board of the Internet.

    5. POC/CORE governance structure
    6. The Internet community has realized that a more consolidated governance structure is required to oversee Internet policy. In foreseeing the stepping out of the United States government from Internet policy, before the United States government proposed the non-profit governance board of members, there were groups that have made consensus on Internet governance issues. They realized that there has been a lack of central planning in the development of the Internet. Under a consensus-based process, the Internet community has formed an oversight committee to manage the Internet. The Generic Top Level Domain Memorandum of Understanding (gTLD-MoU) is the international governance framework drafted by International ad hoc Committee (IAHC), in which policies for the administration and enhancement of the Internet's global DNS are developed and deployed. This includes the addition of new generic Top Level Domains (gTLDs) to the root of the DNS, selection of new domain name registrars, and development of equitable dispute resolution mechanisms over conflicts between parties concerning rights to domain names. In anticipation of the stepping out of the US government from domain name policy, the Council of Registers (CORE) was formed to register domain names. The Policy Oversight Committee (POC) is an eleven-member committee formed by representatives from CORE, IAB, IANA, WIPO, INTA, ISOC and ITU. The POC defines policy and oversees its policy in implementation. These policies are developed in cooperation with the IANA.

    7. Voluntary multilateralism
    8. The idea behind voluntary multilateralism is to bring all interested parties together. They agree voluntarily that a problem has to be solved, and they let the market decide whether their solution is suitable. The International Ad Hoc Committee took this idea and formulated the memorandum of understandings, in which all interested parties have agreed that the DNS has to be reconstructed. The gTLD-MoU is a very board framework on which most interested parties can agree. The beauty of this approach is that a rough consensus can be reached easily.

      The gTLD-MoU framework starts from the formulation of International Ad Hoc Committee. The IAHC is a coalition of participants from the Internet community working to improve the DNS. The ITU held a three-day long meeting in Geneva, and the outcome of the meeting was 80 organization signatories to the gTLD-MoU in agreement to restructure the Internet. The signatories represent a wide range of interests from the private sector across the developed world and some developing counties. But neither the US government nor the European Union has recognized the gTLD-MoU. The gTLD-MoU is drafted with a set of principles which the Internet community in general recognized. In gTLD-MoU, they agreed to introduce competition into the domain name registry business, create the CORE to implement technical details and establish the POC to oversee the CORE and other Internet policies.

    9. POC/CORE Representation
    10. The nature of the POC/CORE structure is inclined to the private sector. This is a reflection of the parties represented in the process of developing POC/CORE. Most of the gTLD-MoU signatories were major Internet Service Providers and backbone providers. It is also endorsed by the ITU, the international authority in telecommunications, and other important Internet organizations such as IANA and ISOC. The aim of the gTLD-MoU is to maximize consumers' interests, and hence, it establishes the CORE and the POC based on a non-profit seeking principle.

      The CORE, the operational arm of the gTLD-MoU, is a non-profit organization registered in Geneva under the laws of Switzerland as a Swiss Association governed by Articles 60 - 79 of the Swiss Civil Code. The POC, Policy Oversight Committee, is yet another non-profit, non-governmental organization that is registered in Delaware. It includes international participation from ISPs, telcos, the IP community and interested citizens, and separates policy from its operations. It has a flexible structure and it is designed to evolve.

      The structure is designed to represent consumers' interests; therefore, it is operated on a cost-recovery basis. As it is drafted in the CORE-MoU, CORE is operated in such a way as to discourage cyber-squatting, forbid registrars from trading in gTLDs, run a non-discriminatory service and support a dispute-resolution scheme proposed by WIPO. CORE also tries to eliminate speculation on domain names and arbitrage opportunities. It will also implement a technical structure to maintain the stability of DNS, separate the registration function from backend registry, and create competition amongst registrars. In early 1998, there were 88 registrars distributed globally, about one-third in Europe, one-third in America and one-third in the rest of the world. The Policy Advisory Body (PAB) has been opened for Internet community to participate, and it elects members of POC.

      In analyzing the representation proportions of the POC, it is important to realize that the POC is a board of functional groups in the Internet community. CORE, IANA, and IAB are responsible for the technical functions of the Internet such as domain name registration, IP number allocation, and protocol design and implementation. WIPO and INTA are responsible for resolving civil issues on the Internet such as domain name and trademark conflicts. ITU and ISOC are representatives from the telecommunication community and the Internet community.

      On its face, the POC does not have any strong direct representations from end users, which is, oddly, the group whose interests they claim to represent. One can argue that once competition is introduced into a market, the equilibrium price in the market will favor the consumers. There is, however, no direct channel for end users to express their views to the POC. The Internet Society can be regarded as a representative for Internet users, but they are by no means representing the overall view of all Internet users. This structure, however, has an important feature – there is no governmental participation in the structure or the formulation process of the structure. The philosophy behind the POC/CORE coalition thus aligns with the social norms that the Internet community has developed over the years. It is open to participation and ideas, it is non-discriminatory to the community, and it makes decisions based on rough consensus.

    11. The US NTIA proposal - Green Paper

The NTIA proposal is the document that the United States government drafted as a request for comments to the general public for ways to reconstruct the domain name system. Instead of letting the Internet community form a self-governance organization for the Internet, the United States government has decided to set up a governance structure and let the structure govern the Internet. The United States government also recognized the problems in the existing domain name system and domain name allocating method. The theme of its proposal is the introduction of competition in the registration process, as well as seeking a way to resolve domain name disputes and step out from Internet policy making gradually.

In both the Green and the White Papers the United States government proposed the creation of a non-profit organization, which would organize the functions of ex-IANA to allocate domain names. In terms of governance, the main theme of the proposal is to maintain the stability and operation of the Internet, introduce competition in the market, and coordinate the Internet in a bottom-up fashion. The papers propose a private and not-for-profit corporation that will replace the existing IANA to coordinate the Internet. As quoted directly from the Green Paper: "The new corporation would have the following authority:

    1. To set policy for and direct the allocation of number blocks to regional number registries for the assignment of Internet addresses;
    2. To oversee the operation of an authoritative root server system;
    3. To oversee policy for determining, based on objective criteria clearly established in the new organization's charter, the circumstances under which new top-level domains are added to the root system; and
    4. To coordinate the development of other technical protocol parameters as needed to maintain universal connectivity on the Internet."

 

The first two functions listed above were performed by IANA. The third point hinges to policy decisions which the POC is seeking to take over. The last point is jointly coordinated by IANA, the Internet Engineering Task Force (IETF) and its steering group (the IESG). It seems that the new corporation will enjoy powers far beyond the mere operation of the DNS.

The papers also propose that "the new corporation will be headquartered in the U.S., and incorporated under U.S. law as a not-for profit corporation. It will, however, have and report to a board of directors from around the world." The fifteen-member board of directors includes representation from the private and public sectors as well as the international community.

    1. The Structure of the NTIA Proposal

The United States shaped the governance structure proposed in the NTIA proposal single-handedly. The United States will step out of the arena of Internet policy, but it will remain in the background to oversee the new non-profit organization. The board of the non-profit organization will be appointed. The Green Paper recommends that the new corporation hire a chief executive officer with a private sector background, in order to bring a more rigorous management style to the organization than was possible or necessary when the Internet was primarily a research medium. The government also recognized the diversity of backgrounds among members of the Internet community, but decided that the new corporation should represent the interests of key stakeholders in addition to the functional blocks of the corporation. As quoted from the Green Paper the composition of the board should follow the following principles:

The board of directors for the new corporation should be balanced to equitably represent the interests of IP number registries, domain name registries, domain name registrars, the technical community, and Internet users (commercial, not-for-profit, and individuals). Officials of governments or intergovernmental organizations should not serve on the board of the new corporation.

Seats on the initial board may be allocated as follows:

    1. Representation under the NTIA Proposal
    2. The board members are designated to represent the interests of Internet users and the Internet technical community. ARIN, RIPE and APNIC are regional registries in North America, Europe and Asia Pacific respectively. They maintain the root servers in the domain name system. IAB oversees Internet technical development, and they represent interests in technical development of the Internet. There are two members from the domain name registries and registrars who represent the business in allocating domain names. These seven members will represent the functional structure of the Internet. Two seats are designated to represent end-users – both individual and non-commercial users of the Internet. The five remaining seats will be filled by private sector representatives such as network operators, ISPs and telcos. Trademark holders also could be included within these five seats. Finally, there will be the CEO of the new corporation, who has private sector experience, but does not directly represent any constituency.

      This governance structure deliberately includes seats for not-for-profit Internet end users. This is an attempt to collect opinions from end Internet users; however, it is questionable how representable these members are. This structure gives five seats to the commercial users. In aggregate, they have a strong voting power to veto any decisions. The United States government has not allocated governmental and intergovernmental organizations any seats on the board. This decision has effectively excluded ITU and WIPO from participating.

    3. Reactions to the Green Paper
    4. The Green Paper sparked strong reactions, discussions, and debates among the Internet community and governments. The debates among the United States, EU, and Australian governments and the POC/CORE governance structure had a strong influence in shaping the White Paper. The United States, the EU, and Australia have significant economic and political stakes in the Internet, and their opinions are crucial in shaping the Internet.

      Governance of the Internet is not about the power of users, but it is about power of the organizations that are representing the different users. The European states want to protect their interests, such as electronic commerce over the Internet. The European Community and its member states see the Green Paper as U.S.-centric. The EU considers the Internet to be a global communication medium. Its member countries argue that they have the responsibility to ensure that communications networks are interoperable and are developed in a way to promote economic and social cohesion and economic competitiveness. They refer to the EU-US joint statement on Electronic Commerce point 4.v, in which there was agreement for the need for "The creation of a global market-based system of registration, allocation and governance of Internet domain names which fully reflects the geographically and functionally diverse nature of the Internet." The EU has also requested the opportunity to enter into full consultations with the United States before certain features of these proposals are implemented.

      The EU has stressed the importance of an international framework for the long-term organization of the Internet, which underlines the need to associate a wide range of international interests with future policy in this area. They disagree with the United States’ suggestions on jurisdiction and trademark resolution. They support the WIPO procedure for resolving domain name conflicts without resorting to a court system. Having commercial interests similar to those of the United States, the Europeans would like to include representatives from their own private sectors in Internet governance. Finally, the EU has emphasized the existing POC/CORE structure and argues that the POC/CORE structure aligns with the international framework which the Green Paper suggested.

      The European governments were not active in participating in the formulation of the POC/CORE structure. The EU, however, realizes the importance of electronic commerce, and its members would like to protect their legal sovereignty. The European Union member countries will lose a portion of their national sovereignty if the United States controls the Internet court.

      The United States and the EU have to resolve this sovereignty issue. The EU considers the WIPO dispute resolution mechanism politically neutral, and they strongly oppose resolving Internet conflicts in a United States court of law.

      The Australian government also recognizes the jurisdiction implications from the Green Paper. It feels similarly to the EU that national sovereignty would be jeopardized if multi-national conflicts were brought to resolve on a US court. The Australians hold the same commercial interests as the Americans and Europeans, and they want their benefits to be included in the decision process. In response to the Green Paper, CORE felt that the proposal had ignored gTLD-MoU, the existing governance framework. CORE felt that the benefits they had reaped by their multi-lateral agreements had been heedlessly destroyed by the United States. They proposed a ten-point plan aimed at Internet self-governance. Their proposal suggests the use of a Shared Registry System (SRS), which is able to handle multiple gTLDs and ensures that the DNS is operated in the public trust.

    5. A step forward - the White Paper
    6. The proposals for Internet governance in the White Paper synthesizes comments and reactions to the Green Paper. In the White Paper, the US government retains the board of directors in the new corporation. It has also made more concrete suggestions about the functionality of the board.

      The White Paper reaffirms that the board of directors should represent both the functional and geographical diversity of the Internet. Accordingly, the Board should be generated from an open election process. Thereby, the Board would represent the private sector and Internet users across the world, while governments may not participate as institutions. However, they may participate as individual users.

      In terms of governance, the decisionmaking process is designed to be transparent to the public, to prevent the process from being captured by factions. Competition is very important to ensure that the market is efficient, and the White Paper stresses again that the new registry service should be fair and pro-competitive.

      Throughout the request for comments period, the US government has opened up the entire discussion to the public, including foreign institutions and governments. The White Paper has emphasized the importance of promoting a global representation of the Internet. As the United States steps away from dominating Internet governance, it simultaneously disapproves of any other governmental or intergovernmental body stepping into the vacuum it is leaving behind. The United States still adheres to the principles that were set out in the green paper, such as fairness and transparency, aimed at protecting individual users. The board of directors is also structured under a set of democratic principles including openness, fairness and representativeness.

    7. The ICANN board
    8. After the White Paper was published, the DNS debate was carried on behind closed-doors. The first ICANN board members were chosen with heavy pressure by the US government. The first ICANN board had their first public appearance in a meeting in November, 1998.

      At present, there are nine initial directors on this board. According to the White Paper, there will be nine more members who will be elected from the supporting organizations. ICANN is registered under United States law in California. It has a subset of rules that it should follows, Bylaws. The board consists of an authorized board of directors numbering between nine and nineteen. A chair is elected from the board of directors. Each board after the Initial Board shall have three directors nominated from each of the Address, Domain Name, and Protocol Supporting Organizations.

    9. International Representation
    10. As indicated in the Bylaw section six, the board of representatives shall not exceed more than one-half from any one geographic region, and no more than two of the directors elected by each supporting organization shall be residents of any one geographic region. From the nine directors on the ICANN board, four come from America, three from Europe, one of from Japan, and one from Australia. So far, the representation structure does not show any balance among geographic regions. The Far East and Pacific region are represented only by a lone member from Japan. Africa, the Middle East and Southern Asia are not represented at all. This representation distribution, however, maps roughly to Internet users’ geographic density. At present, countries like China and India have a very small stake on the Internet, and the representational structure would be expected to change to include their interests.

      In formulating the ICANN board the purpose is to introduce a bottom up approach in Internet domain governance; hence, the president of the ICANN board should have experiences in the private sector to promote competitions. Many of the nine ICANN board directors have involvement with the private sector, and some of them have strong ties to the academia. We could regard that the ICANN board represents the Internet community at large; however, they might also represent no one in the Internet community.

    11. Representation structure of the two proposals

Before the United States stepped into the reform of Internet policy, the POC/CORE had reached a wide consensus across various parties on the Internet about how to restructure the Domain Name System. In a democratic system, a structure should facilitate participation in such a manner that every participant has an equal opportunity to be heard. The architecture of the system would be more democratic if more people are allowed to participate in the formulation process; however, this does not imply that a democratic structure has to be constructed by a democratic process. The POC governance and the ICANN board are two very different representation structures. The POC is regarded as a more democratic outcome, because the POC was structured by an agreement process. Conversely, the ICANN board is considered by the Internet community as a manipulation of power, since the ICANN board was created by the influence of a government.

Governance of the Internet can be seen in two ways. First, we can focus on the representations of the governance structure and how individuals’ views can be heard. Second, we can focus on the formation process of the representation structure, and how the structure is designed so that individuals are not left out from considerations. In forming any governance structure of the Internet, policy makers soon encounter the problem of including stakeholders in the structure. The Internet crosses different sovereignties and jurisdictions; it includes billions of stakeholders and users, and its users have no identity to locate them. It is not feasible for everyone to participate in Internet governance. The number of users on the Internet is astronomical, and there is no identifier to distinguish Internet users. It is difficult to formulate a representation structure. A user could belong to multiple community groups, which could have similar or conflicting interests. The governance structure could let users to self-select their identity so that only stakeholders would participate and deliberate about Internet policies. This structure enhances deliberation but constrains participation.

Neither the ICANN board nor the POC achieves a plausible governance structure of the Internet, which encompasses all the democratic qualities. Both structures are able to include representatives who represent the Internet community at large and leave out values like participation and representation. They have chosen to include a small number of members in the structure so that deliberation can occur. In real space it would be tremendously difficult to formulate a democratic structure of large numbers with deliberation, participation and feasibility. However, the Internet would enable possibilities that were not possible to implement in real space. The next chapter will explore such technical possibilities and their implementations.

  1. Government by the Internet

    1. Introduction
    2. After surveying Internet communities as current examples of online democracies and discussing how the Internet community can act as a connected global democracy, we ask the Internet’s advancing technology can facilitate offline democracy. We examine voting structures as the key participatory moment and decision mechanism of modern democracy. Finally, we examine how the ICANN membership can use the Internet to further their governance ends.

      Online democratic participation and voting raises many new issues. Its possibility challenges the foundation of representative democracy. If the Internet enables all of a country’s citizens to participate directly in legislation, we must justify representative against direct democracy. We can no longer rest on the assertion that it difficult to convene the entire citizenry to discuss and vote on specific issues; the Internet removes the barrier of geographic distance.

    3. Current Online Voting Architectures
    4. Some websites have already prototyped or implemented online voting schemes. A Java applet "deliberation space" is designed to facilitate "asynchronous group decision-making on the web.". The site displays proposals and shows the current vote for a particular proposal, permitting voters to visit at different times to post discussions or amendments, signal their readiness to vote, or vote. Different users may be given authority to propose, comment, or vote in the deliberation space.

      This site utilizes one of the advantages that the Internet has over real space voting: not only may users vote at different times, but they need not live in the same neighborhood, city, or state. In addition, the Internet provides easy access to a wealth of information that can be easily stored and searched. Although the "consideration" section permits deliberation of proposals, the interface is confusing and a bit overwhelming. In addition, this particular implementation does not provide any synchronous discussion – if several people were logged on at the same time, they would have to wait for one another to post and then respond.

    5. The Deliberative Poll Goes Online

This beginning of a deliberation space raises interesting questions. Its design might be expanded into a deliberative polling implementation. The deliberative poll, as developed by James Fishkin, "builds on important work in encouraging citizen deliberation." Deliberative polling aims to build a more informed electorate; to

take[] us a step closer to a more deliberative and engaged society by providing thousands of citizens with the opportunity and the occasion to think through current issues, to confront trade-offs, and to grapple with the hard choices facing our society. In short, these forums help move a subsection of the country in the direction of public judgment rather than public opinion.

 

In Fishkin’s realspace deliberative poll, voters are required to read articles on a particular issue and then discuss the issue with one another before they are permitted to vote. The voters are thus encouraged to become more informed about the topic and required to put more thought into their decision.

The deliberative poll concept translates well to cyberspace. Articles that are read in real space can be posted online, and the post-reading discussion can be implemented in the form of a live online chat among voters. A significant barrier to the implementation of deliberative polling online is the difficulty of ensuring that voters are reading the assigned articles in cyberspace. Though a user might have his web browser pointed to a particular article, he might not even be looking at the screen. In real space, a person overseeing the group can walk around a roomful of participants and visually check to make sure that everyone is reading; even without a proctor, the pressure of others’s presence may be enough. To simulate this proctor, one might try an "attention-discounting," whereby a participant’s vote would be weighted less if it appeared he was not paying attention. A short quiz following each reading could ensure that the reader had at least skimmed the article’s content. The attention discounting could easily be biased, however. At the extreme, it could constitute a discrimination akin to literacy laws used to keep former slaves from voting in the South following the Civil War.

Additional problems with the online implementation of deliberative polling mirror those faced by its realspace counterpart. One such issue is the structural bias of the person who chooses articles for the participants to read. Another is that deliberative polling increases the depth of people’s participation only by requiring greater effort – and so reducing the number of participants. With voters as busy and apathetic as they are today, this time requirement may skew the sample of the population who would take part, toward a particular subgroup of the population, such as those already very active for the cause. The deliberative poll is not so effective if it does not represent a good cross-section of the population. While it might produce accessibility of information and increased deliberation over a particular topic, the trade-off of decreased participation is real.

If the focus is narrowed to a smaller, less complex community than the national scale, deliberative polling might be more feasible. Deliberative polling might be useful, for example, in a corporate stockholder forum. Not all the stockholders can be in the same room at the same time, as they are spread across the country and around the world. However, they might prefer to vote on issues such as electing the corporate board of directors, rather than giving their votes to a proxy. An online deliberative poll would allow users to get background information regarding the company, and provide a forum for stockholders to confer with each other on the issues at hand.

Another example in which deliberative polling might work online is in the form of an online jury in a specific MUD, MOO, or similar group. The users involved might feel that in order for justice to be served in their community, it is important that they take part in the activity as part of their civic duty to the group. As in a realspace jury, the users might feel that their input matters, because of their decision’s effect on other online users. Similarly, they might be obligated to participate in the jury based on the community’s rules, as jury duty is required by law in real space.

These two examples might work in cyberspace because of the stake involved for the participants. Stockholders might think that taking the time to deliberate on an issue is important because they have a stake in the company’s business, and they want the company to do well so that their financial assets improve. An online juror might be interested in deliberating because his decision bears on someone else’s fate, or that of the online community of which he is a part.

    1. The Deliberative Poll Experiment

To test the feasibility of an online deliberative poll, an experiment was conducted on December 1, 1998 at the Massachusetts Institute of Technology, in a class consisting of 30 students from the Harvard Law School and 30 students from MIT. The students were asked to log onto an experimental deliberative poll website, write their initial thoughts on the topic at hand (the Microsoft antitrust case), then begin the deliberative poll experience. First, they read three articles from different perspectives on the topic and answered a few questions about each article as a form of attention discounting. The participants discussed the issue in a chat forum for about ten minutes, and then completed individual ballots.

The students were instructed to sign on with non-identifying usernames so that when they interacted with each other online, no one would know who was making which comments. This was a test to see how the anonymity of cyberspace compared with the face-to-face discussion of a realspace deliberative poll. For example, people who were introverted or in the minority in real space might feel bolder and more outspoken in cyberspace, so that the majority would not so easily overwhelm the voice of the minority.

The anonymity was met with mixed feelings. Although it might have initially been a good idea to help protect the identity of the individual, it seemed to deteriorate the level of conversation in the chat room. Specifically, some people would not take the post-reading chat seriously, since they were faceless, nameless, and therefore unaccountable even if they made inappropriate comments.

Here, articles were picked more for brevity than for the depth and scope that would be chosen for an actual deliberation. One major complaint, however, was that the articles were limited in scope and their choice appeared to reflect a partisan bias.

As mentioned previously, one problem with bringing deliberative polling online from realspace is that there is no proctor in a roomful of voters checking to make sure that everyone is reading the assigned articles. It is difficult to determine whether the online participants are truly deliberating or merely passing the time online. This was the rationale behind the attention discounting questions listed at the end of each article; based on the correctness of the user’s answers to the questions regarding the articles, his eventual vote would be scaled according to the reader’s comprehension and attention to the given articles. The aim of attention discounting was to stand in for the social norms in a face-to-face group, to give readers an incentive actually to read the information provided. The attention discounting questions thus tended to be fairly simple.

The attention-discounting portion of the experiment met with the most criticism. Participants complained that the vote of any reader might be discounted against his or her will, based on an arbitrary quiz. Some noted the questions tested test-taking skills more than understanding of the issues of the case. The similarity to old-South literacy tests would have rendered such a provision unconstitutional in a state-sponsored election. Though the general consensus was to remove the attention discounting from the experiment, no one had any convincing suggestions for its replacement to ensure that participants were indeed reading and paying attention to the articles.

The chat forum was intended to simulate the post-reading discussions that took place in Fishkin’s deliberative poll. After reading the articles, voters were to discuss the readings with each other to clarify facts and to attempt to persuade one another. In addition to the main chat room, smaller rooms were available for private chats if a user wanted to lobby for a specific argument.

The chat room was generally perceived as an interesting concept, but a failure in practice. As mentioned earlier, because users were logged in under pseudonyms rather than their real names, and thus not directly accountable for what they said, the quality of the conversation in the chat room was diminished and less serious. In addition, the general ‘Netiquette’ for online chat rooms is more relaxed and colloquial. There were additional problems with the implementation of the chatroom. With slow connections, some users missed a great deal of the early discussion. Others tried to ‘spam’ the chat room by typing repeated strings of nonsense, but these users were automatically bumped from of the chatroom. Scrolling presented a further problem. With many users typing at once, it was difficult to understand the conversation, especially where there are several threads of conversation overlapping. In real space conversants usually focus on the discussion closest to them, or move to another. In cyberspace, where there is no physical proximity to the conversation, filtering out different levels of text to find your conversation becomes difficult. This reason led to a suggestion for a bulletin board type of mechanism where users could post and display comments, so that all of the voters did not have to be online at the same time.

The post-chat vote itself was conducted through a simple online CGI form. The page also had a calculated weight for the vote, based on the earlier attention discounting. The first few questions were multiple choice, to which the voter simply clicked on "Yes" or "No." The first questions asked the voter’s opinion on some of the fundamental conflicts of the case. The last was an open-ended question that invited the voter to supply a more detailed response as to the appropriate resolution of the case.

Overall, the students involved in the online deliberative poll experiment were critical of the poll. Many complained of the possible bias involved. Others noted the low level of discussion in the chat, with one person commenting, "This is a miserable substitute for a real conversation." The attention discounting was probably the most heavily criticized of all, especially since those familiar with a deliberative poll knew that this was not normally part of the realspace deliberative poll. Nevertheless, there was some positive feedback on it. Some participants liked the idea behind the deliberative poll as a means to a more informed electorate, but said the implementation problems had to be better resolved.

    1. Some Further Possible Architectures
    2. The deliberative poll seeks to create a more informed, thoughtful electorate, but does so at the cost of decreased participation. Yet alternative architectures could increase the number of participants in the electorate, at the cost of decreased information and deliberation. One such possible voting scheme, extending the concept currently used by political parties, is the "express check-box list." As the voter in a mechanical voting booth can often pull a single lever to select all of a given party’s candidates for office, so might he on a CGI form press a button to select a slate of choices. Online, unlimited by the mechanics of the voting machine, the voter might be given a multitude of such slates. Though the effort in choosing among the slates would be less, therefore, minority voters might more easily propose their own, competing slates.

      Putting the ballot online would add information through the ability to hyperlink relevant background material to candidates and initiatives. Further, it increases ease of access to those who have computers and Internet service. It might also be more feasible online as compared to deliberative polling, since it does not require everyone to be online at the same time to discuss the issue; voters participate at their leisure and are not constrained by someone else’s schedule.

      Taken to its furthest extreme, we could enable effortless voter participation with "political collaborative profiles." "Vote-bots," akin to the intelligent agents on the Amazon.com or firefly websites could analyze a web user’s past activity to determine his political preferences. In this way, citizens could choose with a single mouse click, the recommendations of another who had visited similar sites. Just imagine: "if you liked the Christian Coalition’s website, try candidate X!"

      These extremes return us to the tradeoff between deliberative information and participation. Although a deliberative poll might not be perfect in its present state, it might be a more reasonable approach to online voting than express check-boxes or profiling. Even when technology advances and makes things like the express checkboxing and political profiling ideas much more feasible, do we want new architectures like these two voting schemes just because they are new and different?

    3. Feasibility Issues

Any implementation of online voting will raise architectural issues. For instance, if a vote held online has implications that might extend to real space, then there must be universal online access to a computer for all potential realspace voters. Voting should be both anonymous and private, but some form of digital identity must be established to ensure the eligibility and uniqueness of voters.

Any voting implementation, online or off, must guarantee fairness, that only eligible voters are allowed to vote, and only once. In addition, votes must be correctly tabulated, and results accurately verified and announced. The tabulation and verification of the results are actually easier online than in realspace, since much of the calculation can be automated by computer. The questions of fairness and anonymity, involving digital identity, are more difficult.

One of the ways authentication problems might be solved is with various encryption algorithms. These encryption algorithms are the current methods of ensuring anonymity and privacy. For instance, there is a self-adjudicating protocol developed by Michael Merritt in 1982. In this protocol, several layers of encryption and computation are used to allow participants to vote without a third party’s involvement. The basic scheme works like this:

1. Each voter attaches a random string X to his vote V.

2. Encrypts his vote with public keys of Voters 1 through N, in that order.

3. The voter repeats step 2 but includes a random string within each layer of encryption.

4. EN(RN,EN-1(...(R2,E1(R1,EN(EN-1(...(E1(V,X))...))))...))

5. All votes are passed from voter to voter, starting from voter N and ending with Voter 1. Each voter decrypts the message and strips off and validates the random string.

6. Scrambles the votes and send to the next voter. Now, EN(EN-1(...(E1(V,X))...))

7. Each voter decrypts his layer. Check the signature and signs.

8. Votes looks like: Si+1(Ei...(E1(V,X))...)

9. All voters confirm the signature of voter 1 and check the list of votes for their initial random string to ensure their vote was counted.

This method is computationally intensive, and inappropriate for a large scale vote, as the encrypted votes circulate among all the users. However, the extra layers of encryption make this a very safe method.

In a central vote repository scheme, less computation is involved, but responsibility is given to a third party to administer and count the ballots. The way this scheme works follows this structure:

Each voter has public/private key pair {k,d}.

1. The CVR asks each voter whether or not he will participate.

2. A list of all participants is made public.

3. Each voter receives an ID number using an ANDOS protocol.

4. Each voter anonymously sends the CVR his ID along with the encrypted vote.

5. The CVR publishes all encrypted votes Ek(ID,V).

6. Each voter anonymously sends {ID,d} to CVR.

7. All votes are decrypted and their values published alongside them.

The major issue with this CVR method is that it relies entirely upon the validity and integrity of the third party voting authority. If the third party were easily corrupted, then the votes could be miscounted or falsified.

Last, there is also a multiple voting organization structure, the F.O.O. protocol by Fujioka, Okamoto, and Ohta. It is similar to CVR, except that instead of one voting authority, this structure involves two separate parties, one to administer and one to count the ballots. The basic implementation follows this structure:

1. The voter selects his candidates and commits to this ballot.

2. This committed ballot is then blinded and signed by the voter and send to the administrator.

3. The administrator verifies the right of the voter to vote and the signature on the blinded vote. If the signature is valid, the administrator signs the committed, blinded ballot, returns this signed ballot to the user, and publishes its log.

4. The user unblinds the ballot and verifies the administrator's signature.

5. The ballots, now signed by the administrator, are then sent through an anonymous channel to the counter. The counter publishes the ballot along with an index number.

6. The voters send in the private keys to decrypt the vote along with the index.

7. The counter counts the votes.

This F.O.O. protocol is probably the most likely to be implemented because it is the most easily scalable to the level of state or national government election. In addition, it is less likely to be corrupted than a CVR type of implementation.

  1. Theory and Practice of Internet Democracy

    1. Introduction
    2. "The Internet is different" is a common claim of those who analyze its structures in comparison with off-line models. Some take this to mean that nothing translates, that all must be relearned in this new context. John Perry Barlow, declaring the "Independence of Cyberspace" in opposition to the Communications Decency Act, proclaims that the world’s ‘local’ governments and their structures have no place online: "We have no elected government, nor are we likely to have one." More recently, discussions on mailing lists and at the ICANN board’s initial public meeting have often proceeded as if not only the Internet but its questions of governance were new and without useful precedent. Listening to them, one would think that the form of American constitutional government has no bearing on the cyber-sphere. It would be unfortunate to take this claim too far — both because it implies we must start from scratch in thinking about online governance, and because it ignores what cyberspace can teach us about and contribute to offline democracy.

      The Department of Commerce White Paper states that the "U.S. Government policy [set out in the document] applies only to the management of Internet names and addresses and does not set out a system of ‘governance.’" Yet, whether termed governance or management, functional control over the Internet’s infrastructure involves more than deciding who gets what domain name, or how many global top-level domains will exist; it is making rules to apply to the entire group of applicants and defining structurally what an owner can or cannot do with his domain name. Beyond assigning an unused port number to the mail protocol for sending and receiving email, the Internet’s technical architects set conditions for the standard delivery of mail: whether a message must or should include a return address, whether the email address must be bound to a physical identity, whether a host will receive mail from all sources and send or route to and for all senders.

      John Perry Barlow’s complaints that his metaphor has been taken beyond its original meaning notwithstanding, architecture matters to politics. The principles set out in one protocol affect the way we Internet actors behave and think about other protocols, even as the individual protocols and their entire framework remain open to change. Even if they are only an improvised governance, not a planned and theorized one, the protocols, rules, and norms of RFCs compose a considerable "law of the Internet."

    3. ICANN
    4. The abstract questions of governance in the real world and online combine concretely in the debate surrounding the formation of ICANN and the transfer of Internet management from the United States government. We face the task of creating a governance mechanism acceptable in both the real world and cyberspace, a structure that must stand up to technical and theoretical scrutiny.

      The structure to be constructed for Internet governance is intermediate in scale – larger in population than the "communities" of newsgroups and MUDs, and more limited in scope than a territorial government. It needs a "citizenship" that is broad-based but focused, as it must represent the interests of all Internet users, but only where the Internet is concerned. We face the challenge of uniting users from different locales and different political traditions, yet we can assume that they share a common interest in Internet stability and development (though they may have widely differing ideas of development). While we have much less of a status quo to fall back on, we are less constrained by historical traditions, and so are freer to experiment with new representation and voting mechanisms. Though we may be persuaded by principles from the American and other constitutions, we are not bound to the same degree a state or federal body would be. We thus attempt to incorporate the developing political culture of the Internet alongside that of the real world, to apply the tools of the Internet toward a democracy uniquely structured for its citizens and sphere of influence/control.

    5. Technology of democracy on the Internet
    6. Under the large-scale governance of network architecture and the engineers who design it, small groups of Internet users have created their own sets of rules. These communities and rule-sets may themselves be seen in a Darwinian competition, perhaps producing a democracy among groups, some of which are not themselves democratically run. As we saw in Part III of this paper, germs of democratic structures are already sprouting online, in the charters of Usenet newsgroups, the FAQs posted on listservs, or the governance experiments of MOOs. Often without giving formal attention to their structures, many of these groups have evolved the civic spirit of community that commentators find lacking in our modern real world democracies.

      These groups are the culture-dishes of cyberspace. Like the state of nature Rousseau and Hobbes envisioned, the Internet’s collection of newsgroups, MOOs, and listservs demonstrates how authority develops and is exercised. When only loosely tied, if at all, to real-world identity, cyber-status may attach to a showing of particular knowledge, an ability to synthesize threads of a debate, willingness to create and organize an archive or FAQ. A community’s participants may choose to accept or reject the offered leadership, to propose their own rules, or to challenge points set out as fact. They may offer to set up competing FAQs. In moderated newsgroups or lists, democratic elements are less obvious, but present nonetheless among the groups. If denied voice on one list or newsgroup, erstwhile participants will choose to exit that one in favor of a new one. The relatively ease of setting up a new list – founding a new country on one’s own model – allows the communities themselves to compete for loyal citizens.

      The relative ease of movement among cyber-communities may be seen as either a positive or a negative. On the one hand, individuals are not bound by arbitrary, unchosen characteristics such as birthplace or signifiers such as race. They may freely choose to associate with others who share their interests or convictions, and just as freely leave when their interests no longer match. Local minorities may meet across geographic distance to form a majority in a self-created community. Individuals may meet in other groups without acknowledging any minority status, to share experiences common along a different axis. In sharing one interest, they may later recognize their diversity in others.

      On the other hand, cyber-communities’ impermanence may detract from the groups’ meaning. Some would argue that without any connection to deeper identity, group endeavors cannot be deeply fulfilling. Even without adopting such an essentialist view, however, one may fear that online communities lack a larger goal to inspire in its members a commitment to cooperate even through intermediate disagreements. The investment in creation of an identity, whether as a known character in a MOO or as a respected "Old Hat" on a newsgroup, might create such a commitment. The technological ease of setting up a new mailing list or creating an alt.* newsgroup is not necessarily matched by the ease of attracting members or reattaining the status one had in a prior community.

      Some groups erupt into flame wars and die out from a too high signal-to-noise ratio, others are able to adopt and enforce charters that screen out most of the noise, others close their doors to new members or devolve into the cyber-equivalent of gated communities. Further complicating the picture is the extent to which communication inside community groups may exclude conversation among them. The more people gather in specialized discussion fora, the less they may hear from other viewpoints, or discuss their issues in larger context. The influx of new members may prompt them to see a broader picture, but the countervailing normic pressure against newbies’ repetition of old questions may stifle this debate.

      Johnson and Post suggest that the disaggregated decisionmaking by a collection of groups is itself the most effective form of governance. Theirs is an end-to-end argument writ large: each group knows best how to meet its own goals, so a minimum of power/control/discretion should be left to the space among them. Yet even the system designed under end-to-end principles requires coordination. Though we differ on the extent of the government required in between, it seems even they could not eliminate it altogether and preserve the ability of separate groups to communicate with one another over common protocols.

    7. Theory: Membership and Representation

    1. The problem of scale
    2. The Internet gives us greater capacity for a direct democracy, in which referenda on budget items, foreign policy, or education standards could be sent to the inbox of every connected citizen and posted to the web in glorious detail. It offers the potential for one-to one and many-to-one communications that contrast with the one-to-many format of the typical election campaign. Simultaneously, the Internet adds arguments against the direct democracy it could facilitate. The communities of Part III of this paper do not themselves scale to a cyber-democracy. Multiplied by 10, a congenial mailing list becomes tumultuous; multiplied by 10,000, it is either cacophonous or relegated to a background drone, causing participants to drop out or lose interest in either case. The more restrictive Usenet II is ostensibly designed to combat the problem of spam, but, like the AFU Old Hats’ creation of a private mailing list, it may also be a response to an overwhelming volume of even pertinent material. Even when technical bandwidth is enormous, we are limited by a scarcity of attention among its recipients.

      Both online and off, we are being deluged with information: data smog, as one commentator calls it. Our storage mechanisms are growing faster than our capacity to analyze and use the information we retrieve, giving us more data, but not necessarily more information. Yet the same problem of scale arises in real world governance: local debate does not scale to national elections. Some New England towns still hold annual town meetings, but it is hard to imagine national issues decided thoughtfully in an "electronic town hall," as Ross Perot proposed, or by referendum.

      The Federalists realized at the time of the American Constitutional Convention that representation was a response not merely to the physical size of the country, but to the range of issues to be tackled and the types of divisions likely to arise among the people. Madison warned of the problems of "faction," or tyranny of the majority, in a direct democracy. Where the population is too large or diverse to reach a unanimous consensus on issues, majorities will often be able to turn a slight advantage at the polls into a complete control of policy. Particularly where interest alignments are stable, direct majority rule has the effect of disenfranchising the numerical minority. The Federalist ideal of representation had the representative standing for his entire constituency, not merely the majority that had elected him. Some critics argue that this trust in representatives is unfounded, and look to alternative systems such as proportional representation or cumulative voting to give minorities actual representation.

    3. Membership and Citizenship

Two crucial questions face us as we try to create a democracy for the Internet: Who will be members? and How will they participate in governance? In democratic theory, the citizens hypothetically create their government, but as Locke remarks, the real consent of the governed is more often tacit – citizens manifest their consent to governance by staying in its jurisdiction without revolt. The founding of a constitutional government is necessarily outside of the Constitution, and may well be antidemocratic. The founders face a chicken-and-egg problem in describing citizens of a to-be-defined government, and the method of creating a procedural framework may well differ from that for enacting substantive rules. We need not move all the way to Plato’s "noble lie" to refuse to condemn the ICANN process because early drafts of its Articles and Bylaws were produced behind closed doors. We must, however, fill the blank Article II of the Bylaws with a meaningful and inclusive membership.

The members to be added to the ICANN structure fill the place of citizens to a geographic sovereign. Instead of stockholders, whose primary objective is generally to see their corporation earn money, the Internet and its governing corporation have stakeholders, people who use the Internet in different ways but have interests in how it develops. The businesses who link offices with Internet email, the academics who share research through websites, the students who use the Net for research or surf for fun, interest groups who share passions, and advertisers, will all expect something from ICANN — not just to be left alone, but to be left alone with the types of structures that support their needs. Yet the members offer something necessary to ICANN in return.

In creating substantive rule and endorsing the procedural frame, members serve the crucial functions of providing legitimacy, expertise, and checking. Members will give ICANN its legitimacy through their participation in decisionmaking and endorsement of the results. Involvement in what they perceive to be a fair process will give them a stake in making its outcomes succeed. The expertise of some members comes from engineering backgrounds or long experience with the technology. Equally important, others bring less focused concepts of what they want to do online and where, from a non-technical perspective, the Internet should be developing. Finally, members serve as the organization’s check, watching that it develops and implements policies as it has agreed to, in ways that serve their needs.

    1. Real World Meets the Net: ICANN as a Test of Both

    1. ICANN Representation
    2. These membership functions direct us toward a broad-based membership of users, builders, and maintainers of the Internet. Yet that choice is closely connected to our choice of structures by which the members interact to participate in governance. None of the functions is most effectively performed by returning an aggregate of yes and no answers to complex questions. As in realspace, we look to modes of representation.

      Realspace representation in the United States is most often tied to federalism’s division of power along geographic lines. Federalism allows us to divide responsibility and scale solutions. We contact town hall about a cracked sidewalk, State Representatives about smoke from the factory downstate, and Senators about foreign policy concerns. It also helps us to find the right groupings in which to deliberate. Though the scope of its governance power will be limited by national sovereigns’ unwillingness to give up power over citizens when they go online, the ICANN representation need not be geographically tied. An ISP may be more likely to share concerns with another ISP across the continent than with the school next door.

      ICANN, the Internet Corporation for Assigned Names and Numbers, is currently led by an Initial Board of nine at-large directors and a president. The organization has committed to shifting to a bicameral structure, whereby nine directors will be elected by an at-large membership and nine more are elected, three each, by the Supporting Organizations ("SOs") for names, addresses, and protocols.

      The SOs appear particularly directed at the membership function of expertise. They are to serve as advisory bodies to the corporation, and "shall be delegated the primary responsibility for developing and recommending substantive policies and procedures regarding those matters within their individual scope (as defined by the Board in its recognition of each such Supporting Organization)." Yet the SO structure may also be seen as an early attempt to describe an interest-based federalism. Members of these technical communities are thus given votes for distinct directorships on the assumption that each group will have a particular deep interest it will want to discuss and protect.

      This division raises problems of political equality, however. While geographic federalism guarantees that a single state citizenship precludes citizenship in another, affiliation with one technical community is not exclusive, nor are technical concerns distinct from those of the at-large users. How do we understand the "one person, one vote" axiom of American democracy in Internet governance? Systems of cumulative voting, as is common among corporate boards, offer an alternative. Voters each get n votes for n directorships, which they may cast separately or pool as they choose. Thus minorities and majorities may form their own interest groups to pool votes for a candidate, rather than being gerrymandered into the groups another finds significant. Alternatively, each member could apportion votes among predetermined interest groups depending on the strength of his affiliations.

    3. Theory of the Deliberative Poll
    4. John Rawls suggests that people of widely differing beliefs can unite to support a government acting on an "overlapping consensus of reasonable comprehensive doctrines." Each person will support the policy based on his or her unique deep views, so the consensus will only cover an abstracted policy subset. Citizens reach this narrower consensus through public discourse and attempts at persuasion, as in Mill’s marketplace of ideas. In the discussion of fundamental questions, however, Rawlsian citizens are to rely on "public reason"; while their own convictions may be based on private values, their public attempts at persuasion must be made in general terms, on reasoning that does not depend on their particular doctrines. To win support for a position, a citizen must present it in terms others of differing background convictions can accept.

      James Fishkin’s deliberative poll offers one implementation of the Rawlsian discourse. It first makes available to all participants a common background of information, lessening the real or feared ignorance of some and the perceived status or greater knowledge of others. This preliminary stage may thus bring citizens closer to a democratic equality by making other characteristics less relevant to the discussion at hand. The informational sessions also build the foundation for public reasoning: they suggest that arguments based on this common information, not from private religious or cultural belief, will be persuasive.

      The deliberation sessions provide further support for these intuitions. People find themselves facing others with different backgrounds and different starting points, but see that they nonetheless share common goals. Their differing baselines may be more amenable to compromise than they expected. Further, it is possible that without structural constraints to their deliberation – if they are allowed to craft their own answers rather than voting yes or no, the participants may find ways to create value in positive-sum solutions.

      As an intermediate, informative stage of the process, the deliberative poll has the most promise. It allows people to hone the arguments that will persuade those of different convictions, and gives them the facts to bring back to those who share their convictions. It makes them ambassadors of a sort, helping to give legitimacy to the outcome of the final vote.

      Further, the process of deliberation is as important as its content, Putnam and Tocqueville would say. Tocqueville wrote that the American jury’s educational function was far more effective than its role in the "good administration of justice." Serving on a jury, by putting real control into the hands of jurors, taught the American citizen cooperation, deliberation, and compromise in group decision-making.

      It is possible that the deliberative poll, like any discussion forum, could reinforce stereotypes rather than inducing people to reevaluate them. As a decision mechanism, it is subject to further criticisms that it could give undue power to holdouts, or, on the contrary, that it pushes everyone toward a weak compromise. The enforced "deliberation" may downplay the value of common sense. In addition, the architects of the poll itself manifest conscious or unconscious bias in creating the structure and choosing the "informational material" that they present to the participants. If the structure appears partisan, it will hinder productive discussion and fail to legitimate its outcomes. We therefore recommend the deliberative poll not as a conclusive voting mechanism, but as an informative stage in the decision process. All members would be given an opportunity to participate in polls and all would be apprised of the polls’ results. Each individual would determine for him or herself what value to place on the outcome. We suggest, however, that those who had deliberated would emerge with a sense of having learned from the process, and would share those new insights and persuade others.

    5. Technology of the Poll

The Internet offers several new elements to Fishkin’s deliberative poll. Although ICANN cannot offer the television audience or soap opera set Fishkin gave his British volunteers, it can provide them a forum through which to make their voices heard. An Internet deliberative poll seems particularly apt for the membership of ICANN, as members may have little in common beyond an interest in or use of the Internet. Moreover, the Internet deliberative poll enables a more personal contact among members spread around the globe, for whom face-to-face meetings would be prohibitively expensive.

We propose that ICANN, through the administration of a board committee, run polls of its membership before resolving important policy questions. The poll would take place on a public website, and participants would post their conclusions publicly. The board would vote only after reading these conclusions and any responsive comments from the membership at large. In particular, we recommend this approach for decisions the board is authorized to make without a membership vote. As well as building consensus behind board decisions, public deliberative polls would serve the transparency requirements of the Bylaws’ Article III.

The Internet deliberative poll builds on the architectures of the World Wide Web, and does not require a complex technological superstructure. The poll itself can operate through CGI scripts and Java applets viewed on a simple web browser. A basic web server with space for members to post their own pages would suffice to host the poll.

As one condition of membership, ICANN members might be asked to provide an email address or other means through which they would allow the organization to contact them. When issues arose, members could be invited by email to participate in a deliberative poll. Members could respond to these requests through anonymous remailers and voting protocols.. The poll itself requires only continuity of identity, not its linkage to a physical identity. Those who opted to participate would be divided into groups, either randomly, by time-zone availability for real-time chat, or aligned by relevant interest. If trying to build consensus, the ICANN committee might attempt to bring diverse interests together in a single group.

Participants would be presented an initial question and hyperlinked introductory reading materials. They would also be given the opportunity to add a their own materials, limited in number and ordered by priority, either hyperlinks to material elsewhere on the web or documents they wrote and posted to the server. Participants might also add commentary alongside existing materials. The deliberation itself would occur through a real-time chat, an asynchronous message board, and email exchanges. All events would be archived on the website so others could trace the course of deliberations.

ICANN could choose from among several methods of resolving the poll. It could simply set a time period for deliberation, and ask participants to record their individual conclusions after that time. This would reveal a snapshot of "the considered judgments of the public," but still a collection of individual thoughts. Alternatively, ICANN might ask members to reach consensus in their deliberation groups. They might be asked, jury-style to deliberate until they reached a single conclusion, and the technology might permit them to post a consensus outcome only when it was signed by all members. In this case, though each member would have veto power, each would also be aware that the group would have no voice if it did not present a consensus. In this scenario, the greater weight of a consensus proposal would give all participants an incentive to resolve differences.

The ICANN board would commit to reading the conclusions of each deliberative group, and directors could refer to these statements in explaining their votes. Their decisions would thus be informed by the archived deliberations and the links posted by members of the deliberative groups. While directors would not be bound by the deliberative conclusions, they could be expected to justify their disagreements with a posted consensus. These public statements in turn would allow members to judge the directors’ fitness for reelection.

The ICANN membership organization offers an opportunity to meld new technologies to a new electorate. We propose the Internet deliberative poll to realize these opportunities and to revitalize the democratic process of governance online.