Legal/Technical Architectures of Cyberspace

Protecting Children on the Web:
An Analysis of Solutions to the Problem of Content Control

Team Members
Monisha Merchant
Jennifer Murphy
Jonathan Pearce
Emily Sexton
Jonathan Soros
Adam Welsh
Executive summary by Jonathan Pearce
Oral presentation by Emily Sexton


Compared with the other issues addressed in today's conference, content control on the Internet is a relatively mature one. While there will always be those in the government who seek to restrict the flow of certain content in cyberspace, it has been found to be unconstitutional for the government to restrict speech that is protected by the First Amendment from adults. At the same time, however, it is perfectly legal for corporations or individuals to restrict the flow of such material. Employers have every right to filter out non-work-related sites to improve worker productivity, and web space providers can remove users web pages if they are objectionable, or irrelevant to an agreed-upon topic.

Between these two extremes lies the gray area of children's interests. Most people believe that children should be protected from material regarded as harmful or indecent, like pornography. In addition, the government does have the power to limit the distribution of material to children, even though it would be unconstitutional to limit the distribution of the same material to adults. Despite these forces, current and proposed solutions to the problem have been unsatisfactory when applied in public schools and especially in public libraries. To varying degrees, when applied in these settings they either block non-harmful content from children, fail to block harmful content from children, block constitutionally protected content from adults, or fail to make clear when content is being blocked.

To better understand the merits of various methods for content control, it is important to define the basis of values upon which they will be judged. We will adopt constitutional protections as the baseline for our analysis, our ideal being for neither the government nor private companies that would control content to restrict constitutionally protected speech from adults. We will also consider practicality, political feasibility, and impact on the individual user. Based on these values, we put forth the following criteria with which we will analyze possible solutions and submit a recommendation: technical feasibility and cost, implications for privacy, flexibility, level of transparency in blocking, problems of enforcement, problems of jurisdiction, locus of control, constitutionality, satisfaction of interest groups, general political viability, application to other areas of content control, and other, secondary concerns.

The goal of this paper is to analyze possible solutions, and recommend one whose characteristics best meet these criteria. The alternatives analyzed are the World Wide Web Consortium's PICS specification, a corralling system, a child ID system, commercial URL and keyword filters, zoning, and an adult ID system.

The strength of PICS lies in its flexibility. It can be implemented for general Web browsing, or for more specific tasks, like a teacher labeling certain sites as relevant to a topic and only allowing access to those sites. Any party can label pages with PICS, and users can choose which labels to trust. However, since much of PICS is based on recommendations, not on actual code, it has no enforcement mechanisms for sites that mislabel themselves.

The corralling system that was analyzed involves a network of corrals sharing a common content decency standard created by a private or government-endorsed organization. Sites in the corrals would not be allowed to link to sites outside the corrals, and the list of approved sites would be made public. The corrals would be patrolled to make sure that content providers were adhering to their contracts with the corrals.

A third possible solution, the child ID system, calls for an option in the browser to be set for child-protection mode or non-child-protection mode, protected with a password known to the librarian, teacher, or in the case of home use, the child's parent. Sites with indecent material would have to refuse to send such material upon finding that this option was set.

This option also requires a nationwide standard of what is harmful to children, but also does nothing to stop children from accessing harmful material on international sites that would not be subject to this standard.

Keyword and URL-based filters, like CyberPatrol and NetNanny have the advantage of already being widely implemented. However, these services are simply not able to keep up with the new sites being created every day, and cannot block every harmful site. In addition, since this solution is market-driven, the different services keep their URL blacklists and filtering methodology secret, since to reveal them would be to give up a competitive advantage to other such companies.

Another system analyzed was zoning of the web into a regulated area and an unregulated area. Sites must label their content according to an explicitly defined rating standard, devised by ICANN or another regulatory body, in order to be allowed in the regulated area. Content creators who do not wish to label their content would be relegated to the unregulated zone. The governing body could enforce penalties on mislabeled sites in the regulated area. Public schools and libraries could then modify the browsers on children's computers to only view pages with certain ratings in the regulated area, while leaving the adults' browsers unmodified.

The final system considered was one in which all adults had anonymous digital certificates that they could use to enter sites not available to children. Government agents would enforce compliance searching the Web for harmful sites and attempting to view them without the adult ID. However, this suffers from the same problem regarding international sites as in the child ID system, since harmful international sites would have no reason to ask for the certificate.

Though all of these solutions have flaws, we recommend the zoning plan. It creates an enforceable rating system, while adhering to constitutional standards by not requiring that sites be rated. It also allows local communities to devise their own standards about what children should be able to view in public schools and libraries in those communities. Market forces would drive most sites to participate in the system, since doing so would increase their overall visibility. The same market forces would also preserve the integrity and objectivity of the rating system; if the system were too badly abused, sites would eventually stop opting into it. We urge ICANN to step up to this challenge, and develop such an infrastructure.


Return to conference page


Send comments about this site to 6805-webmaster@martigny.ai.mit.edu.


Last modified: December 2 1998, 10:35 PM