In 1995, the Supreme Court overturned the Communication Decency Act in ACLU v. Reno. An important factor in that decision was the newly invented Platform for Internet Content Selection (PICS) technology, which allowed parents to protect their children from the unwanted content of the Internet without disrupting the transmission of that content to the willing adults. PICS technology was believed to provide the less restrictive means to protect minors. The opponents of the PICS standard, however, argue that labeling content of the Internet opens up the possibilities for a great content-based censorship of the Internet materials. Hence, they claim that the PICS standard can be a bigger evil for the cause of Free Speech then even the CDA. Nevertheless, it is very unlikely that, if used responsibly, PICS is going to facilitate any private censorship of the Internet, at least in the United States, even though there exists a possibility of abusing the technology. Likewise, if the due care is exercised, using PICS for labeling will not violate the rights of the content providers on the Internet. Hence, the PICS standard delivers the needed and very beneficial service to the Internet users.
For the past five or six years we witnessed the unprecedented growth of the Internet from an academic network to the Wild West to the Global Village and now to the Global City. Today almost every public school and public library in the United States is connected to the Internet. Most businesses are maintaining their Web presence. Many households are either considering an Internet connection or already have one. Because of its global nature and the diversity of its members, Internet has become a host to a great number of controversial materials. Some of the materials are deemed inappropriate by some audiences, and some are even illegal in some countries. Hence, to survive as a public communications medium, the Internet needed a system that would prevent inappropriate content from being delivered to those who do not want to receive it.
The PICS standard, designed by the World Wide Web Consortium in 1995, provides a system for the creation and distribution of rich and diverse labels about the content on the Internet. Armed with these labels and an appropriate software, Internet users can filter out the content that does not satisfy their criteria for acceptability. In particular, parents can setup the filtering software on their home computers to protect children from being exposed to the pornographic and exceedingly violent materials. The PICS standard purposely specifies neither a labeling vocabulary (the categories for labeling Web pages), nor who should collect the labels, nor who should pay attention to which labels. Hence, PICS's goal is to provide a framework for the development of a variety of labeling schemes and allow individual users to freely select filtering criteria according to the personal preferences and values.11
Critics of the PICS standard are arguing against promoting labeling and filtering Internet content. While most of them claim that PICS-based filtering is likely to result in the censorship of the Internet for adults as well as children, others are concerned about labeling even without filtering. We shall consider the issue of labeling first in our discussion. After that, we will discuss label-based filtering.
The PICS standard allows two kinds of labels: self-labels and third-party labels. A content provider can embed labels inside the Web document. Those can be any labels from any labeling vocabulary. Usually, the labeling vocabularies will be pertinent to the contents of the page. For example, a commercial adult site might label its pages using common labeling systems that specialize in admissibility for children (currently RSACi's and SafeSurf's systems). Placing labels in the Internet documents by the author is called self-labeling or self-rating.
However, sometimes one might not trust the author with labeling the pages correctly, or the author might choose not to label his or her content at all. In the non-electronic media one can turn to the other people's reviews in this case. Such third-party labeling is also supported by the PICS standard. Hence, one (anyone) can rate Web pages and post labels about them. If users trust some rater's labels, they can use those labels instead of the labels supplied by the authors. A popular example of a third-party labeling agency is the Simon Wiesenthal Center, which tracks the activities of neo-Nazi groups. Those groups do not label their sites as containing hate speech, but the Center may publish labels for the pages containing neo-Nazi propaganda anyway. Those parents who trust the Simon Wiesenthal Center and do not want their children to see hate speech may instruct their software to check the Center's server for the labels.12
There are many questions that can be raised about third-party labeling. The first question that comes to mind is whether giving an ability to any stranger to criticize any Web site in any way he or she wants is the right approach. The answer to this question is an unconditional yes. After all, labels and reviews are themselves speech, protected by the First Amendment. An ability to commend and criticize other people's work is an integral part of any democratic society.10 However, what if someone does label your site incorrectly and unfairly? Such misrating certainly could cause serious problems for a site. This situation is equivalent to someone publishing a misleading review of your work. You can always try to convince the rating agency that they are mistaken. It is in the agency's interests to correct all errors or the agency may become infamous for misleading labels and people will stop using it.
The situation is not that simple if we do not have enough rating agencies with the same or similar labeling vocabularies. In that case, an agency that holds a monopoly on the labels would wield quite a bit of power since people would have no choice in the source of their labels. This would be a very unfavorable situation for many communities, whose values do not coincide with the values of the labeling monopolist. Indeed, any labeling organization places some bias on the rating by its choice of labeling vocabulary and the actual ratings assigned. It is in the interests of the community and the government to promote diversity in the marketplace for labeling services. Likewise, those agencies that enjoy the greatest trust of the community should be carefully monitored to ensure that people are well informed about the practices of the trusted organizations.13 The good news is that the European Union, Australia, Japan, and other entities are currently planning the development of local rating services. Once those services come online, they would probably be available to all Internet users.
As we noted before, the labeling vocabulary that a rating agency is using has a very profound impact on the usability of the labels. For example, a rating agency like RSACi that labels content on violence, nudity, sex, and language would be of a limited use to a parent who is more concerned with drug culture and hate speech. Increasing the number of categories does not solve the problem completely. It is important to understand what those categories measure. Agencies that follow RSACi provide a simple questionnaire with very precise and focused questions requiring a yes or no answer. For example, "Does your content portray the death of non-human beings resulting from natural acts or accidents?" Or, "Does your content portray rape?" These are so-called "rules-based" systems. The problem with these labels is that simple rules might not take important factors like artistic or scientific values into account. Other agencies that follow SafeSurf provide a questionnaire with vague questions that require the use of the community standards in supplying the answer. For example, "Is this page appropriate for younger teens?" Or, "Does this page have an artistic value?" These questions are directly related to the criteria that parents use in deciding what content to allow into their homes. However, the answers to these questions may only make sense if the parent and the rater share the same cultural values.16
The solution to the problem above is again the diversity of the rating services. Indeed, if most major communities (countries or regions) and organizations (UNICEF and the Christian Coalition) provide labeling services, then people would be able to choose a set of labels that very closely reflect their values. The labeling services that could be provided by the communities and organizations do not have to be very comprehensive. If the choice of the label providers is sufficient, one could setup the software to consult the labels of the user's favorite service and fall back to the second-favorite if no label is found, and so forth.
Many critics of the PICS standard claim that it is very hard to rate their sites. They often cite the testimonials of the "computer experts" who confess that rating Web sites is associated with extreme hardships.3 To verify that claim I tried to rate my Web home page with RSACi and SafeSurf. To my great surprise, I was able to obtain a rating on both systems in under 120 seconds each. I estimate that it might take about 15 minutes to rate a site, which actually does contain some non-trivial content. Moreover, products such as Microsystems Software's CyberLabeler are being developed right now to ease PICS labeling even further.10
The great fear of the opponents of PICS labeling is the government mandating self-labeling (by compelling Internet Service Providers to require their customers to rate their Web pages, for example). One can get a good sense of the problem by considering two particular labels: a blue ribbon that many sites place on the Web pages to indicate their devotion to freedom on the Internet and a yellow Star of David that Nazis required all Jews to wear to indicate their religion and nationality. There is a difference in displaying a label voluntarily and being forced to display a label.9 Luckily, such a measure by the government would very likely not withstand judiciary scrutiny. The Supreme Court has ruled consistently many times that mandating a speech that a speaker would not otherwise make is unconstitutional. In Riley v. National Federation of the Blind the Court struck down the requirement for professional fundraisers to disclose some accounting details about charitable contributions. Likewise, in McIntyre v. Ohio Elections Commission, the Court considered unconstitutional a requirement for the persons distributing election materials to include their names and addresses in those materials.16
To understand the place of labels for the Internet content we need to take a step back and reexamine the fundamental reasons for the enactment of the CDA and the prevalence of the PICS standard. When the CDA was enacted, it in effect likened the Internet to a broadcast media, which is known for being intrusive. Indeed, by merely mistyping the address of the White House, one can inadvertently stumble upon a pornographic site at http://www.whitehouse.com. Similarly, an unlucky surfer may come across the same material at http://www.netparents.com. Finally, links to numerous Web sites with controversial information may be returned by an unintentionally broad query to the search engines. Therefore, the Congress enacted a legislature that imposed restrictions on the Internet content similar to the restrictions imposed on other broadcast media. The PICS standard, however, delivered an ability of the users to ban the unwanted content from being delivered to them. Hence, for a user armed with a PICS-compliant browser, Internet content is no longer intrusive and deserves a higher degree of Constitutional protection. Therefore, provided that mandating self-labeling and regulation of the intrusive media is Constitutional, the government has a right to enact legislation mandating self-labeling of the Internet content that is regulated for the intrusive media. Hence, the government has a Constitutional right to compel sites like http://www.hotbabes.com and http://www.whitehouse.com to label themselves, whereas sites like http://www.whitehouse.gov and http://www.mit.edu/~igorlord have a Constitutional right to refrain from self-labeling.
The prime use of the PICS standard is providing a way for the Internet users to block (or filter out) unwanted materials. In particular, parents require a way to protect their children from what they consider harmful content that is readily available on the Internet. Such content usually includes but is not limited to various forms of violent and pornographic imagery, hate speech, drug culture, and crude language. The PICS standard allows end users (like parents) to select what they consider to be inappropriate content based on the PICS labels. The parents can decide which labeling vocabulary and whose labels they are going to use (authors' self-labels or third-party labels). If parents decide to go with third-party labels, then they also need to pick a rating agency that they trust to provide PICS labels. With the right software, users can setup a variety of labeling policies. For example, one can decide to go with third-party labels only if the document does not contain an embedded self-label. The virtue of the PICS-based filtering is its flexibility and ability tailor filtering criteria according to each parent's values. "No United States censorship law could give parents this range of control, nor could it reach content from around the world so effectively."17
The very first question that is usually asked in the PICS debate is whether the PICS standard enables censorship. Unfortunately, this question is not as straightforward as it seems. There are many definitions of censorship ranging from "denial to publish" to "any denial of access." PICS does not have any mechanisms to deny publishing. On the contrary, by moving filtering to the receiving end of Internet communications, PICS protects the ability of the content providers to publish anything, which is not illegal in their country. On the other hand, PICS does provide mechanisms for filtering out Internet content by whoever is in charge of the computer. Hence, one can consider setting filtering rules by parents as acts of censorship. However, most people do not dispute such actions of the parents and often even consider them appropriate. What opponents of PICS worry about is censorship by the Government, large public or private organizations, and possible censorship-like consequences of the use of filtering by individual users. We will discuss each concern below. But before we start, there is another notion in the original question that requires clarification. What does "enable" mean? Some may argue that if PICS can be used for something, then PICS enables that. However, the other view--and it is the view we are going to take--is to reserve the word "enables" to applications that could not be implemented easily without PICS technology.13
Usually, when we talk about censorship we imply censorship by the government. Government censorship has been a tradition for all totalitarian regimes, and hence it is the form of censorship that is most hated and feared by any society that considers itself democratic. Therefore, it is not uncommon to hear people criticize the PICS standard because it can be abused by the government to establish a universal and mandatory filtering of the Internet content. These concerns indeed seem very legitimate, especially considering China and Singapore implementing national "firewalls" to block their citizens' access to certain newsgroups and Web sites. There are several things that the government could possibly do.13
As we have seen above, PICS does not provide a significant leverage to the government censorship of the Internet over the existing technologies. In fact, it increases the level of confidence that Internet does not pose a serious problem to the citizens and therefore reduces the likelihood of governmental censorship.
Governments are not the only entities that have an ability to control what people see on the Internet. Any organization that has an Internet connection has an ability to control what people who use its computers see. Four kinds of such organizations are often cited in the discussions about PICS: schools, public libraries, private corporations, and Internet Service Providers.
Over the years the Internet has been repeatedly cited as an unparalleled learning tool. Many public and private schools today have Internet connections, which they hope to use extensively in the school curricula for a variety of classes. Incorporation of Internet use into a school curriculum will also require children to do some research on the Internet without a teacher's supervision. In other words, teachers want children to access the Internet as easily and extensively as they access the school's library. There is only one problem with this undoubtedly beneficial proposal. A school board can ensure that materials that a child can find in a school's library are appropriate for children, but has no control over the information that a child can stumble across on the Internet (even accidentally). There is no doubt that a school has a right and an obligation to ensure that whatever a child learns in the school is "safe for kids" according to the community standards and common sense.
Fortunately, PICS answers the requirements of the educators almost perfectly. PICS-based filtering tools provide the needed protection and non-intrusive operation for the children. It is also quite easy for the school administration to agree on a particular set of filtering criteria since a school usually serves a rather limited local community, and a direct poll of the parents is possible. Finally, because lower, middle, and upper schools usually occupy distinct buildings, different filtering criteria may exist in those schools to reflect the maturity level of the students. For example, given an appropriate labeling dictionary is used to filter the Internet content, upper school students may be allowed to access anti-AIDS and safe sex sites.
The are already many practical implementations of the filtering software that would be suited for a school's use. The even better solutions based on proxy-server technology are being developed by IBM and other companies. A proxy-server, for example, will significantly ease the administration and maintenance of the Internet filtering. Instead of supporting PICS-based filtering software on every computer for every platform for every Web browser, a network-level proxy can be used to process labeling information on behalf of the user according to several predefined filtering criteria. Hence, a school's (or school district's) proxy-server may use a "student-type" filtering profile when processing a request from a computer in a computer lab and a "teacher-type" filtering profile (which probably allows an unrestricted access) when processing a request from a computer in the director's office.
For the libraries, the Internet is an indispensable resource because of an enormous amount of information it contains and the ease with which that information can be retrieved. Indeed, it is a lot more convenient to connect to the Encyclopedia Britanica on the Web, type in a question, and receive several articles from different volumes of the Encyclopedia in less then ten seconds. You can even print them out to take home! Hence, it is no wonder that children spend a significant amount of time in the public libraries, a great number of which in the United States have an Internet connection. That is why family advocates claim that just as the schools have an obligation to protect children from inappropriate materials, libraries must install PICS-based content filters on their Internet terminals for the protection of children.
The issue with the public libraries is much more complicated, though. Unlike schools, the mission of the libraries is not to educate children, but to provide a wide range of information to the whole community, which is composed mostly of adults. Moreover, the Library Bill of Rights prohibits public libraries from restricting access to the Constitutionally protected free speech. American Library Association (ALA) claims that "libraries are places of inclusion rather than exclusion."1 Hence, using a filtering software to block or exclude Constitutionally protected speech "is inconsistent with the United States Constitution and federal law and may lead to legal exposure for the library and its governing authorities."1
Some may argue that by having control over the library acquisitions, librarians have control over what is not included in the library collection. Hence, librarians have a right to refuse to include certain Internet content in the library collection of online information. Traditional community standards can often provoke local administration to adopt this argument and mandate the purchase of blocking software by the local public libraries. A prominent example of such policy is the installation of the content filters in Boston public libraries according to Mayor's order. There are several possible limitations to this seemingly compelling argument, however.
First, what exactly does an Internet connection mean for a library? (This point will surely excite many lawyers.) Is the library's Internet service provider a substitute for the publisher of the books? If so, a library computer substitutes the librarian by filling an acquisition order on the client's behalf, and, as any librarian, the computer may refuse to file an order by exercising the right to control library acquisitions. On the other hand, one can view the Internet as an electronic extension of the library, which has many shelves with an immense number of virtual books (information). In that case, by acquiring an Internet connection, a library has automatically acquired all materials that Internet has to offer. Therefore, by blocking access to some materials on the Internet, the computer acting as a librarian is not refusing to include the material, but is actually excluding it! And, as the Supreme Court ruled in Island Trees Board of Education v. Pico, excluding materials from public school libraries (and any public libraries by extension) is unconstitutional.15
ALA's reservations about the power of librarians to restrict Internet access to children is consistent with the considerations above. ALA insists that
A role of librarians is to advise and assist users in selecting information resources. Parents and only parents have the right and responsibility to restrict their own children's access--and only their own children's access--to library resources, including the Internet. Librarians do not serve in loco parentis.1
Hence, ALA effectively claims that librarians do not have the right to force any content regulation on children (to say nothing about adults). Instead, librarians are there to recommend and advice parents in the selection of information resources for the children. Also, librarians can be expected to cooperate setting up restrictions chosen by the particular parents, if the library's technology allows. We have to be careful here, though, to prevent overzealous conservative parents placing overly restrictive filtering criteria even on their children. According to the Supreme Court, teenagers have the right to obtain access to information about safe sex even without their parent's consent.3
This brings us to another serious trouble of installing PICS-based filters in libraries. The trouble is the absence of the filtering software suitable for library use. None of today's software packages allow enough control in setting filtering criteria and providing a reliable compliance with that criteria. By installing a filter with a limited control over what materials are blocked, the libraries delegate their decision-making authority to the authors of the filtering software. Unfortunately, it is not uncommon for the filtering software to block access to such sites as Queer Resources Directory, Electronic Frontier Foundation, and even sites that criticize that filtering product.
A good example of such software is X-Stop by LOG-ON Data Corporation, which uses its Web crawler engine to automatically rate Web sites. The company claims that it has produced a library version of X-Stop that blocks only obscene material illegal under the Supreme Court case of Miller v. California and does not deny controversial literary, artistic or political sites to the library patrons. However, as First Amendment attorney James S. Tyre of Bigelow, Moore & Tyre in Pasadena, California put it, "LOG-ON is setting itself up as judge, jury and executioner when it makes unilateral decisions about what is obscene under the Miller standard -- and there is ample reason to believe that the owners of the company have little knowledge about how to apply the standard."5 In fact, a version of X-Stop distributed at the end of July blocks such sites as:
Hence, it's quite clear that before the quality of the software improves, installation of such filters on the Internet terminals in the public libraries is likely to cause serious violations of the patrons' rights.
Lastly, libraries must make sure that the safeguards placed on the information content to protect children do not interfere with the adults' ability to access information. Also, it is completely unacceptable that to gain access to "restricted" materials, an adult has to ask a librarian for a "key" or a password (that's the way it's done in Boston). Instead, one can conceive a better solution when a patron enters his or her name and password (or slides a library card and enters a PIN) to load personalized filtering criteria. This scheme allows parents to setup filtering criteria for their own children as well as protects the privacy of adults who wish to access information not suitable for children. While not difficult to develop, this technology is essential to protect the rights of all library patrons. Only with the development of the quality software that allows for maximum configurability and flexibility can Internet filters for the public libraries (including PICS filtering) become acceptable.
Many private corporations with Internet connectivity are also interested in selective blocking of Internet access from their computers. Their motivation is quite different from protecting children, though. Most corporations are interested in blocking access to the entertainment sites to prevent their employees from wasting time surfing during work hours. One possible solution to that problem would be using PICS-compatible filtering software that uses a labeling vocabulary designed to track entertainment content. Some opponents criticize PICS for enabling such content-based blocking of entertainment information by employers.
The most simple response to the above statement is that PICS does not enable such content-based filtering, but just provides still another way of implementing it. Indeed, most corporations use "firewalls" to control their Internet connections. It is quite easy to reconfigure those firewalls to refuse Web data from a list of known entertainment sites. Hence, a labeling agency would provide a list of the entertainment sites instead of labels, and the blocking mechanisms are already in place at large corporations. The PICS standard for the large corporations simply provides an alternative to the firewalls.
The other argument is that an employer owns the corporation's Internet connection and all the computers with Internet access. Hence, an employer has a right to select what content is acceptable on the corporate network. One can claim that there is no reason why the blocking has to apply to the use of the Internet after the working hours. It is hard to disagree with the argument that "kind" employers should allow their people to browse the Net in any way they want during their leisure time. However, one cannot force employers to be "kind." After all, it is the employer who owns the computers, and it is the employer who has a right to choose what content is appropriate. If one disagrees with the corporate policy, one can definitely subscribe to an Internet connection at home and select the content as one pleases.
There is only one issue that needs to be clarified, though it is not connected to the PICS standard, and applies to more then private companies. Many Internet filters not only block access to the information, but keep a log of all sites that a user tried to access. Such logging violates the privacy of the users if done without an explicit advance notification of the users. Such logging per se and the further use of those logs are important issues that require a careful consideration. However, they are beyond the scope of this paper.
While the harm to the free speech arising from misguided content-based filtering in the public libraries and private corporations can be very alarming, a misuse of the filtering software by ISPs can be dreadful. It does not take much imagination to picture ISPs swearing to deliver only "quality" content to the American families. Those ISPs might decide to install PICS-enabled filters with the filtering criteria that reflects the moral values of the company's CEO or board of directors. If many ISPs adopt a "Family Friendly Internet" slogan in the manner described, that would signify the end of the free speech on what in the opinion of the Supreme Court is "a unique and wholly new medium of worldwide human communication." Fortunately, this is not likely to happen.
First, ISPs will not censor for economic reasons. The Internet Service Provider market is very competitive as there are many ISPs that offer services in the same area. In choosing an ISP, people will be very conscious of the censorship practices of the providers. Even if some people might initially think that the censorship by the provider might make life easier for them, the competitors will make sure that people do not hold those erroneous beliefs. Moreover, those ISPs that practice censorship will automatically lose clients who want to keep their options open and enjoy everything that the Net has to offer. Besides, non-censoring ISPs will be quite successful in promising subscribers the same protection through personalizable PICS-enabled filters without sacrificing the possibility of full access. Finally, ISPs can use newly developed proxy-server technology to provide ISP-managed filtering solutions, based on the subscriber's filtering criteria. Such solution would allow subscribers a full control over what content is deemed acceptable while freeing them from the maintenance of the filtering software.
The second reason to believe that ISPs will not engage in content-based censorship is at least as compelling as the first one. Currently, in providing Internet access, ISPs are legally considered to be common carriers, which exempts them from any responsibility for the content that they transmit to the subscribers. Hence, ISPs are legally liable for neither providing email services to anyone engaging in a criminal activity over email nor transmitting files containing child pornography through Web interface. On the other hand, the common carrier status requires Service Providers to transmit all materials and prohibits any content-based censorship (without an explicit request from a user). It is hard to believe that any ISP would give up its common carrier status and open itself up to a variety of legal suits.
After discussing how the PICS standard might be used (or misused) by the government and large organizations, it is important to recall the true intent behind PICS -- to allow individual users to selectively filter out information according to their values. Specifically, PICS is designed to help parents protect their children from undesirable materials, while allowing willing adults to access those materials freely. It is important to discuss how well the PICS model is suited for its main application: helping parents to protect children from unsuitable content.
The first question, which happens to be the easiest, is where would an average family obtain the software needed to filter Internet content? Fortunately, almost every ISP is offering some filtering software either for free or for a nominal fee. Moreover, Microsoft's Internet Explorer already contains PICS-compatible Content Advisor, and Netscape is promising to incorporate one in its Communicator. While it is true that labeling organizations currently are not numerous, and their labeling databases are far from being complete, one has to remember that we are talking about a very new technology created only a year ago. During that year filtering software and labeling services have made a significant progress and will continue being enhanced in the future.
The dynamic nature of the Web, however, is the root of the largest dilemma for everyone who is going to rely on the PICS-based filtering software: the existence of unlabeled (unrated) sites. Because of the global, ever-growing, and ever-changing nature of the Internet, it is impossible for the labeling agencies to keep their databases completely up-to-date. There will always be a great number of sites about which the labeling agency does not have any information. Naturally, one may believe that among unrated sites there are some that are not appropriate for children. What should a parent do about those sites? Apparently there is a decision that every parent has to make. Namely, should all unrated sites be blocked, or should they be allowed? One has to realize that among sites that are unrated, only a tiny fraction are inappropriate. Whether blocking all potentially useful and interesting information justifies an extra security gained from blocking all unrated sites is something for each parent to decide individually.
Critics of the PICS standard point out that if a majority of parents decide to block all unrated sites, numerous undesirable effects are likely to happen to what children will see on the Internet. For one, administrators of the questionable sites might figure out that not labeling their sites at all (and hence saving efforts and money involved with labeling) will lead to the same effects as labeling their site -- it will be blocked for most children.
The more important effect of blocking unrated sites, however, is that it may lead to the suppression of non-commercial communications accessible to children. As we know, labeling a site takes efforts and sometimes money. One can be sure that large commercial sites will spend those efforts and money to label their sites using all influential labeling vocabularies to avoid being blocked by the filtering software. Smaller sites with less mass-market appeal that are not too concerned with their under-18 audience will be reluctant to label. Also, labeling bureaus are going to expand their labeling resources on large corporate Web sites that generate much traffic before they may attempt to label sites with a smaller volume. Hence, for the children whose parents have blocked unrated sites, "the Internet will function more like broadcasting, providing access only to sites with sufficient mass-market appeal to merit the cost of labeling."12
What can be done to prevent such undesirable effects? Most importantly, parents must be aware of the issues involved and the possible consequences of their choices. A manual for the filtering products is the best place for such information. Also, a screen that allows parents to configure filtering software (or to submit information to a proxy-server) must have rather extensive explanations of the pros and cons for each user-selectable option. The software should encourage parents to review all default filtering criteria to ensure that what children see and do not see on the Internet indeed reflects the parent's values. Likewise, the software authors must make an effort to inform users of the different labeling vocabularies and labeling services available. Perhaps www.netparents.org could keep a special page where it would collect information on all labeling agencies around the world and provide a short review of each.
By allowing access to the unrated sites, parents are not completely defenseless against the worst content on the Net. Most of the current Internet filters besides PICS filtering allow "heuristic" filtering. They use filters that are looking through the content of a Web page searching for specific key words or phrases, which indicate an extremely sexual or violent content of the page. Heuristic filtering is a useful technology that can in many cases block access to the worst sites. However, as is true with any technology, such filtering is very easy to abuse. Hence, if one is very careless and uses simple word-recognition without any respect to the context, various problems are certain to surface. Unfortunately, most of the current filtering software use key word recognition carelessly. Many cases where simpleminded filters have failed are well publicized. For example, "once America Online's software refused to let users register from the British town of 'Scunthorpe.'" 16 In other less comic cases, such software may block pages about breast cancer or pages with questionnaires that besides other things ask for the reader's sex. These problems, nevertheless, can (and I believe will) be solved through proper software design.
There is something that the government can, and should, do as well. As it was discussed before, sites whose content is not appropriate for invasive broadcast media can, and should, be legally obligated to label themselves and provide self-ratings. Because of such regulation, the number of unrated sites that are not appropriate for children would decrease dramatically, at least in the United States. Fortunately, "most adult sites are more then happy to label and use filtering services, they want to attract those looking for the services they offer while avoiding the difficulties associated with angry parents."10
Finally, there are arguments that most parents are too "lazy" to change filtering profiles every time they use the computer. Hence, "parents may setup filters at levels appropriate for their children, and not disable them for their own use."16 To deal with this potential problem the software must be designed to be more "proactive." Instead of being a silent and transparent "bug" in the network layer, the software must take a role of an access manager. Consequently, when a user tries to access a blocked site, messages like "Connection failed" or "The server could be down or not responding" are completely unacceptable. Instead, the message must indicate that the connection to the site was blocked, for what reason (site is labeled off-limits, site is not rated, ...), and allow a password to be entered to gain access. After a valid password has been entered, the software should remember to use the set of restrictions (if any) associated with that password till the end of the session. A filtering software that adheres to these recommendations frees the parent from worrying about restrictions and accounts until some site actually is blocked.
There is still one potential problem that needs to be considered. Most people use search engines regularly for locating information on the Net. In would be very beneficial to allow a filtering software to communicate with search engines to prevent search engines from returning links that will be blocked when followed. Also, such communication is useful to prevent excerpts from the blocked pages from being displayed by search engines. The negative side of this approach is that an adult whose access to the Net is mediated by the filter setup to be used by children will never see that the blocked sites even exist. Hence, the first time in a session a link to a site is about to be blocked, the filtering software should inform the user about that fact and allow a password to be entered to disable filtering. One can think about many even less intrusive methods of notifying the user that some links have been removed from the search result. A red light on the status bar next to an input box for a password might do the trick just as well. The main goal is to make sure that the user is aware of the filtering taking place without disturbing the parent or the child.
One may see that "making sure that the user is aware of the filtering taking place" philosophy is completely diametrical to the philosophy that some current Internet filters take. SyberSnoop, for example, markets as an undetectable Internet filter and logger. Like several other current software filters, it has an ability to log all information accessed by the user including the content of all email messages sent and received. These programs clearly raise questions about protecting privacy of the users and minors in particular. Such discussion, however, is not related to PICS and is beyond the scope of this paper.
There are more uses for the PICS standard than protecting children or preventing employees from wasting job time on entertainment sites. The PICS standard can be used not only for filtering, but also for simple advisory and selection, the opposite of blocking. For example, Better Business Bureau can provide its ratings of the online businesses in a form of PICS-compatible labels and host a labeling service that would distribute them. Then, whenever people who instructed their software to consult BBB's labels view a site of a company with a flaky reputation, a warning about the company's business practices would be displayed.
The PICS standard has a potential to provide enormous help to anyone searching for information on the Internet. For many sources of information "the PICS labels could contain a ... mix of factual (for example: author/ownership, type of corporate source, length, subject coverage, geographical coverage/relevance) and qualitative (spell check indicator, accuracy measurement, indication of peer-reviewing, timeliness, etc.) labels."2 A labeling service that provides quality labels as above could perform searches based on the label values and return a list of references that match a researcher's criteria. Alternatively, a researcher may use a usual Web search engine and by using mechanisms mentioned above ensure that the result does not contain links to what he or she considers to be unreliable sources of information. Many more applications of PICS are possible. In fact, in any task where an information about information (metadata) is required, the PICS technology may provide a solution.
When the PICS standard becomes globally accepted and used, people will obtain a very powerful tool to exercise a great control over their lives and experiences on the Internet. As is true about any powerful technology, PICS can be abused, and if the abuse is widely spread, PICS can cause some damage to free speech on the Internet. However, the slippery slope argument, which claims that any technology that can be abused by the government or individuals should not be developed is not acceptable. Instead, one should suggest recommendations and layout a framework for the proper use of the PICS technology to minimize the risks involved with it, and allow people to rip the benefits that PICS has to offer.