Universal Access and Universal Service:

Lowering the Barriers to Entry into Cyberspace


Matamba Austin

Peggy Chen

Jeff Doering

Hazel-Ann Mayers

Lars Oleson

Serrin Turner

Nadia Vinson



Submitted in fulfillment of the requirements for

The Law of Cyberspace — Social Protocols


December 10, 1998



"The power of the Web is in its universality. Access by everyone regardless of disability is an essential aspect."
— Tim Berners-Lee, W3C Director and inventor of the World Wide Web
"As the Internet becomes the preferred mode of political participation, lifelong learning, employment and commerce, as well as personal expression, non-access . . . could become tantamount to nonexistence."
— Allen S. Hammond, IV, The Telecommunications Act of 1996: Codifying the Digital Divide


The information revolution has opened up countless new opportunities for commerce, personal communication, education, dissemination of information, and political participation. Yet, for millions of disabled and economically disadvantaged persons, these opportunities have yet to be fully realized.

For those with disabilities, the usability of technology poses a barrier. For example, blind persons cannot access the information contained in web pages if the information is not formatted in a way that screen-readers for the blind can sensibly interpret. Deaf persons cannot enjoy multimedia content if the audio portion is not visibly captioned in some manner. Physically disabled persons cannot navigate through cyberspace without specially modified hardware.

For those who are economically disadvantaged, the affordability of technology poses a barrier. For example, the price of personal computers bars many from accessing the Internet. Even for those who do have computers, for those living in rural or inner city areas, telecommunications links to the Internet are often not readily available.

This paper explores the question of how to make cyberspace more usable and affordable, so that all persons may equally enjoy the opportunities the ongoing information revolution brings with it. The first question – how to make cyberspace more usable – falls under the heading of "universal access." Chapter 1 covers the technological aspects of universal access, while Chapter 2 focuses on legal issues. The second question – how to make cyberspace more affordable – falls under the heading of "universal service" and is covered under Chapter 3.


Chapter I
Universal Access – Technology

by Peggy Chen (§§B2-4, C2, D1-2)

Jeff Doering (§§A1-2, B1, C3, D5)

Lars Oleson (§A3-4, D4)

and Nadia Vinson (§§C1, D3)

A. History of Universal Access

Although Internet related technologies are relatively new, many of the functions they provide are not. For example, e-mail and Internet telephony services are similar in purpose to both the traditional telephone and postal systems while Internet audio and video technologies have real-space counterparts as well. While the new technologies change important factors such as speed of delivery, they are still comparable to the older systems. It is important to note this comparison when considering universal access issues because society has already invested significant effort in ensuring the accessibility of traditional communication systems. An assessment of how new technologies might effect existing efforts both highlights the degree to which Internet technologies require new accessibility initiatives and provides some suggestions for actually designing new initiatives.

1. Mail

The accessibility issues involving the telephone and postal systems can be analyzed in two respects: the specific measures that affected accessibility and the overall accessibility achieved. An analysis of the historical accessibility achieved is an important benchmark for accessibility of new technologies: at the very least new technologies should equal their predecessors. Historical measures that promoted accessibility provide valuable insight for those creating new initiatives.

Dating back over two hundred years, the postal system in the United States has had significant time to achieve accessibility. In fact, according to Title 39 of U.S. Code:

The Postal Service shall have as its basic function the obligation to provide postal services to bind the Nation together through the personal, educational, literary, and business correspondence of the people. It shall provide prompt, reliable, and efficient services to patrons in all areas and shall render postal services to all communities.

This statement highlights the justification for universal access to person-to-person communication systems. In keeping with this purpose, historical advances in the postal system helped to promote accessibility. For example, the introduction of "free city delivery" in 1863 and "free rural delivery" in 1896 increased accessibility to many individuals. Free delivery refers to the system where mail is delivered right to an individual’s home rather than kept at post offices for pickup. Free city delivery offers obvious advantages to the mobility-impaired as it provides access to many postal services from home. Free rural delivery provides the same benefit plus significantly improved accessibility to individuals without impairment as well. Because rural post offices were often located quite a distance from recipients’ homes, getting one’s mail might have required over a day’s travel. Thus, free rural delivery reduced the economic burden of using mail services. More recent measures such as handicapped parking spaces and wheelchair ramps address accessibility of post offices themselves.

Economic features of the postal system help improve accessibility as well. The "Uniform Rate," introduced in 1863, creates a price structure based only on a package’s weight rather than depending on its destination as well. This kind of rate structure ensures that individuals living in remote areas have an equal opportunity to receive the benefits of the postal system. Special programs such as book-rate service promote educational use of the postal system.

The specific examples listed above provide some examples of postal system accessibility. But they do not directly address the most relevant question. How accessible is the current postal system? The answer to this question provides a direct benchmark to use when judging new technologies. Actual access to postal system services is quite good. Economically, the services are priced to allow virtually everyone to participate in activities such as sending a letter. Physical accessibility of postal facilitates is consciously pursued and services to the doorstep reduce accessibility barriers as well. However, basic access to a technology does not guarantee useful access. Technologies sometimes impose additional barriers for some individuals. What barriers then are fundamental to postal service? In fact, the postal system only cares about the physical form of its packages. It does impose any restrictions on the format of the data transmitted. A letter could be written in any language, typed using ink or Braille symbols; it could even encode information in some form meaningful only to a computer. This unrestricted information format results in a communication system that can be used for virtually any type of communication provided that the parties at both ends agree on a common format. This final requirement is important to understand. The postal system’s ambivalence towards data representation does not impose accessibility constraints, but neither does it prevent them.

2. Telephone

Having examined the postal system and discussed important issues with regard to accessibility, it is now useful to turn to the telephone system. The telephone system provides many advantages over postal service. Chief among these is speed of information delivery and the interactive nature of the media. However, fundamental differences between the telephone and postal systems required that accessibility be addressed in new ways.

Basic accessibility to telephone services requires a connection to the telephone network. Universal service efforts were designed to minimize this hurdle. These efforts are described in the Universal Service chapter, below. However, as mentioned earlier, universal access requires that technologies provide services in a useful manner. Unlike mail services, the telephone system imposes significant constraints on the communications it supports. Providing useful telephone services to hearing-impaired individuals has required specific measures to address these constraints.

The Americans with Disabilities Act (ADA) mandates support for two technologies that provide telephone service access to the hearing impaired. Text Telephone Devices (TTY, sometimes know as TTD) support typed communication over the regular phone system. This helps overcome the telephone system’s basic requirement of audio data formats. The ADA requires support for such devices in a variety of ways including mandatory availability at public phone areas, hotels, etc. However, TTY communication requires that both parties use the technology and exists as an add-on to the telephone system. This means that TTY availability does not guarantee that the hearing impaired can effectively communicate with other parties. The Telecommunication Relay Service (TRS) was created to overcome this constraint. TRS provides translation between audio communication and TTY. A live TRS operator facilitates this translation in real-time. The ADA requires that telephone service providers offer these services at no charge. Together, TRS and TTY make many telephone services available to the hearing impaired.

Society has clearly addressed accessibility issues involving the telephone system. Its two biggest constraints, cost of entry and restriction to audio data, have both been addressed. While current measures do not completely solve all accessibility issues, they make significant progress. Yet, ever-changing technologies require a continual reevaluation of this progress. For example, the FCC is currently addressing new TTY issues created by mobile telephone communications. Internet technologies dramatically change communication possibilities. They too require a careful reevaluation of accessibility initiatives, goals, and milestones.

3. Audio and Video

Universal access accommodations have also been implemented in television. The most common, now available nearly everywhere, and on every major network during prime time, is closed captioning. Closed captioning technology provides the option of having a text transcript of the dialogue appear on the screen. It is estimated that twenty million Americans have sufficient hearing loss that they cannot understand the dialogue on television. For these people the access that closed captioning provides is not just about being able to utilize the most common form of mass communication, but is about being fully integrated into society. The captioning can be turned on and off and is imbedded invisibly on the video signal, so that for those viewers who opt not to display the closed captions there is no way to distinguish a television signal with closed captioning from one without. No loss of picture or sound quality is needed to accommodate the closed captioning information. This is because closed captioning fits onto an otherwise unused portion of the television bandwidth.

Developed in the sixties and early seventies, closed captioning was first used to make accessible the PBS show, "The French Chef." Closed captioning was implemented long after the proliferation of even color television and was possible only through a clever engineering "hack." By taking advantage of "Line 21," roughly corresponding to the black bar which is normally off screen, but which is visible, for example, when the vertical hold on a television is improperly configured, the developers of closed captioning were able to imbed the digital signal carrying the closed captioning information. The retrofitting was successful, but only because of the chance existence of the "Line 21," an artifact of the broadcasting process. This is not to say that the implementation of closed captioning was cheap or easy.

Originally, closed captioning of programs was achieved through the use of public funds or through private grants, a fact which greatly limited the availability accessible programs. Today closed captioning is typically part of the postproduction of a program, included in the production budget. Next to the production cost of most television shows, the expense of adding closed captions is trivial. Captioning costs vary based on the amount of dialogue in the program, but the falling costs of equipment and the availability of efficient captioners have allowed the FCC to mandate that all but the smallest television producers and broadcasters caption their material. Today, the sophistication and availability of equipment for captioning is such that captioning live broadcasts is possible.

Widespread adoption of closed captioning is a relatively new occurrence. The Decoder Circuitry Act, a recent mandate that requires most new televisions sold in the US to include the decoding circuitry for displaying the closed captioning signal, met with significant opposition, but finally passed and took effect on July 1, 1993. Previously, users who wished to be able to view the closed captioning signal had to purchase a separate set-top-box type decoder. The FCC has had a program designed to phase in universal closed captioning, with escalating requirements through 2006, when 95% of a broadcaster’s programming will be required to be captioned.

Captions are not just about providing access to the deaf and hearing impaired. Besides the importance of providing access as a democratic ideal, there are many positive side effects of closed captioning many of which were cited as arguments for the recent passage of the Decoder Circuitry Act. Perhaps one of the most common uses for captions is to provide the ability to transmit the dialogue of a television program in a noisy environment, such as a bar. Closed captioning could also be a boon to those trying to learn English. The positive externalities of closed captioning strengthen the arguments supporting its adoption.

4. Physical Space

Perhaps the most common accessibility accommodations are modifications to buildings. Typically, these seek to alleviate mobility impairments. Construction also provides an obvious example of an industry in which implementing accessibility accommodations after completing a production entails significant extra costs. The costs involved in installing ramps and elevators, widening doorways, and redesigning lavatories are much greater when these modifications are made as an afterthought rather than included in the original design plans. Retrofitting existing buildings is expensive and disruptive and often results in only minimal access. As in construction, retrofitting for accessibility in telecommunications technology is far more expensive, and less effective, than including accessibility accommodations in the original design.

B. Transition from Real-Space to Cyberspace: Cyberspace Changes Accessibility

"Unless a web site is designed in an accessible format, significant populations will be locked out as the World Wide Web rapidly advances from a text-based communication format to a robust, graphical format embracing audio and video clip tools."

1. E-mail & Internet Telephony

E-mail and Internet telephony were mentioned earlier as significant new Internet technologies. They represent a particularly appropriate opening for a discussion of accessibility issues created by the Internet because they share some close resemblance to the historical systems discussed. Can one then simply equate e-mail with the postal system and Internet telephony with the phone system for the purpose of this discussion? Despite obvious parallels, the answer is no. While e-mail and the postal system are both capable of delivering information in the form of a letter, the comparisons are not so simple. For example, in terms of speed of delivery, e-mail actually resembles the telephone service. Yet, it does not support interactive communications.

Accessibility issues involving e-mail fall into several categories. As always, the basic question of access to the service is critical. A second important issue is accessibility to computers in general. Individuals with physical disabilities such as sight impairment might need assistance in using a computer. This need must be met if they are to access e-mail services. What additional constraints might e-mail itself put on communications? Much like the postal system, e-mail itself imposes very few constraints on the data transferred. E-mail can carry basic text or complex binary data. Again, the important constraint is that individuals must agree on common data representations. The accessibility of Internet data representations is relevant to e-mail systems in particular. Specific data formats such as HTML and video information are addressed later.

The specific accessibility issues of Internet telephony are quite like those of the traditional telephone system. This is because the main constraint on both systems is their reliance on audio information. TTY and TRS were created to address this on the traditional telephone system. In fact, Internet telephony represents a threat to TTY and TRS. It is unclear that ADA provisions requiring these services would cover Internet telephony. Does this imply that Internet telephony requires its own version of TTY and TRS? A close look at the original at the reasons for introducing TTY answers this question. The traditional phone system is built on a foundation of audio information (while much of the system is now digital, the end-user connections typically maintain this analog heritage). Thus, the need for a non-audio mode of communication required that TTY be introduced on top of the audio foundation. The Internet is fundamentally different. The Internet is based on a generic digital foundation. Like e-mail and the postal system, this digital foundation places few constraints on the data representations supported. Internet telephony sits as an audio layer on top of that foundation. To require TTY on top of this audio layer would be to miss the promise of the Internet. Instead, a text layer should be built directly on the digital foundation. Concededly, e-mail is a text layer built on the digital foundation. But e-mail is not interactive like Internet telephony and the telephone system. Perhaps systems like Internet Relay Chat, ICQ, or Unix Talk will suffice.

This simple example highlights the incredible potential of the Internet in solving accessibility issues. A new data representation can always be introduced on the digital foundation. Accessibility concerns should be addressed by designing accessible data representation and products that use these representations.

2. Adaptive Technology for Computing in a Text-Based Environment

Cyberspace enhances communication and information transmitted to large audiences because there are no borders to cross and no time constraints. The Web hosts a unique medium in which all members of society may interact at their own pace. The potential for a convenient means of mass communication exists, but it will not be effective unless all members of society have access to it.

Mainframe computers have been around since the 1960s and the personal computer is less than 20 years old. The invention of the spreadsheet and other applications helped businesses by facilitating accounting, payroll, inventory control, ordering supplies and billing. As computers gained popularity and infiltrated the work place, universal access concerns were voiced. "As the work force ages, accessible information services must support the requirements of people who develop age-related limitations of vision, hearing, or mobility." People with disabilities and functional limitations were at a disadvantage because their impairments often times limited them from using features on the computer.

Once the IBM PC and the Disk Operating System (DOS) became standards, hardware manufacturers and software developers took great strides to design adaptive technologies that would make PCs accessible. Universal access initiatives produced many solutions in this text-based environment, due in large part to the simplicity and consistency of this environment.

a. Input Devices

People with physical impairments often experience difficulty with conventional keyboard designs and can benefit from a large selection of alternatives. For instance, keyguards can prevent two keys from being pressed simultaneously and mini keyboards with pressure sensitive pads can be configured for users with small ranges of movement. By gathering input from alternative movements or actions that are interpreted as keystroke substitutions, these mechanisms provide alternative keyboard systems.

Voice recognition provides users who cannot effectively use a keyboard an alternative means of inputting information to the computer by speaking into a microphone. These products currently cannot convert regular conversational speech into text, but they can do virtually anything the user can do with the keyboard if the user speaks articulately. Other alternative input mechanisms include pointing devices and infra-red eye-tracking.

People with cognitive disabilities may benefit from word prediction programs and spell checkers. Users with a reduced attention span may find large print displays beneficial, as well as macro software that expands abbreviations, which can reduce the necessity to memorize keyboard commands and ease the entry of commonly-used text. These adaptive solutions are particularly useful for slow typists, probe or pen users, and people with dyslexia.

b. Output Devices

The estimated 8.6 million Americans with visual impairments are at a severe disadvantage if they are not able to access the content on the screen. Low-vision users utilize screen magnifiers to enlarge the content displayed on the monitor, while the blind rely on alternative presentation mediums such as audio or tactile displays.

Scanners with optical character recognition (OCR) convert an image to text, such as a scanned paper document or electronic fax file into computer-editable text. The converted text is then read using voice synthesis or printed in large print for low vision and blind people.

Screen readers are a type of voice output technology composed of software and hardware components. Special software programs "read" computer screens from left to right and top to bottom and then speech synthesizers then "speak" the text to the user.

Refreshable Braille displays provide blind computer users another computing alternative. Screen text is scanned left to right and translated line-by-line into Braille on a display area composed of vertical pins.

As one of the prevailing chronic disabilities affecting over 22 million people in the U.S., hearing impairments prevent receiving auditory information. This disability poses many challenges in the real world. Although it does not present as large an obstacle in relation to computer use, it will in the future when multimedia goes on-line and the telephone, TV and web merge into one.

c. Externalities

Adaptive technologies were designed to benefit individuals with specific disabilities; however, many disability-free employees in "hands-busy," "eyes-busy," or noisy environments are benefiting today from these flexible interface alternatives. Applications with user interfaces that accommodate choice of alternative displays and keyboards are also being employed to minimize or prevent the visual fatigue and repetitive strain injuries associated with keyboard-intensive environments.

Many high-demand and high-performance environments – including governmental national security organizations – are using infra-red eye-tracking devices, originally developed to assist people with extensive mobility limitations. Speech recognition was developed for quadriplegic individuals but also benefit people with repetitive strain injuries.

3. Hardships in Facilitating Graphical User Interfaces

Although a wide range of accommodation products is readily available, there are still areas of unmet need where accessibility cannot be readily achieved. Technology is its own enemy when universal design principles are not factored in. Hopefully technology advances will meet these needs in the near future, but manufacturers of adaptive technologies are struggling to keep up with constant evolutionary changes.

During the late 1980s and early 1990s, software application programmers began designing more sophisticated screen layouts. The human user interface received a facelift with the birth of pull down menus, colored tool bars and special cursors. These features provided convenient functionality for many, but were not accessible by users of adaptive technologies. Fortunately, programmers have been able to overcome these barriers by modifying existing products to be compatible with these screen innovations.

However, the advent of the graphical user interface (GUI) in the late 1980s posed a very strong challenge and threat to computer accessibility. At the time, many assumed there was no solution to this problem. People who could not see the screen were not going to be able to access these new interfaces because the graphics screen could not be read by a screen reader of Braille system.

Fortunately, these fears have not been realized. Currently, advanced technologies are starting to produce speech systems that can translate graphics screen information into an accessible form. But unfortunately, they are having very limited success at the current moment.

4. The Web : Encompassing Old Problems and Creating New Problems

As we proceed full speed into the Internet era, we are realizing that the Internet not only encompasses the problems the GUI interfaces had, but it also presents many new problems. "The Internet and technology have moved so quickly that assistive technology has not been able to catch up."

HTML opens the door to complex web sites with graphics and interactive content. It began as a simple, predominantly text-based subset of SGML. However, innovative users pushed the structured assumptions of HTML to create aesthetic designs, failing to realize their creativity did not work with existing accommodation solutions such as screen readers and Braille printers developed during the text only era. Although many users benefit from the improved functionality, many people with impairments can no longer access the content under the new presentation format.

Pictures, image maps, frames, columns, tables, and complex web designs all present obstacles. Improper HTML structure, which new GUI browsers can parse successfully, and new dynamic elements such as Java applets and active-X controls, all contribute to the erosion of the accessibility that users with disabilities, slower connections, and text-based operating systems previously enjoyed. Accessibility issues in cyberspace are neither simple nor straightforward. Real-space solutions are often not applicable in cyberspace, and although some are seeing success through modification, they will not last for long.

The key to preventing retrofitting requires cooperation from technology makers, users, and the concerned community. As cyberspace is still in its developing stages, the current status of new developments must be reevaluated to ensure accessibility. Society must be educated about these changing needs.

Cyberspace mandates a new approach to dealing with universal access issues. Taking into consideration the inefficiency of merging old technologies with new technologies when there is no backward compatibility, the opportunity exists now to look ahead and make improvements for the future. Cyberspace is not real space, but the lessons learned from real space are valuable.

C. The Present: Retrofitting to Catch-Up

1. HTML and The World Wide Web

The Web began in the late 1980s at the European Laboratory for Particle Physics (CERN). Researchers needed an easy way to access information stored on the laboratory network, and by the early 1990s they had developed their own text-only browser along with an initial version of HTML, or HyperText Markup Language. The concept of an internet had been around for some time, but it was the desire for an easy method of information retrieval that drove the development of the World Wide Web. CERN researchers released the Web to the Internet community and in a short time it became the World Wide Web as we know it today, a simple yet effective source of information storage and retrieval that was designed to be accessed by computer users around the globe.

Initially, Web use focused mainly on research and government applications. The text-only interface was adequate for the transfer of research data and various forms of documentation that were needed by groups of people who did not always work in the same laboratory or office. Since it had a text-only format, the Web faced few standardization issues. Users at any location could easily browse and download information.

Another benefit of the text-only format was accessibility. Early third-party assistive technologies did not have many hurdles to cross in terms of compatibility since information was text-based and the devices were already designed to function under the text only environment of DOS. Devices such as screen readers simply had to process a batch of text and output that information as synthesized voice.

Today, the World Wide Web is no longer limited to a text-only format and is filled with graphics and other multimedia amenities. We use the Web for more than just the transfer of textual information; we use it for entertainment, for social interaction, and for commerce. The motivation for using the Web no longer comes mainly from research institutions or departments of the government since much of society now participates in cyberspace activities. This makes regulation of Internet practices more difficult.

The lack of centralized regulation combined with the use of multimedia has caused problems of accessibility on the Web. Developers of screen readers cannot count on the text-only format and must make accommodations for the multitude of graphics and page formats such as tables, frames, or cascading style sheets. Users who are unable to use a mouse or even a keyboard cannot depend on interface standards for their alternative devices. Those with cognitive disabilities may be overwhelmed by the multimedia, unable to process the content.

Much of the problems of accessibility with the World Wide Web come from current standards including browsers, related editors, and mark-up language formats such as HTML. HTML, the standardized language of the World Wide Web, was designed to handle the Web’s hypermedia functionality. Web page authors use commands called tags to link to other pages, to describe the site for searching purposes, and to define the layout of the document. It began in the 1980s as SGML, Standard Generalized Markup Language, and has gone through several transformations to improved versions since that time.

Standardization for HTML was present from its onset. SGML was an ISO (International Organization for Standardization) standard, making it accessible to commercial organizations worldwide, and HTML became an official standard in 1994. Since major browsers such as Netscape Navigator and Microsoft Internet Explorer followed the same set of standards for pages, documents were rendered much the same regardless of a user’s browser.

Soon thereafter, Netscape and Microsoft began to add proprietary extensions to their browsers’ configurations. This meant that tags supported by Netscape were not supported by Internet Explorer and vice versa. If a Web author wrote pages using tags supported by one browser, aspects of that page set by the nonstandard tag would not appear in other browsers. It was at this time that standardization moved from the hands of a single organization or small group and became partially dependent upon the code behind Web browsers. Standardization existed but was subjective.

This continues to be the state of standardization on the Web today, though some organizations are attempting to establish centralized standards to make pages more compatible with all major browsers. Web authors must format their pages with a variety of tags to ensure that the pages are rendered in a set way on all major browsers. At the same time, many authors write pages without the proprietary tags and forfeit the added capabilities offered by browsers.

This system of writing according to browser is not only inconvenient for Web authors, but it also causes accessibility problems. The standardization that existed for SGML and HTML made it necessary for all authors to work from a single set of tags. Therefore, third-party assistive technologies were working from the same file formats. With the added tags that function depending upon browser, assistive technologies now must be able to handle not only a new and ever changing library of tags, but must recognize tags that are not able to function in some browsers and to compensate for that lack of functionality.

Possessing standards to ensure universal authoring is not the only need for accessible Web design. There is also a need for accessibility standards as well. This means that certain aspects of HTML should be set such that pages are accessible to all audiences regardless of impairment.

The World Wide Web, like many other forms of communication, has a large audience that includes members of the disabled population. Other communication technologies such as print, telephone, and television have all addressed accessibility issues with appropriate solutions. Newspapers or other documents can be published in Braille format for blind or visually impaired readers. Telephone has TDD technology such that information can be transferred between parties in text format when one or both users are deaf or hearing impaired. Television has closed captioning for hearing impaired users so that they can understand what is being spoken. These technologies have touched thousands of people, making the technology and the communication more accessible to everyone.

Blind or visually impaired members of our society are automatically cut off from the visual aspect of the Web. Much of the Web today is graphics-based, which means that visually impaired users need to access graphics based information in alternative ways. Blind users have third-party assistive technologies such as screen readers and Braille displays to help them to interpret Web pages, however these assistive technologies depend upon the format of the Web documents. Adequate and standardized information must be present in order for the devices to correctly interpret the page’s layout as well as any visual information that may appear.

Many people have disabilities that make it difficult or impossible to use a pointing device to interact with Web pages. The default method of interacting with Web pages is with tools such as the keyboard, mouse, or trackball. Therefore, it is important for browsers to make it possible to use alternative devices for pointing and to maintain standards so that these devices will function on any system and with any browser.

Those with any of a large range of cognitive disabilities may find it difficult to process Web pages in some formats. Users with attention impairments may be unable to comprehend information that is combined with bright, distracting colors or graphics. Any nonessential information may interfere with the user’s comprehension of the main content. Extensive hierarchical formats between pages resulting from file structure or the use of frames may also be confusing to some users. It is therefore helpful to provide the information is a way such that it is not only available in an alternative format, but is clear as to the organization of the site’s informational content.

Hearing-impaired individuals have only recently begun to face accessibility issues on the Web. With the onset of multimedia in the form of movie clips or other audio-visual formats, deaf or hearing-impaired users are being excluded from receiving some Web information. Therefore the use of captioning for such information is necessary in order to ensure that the hearing impaired population is not left out.

One group that has taken great interest in Web accessibility issues is the World Wide Web Consortium, or W3C. W3C recognizes the difficulties that the disabled population faces when participating in cyberspace and provides information with the hopes that accessible design will become more of a standard. According to W3C’s Web site, their goal is "to lead the World Wide Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability."

In 1994, Tim Berners-Lee of CERN convinced MIT to establish the World Wide Web Consortium. Soon thereafter, CERN discontinued its participation in W3C, which was later joined by INRIA (French National Institute for Research in Computer Science and Control), Keio University in Japan, DARPA (Defense Advanced Research Project Agency), and the European Commission. Today, the W3C resides at MIT’s Laboratory for Computer Science in Cambridge, MA, where Burners-Lee manages the consortium’s activities.

Web founder Berners-Lee operates from MIT’s Laboratory for Computer Science as the director of W3C. He is assisted by Chairman and Director of Development and Industrial Relations at INRIA by Jean-François Abramatic, as well as a long list of other personnel around the globe. The Consortium is funded by its more than 200 member organizations who participate in committees, receive pre-public releases of W3C software, and gain access to W3C information and experts in the field.

According to the W3C, "the World Wide Web offers the promise of transforming many traditional barriers to information and interaction among different peoples. The W3C’s commitment to lead the Web to its full potential includes promoting a high degree of usability for people with disabilities. The Web Accessibility Initiative (WAI), in coordination with other organizations, is pursuing accessibility of the Web."

The Web Accessibility Initiative, or WAI, is a W3C project committed to universal access on the Web. It was created in February of 1997 and focuses on five Web-related activities in its pursuit of increased universal access. These activities are technology development, development of tools, guidelines for use of the technology, education of content creators, and research and advanced development. The WAI aims to combat accessibility problems by making those problems known to the public and by making suggestions available that can alleviate the consequences of previous poor design. The WAI aims to eliminate the need for extensive retrofitting by educating software developers and Web authors about accessibility issues prior to the release of software or Web documents. Not only will this provide greater accessibility to users, but it will also reduce development costs over time. By committing time and effort to increased accessibility now, developers can save considerable amounts of time and effort later with retrofitting existing software or documents.

The WAI is currently in the process of producing a series of Web accessibility guidelines. Since their project is a work in progress, some or all of the following information may be changed at any time. The principles of the guidelines, once completed, should be followed by user agent developers and vendors so that user agents will be more accessible to users with disabilities.

The WAI has developed a series of general principles of user agent design that includes the following:

  1. The user interface must be accessible to all users.
  2. The user agent must render information accessibly.
  3. The user agent must facilitate orientation, navigation, and searching on a page and between pages.
  4. The user agent must make information available to assistive technology software.
Under the heading of these general principles are the numerous guidelines developed by the WAI. The following is a brief discussion of some of the guidelines as they currently stand.

a. The User Interface Must be Accessible to All Users

An important part of accessibility is interactivity. Users should be able to interact with user agents in more than one way in order to ensure that users with disabilities requiring different third-party assistive technologies for interaction can easily be accommodated. All windows and controls such as menus or toolbars should be accessible, as should application installation procedures.

Another critical aspect of accessibility is the layout of controls. An interface to software configuration that is not coherent or consistent may leave many users with disabilities unable to use a product since they may not be able to configure the interface to their needs. A user agent should be configurable to different operating systems, to accessibility devices other than a mouse, and to keyboard controls at nonstandard settings.

Some user agent features may not only be inaccessible to disabled users, but may also interfere with functionality for that population. Visually impaired users who cannot access visual information such as images or video should be able to turn them off. Users with hearing impairments should likewise be able to turn off sound. Users should be able to turn off blinking text or animated images in favor of static representation. Support for scripting, style sheets, and frames should also have the capability to be toggled on or off.

Documentation is an important part of accessibility, since users must be informed about user agent functionality and options. Documentation should not only be available, but should be in an accessible format and should clearly state the available accessibility features of the user agent when that documentation is online. For example, information about keyboard commands should be provided so users are aware of alternative means of interaction.

Effective interaction not only includes the ability for users to send information to the user agent, but for the user agent to send information to the user as well. When a user triggers an event, whether by mouse, keyboard, or other device, he or she must receive some notification of that event. This is often done by a visual or auditory cue, but user agent developers must keep in mind that some users do not have access to that information and alternative forms of notification are necessary for effective interaction.

b. The User Agent Must Render Information Accessibly

Some users must have the capability to override certain page styles. User agents should therefore make this option available. Some necessary override capabilities are font style and size, color of the foreground or background, appearance of animations or background images, video frame rates, audio volume and playback rate, and window position and size.

Users should also have access to alternative content representations. This includes different formats for visual information such as images or video, for alternatives to or additional formats for audio information, and for alternatives to scripts, frames, and tables. If there is no alternative available, the user agent should notify the user of the type of object that is unavailable to that user.

Some third-party assistive technologies such as screen readers have difficulty processing some formats of user agent information. It is therefore important that user agents make available alternative information formats. For example, it may be helpful render information linearly when the default format is in tables where the information scrolls from one line to another within a single cell. A page can also be rendered in a linear fashion based upon headers and lists.

c. The User Agent Must Facilitate Orientation, Navigation, and Searching on Pages

An important aspect of Web sites is the orientation within a page and between pages. There are several techniques used to orient users including site maps, consistency of visual layout, and frames. However, many users cannot take advantage of these techniques and need an alternative method for determining orientation at any time.

One helpful technique for orienting users is to provide supplemental information about the content and structure of a page. This supplemental information may include the number of links, form controls, tables, and view ports. It is also helpful to provide alternative means of notifying a user about loading information, such as whether a page is still loading or whether the loading is stalled. Additionally, link information is also helpful, such as providing a non-visual method of determining if a link has been visited.

The relative position in a document is often lost for users who view documents with alternative means such as screen readers or Braille displays. A relative position may be the amount of the document already visited, the current cell in a table, or the position of the current document with respect to a series of documents. Therefore, it is helpful to provide users mechanisms for highlighting and identifying the current view, the user selection, and the current focus in the document, as well as options for maintaining the current focus as the view port changes.

The use of the keyboard for navigation is important to many users. Therefore, user agents should enable use of the keyboard to navigate sequentially from element to element, among headers within a document, and among elements such as paragraphs, lists, or table cells.

For users whose interactive device requires sequential navigation, it is helpful to perform a search for desired information rather than scrolling through the information as it is presented. Therefore, user agents should enable users to search for links based on various attributes (title, link values, position), to search for specific form controls based on various attributes (text content, attribute values), and to search through alternate text representation of elements.

d. The User Agent Must Make Information Available to Other Technologies

There are several languages that are supported by user agents. Therefore, user agents should provide accessibility for all supported languages. These languages include HTML, Cascading Style Sheets (CSS), and Synchronized Multimedia Integration Language (SMIL).

There are many features that are not supported by user agents. However, it is still important for user agents to make relevant information available to other software applications in order to facilitate interoperability. User agents should therefore use operating system application programming interfaces (APIs) that support accessibility.

2. Updating Screen Readers to Translate Web Content

Page layout information is often key to a full understanding of text presented on a page. The minimum requirement to comprehend a page involves understanding all the words on the screen. However, the blind may not achieve complete understanding of the most important points because these points may only be obvious from the way the words have been physically presented and placed on the page.

Most adaptive technologies were designed in the text-based DOS era and do not work well with graphical user interfaces (GUIs) and Web browsers. Screen readers read the screen similar to the way a sighted user does – left to right, top to bottom, one line at a time. Designed to translate text to speech, they work well with pure text content, but cannot translate much of the content on the Web and in applications with GUIs if page layout is due to the inaccessible presentation style.

One of the greatest shortcomings of existing screen readers is their inability to translate text formatted in tables or multiple columns. Because there is no syntactic information about the table available to the screen readers, it must "guess" the boundaries of the columns or table’s cell borders.


The first


in this


The second


in this


The third


in this



Most Windows screen readers do not permit the user to read columns of data. In the example above, the screen reader will read each line across all three columns before proceeding to the next line. It will read the first line from column one, the first line from column two, and then the first line from column. With the sound turned on, the user would hear "The first, The second, The third." Then it will read the second line from each column: "column, column, column." And so forth.

The blind user can most likely deduce that the incoherent output is due to column formatting, but will not be able to understand the screen content. This problem can be resolved by presenting the information in an alternative format such as a list with three items in it, or by designing screen readers that can understand tables and columns.

Many text-only screen readers are available, but very few are compatible with cyberspace. Jaws 3.2 for Windows is one of the only screen readers that can "reformat complex web pages and list links alphabetically in an easy-to-use list box." However, this feature only works when using Internet Explorer 4.01 and may not work with future browsers.

Compatibility of screen readers with different versions of applications and operating systems is difficult to maintain because screen reader manufacturers cannot foresee future software developments and many software developers are not providing accessibility features. Most DOS screen readers supported a wide variety of strategies for navigating a document based upon text blocks such as next and previous character, word, line, sentence, paragraph, and screen. Existing Windows screen readers offer a limited selection of these features, and none permit searches based upon next sentence, paragraph, or title. Many DOS screen readers provide users with the ability to change the pronunciation of a word or groups of characters, but most Windows screen readers lack pronunciation dictionaries.

As new technologies emerge and more adaptive technologies are required to resolve accessibility issues, ensuring backward and forward compatibility is one of the biggest obstacles to save costs in the long run. Accommodation providers, application developers and operating system designers must work together to bring equal access to all.

3. Accessibility APIs

As computing technology continues to advance rapidly, a new accessibility approach is needed. Accessibility application programming interfaces (APIs) represent one possible solution. Instead of requiring accessibility products constantly to adapt to new technologies, accessibility APIs offer a uniform method for new technologies to provide useful information to the accessibility products. Shifting the burden in this manner promotes accessibility-awareness in the development stage of new software products. Further, it reduces difficulties associated with adapting old accessibility solutions to new technologies.

Before exploring the details of accessibility APIs, it is useful to look briefly at computing accessibility without them. Although it is important to remember that many different accessibility technologies exist, a discussion of screen readers provides an excellent introduction to the topic in general. The preceding section introduced the basic operation of screen readers. Likewise, the problems involved in adapting screen readers to GUIs were described. A closer look at the technical details of those problems helps build the case for accessibility APIs.

As described in reference to web browsers, screen reader developers have been quite aware of the GUI problems. They have customized their products to understand the unique feature of various GUIs. For example, they might exploit operating system specific features in order to distinguish between foreground and background application windows. This would prevent the screen readers from inadvertently reading text associated with a background application. While such solutions help, they cannot completely solve the problem. Operating systems and applications often do not provide enough information for screen readers to translate visual data in a meaningful way. Even when information is available, having to customize screen readers for individual versions of applications and operating systems imposes a significant burden on developers. This burden passes directly to impaired users as they continually wait for accessibility solutions to catch up with current technologies.

What are accessibility APIs and how can they address the problems outlined above? In general, APIs are built into software products to facilitate cross-application communications. The APIs allow applications to communicate with the operating system and with other applications. This allows a single product to take advantage of features in other products rather than having to recreate all functionality. For example, an operating system might provide a network API to free applications from the burden of handling the explicit details of TCP/IP and other networking protocols. Likewise, a video API might allow applications to display information on a computer monitor without understanding all of the details of video hardware. In fact, an accessibility API has much in common with video APIs. While video APIs take various data formats and position them on the display, an accessibility API provides a standard way to present the same information in an textual or audile format.

The simplest parallel between video and accessibility APIs would be an application sending text to the video API for onscreen display and the same text to the accessibility API for audio output. This functionality basically duplicates the results of current screen readers. However, an accessibility API can go much further. Applications typically create display elements such as toolbars, menus, and windows as part of their GUI. An accessibility API could support the creation of equivalent elements with textual descriptions substituted for the typical visual data. This would allow creation of an application’s entire user-interface in a non-visual manner. Screen readers and other accessibility solutions would no longer need to guess about the interpretations of visual display elements. Some visual elements such as menus would require very little application level support. Typical menus already include text descriptions of their elements. Thus, an accessibility API could take the same information needed to create a visual menu and create a non-visual equivalent. Other screen elements would need additional support at the application level. For example, if an application displays images as part of its user-interface, it would explicitly have to add textual descriptions of that portion of the interface. This extra burden exists because an accessibility API cannot assume anything about the purpose of the image. In general, an accessibility API could automatically support most generic display elements while applications would have to provide specific information regarding custom display elements. In the second case, the accessibility API’s main purpose is to provide a consistent communication channel between the application and other accessibility solutions.

While this discussion has presented accessibility APIs as a potential solution to some accessibility issues, it has done so in terms of general possibilities. What is the current status of this technology? Microsoft’s Active Accessibility program represents the most significant implementation of this technology to date. The newest version of the software, Active Accessibility 1.2, provides some accessibility solutions for Windows 98 and Windows NT 4.0 based software. However, the addition of this technology to existing operating systems and applications has introduced many of the problems associated with retrofitting. For example, even as Microsoft worked to add accessibility features to its Internet Explorer (IE) product it inadvertently shipped IE 4.0 without accessibility support. A patch was subsequently issued to correct this problem. However, this patch still failed to include full support for accessibility. Only with the release of IE 4.01 did Microsoft finally meet its own accessibility goals.


D. The Future : Ensuring Everyone Access to Cyberspace

1. Universal Design

"Universal Design calls for the development of information systems flexible enough to accommodate the needs of the broadest range of users of computers and telecommunications equipment, regardless of age or disability." This concept prevents retrofitting costs that would otherwise be incurred to accommodate inaccessible technologies. The seven Principles of Universal Design, as defined by the U.S. Access Board, apply to all products and environments, but are especially important as they provide guidelines to ensure universal access in cyberspace. They are listed and elaborated upon below.

a. Equitable Use - The design is useful and marketable to any group of users.

Guidelines: Provide the same means of use for all users: identical whenever possible; equivalent when not. Avoid segregating or stigmatizing any users. Ensure usability by visual and mobility impaired individuals. Limit physical dependency.

b. Flexibility in Use - The design accommodates a wide range of individual preferences and abilities.

Guidelines: Provide choice in methods of use. Offer breadth of solutions. Facilitate the user’s accuracy and precision. Provide adaptability to the user’s pace.

c. Simple and Intuitive Use - Use of the design is easy to understand, regardless of the user’s experience, knowledge, language skills, or current concentration level.

Guidelines: Eliminate unnecessary complexity. Design intuitive applications and simple straight forward Web sites. Accommodate a wide range of literacy and language skills. Arrange information consistent with its importance.

d. Perceptible Information - The design communicates necessary information effectively to the user, regardless of ambient conditions or the user’s sensory abilities.

Guidelines: Use different modes for redundant presentation of essential information. Maximize "legibility" of essential information in all sensory modalities. Provide verbal and tactile versions of information. Provide compatibility with a variety of techniques or devices used by people with sensory limitations.

e. Tolerance for Error - The design minimizes hazards and the adverse consequences of accidental or unintended actions.

Guidelines: Arrange elements to minimize hazards and errors: most used elements, most accessible; hazardous elements eliminated, isolated, or shielded. Provide warnings of hazards and errors. Provide easy navigation and fail safe features. Accommodate lack of precision.

f. Low Physical Effort - The design can be used efficiently and comfortably and with a minimum of fatigue.

Guidelines: The less the user has to do, the better. Minimize repetitive actions. Minimize sustained physical effort. Automate procedures.

g. Size and Space for Approach and Use - Appropriate size and space is provided for approach, reach, manipulation, and use

Guidelines: Accommodate variations in body size and mobility. Provide adequate space for the use of assistive devices or personal assistance.

2. Design-Conscious Web-Authoring Tools

Currently there are many initiatives that promote Universal Accessibility. But unfortunately, lack of awareness and time constraints prevent these efforts from being effective. Limited HTML knowledge is required to be a Web author and people do not enjoy doing repetitive, time-consuming activities that may be viewed as unimportant. Therefore, the key to increasing the accessibility of the Web is to take advantage of the use the power of computers to do repetitive, guideline-intensive work instead of asking people to scrutinize and change their own authoring habits.

One emerging solution to this problem is the development of access-aware HTML authoring tools such as HoTMetaL 4.0 from SoftQuad. By providing information regarding accessibility requirements, users of such authoring tools can be educated. Wizards that automate certain procedures, such as the creation of a text-only version and the inclusion of ALT tags, allow designers to incorporate accessible design elements quickly and with little effort. Seamlessly integrating these features into the product provide new hopes for accessibility.

"If support for accessibility is built directly into the authoring tool, it has a better chance of reaching designers who might not otherwise seek out these guidelines," says Jutta Treviranus of the Adaptive Technology Resource Center (ATRC). In cooperation with market forces, the potential exists for compliance with published guidelines. The burden lies on the Web accessibility community and the popular Web authoring toolmakers to ensure that their products offer informative and easy-to-use authoring support.

3. XML

XML, or Extensible Markup Language, is a dialect of SGML. It was developed under the direction of the W3C’s XML Working Group in 1996 with the following design goals in mind.

    1. XML shall be straightforwardly usable over the Internet.
    2. XML shall support a wide variety of applications.
    3. XML shall be compatible with SGML.
    4. It shall be easy to write programs which process XML documents.
    5. The number of optional features in XML is to be kept to the absolute minimum, ideally zero.
    6. XML documents should be human-legible and reasonably clear.
    7. The XML design should be prepared quickly.
    8. The design of XML shall be formal and concise.
    9. XML documents shall be easy to create.
    10. Terseness in XML markup is of minimal importance.

The W3C believes that XML will play an increasingly important role in the exchange of a wide variety of data on the Web by

    1. Enabling internationalized media-independent electronic publishing
    2. Allowing industries to define platform-independent protocols for the exchange of data, especially the data of electronic commerce
    3. Delivering information to user agents in a form that allows automatic processing after receipt
    4. Making it easy for people to process data using inexpensive software
    5. Allowing people to display information the way they want it
    6. Providing metadata -- data about information -- which will help people find information and help information producers and consumers find each other.

XML is not a new form of HTML, nor is it a replacement for SGML. XML is an abbreviated version of SGML that makes it easier to define document types and easier for programmers to write applications to handle those documents. Since HTML is a subset of SGML, XML and HTML are somewhat similar in terms of appearance and of a limited amount of functionality.

a. Why HTML Should Be Replaced

HTML has become nearly obsolete. It was not designed to handle all types of documentation, and many document formats that now appear on the Web are designed with a type of HTML "fudging." This means that certain aspects of the HTML tags are being applied in ways not originally intended. A prime example of this practice is the use of tables for improving page layout. HTML was designed for text formatting, not for control of the positioning of graphics and white space. However, Web authors must often resort to the use of tables in order to have a user agent render the page in the way that is intended.

This is just one example of many illustrating HTML’s lack of functionality in today’s Web authoring environment. User demands have pushed HTML to its limit, which explains the need for the numerous HTML versions that have been developed over the past several years. Is HTML truly being improved or is retrofitting out of control?

Currently, the very nature of HTML seems to create a necessary tradeoff between aesthetics and functionality. In the example of tables, aesthetics are improved when tables are used to control formatting. However, functionality is lost since tables are not being used for their intended purpose. Third-party assistive technologies cannot distinguish between tables containing delineated information and tables used to control placement of graphics and white space. If a Web author wishes to maintain functionality for all users, he or she must forfeit control of image placement in favor of standardization of content. In today’s environment of the GUI (graphical user interface) and bigger and better graphics, this tradeoff is unacceptable. New versions of HTML can only go so far before the need for a new language makes HTML an outdated format for Web authoring.

b. Why XML Should Replace HTML

XML is not a subset of another markup language; it is a language of its own that is designed to describe an infinite amount of information. According to the W3C’s XML Special Interest Group, "HTML is at the limit of its usefulness as a way of describing information, and while it will continue to play an important role for the content it currently represents, many new applications require a more robust and flexible infrastructure."

The essence of XML is flexibility. This is a strong attribute for accessible design. Flexibility of design techniques means no "fudging" of available functionality to produce a desired effect. HTML almost requires fudging of the available functions to produce desired aesthetic effects. Therefore, since XML will use functions for their intended purposes, there is less chance that third-party assistive technologies will have problems interpreting Web documents.

Accessibility is a tradeoff under the HTML regime; by contrast, it is a costless option under XML. Web authors who currently use HTML must often make a choice between aesthetics and functionality. A page that contains fancy graphics and a complicated arrangement is often inaccessible to assistive technologies such as screen readers. With XML, however, Web authors use functions that are designed specifically for aesthetics and textual content. The tradeoff is effectively eliminated and assistive technologies can better access and render information.

Some of the improvements of versions of HTML have been results of the need for retrofitting. HTML as a language was not designed to address accessibility, nor was it designed with the flexibility to handle the growing demands of Web users in terms of content and design. XML is not designed for accessibility, but is through its flexibility more amenable to accessibility needs.

c. Web Accessibility Issues

W3C’s work on providing guidelines for Web accessibility will not lose its value with the advent of XML. The guidelines are not wholly specific to HTML; therefore most of the available guideline information can be applied to any successive language. This means that the guidelines for accessibility are already in place; it is simply a matter of applying those guidelines to XML and associated user agents.


With the explosion of the popularity of the World Wide Web and the expanding capabilities of modern personal computers, digital multimedia now represents a real and growing portion of the total media market. Already, record labels and musicians have become concerned with the threat of rampant piracy posed by the MP3 standard for computer audio. Online video, from the original Apple QuickTime, to today’s abundance of competing standards, is also a part of the growth of online media. Motion picture companies now regularly release clips and teaser trailers online to help whet the appetites of movie going audiences and promote their film. The growth of online media, and the predictions for future growth lead to the same important issues of accessibility as broadcast media.

Just as it is important to society that television be equally accessible to all, so is it important that these new mediums for communication be equally accessible to all. Currently this is a problem. No widely used standards for online video include a capacity to carry closed captioning. The accessibility accommodations won over the last twenty-five years by advocates for the deaf and hearing-impaired community now threaten to become obsolete. The move to cyberspace has reduced the universality of closed captioning and set back the movement toward universal access.

Unlike in television broadcast, or on VHS, closed captioning does add a burden to the carrying system in cyberspace. Digital video contains no "Line 21"; as an artifact of analog television broadcast it has no purpose and is not included in online video captures. To include this data would necessitate using additional bits. If the size of the video document is increased to include the closed captioning, downloading will take longer or require more bandwidth. This is not significant to the end user, but administrators of servers which regularly transmit large numbers of video signals may feel that the extra size is not justified as the vast majority of users would have no use for the added closed captioning signal. Reducing signal quality by squeezing bytes out of either the audio or video portion of the existing signal to fit the closed captioning may not be an acceptable alternative under all circumstances. Additionally, even those content providers dedicated to the ideal of universal access have no standard protocol to use to accommodate closed captioning.

A project started at the National Center for Accessible Media (NCAM) at Boston’s WGBH public television station seeks to expand the extent of access to television available to persons with disabilities. Descriptive Video Service (DVS) provides an audio description of television for people who are blind or have low vision. Like a kind of narration, DVS seeks to describe the action visible on the screen but not otherwise perceivable. DVS puts this narration into the normal pauses between the dialogue of characters. DVS grew from a program started in playhouses which provides a live audio narration of action on the screen over a set of earphones for people with vision impairments. Currently DVS, where available, is broadcast over the Second Audio Program (SAP) of normal television. Like closed captioning, DVS fits into spaces already existing in the analog television signal, and results in no loss of signal quality in either audio or video. Also like closed captioning, DVS is transparent to those who choose not to use it.

DVS was developed in 1990, and since its first broadcasts over public television, has expanded. Today many programs on PBS as well as the cable station Turner Classic Movies are broadcast with the special DVS audio program. Libraries nationwide and the commercial video rental chain Blockbusters carry videotapes of major movie releases with DVS. At IMAX theaters across the country optional DVS descriptions accompany many of the large-format films shown. But DVS is far from the widespread availability of closed captioning.

The progress of the adoption of DVS closely parallels the early years of the implementation of closed captioning. Like closed captioning, DVS is currently added to existing products as a separate service after postproduction. Currently the majority of DVS programs are funded through private donations or public funds. DVS got its start, and has its most widespread availability, on public television. In time, Descriptive Video Service is likely to become a standard, widely available and expected, much the way closed captioning is today.

The future is on the Web. The race to get online, for both users and content providers, has left behind many people with a variety of disabilities. Among the problems is the lack of protocols for the formatting of multimedia materials in a universally accessible way. New protocols that are developed should include accessibility not as an afterthought but as an integral part of system development. For online video, there are three such protocols under development and which will be available soon.

Apple’s QuickTime version 3.0 was the first computer video standard to include the capability of displaying closed captioning and of playing alternate or additional sound tracks. In QuickTime the video signal controls the timing of the playback, audio and text are synchronized with the picture. Captions can be turned on and off, as can additional audio tracks which carry DVS or alternate language dialogue. One advantage of using captions with online video is that, unlike with conventional television, captions do not appear superimposed on the picture. This is true of QuickTime as well as the other major standards for closed caption capable computer video.

The second such protocol is Synchronous Multimedia Integration Language (SMIL), which is being developed by the World Wide Web Consortium. SMIL works by incorporating multiple information tracks as they are played, be they in video, audio, text, or other multimedia format. The playback is controlled through a text script which is not a part of any of the individual tracks. SMIL provides the capability of synchronizing each track based on whatever time scale is desired, there is no one controlling track as in QuickTime. This is most useful when incorporating DVS style narration in the playback. Unlike conventional video or QuickTime, SMIL could allow the video and audio tracks to be paused while the DVS narration catches up with the action on the screen. When the narration is complete, the program can continue. SMIL would also accommodate other alternate audio tracks, for example dialogue translated into any number of languages. The versatility of SMIL is not just about universal access; it allows SMIL to be the multimedia protocol for future manifestations of media.

All this is completely invisible to the user. SMIL video can be played just like any other online video, and requires only the RealNetworks G2 plug-in. To the user that chooses only to view the standard video and audio tracks, the presence of DVS, second audio programs, and closed captioning is irrelevant. No manifestation besides the bytes of the extra tracks appears. SMIL provides the same idea of accessibility without compromise of quality or artistic integrity embodied by closed captioning of television with an important difference.

The third protocol for computer video is Synchronous Accessible Media Interchange (SAMI). Like SMIL, SAMI can accommodate numerous tracks of multimedia information. Unlike SMIL, SAMI was developed by Microsoft specifically to address issues of accessibility of multimedia online. To run, all SAMI requires is Microsoft Media Player. SAMI allows the user or developer to select the controlling track. Each audio, video, text, or other multimedia data source includes timing information which is used by Media Player to decide when to display the signal. Both SMIL and SAMI allow tracks to be developed independently, providing greater flexibility in adding accessibility or alternative languages later and facilitating correction of errors in the original tracks.

Both SMIL and SAMI, and to a lesser degree QuickTime, provide an opportunity not just to continue the level of accessibility available today, but to expand and extend the ideas of universal access. Through services like DVS, QuickTime, SMIL and SAMI can not only provide accommodation to the deaf and hearing-impaired but also the blind and sight-impaired. The growth of cyberspace is supposed to improve quality of life and advance democratic ideals as a place where all can exchange information equally. But how true would this be if large portions of society were left behind? The past has shown that universal access is something that is important. The slow, plodding adoption of closed captioning by broadcast television demonstrated the difficulty in retrofitting a large installed base. It also demonstrated the multitude of unexpected positive externalities of closed captioning. Right now there is a unique opportunity to adopt standards for media without large installed bases, and including considerations of universal access from the beginning will prevent the large costs involved with retrofitting which we know will happen eventually. SMIL and SAMI represent exactly the kind of designs needed. SMIL and SAMI are excellent architectures for providing universal access to multimedia on computers and through the Web.

5. Built-In APIs

The success of accessibility APIs depends on several critical factors. First, software developers must embrace the technology. If only a few applications support accessibility APIs, the technology cannot offer a significant accessibility improvement to the visually impaired. Truly accessible computing requires that the impaired have access to a wide range of tools comparable to those available to other users. Further, the creators of accessibility solutions would have little incentive to support APIs that did not significantly increase the utility of their products.

Assuming that the first requirement is met and software developers clamor to adopt accessibility APIs, successful implementation of the technology still faces important hurdles. While this discussion has referred to the technology as APIs in the plural, the technology will prove most beneficial if the number of different APIs is very small. While asking that all software developers agree on a single API is a considerable request, such an effort would help the technology achieve its greatest potential. A single accessibility API would help ensure that a wide range of accessibility devices and computer applications could interact successfully. Because most applications and many hardware devices are designed for a specific family of operating systems (e.g., the Windows family or the Unix family), a single accessibility API for each operating system family would probably suffice.

With accessibility API technology introduced and some important goals identified, it is important to consider what steps society can take to make these proposals a reality. Who are the important players in the accessibility API arena? Clearly, companies currently offering accessibility solutions have an important role in supporting accessibility APIs. However, they already have a vested interest in the accessibility issue. In fact, these companies have already pushed for operating system and application level support for accessibility concerns. Thus, efforts to promote accessibility APIs can probably count on the support of these companies rather than needing to motivate such support.

Operating system vendors and application vendors are probably the most useful parties to recruit. Based on the need for standard accessibility APIs, operating system vendors are the logical parties to enforce such standards. Operating systems already define numerous APIs for applications and application vendors are already accustomed to working with the constraints of operating system APIs. This kind of support fits with the Microsoft Active Accessibility efforts already described. If operating systems support accessibility APIs, the second half of the battle is to get application vendors to use these APIs. Microsoft has started to incorporate some accessibility support requirements into its Windows logo program. However, to date these requirements are fairly minimal and stop short of requiring use of the accessibility API.

How can society either promote or even require the continuation of this trend? Historical efforts at market regulation have proved that requiring specific features for all government procured products creates a strong incentive for companies to adopt these features. On the other hand, more direct regulation such as the Americans with Disabilities Act simply require certain measures rather than creating economic incentives. Because large companies produce most commercial operating systems, economic incentives would probably prove quite effective in promoting accessibility APIs. Presumably these companies have a strong interest in selling to the government. On the other hand, the burden of such APIs on these large companies would probably be small enough that mandating the creation of accessibility APIs might be reasonable.

Promoting application support for accessibility APIs is a more complicated issue because many applications are created and the producers vary in size from single individuals to huge companies. Economic incentives focused on government procurement would still work well. Such incentives only burden those trying to sell to the government. Presumably companies trying to capitalize on that market would be willing and capable of supporting accessibility APIs. On the other hand, a direct mandate for accessibility APIs might impose a significant burden on small-scale developers. A system similar to current FCC regulations which defining criteria such as "readily achievable" might solve the problem. However, experience indicates that defining such criteria in a workable manner is a significant challenge. Perhaps reliance on industry self-regulation, such as Microsoft’s voluntary logo program, would suffice in promoting the technology.

While accessibility APIs are only a small part of the "Universal Access" puzzle, they hold significant potential in solving important problems. As this discussion has illustrated, the technical feasibility of such APIs is quite high. The real challenge rests in making the technology an important part of software development. Using economic incentives tied to government procurement seems the most obvious way to initiate this trend. More drastic measure such as outright mandates should wait until after economic incentives have a chance to demonstrate their effectiveness. However, this kind of accessibility technology is very important so failure of one approach should be taken as indication that new measures are required rather than simply dismissed as a nice try.


Chapter 2
Universal Access – Legislation

by Hazel-Ann F. Mayers (§§ A-C) and Matamba Austin (§§ D-E)

A. Introduction

With the development of the Internet, a new medium has been created by which the disabled community could easily be included in mainstream America. For those for whom mobility is difficult, the Internet provides the opportunity to obtain information in a fast and efficient manner. In addition, new and improved software allows those who are disabled to perform their professional and personal responsibilities with an ease previously unavailable. However, along with these new opportunities have come new obstacles to accessibility. Therefore, it is of little surprise that the disabled community has sought the extension of civil rights laws to the context of cyberspace in order to ensure that they will not be excluded as technology advances.

In this section of the paper, we will discuss the statutes that address universal access in cyberspace. The Americans with Disabilities Act (ADA), sections 255 and 706of the Telecommunications Act of 1996, and section 508 of the Rehabilitation Act of 1973 will be discussed. We will examine the ways in which these statutes attempt to guarantee usable products and services in cyberspace and highlight the ways in which they are effective or ineffective. We will also consider what these statutes suggest about the role that the government will play in making cyberspace accessible and the roles industry must play in balancing the legal trade-offs outlined in the statutes.

B. Americans with Disabilities Act of 1990

In real space, the accessibility of structures, equipment, services, and the workplace is regulated by the Americans with Disabilities Act (ADA) passed by Congress in 1990. It was the first piece of national legislation passed to address the needs of the disabled community. At that time, Congress acknowledged the need to remove barriers to access in real space. Congress realized that it would be necessary to force covered entities to provide access to disabled individuals because until that time, many of these entities were not doing so voluntarily. Through its regulation of real space, the ADA provides a baseline with which to compare new civil rights legislation.

The ADA prohibits discrimination on the basis of disability in employment, in programs and services provided by state and local governments, and in commerical facilities. The ADA regulates the construction and renovation of buildings and facilities; it requires the provision of accessible goods and services by private entities. State and local governments are also required to provide access to public programs. In addition, it provides for accessible telecommunication devices and services.

In passing the ADA, Congress was able to use laws directly to regulate the marketplace. It changed the norms of the construction, architecture, telecommunications, and public service industries by requiring the consideration of accessibility issues. Architects designing new buildings or renovating existing structures must consider access in the design phase of a project. It is a cost which must now be factored into all design and construction projections. The ADA also changed the way in which goods and services are provided by commercial and governmental entities. Covered entities must provide reasonable accommodations for the disabled, including providing alternate means of supplying products and services. Congress clearly defined the roles of federal enforcement agencies, as well as the roles of architects, building owners, and commercial entities in order to ensure that accessible products and services would be provided as soon as possible.

Thus, by altering norms and market considerations, Congress was able to establish the importance of providing access to the disabled. Society’s concern for the disabled in the realm of real space, as demonstrated by the community’s willingness to comply with the ADA, should extend to the context of cyberspace. Although the ADA was passed before rise of cyberspace, theoretically, it should apply to cyberspace. In 1996, the Department of Justice issued an opinion stating that Titles II and III do indeed encompass the issue of web accessibility. The opinion notes that those entities which use the Internet to provide information regarding their programs, goods, or services must be prepared to deliver this information in an accessible format. However, the force of the Department of Justice opinion is not clear, as no cases alleging ADA violations due to the inaccessibility of websites have yet been decided. In addition, the Department of Justice has not spoken to the issue of how to address accessibility problems whose solutions require the involvement of multiple parties, such as manufacturers and final assemblers.

Given that Congress did not intend the ADA to apply to cyberspace, it is questionable whether the statute should be extended to cover online content on the Internet. Such an extension would retrofit old legislation to cover a new unanticipated medium. Consequently, it might result in inefficiency, as it is more costly later to adapt technology to include accessible features than to create products which initially include such features. Nonetheless, the ADA is an example of regulation of accessibility in real space that provides a base of comparison for regulation of accessibility in cyberspace, by highlighting the extent to which a law can shape code, norms, and markets in order to achieve its goals.

C. Section 255 of the Telecommunications Act of 1996

As the ADA successfully established the importance of accessibility in real space, Congress developed a statute which similarly addresses such issues in cyberspace. With the passage of the Telecommunications Act, "Congress…recogniz[ed] that the telecommunications revolution had brought about a new type of public space: a virtual space to which access was just as critical as any physical structure." Section 255 of the Telecommunications Act addresses the accessibility of all products used in this virtual space. Some scholars and policy makers view Section 255 as the legal progeny of the Americans with Disabilities Act. Although it is the first attempt by Congress specifically to address the needs of the disabled in cyberspace, it derives from earlier civil rights legislation ensuring equal protection of the rights of disabled individuals. Section 255 should be viewed as the first step in a developing legal doctrine that will ensure the accessibility of telecommunications products in the cyberspace context.

Section 255 requires that "a manufacturer of telecommunications equipment or customer premises equipment shall ensure that the equipment is designed, developed, and fabricated to be accessible and usable by individuals with disabilities, if readily achievable." The definitions of "disability" and "readily achievable" used in the Americans with Disabilities Act (ADA) are adopted by the statute. The statute further mandates that telecommunication service providers must ensure that their services are accessible to disabled individuals, where readily achievable. Where such access is not readily achievable, manufacturers or providers must ensure that their products are compatible with existing devices commonly used by disabled individuals. Finally, the section explicitly prohibits private rights of action and grants exclusive jurisdiction over enforcement to the FCC.

The FCC shares its section 255-rulemaking authority with the Architectural and Transportation Barriers Compliance Board ("Access Board"). To date, these two entities are still in the early stages of developing section 255 rules and regulations. In its first formal rulemaking act pursuant to section 255, the Access Board issued its Telecommunications Act Accessibility Guidelines in February 1998. Subsequently, in April 1998, the FCC issued its Notice of Proposed Rulemaking (NPRM), which incorporated a substantial portion of the Access Board’s Guidelines. Although the notice and comment period has been completed, the FCC has not yet issued a final rule. A final rule is anticipated in the near future.

As demonstrated by the text of section 255, Congress intended to provide the FCC and the Access Board with a broad but practical mandate in developing accessibility guidelines and regulations. The text of the statute provides for accessibility but leaves the details concerning implementation to the agencies. Accordingly, the FCC has opted to develop a flexible rule that does not provide industry with extensive details or implementation standards. It is believed that flexibility will encourage industry to develop efficient processes that will ensure the consideration of accessibility during the product design phase. Industry should "focus its resources on achieving the end goal: getting as many accessible products in the hands of consumers as possible." Flexible guidelines allow companies to be innovative and "innovation will yield the most accessible products."

1. Proposed Rule - Statutory Definitions

With this idea as its central tenet, the FCC developed a proposed rule that allows industry to achieve accessibility in virtually any manner that it chooses, as long as it meets the minimum requirements outlined by the agency. The FCC has clearly the defined the scope of section 255, stating that the term "manufacturer" applies to a company or vendor that sells to the public, as well as final assemblers and the company whose name appears on the product. The FCC particularly favors the "final assembler" approach advocated by the Access Board because it reduces the complexity caused by requiring accessibility by every link in the manufacturing chain. Instead of embracing this latter, "counterproductive" approach, the FCC hopes that "clearly fixing responsibility at the assembly stage would give [final assemblers] the greatest incentive to specify accessible components from their suppliers, and to negotiate private arrangements for allocating compliance." In addition, the agency proposed that retailers and wholesalers be excluded from the "manufacturer" definition.

The proposed definition of the term "telecommunications providers" includes "all entities offering, whether by sale or resale, telecommunication services to the public, in addition to the service provider who initiates the offering." Like the definition of the term "manufacturer," this definition outlines who is required to ensure that accessibility is achieved. In delineating what services must be accessible, the agency adopts the Act’s definitions of "telecommunications services" and "information services." Telecommunications services include those services previously classified as "adjunct-to-basic" services (such as speed dialing and repeat dialing); information services include services defined as "enhanced services" (such as email and voice mail). Although telecommunications services fall within section 255’s mandate, information services are not within the scope of section 255’s regulation authority. The ambiguity created by these definitions also affects the implementation of section 706, which will be discussed later in this paper.

Moreover, the definition of "telecommunications equipment" is particularly important to the issue of accessibility. The Act defines "telecommunications equipment" as "equipment, other than customer premises equipment, used by the carrier to provide telecommunications services, and includes software integral to such equipment (including upgrades)." The FCC also proposes to include software integral to the use of customer premises equipment within the scope of section 255, although the statutory definition does not explicitly refer to such software. The agency has reached this definition as a result of its belief that the "focus of the section should be on functionality." Thus, it would appear that these definitions would include telecommunications hardware, as well as software such as Internet browser programs.

The text of the Act explicitly requires the adoption of the ADA definitions of "disability." In addition to the ADA’s definition, the FCC proposes to adopt the Access Board’s delineation of common disability categories. In terms of the accessibility that manufacturers are required to provide, the FCC again incorporates the Access Board’s findings: equipment must include accessible input, output, display, and control functions (including the availability of information in visual auditory formats with the ability to manipulate these functions). Equipment usability requires that disabled individuals have access to the full functionality and documentation for the product, including product information and technical support.

Finally, the agency’s definition of the term "readily achievable" is a critical factor in the success or failure of the statute. Although the statute calls for the adoption of the ADA definition, the Commission recognizes that the architectural barriers which arise in real space present different considerations than the technological barriers presented in the telecommunications context. Thus, the Commission developed a definition which delineates "factors that are true to the letter and spirit of the ADA definition and the objective of Congress in enacting Section 255" while allowing the consideration of case-specific factual considerations. The Commission will consider the feasibility of the feature, the expense of providing the feature, and the practicality of the feature in light of its expense. This balancing approach involves examination of market considerations, timing, entity resources, and cost recovery. The Commission’s "readily achievable" standard is critical, as the statute explicitly provides for circumstances where the Commission concludes that accessibility is not feasible. In those instances, the standard becomes one of compatibility: the product or service must simply be compatible with existing devices commonly used by disabled individuals.

2. Proposed Rule - Enforcement Mechanisms

As Congress authorized no private rights of action pursuant to the statute, the Commission’s enforcement process will be the primary mechanism to ensure that manufacturers comply with the mandate of section 255. The Commission has proposed a fast track complaint process. This informal process is designed to resolve complaints efficiently by encouraging communication between consumers and manufacturers or service providers. The Commission, during these early stages, essentially is to act as an intermediary between consumers and industry, by attempting to ensure that legitimate accessibility issues within the scope of section 255 are resolved as quickly as possible. The Commission would require consumers to contact manufacturers directly with accessibility concerns before becoming involved in the problem-solving process, thereby conserving its resources for those matters requiring its expertise. Once this prerequisite is met, if the problem is still not resolved, a consumer would then be free to file a complaint with the Commission.

At this point in the process, the Commission would forward a copy of the complaint to the manufacturer or provider (respondent) so that the entity could attempt to resolve the problem through its own internal efforts. The respondent would be granted a certain period of time (the Commission has proposed a period of five business days) in which it would develop a report outlining its attempts to resolve the problem. The entity would subsequently forward a copy of its report to the Commission for evaluation. At the end of this fast track process, the Commission expects "the manufacturer or service provider to report to the Commission regarding whether the complainant has been provided the access sought, and if not, why it has not been provided." The Commission believes that this informal process will encourage "the most critical element of the fast track process" which is "the sharing of information between complainant[s] and respondent[s]."

Subsequently, the Commission would begin its evaluation process in order to determine whether the respondent provided a solution to the accessibility problem and whether there was an underlying section 255 compliance problem. This would involve examining whether the product is covered by the section and whether there was a readily achievable solution to the problem. Matters determined to be outside of the parameters of the section, as well as those matters that could not be resolved in a readily achievable manner, would be closed. Otherwise, further proceedings would be undertaken, which could include testimony by industry experts, disability groups, or the Access Board. After the completion of these early phases, the Commission proposes to use informal, investigative procedures that are more flexible than formal procedures. However, the Commission may initiate formal proceedings at its discretion and at the complainant’s request. The Commission sought comment on the circumstances in which such proceedings should be initiated.

Finally, the Commission outlines the penalties for non-compliance with section 255. Although the statute does not explicitly delineate any penalties that can be imposed by the Commission, the Commission proposes to use the penalties available under other sections of the Communications Act. These penalties include the revocation of station licenses or construction permits, the issuance of orders outlining corrective measures, the issuance of cease and desist orders, the award of damages, or the possible issuance of retrofitting orders, requiring manufacturers to retrofit accessibility features into inaccessible products.

3. Evaluation of the Proposed Rule - The Future of Section 255

As the FCC is in the final stages of its rulemaking process and has not yet issued its section 255 rules, the degree to which the final rules will mimic the proposed provisions is presently unclear. However, there are several analytical observations relevant to any discussion of the future impact of section 255 on accessibility in cyberspace. Perhaps, it may even be necessary to amend section 255, as it is the text of the statute that defines the parameters of the agency’s rulemaking activity.

The Commission has decided to adopt a flexible approach to achieving accessibility, for fear that stringent implementation guidelines will stifle the creativity of industry. The telecommunications industry traditionally has been slow to embrace accessibility requirements because many entities believe that such requirements force non-disabled users to use products tailored to the lowest common denominator. Unlike accommodations in real space, accommodations in cyberspace can result in products that do not reflect the latest technological advances. Thus, many industry representatives have argued in favor of self-regulation, maintaining that industry, to date, willingly has provided access on its own and to a greater extent than required by law. In this manner, they hope to prevent, delay, and minimize government involvement in this sphere.

Another reason that the Commission may be hesitant to exert greater pressure in this sphere is that it views the current legislation as the first step towards full accessibility. The ADA, which represents the first example of civil rights legislation specifically created to address the needs of the disabled, has only been in existence for eight years; section 255 was enacted recently in 1996. Accordingly, Congress and the Commission may be advocating flexible and informal regulation in the hopes that industry will realize the critical importance of issues of accessibility on its own. By requiring industry to consider such issues during the design and development phases, the Commission hopes to establish an industry norm. This is an example of the way in which law can be used to affect the norms of manufacturers and telecommunication service providers.

Nevertheless, it is not clear that the government’s approach will lead to an increase in the number of accessible telecommunications products. The hands-off approach advocated by Congress and the Commission can be viewed as implicitly permitting industry to circumvent accessibility. Although "the prospect of requiring industry to achieve accessibility might be frightening at this juncture, . . . if the Commission requires accessibility, and makes it clear that it will move strongly to resolve complaints, access will be provided." Thus, arguably the Commission needs to develop a more aggressive stance in its enforcement of section 255. Requiring the development of industry compliance plans may be the first step in ensuring that accessibility issues are considered in the product design and development phases. The government must make it clear to industry that the interests of accessibility cannot be sacrificed in favor of better technology in the short term. This may be a better means of ensuring that industry adopts the norms that the government seeks to establish by its proposed rules.

Furthermore, the enforcement mechanisms outlined in the proposed rule may not be forceful enough to ensure resolutions of accessibility problems, especially given section 255’s prohibition on private rights of action. As the Commission’s enforcement machinery will be the only means of ensuring compliance with section 255, it is extremely important for the agency to implement an investigative and adjudicatory framework that adequately protects the needs of complainants while respecting the right of industry to develop new products. It seems unreasonable to place the burden upon the complainant to contact respondents before filing a complaint with the Commission. It is too easy for a complainant to become lost in the bureaucratic hierarchy of a corporate manufacturer when attempting to report complaints. In some instances, it might even be difficult to determine who is responsible for resolving a certain problem. Moreover, as the complainant awaits a response from the various entities which may be involved in the manufacture or sale of a product or service, technological advances will continue to occur; the disabled community would thereby continue to be denied access to products, requiring a greater amount of retrofitting at a later date.

Thus, the Commission should permit complainants to file complaints with the FCC without having to contact potential respondents first; FCC employees would then be responsible for forwarding the complaints to respondents, as currently required in the proposed rule. Perhaps the current hope is that requiring contact before an official complaint filing will reduce the number of complaints received by the Commission as some problems will be resolved quickly (without the Commission’s involvement) and some complainants will choose not to pursue their complaints due to frustration or other factors. Notwithstanding this interest in the efficient allocation of the Commission’s resources, the agency’s early involvement in these actions is necessary, especially as there are no private rights of action authorized pursuant to section 255. Such a system would force respondents to address accessibility complaints from the beginning, rather than using the complaint process to frustrate complainants.

In addition, some commentators highlight other flaws in the proposed enforcement framework also caused by the prohibition against private rights of action. They argue that the option of initiating formal procedures against respondents must not be left entirely to the discretion of the Commission, particularly as the informal procedures "provide very limited protection for the complaina . . . and no apparent opportunity for participating beyond the issuance of a Commission report." In essence, such a restriction is a denial of the due process of complainants and the Commission should be required to institute such actions where requested by the complainant. This provision of the proposed rule demonstrates that private rights of action may be necessary in the future. It is evident that Congress may have sought to prevent such actions due to the fact that the courts are overly-burdened by civil rights lawsuits. However, given the informal nature of the enforcement process and the fact that industry may view accessibility as an expensive technological consideration in a world where products are constantly changing, Congress may need to amend section 255 in the future to permit the filing of such suits. At the very least, the Commission should begin the formal resolution process where requested by complainants with legitimate concerns.

The definitions outlined by Congress in the text of the Telecommunications Act, as well as the definitions developed by the Commission, highlight another example of the ways in which the statute and proposed rule may inadequately attempt to address accessibility issues. The Commission’s proposed definition for the term ‘readily achievable’ is vague and ambiguous, requiring further refining of the factors to be considered in the evaluation. As the text of the statute requires the Commission to incorporate the ADA’s definition of the term, the Commission is limited in the degree to which it can develop a new calculus that balances the relevant factors. As previously mentioned, the Commission has proposed the consideration of the feasibility, expense and practicality of the inclusion of accessible features in order to determine whether entities can provide accessibility in a manner which is readily achievable. In considering the feasibility of including accessible features, the Commission must make it clear "that there is a responsibility to define the [accessibility] problem, and to consider alternatives." Instead, the proposed standard "may lead a manufacturer or service provider to believe that its obligations…are satisfied if it can be shown that a particular accessibility option is technologically infeasible given the design, development and implementation decisions the manufacturer or service provider has chosen to make." This distinction demonstrates the way in which a manufacturer or service provider can circumvent providing accessible features. The latter standard does not force a manufacturer to consider alternatives and would permit them to use technological unfeasibility as an excuse in circumstances where they affirmatively chose to ignore an accessible alternative in favor of other interests.

Section 255 and the other statutes addressing this issue are the first steps towards allowing the disabled community to utilize media which will ease some of the communication burdens which traditionally, have been difficult to overcome. Section 255 represents an extension of civil rights law and could potentially revolutionize the telecommunications sphere. Congress has embraced an informal approach to regulation, hoping that industry will begin to develop research and development norms which include consideration of accessibility issues. As a result, the FCC has chosen to forego the establishment of rigid implementation rules in favor of flexible rules which it believes will lead to an increase in the number of accessible products available to the public.

This approach, however, is flawed because it does not force industry to grapple with accessibility issues. The need to provide accessibility is not new, as our society has been dealing with such issues for decades. Thus, Congress needs to take a more aggressive stance in order to ensure that the disabled community will not be left behind in the technological revolution. At a minimum, Congress and the Commission must pressure industry to ensure that products developed in the future are accessible to all. The Commission must assist complainants during the early complaint stages, rather than assuming a role as an intermediary as proposed. Finally, the Commission must ensure that industry is not allowed to escape its responsibilities though flexible standards. The government must clearly define its role as regulator and enforcer if industry is to comprehend the importance of providing accessible technology and complying with legislative mandates. Congress must not allow industry to sacrifice accessibility in favor of short-term technological advances. This trade-off carries a high price that may result in the exclusion of a huge sector of our society and a loss of the beneficial externalities that frequently accompany accessibility in the long term.

D. Section 508 of the Rehabilitation Act

Another approach that the government is taking to make the market take accessibility issues into account is through section 508 of the Rehabilitation Act of 1973 (including 1992 and 1998 amendments). The language of section 508 was originally written in 1973. It required the Secretary of Education and the Administrator of the General Services Administration to develop and establish guidelines for Federal agencies for electronic and information technology accessibility and required that such guidelines be revised, as necessary, to reflect technological advances or changes. This piece of legislation is designed to accomplish at least two explicit goals, and as argued below, perhaps another more implicit, indirect goal.

The first direct goal is to ensure that federal employees with disabilities have the same opportunity as other federal employees to produce and access information. The second goal is to ensure that people from the public seeking information and data from federal agencies have comparable accessibility. The section does this by imposing equal access requirements on all electronic and information technology that the federal government develops, procures, maintains and uses.

The initial focus of section 508 was purely on equipment, but as argued by John Clark of the Strategic IT Analysis Division, Office of Information Technology, the focus has shifted and expanded from just "Electronic Equipment Accessibility" to a broader concept of "Electronic Information Technology (IT)." The latter term is thought to include any equipment, software, interface systems, operating systems, or interconnected system or subsystem of equipment used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information. Clark also asserts that "comparable access" in section 508 should mean that individuals with disabilities should be able to produce and have access to the same or equivalent information processing resources, with or without special peripheral equipment, as individuals without disabilities.

The specific scope of section 508 will not be determined until the Access Board has completed its work on establishing guidelines on what technology is covered by the section, and how that technology must come into compliance. As the Access Board works on the specifics, it is estimated that this provision will affect about $27 billion of federal purchasing of information technology used in federal government offices or workplaces, representing slightly more than a quarter of all information technology that is purchased annually in the U.S.

Section 508 exhibits many of the ambiguities that characterize legislation aimed at enforcing an accessibility standard on the market. These ambiguities could either be characterized as results of poor draftsmanship that undermine enforcement of the section, or as compromises that purposefully limit the role that the law plays in implementing accessible technology.

One example of a potential compromise in section 508 concerns its enforcement provisions. The original version of the section contained no serious enforcement provisions. Although the current version is an improvement in this regard, individuals with complaints are expected to file those complaints with the administrative agency alleged to be in noncompliance, instead of filing a claim directly in district court. This means that the courts cannot take an immediate role in interpreting the ambiguities and limits of the section.

An examination of the text of section 508 reveals several such ambiguities. The first and perhaps most important ambiguity in the section is its scope, which is described as including all electronic and information technology. The legislation provides that this term is to be given substance by the Architectural and Transportation Barriers Compliance Board (Access Board), which will publish standards setting forth a definition of electronic and information technology and the technical and functional performance criteria necessary to make such technology accessible. These standards must be published by February 7, 2000. Furthermore, the Access Board will be required periodically to review and, as appropriate, amend the standards to reflect technological advances or changes in electronic and information technology. The scope of the legislation therefore, will depend on whether or not the Access Board can develop standards that are comprehensive and clear, while still being flexible enough to keep pace with the advances in technology until the next periodic review.

Another clause in section 508 which could benefit from interpretation by the courts is a loophole that could potentially undermine the urgency and importance of the legislation: federal agencies and departments are not required to follow its provisions if an undue burden would be imposed on the department or agency. Although those departments are still required to make the information available to the person requesting it, such a loophole could be interpreted in ways that delay the implementation of new technology that is more accessible.

E. Section 706 of the Telecommunications Act of 1996

In addition to focusing on the systems used by the end-users, there is legislation aimed at making sure that the telecommunications network itself is universally accessible. One example of this legislation is section 706 of the Telecommunications Act of 1996. This section is a mandate to the FCC to encourage the deployment, on a reasonable and timely basis, of advanced telecommunications capability to all Americans. The section is designed to give the FCC tools by which to "encourage" the market to provide telecommunications infrastructure for everyone. Specifically, the FCC should take measures that remove barriers to infrastructure investment and promote competition in the telecommunications market.

One way to describe this section would be as an attempt to modernize the mandate of the Federal Communications Act of 1934 which was intended to make sure all Americans had telecommunications services, which at that time meant mostly phone and telegraph. In trying to keep pace with the technological advances in telecommunications, the new definition of what is to be made universally accessible tries to de-emphasize technology.

"Advanced telecommunications capability" is defined in the statute "without regard to any transmission media or technology, as high-speed, switched, broadband telecommunications capability that enables users to originate and receive high-quality voice, data, graphics, and video telecommunications using any technology." "Telecommunications" is defined in the statute as "the transmission, between or among points specified by the user, of information of the user’s choosing, without change in the form or content of the information as sent and received." "Telecommunications service" is defined in the statute as "the offering of telecommunications for a fee directly to the public, or to such classes of users as to be effectively available directly to the public, regardless of the facilities used." "Information service" is defined in the statute as "the offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications, and includes electronic publishing, but does not include any use of any such capability for the management, control, or operation of a telecommunications system or the management of a telecommunications service."

Although the legislation attempts to provide a clear definition of what needs to be provided, it is not clear on who should be providing it. The legislation tries to categorize players by the services they provide – telecommunications services vs. information services. A closer examination of these terms however, suggests that they are merely newer names for old categories that were conceived before the rise of the Internet, and reflect a relatively static world of service providers.

Title II of the Communications Act of 1934 gave the FCC power over players "engaged as a common carrier for hire, in interstate and foreign communication." In 1980, recognizing that technology was making that definition obsolete, the FCC created two categories of services. Basic services were defined as "pure transmission capability over a communications path that is virtually transparent in terms of its interaction with customer supplied information." Enhanced services were defined as "any offering over the telecommunications network which is more than a basic transmission service." Specifically, the Commission defined enhanced services to include services, offered over common carrier transmission facilities used in interstate communications, which employ computer processing applications that act on the format, content, code, protocol or similar aspects of the subscriber’s transmitted information; provide the subscriber additional, different, or restructured information; or involve subscriber interaction with stored information. The telecommunications service providers and the information service providers in the Telecommunications Act of 1996 embody these categories.

These categories are important because those providers put in the telecommunications service category may be made to contribute to universal access solutions, while those in the information service category are not. Furthermore, these provisions are considered to be mutually exclusive by the FCC, meaning that anyone placed in the information service provider category can completely avoid any obligation to contribute under this legislation.

This could undermine the effectiveness of the legislation because the Internet has blurred the distinction between those that provide "pure transmission capability", and those that do not. Consequently, since the categories delineating who is covered by the legislation depend on what they are doing, and since what they are doing is being rapidly changed by the Internet, the telecommunications service category is not as comprehensive as originally designed.

To further complicate matters, the FCC created a distinction between providers of telecommunications and providers of telecommunication services. Providers of telecommunications services placed in a mandatory contribution category while providers of telecommunications were put in the permissive contribution category where it was up to the FCC whether or not they had to contribute. The FCC derives "permissive" authority from section 254(d) to require "other provider[s] of interstate telecommunications to contribute" based on a finding that the public interest requires these entities to contribute "to the preservation and advancement of universal service." Examples of entities that fall into that category include pager service companies, private network operators that lease excess capacity on a non-common carrier basis, and payphone aggregators.

The FCC also found that entities that "provide telecommunications solely to meet their internal needs" are subject to permissive contribution authority. The Commission concluded, however, that those entities "should not be required to contribute to the [universal service] support mechanisms at this time, because telecommunications do not comprise the core of their business." The Commission recognized that "it would be administratively burdensome to assess a special non-revenues-based contribution on these providers because they do not derive revenues from the provision of services to themselves."

Based on the statutory definition of the term "telecommunications," the FCC adopted the following list of services that satisfy the definition of "telecommunications" and are examples of interstate telecommunications: cellular telephone and paging services; mobile radio services; operator services; PCS; access to interexchange service; special access; wide area telephone service (WATS); toll-free services; 900 services; MTS; private line; telex; telegraph; video services; satellite services; and resale services.

The FCC also included among the contributors those entities providing, on a common carrier basis, video conferencing services, channel service, or video distribution services to cable head-ends. It expressly excluded entities providing services via open video systems (OVS), cable leased access, or direct broadcast satellite (DBS) from contributing on the basis of revenues derived from those services.

This complicated and increasingly outdated tangle of categories creates the incentive for businesses to exploit the ambiguities in such a way as to avoid having to contribute to any solution that the FCC may try to implement.

Having looked at these pieces of legislation, if one were to ask who manages the trade-off between creating new technologies and accessibility, it appears that Congress has told the market that it must manage the trade-off, without sacrificing accessibility. In theory, if one were to put these legislative attempts in their best light, they appear to be an attempt by the government to use the law and the market to change the norm of how new technology is created. Instead of building a product, and then considering accessibility issues after the fact because some interested group has voiced a grievance, these laws may have the indirect effect of forcing industry to consider accessibility issues in product development phase, in order to be in compliance. Ideally, it would create an industry norm of making accessibility a fundamental code-design issue, as opposed to a retrofitting exercise. Furthermore, if these laws are interpreted broadly, it would make all the parts of the cyberspace chain responsible for these issues, whether they are hardware manufactures, software developers, or network providers.

There are various possible explanations for the ambiguities in these pieces of legislation, combined with their limited enforcement provisions. One theory is that since cyberspace issues have only so recently come to light, and efforts to make civil rights legislation apply to those issues are also recent, the US government is trying to de-emphasize that role that law plays in making cyberspace more accessible. Rather than forcing the market to manage accessibility with law, as mentioned before, the government is trying to establish general parameters, in the hope that changing market incentives and norms will bring accessibility concerns to the forefront. This would mark a departure from the government’s aggressive stance in implementing universal service policy goals.

One reason for this departure may be that implementing universal service provisions often involve the regulation of existing technology, with less direct effects on the pace of innovation, while implementing accessibility requirements may be perceived to have greater negative effects on innovation. In other words, by mandating universal service provisions, the government is simply requiring that technology that is becoming, or has become, a standard be made available on a broader basis. Enforcing accessibility requirements in the ways suggested above, however, requires a change in how technology is developed, since in effect it would require the creation of new standards.

Another possible explanation for the limitations in the legislation is that there is not yet a full understanding of what the scope and mandate of such legislation should be. Instead of rethinking the goal of laws covering cyberspace, there has been an effort to import real space laws into cyberspace, with varying results because the assumptions underlying real space legislation often do not apply in cyberspace. Section 706 provides an example of how real space definitions are difficult to import into cyberspace.

A third possible explanation for the limitations in the legislation is that the text of these sections represents a compromise between the various players involved. The new enforcement provisions of section 508 reflect the goals of those that believe that accessibility concerns are of paramount concern in the development of new technology. Vague definitions and broad exceptions in several of these sections reflect the concerns of manufactures and developers that seek to minimize regulatory constraints on how and what they choose to develop and manufacture.

The extent to which any of these explanations apply will not be determined until after the government has determined how it plans to implement these sections. The foregoing analysis of these sections suggests that any successful proposal for universal access must not rely too heavily on the law. It is difficult to craft legislation that is both comprehensive enough to make sure all the relevant players are covered and flexible enough to keep pace with technological change, without unduly stifling innovation. However, the government does have a role to play in making sure that the trade-off between introducing new technologies and maintaining accessibility is not forsaken, as well as a role in changing norms so that society more fully appreciates the positive externalities generated by accessible technology.


Universal Service

by Serrin Turner

A. Introduction: The "Digital Divide"

The information age is already upon us, and millions of Americans have quickly come to take for granted the opportunities it brings. However, entry into the information age comes at a price, which thus far only the well-off have been able to afford. The chart below illustrates the large gap in computer ownership between rich and poor.

Of households earning more than $75,000, 76% own a computer; by contrast, of households earning between $5,000 and $10,000, only 10% own a computer. Similarly, whereas 50% of the former group also have Internet access, only 2% of the latter group do. The poorest households are not the only ones being left out of the current technological revolution; households closer to the median are being left out as well. Of all households earning $35,000 or less, the percentage of those who own computers and have access to the Internet is lower than the national average (36% and 26%, respectively).

As the importance of computer technology in our society continues to grow, the result of this "digital divide" will be to entrench the economic divide that currently separates the haves from the have-nots in real space. Those without access to computers will suffer a substantial disadvantage in searching for jobs, educating their children, and exercising political power; climbing the economic ladder will become that much more difficult for these households as a result.

The preceding sections of this paper have addressed the question of how to make cyberspace more accessible, that is, more usable, for persons with physical and cognitive disabilities. This section addresses the question of how to make entry into cyberspace more affordable for persons with low incomes. Whereas the former problem is one of "universal access," the latter is one of "universal service."

The two problems, while distinct, are nonetheless intertwined. First, the goal of universal access and the goal of universal service are essentially the same: to eliminate the barriers that threaten to exclude certain groups from reaping the benefits of the information age. Second, because many disabled persons are also poor, universal service stands as a necessary complement to universal access, in the sense that, for these doubly disadvantaged persons, accessible technology is of no benefit unless it is also affordable.

The discussion of universal service below proceeds as follows. First, the history of universal service is reviewed, starting with its origins in the telephone industry and ending with the FCC’s creation of the universal service fund under the authority of section 254 of the 1996 Telecommunications Act. Second, the question of how to carry over the concept of universal service into the context of cyberspace is examined in detail. The problem is broken down into three components: making computers affordable; making telecommunications services affordable; and making online content affordable. For each component of the problem, current policy trends are examined and, where appropriate, proposals for change are put forth.

B. History of Universal Service

The concept of universal service originated near the turn of the century. At this stage in its history, the telephone industry was largely unregulated, and numerous companies freely competed against one another. However, phone customers were ill-served due to the fact that rival companies were not interconnected, so that customers of one company could not easily be patched through to customers of another company. In response to this problem, Theodore Vail, president of Bell Telephone, argued for the monopolization of the telephone system, invoking the slogan, "one system, one policy, universal service." The idea was that a monopoly system would be most efficient at connecting as many people as possible to the same network, making for a thoroughly interconnected society. Vail’s arguments persuaded, and by the 1930s Bell held a monopoly on the nation’s telephone system.

The Communications Act of 1934 enshrined Vail’s concept of universal service into law, emphasizing the need not only for an interconnected telecommunications network, but also for a telecommunications network affordable to all. The preamble of the Act stated that the Federal Communications Commission was created

for the purpose of regulating interstate and foreign commerce in communication by wire and radio so as to make available, so far as possible, to all the people of the United States, a rapid, efficient, Nation-wide, and world-wide wire and radio communication service with adequate facilities at reasonable charges.

Though arguably intended as a mere policy statement, the goal expressed in the preamble served as a guidepost for the FCC and the telephone industry for the next 60 years, and stood as a public affirmation of the importance of universal service.

From the 1930s until the break-up of Bell in 1982, Bell "funded" universal service through a system of internal cross-subsidies and rate-averaging. By charging artificially high rates for business and long-distance services, Bell subsidized artificially low rates for residential, local service. And by setting rates according to the average costs of providing service across diverse geographic areas, Bell was able to charge rural and urban customers the same rates, even though the cost of providing service in rural areas was considerably higher than average. In this fashion, Bell promoted the goal of universal service by making residential, local service available to all at relatively affordable rates. As a consequence, however, long-distance service was made somewhat of a luxury.

Importantly for the history of universal service, this system of cross-subsidies and rate-averaging was only feasible within the confines of a monopolized telephone industry. Thus, as rivals to Bell began to emerge in the 1940s and 1950s, Bell continued to rely on universal service as a defense for its monopoly. Bell argued that universal service would not be possible in a competitive environment: competition would force Bell to charge businesses and urban customers competitive rates, and thus its source of subsidies for universal service would vanish.

In 1982, after an eight-year antitrust lawsuit, the Bell monopoly finally came to an end, as Bell agreed to a settlement requiring it to divest its local telephone companies. With the national telephone monopoly no longer intact, a new mechanism was required for funding universal service. Consequently, as part of the 1982 settlement, local telephone companies were authorized to levy charges on a long-distance carrier every time it accessed the local telephone network in order to place a long-distance call. The local telephone companies, in turn, were to draw upon these "access charges" to subsidize affordable local rates. Thus, in place of the intra-company cross-subsidies that Bell used to fund universal service, after the Bell break-up access charges functioned similarly as intra-industry cross-subsidies: the effect of both was to raise the price of long-distance service in order to subsidize lower prices for local service. Similar cross-subsidies were created which artificially raised rates for business and urban customers while artificially lowering rates for residential and rural customers.

This system of intra-industry cross subsidies persisted until the passage of the Telecommunications Act of 1996, which worked the most comprehensive reform of the telecommunications industry since 1934. The overriding purpose of the Act was to deregulate the telecommunications industry in order to promote competition; and, as with the Bell break-up, the Act’s reorganization of the industry to make it more competitive again required a retooling of universal service. The worry, once again, was that competition and universal service were incompatible. Even prior to the Act’s deregulation of the industry, what competition among telecommunications companies there was had been distorted by universal service in several ways. First, access charges inflated the cost of providing long-distance service, preventing long-distance companies from offering service at prices close to actual cost and thus reducing the general competitiveness of the industry. Second, access charges put traditional long-distance companies at a competitive disadvantage relative to emerging, non-traditional rivals. By making use of new technologies such as cellular phone equipment and fiberoptic networks, these rivals were able to provide long-distance services without going through local telephone networks. By bypassing local telephone networks, these companies avoided paying the access charges incurred by their traditional counterparts. As a result, meaningful price comparisons between the two groups were made impossible, as the rates charged by traditional long-distance providers were artificially inflated whereas those charged by the emerging rivals were not.

The 1996 Telecommunications Act attempts to resolve these problems by spreading the cost of universal service evenly across the entire telecommunications industry. Rather than passing the costs of the program exclusively onto traditional long-distance providers, section 254 of the Act mandates that "[a]ll providers of telecommunications services should make an equitable and nondiscriminatory contribution to the preservation and advancement of universal service." In other words, rather than requiring one sector of the telecommunications industry to subsidize another — the regime effected by access charges — the Act requires all sectors of the industry equally to contribute to the cause of universal service. Essentially, these mandatory contributions are deposited into a centralized universal service fund, from which the subsidies required to pay for universal service are drawn. How well this funding mechanism resolves the traditional conflict between universal service and competition within the telecommunications industry is discussed below, in the next section.

Besides changing the way in which universal service is funded, the 1996 Telecommunications Act changes the scope of universal service as well. The Act defines the term "universal service" in statute for the first time; and the definition makes clear that the scope of universal service is no longer necessarily limited to basic telephone service, as it traditionally has been. Rather, section 254 of the Act defines universal service as an "evolving level of telecommunications services" to be set by the FCC from time to time. In considering what specific telecommunications services are to be included in this "evolving level," the FCC is to consider the extent to which a particular service is (A) "essential to education, public health, or public safety;" (B) subscribed to by a majority of consumers; (C) presently being deployed in public telecommunications networks; and (D) "consistent with the public interest, convenience, and necessity." Thus, as new technologies eventually become mainstream and take on important societal functions, the FCC can, at its discretion, revise the scope of universal service to include these technologies and, accordingly, establish programs aimed at making these technologies affordable to all.

While the Act leaves the task of defining the precise contours of universal service largely to the FCC, it nonetheless includes one provision that immediately expands the scope of universal service beyond its traditional borders. Section 254(h) mandates that telecommunications providers provide services to schools, libraries, and health care providers at discounted rates; in addition, it directs the FCC

to enhance, to the extent technically feasible and economically reasonable, access to advanced telecommunications and information services for all public and nonprofit elementary and secondary school classrooms, health care providers, and libraries.

Under color of this provision, the FCC has since created the highly controversial "e-rate" (education-rate) program, providing vast discounts to schools and libraries for Internet-related telecommunications technology, paid for through the universal service fund. The controversy surrounding the program is discussed in the next section.

In sum, the primary lesson to be drawn from the history of universal service is that the funding mechanism underlying universal service has its roots in a system of cross-subsidies sustainable only within a monopolized industry. Consequently, whether universal service is compatible with the deregulated, competitive industry of today is, preliminarily at least, subject to doubt.

C. Analysis of Current Trends and Future Possibilities

As the historical review above makes clear, the concept of universal service is expanding. No longer is its goal limited merely to making local telephone service available to all at reasonable prices. Rather, abstracted from its history and its treatment in statute, the concept of universal service can be thought of broadly as the goal of making important conduits of information and communication affordable to all. Increasingly, cyberspace is becoming an important conduit of information and communication. This section considers what should be done to make it affordable to all.

The problem has three components, corresponding to the three potential price barriers faced by any entrant into cyberspace:

  1. the affordability of end-user equipment, such as PCs;
  2. the affordability of telecommunications links to relevant networks, such as the link to the Internet provided by a phone line and ISP service; and
  3. the affordability of online content, such as online news magazines and informational databases.

For universal service to succeed in the context of cyberspace, each of these components of cyberspace must be made affordable – whether through market forces, governmental intervention, or a combination of the two. Below, policy recommendations are presented as to each component.

1. End-User Equipment

Today, the end-user equipment typically used to access the Internet is a PC equipped with a modem and a browser. The cost of such equipment has fallen dramatically in recent years. In just the one-year period from January 1997 to January 1998, the average price of a personal desktop computer dropped 29%; low-end models are now available at less than $500. Assuming these trends continue, the average PC should eventually cost approximately the same as a television.

Moreover, as PC prices continue to fall, simpler and cheaper alternatives to PCs continue to be developed. So-called "Internet appliances" include network computers, palmtop computers, TV set-top boxes, and Internet-ready telephones and pagers. For consumers who lack the money to purchase a fully-equipped PC, but who, for example, already own a television, an Internet appliance such as a TV set-top box may provide an affordable alternative.

Thus, market forces and the advance of technology promise eventually to make end-user equipment affordable for most American households. Nonetheless, for a substantial number of low-income households, prices for PCs and Internet appliances will certainly remain beyond their budgets. Without some sort of governmental intervention, these households will not be able to enjoy the benefits of the information age.

A sensible solution to this problem – the solution envisioned by the "e-rate" program – is to make Internet-ready computers available for free public use at the nation’s schools and libraries. Such "community access centers" could serve as a sort of "safety net" for those unable to afford computers of their own at home. Ideally, community access centers would provide reference staff who could help citizens use the Internet to find information they need, e.g., news; literature; medical information; job listings; or information on government programs such as Medicaid or workfare. In addition, community access centers would also serve as community learning centers, where citizens could take classes on how to use computers and the Internet, skills that would help many low-income persons to find jobs.

Many communities have already begun establishing community access centers, drawing from a mix of public and private funds. This effort shows promise toward fulfilling its goals, makes efficient use of existing community resources, and should continue along its current course.

2. Telecommunications Links

As discussed in the historical review above, the affordability of the telecommunications services required to connect to cyberspace is partially covered by the universal service provisions of the 1996 Telecommunications Act. In establishing an "evolving definition" of universal service, the Act leaves the FCC free to determine the extent to which these services should fall within universal service’s scope. To this point, except for creating the e-rate program, the FCC has done no more to extend universal service to include telecommunications links to cyberspace than to guarantee all households affordable basic telephone service. In considering the matter, the FCC found that

to the extent customers find that voice grade access to the public switched network is inadequate to provide a sufficient telecommunications link to an Internet service provider, we conclude that such higher quality access links should not yet be included among the services designated for support pursuant to section 254(c)(1).

While the FCC’s conclusion is reasonable at this time, it is worth pausing to note the potential importance wireless networks may play in providing "higher quality access links" to residential consumers in the future. Two groups of residential consumers in particular are at risk of being deprived of high quality links to cyberspace in the future. Rural customers are at risk because of the high cost of laying down wire in sparsely populated areas. Customers in inner city areas are at risk, some commentators argue, because their communities are likely to be "electronically redlined" by telecommunications companies, who are likely to fear that the investments in infrastructure required to serve these communities will not generate profit. Wireless technology can solve both these problems by eliminating the need to invest in new telecommunications wiring and infrastructure in order to serve these customers.

"Internet radios" offer one example of a potential wireless solution to this universal service problem. Currently being developed by Rooftop Communications, Internet radios are high-speed digital radios able to send, receive, and forward Internet traffic to other Internet radios in the surrounding area. Only one radio in this network (the "air-to-Internet-router") needs to be connected to a physical link to the Internet for all the other radios in the network to connect to the Internet by proxy. Thus, one way the government could ensure Internet access to customers in rural and inner city areas is by operating an air-to-Internet router, which the surrounding community, equipped with Internet radios, could use as its relay to the Internet. Internet access would thus be free for these customers; the only cost they would bear is the initial cost of an Internet radio.

Thus, as the definition of universal service continues to "evolve," the FCC should consider the role wireless technology could play in extending residential Internet access to areas traditionally under-served by telecommunications companies.

Returning to the present, however, the FCC’s efforts at providing links to the Internet to low-income individuals have thus far been limited to its creation of the "e-rate" program, under color of section 254(h) of the 1996 Telecommunications Act. In creating the e-rate program, the FCC quite broadly interpreted the Act’s vaguely worded instruction that the FCC "enhance" schools and libraries’ access to "advanced telecommunications services": the program guarantees schools and libraries discounts of up to 90% for a wide range of telecommunications services and equipment, including dial-up Internet access, ISDN modems, T-1 lines, internal network servers, routers, switches, hubs, and wireless and wireline LANs. The FCC also boldly interpreted its authority to raise funds under section 254. Originally, the FCC decided to raise $2.25 billion for the e-rate program. However, faced with the threat of rate hikes by the industry and complaints by politicians, the FCC scaled back the program and cut funding to $1.275 billion.

Controversy has surrounded the e-rate program since its inception. No one challenges the worthiness of the goal the program aims to achieve – extending access to cyberspace to all citizens via the nation’s schools and libraries; rather, criticism is focused solely on the mechanism by which the program is funded. The criticism is made on two fronts: to borrow the paraphrase of one commentator, critics argue that the program amounts to "an unconstitutional tax, or a costly mess even if it’s constitutional."

FCC Commissioner Harold W. Furchtgott-Roth himself, outnumbered by the rest of the Commission, has testified before Congress on behalf of e-rate critics that the program amounts to an unconstitutional tax. In his testimony, the Commissioner argued that "in enacting a sweeping new welfare program for schools and libraries," the FCC exceeded its authority under section 254(h), essentially enacting a new tax on telecommunications providers when such is the prerogative of Congress under the Constitution. Rejecting e-rate proponents’ argument that the mandatory contributions used to fund the e-rate program are legitimate regulatory fees, the Commissioner argued that "a tax confers no special benefit on the payee, is intended to raise general revenue, or is imposed for some public purpose"; by contrast, a regulatory fee "is a payment incident to a voluntary act," in return for which the payer receives an individual benefit – e.g., a payment to construct a house in return for which the payer receives the benefit of construction and sale. In the case of the e-rate program, the Commissioner concluded,

all these factors point toward the category of a tax: the [universal service] fund, which creates internet access for schools and libraries, confers no particular advantages upon telecommunications carriers in exchange for their contributions, such as a license or permit; the funds have not, as far as I can tell, been segregated from other government monies . . . ; the purpose of the fund is a broad, social one, purportedly to improve education for all Americans; and the payment requirement is not triggered by a voluntary act on the part of telecommunications carriers, such as the filing of an application, but is a flat mandate.

Constitutional challenges along these lines are currently pending in the courts and in Congress; they may spell a premature end for the e-rate program.

The second criticism of the e-rate program – that it is a "costly mess" even if constitutional – is, more precisely, that the way in which the program is funded is inconsistent with competition, as has historically been the charge levied against universal service. The funding mechanism for the e-rate program conflicts with competition in two ways. First, it distorts prices. All telecommunications providers are required to contribute to the universal service fund from which subsidies for the e-rate program are drawn. Although this mechanism does not grossly distort the prices of one sector of the telecommunications industry, as access charges grossly distorted long-distance telephone rates prior to the 1996 Telecommunications Act, it still distorts prices across the industry. Put another way, although universal service no longer greatly distorts the prices of a few providers, it now significantly distorts the prices of many. Telecommunications providers largely pass off the costs of their universal service contributions to customers, raising prices above what they would be in a competitive market, and thus artificially lowering the quantity of their services demanded. Such an effect is inconsistent with the overriding aim of the 1996 Act, which is to make the telecommunications industry more competitive.

The second way in which the e-rate program interferes with competition is by creating opportunities for "bypass." Recall that prior to the Telecommunications Act, certain long-distance providers were able to bypass "access charges" by using new technologies such as cellular and fiberoptic networks; the result was that these providers enjoyed an unfair competitive advantage over their rivals. Although the Telecommunications Act attempts to avoid this problem by requiring all telecommunications service providers to contribute to the universal service fund, a significant opportunity for bypass still exists, in that providers of what essentially are telecommunications services may avoid falling under the legal definition of a "telecommunications service provider," and thus avoid having to pay contributions to universal service. Internet service providers are a perfect example. Using "I-phone" software, persons can use their ISP connection as a substitute for traditional long-distance phone services. Yet ISPs are legally classified as "information service providers" and "enhanced service providers," not as "telecommunications service providers." As such, they are not required to pay contributions toward the universal service fund. Consequently, consumers have an artificial incentive to use I-phones for long-distance communication, for in so doing they avoid paying long-distance telephone rates made artificially high because they include the costs of universal service. Such a result is inconsistent with competition.

The critics of the funding mechanism for the e-rate program are right. While the program’s goals are admirable, there is simply no policy basis for requiring telecommunications providers to internalize the costs of subsidizing it. In the days of the Bell monopoly it may have made sense for Bell to internalize the costs of universal service in exchange for reaping the benefits of holding a monopoly; but no such quid pro quo exists in today’s competitive market. Thus, requiring telecommunications providers to pay for the e-rate program, as if it were a cost of doing business, is a policy without a rationale, a holdover from a bygone era. To dissolve both the constitutional and efficiency problems surrounding the program, Congress ought to fund it just as it does any other general welfare program: by drawing directly from general tax revenues.

3. Online Content

The last price barrier to entry into cyberspace is the price of online content. Once a person has access to a computer, which is linked to the Internet, he or she still must be able to afford the content found therein for the goal of universal service to be achieved. Presently, of course, most content found online in cyberspace is free, consisting of either non-commercial websites or commercial websites funded through advertising revenues. However, already there are some exceptions. Some online magazines and news sources – such as Slate, the Wall Street Journal, and ESPN – have already begun requiring users to pay subscription charges; online databases and archives, such as Westlaw and Lexis-Nexis, also presently charge users for access, at very high rates. Many commentators believe that increasing numbers of commercial sites will soon follow this lead, making charges for online content the rule rather than the exception. Moreover, as broadband cyberspace becomes a reality, and cyberspace increasingly becomes the primary medium of distributing not just text, but audio, video, and software as well, fees for services are likely to become substantial. If this is indeed the case – if the price of online content will in the future become a significant barrier to entry for some users – it is worth considering now how this barrier might be lowered.

Facilitating price discrimination offers one possible solution. Theoretically, if providers of online content were able to measure individual consumers’ willingness and ability to pay for content, providers would respond by voluntarily offering content to all consumers at a price each could afford. This hypothesis is explained below, but first note that, if true, it implies that the barrier to entry posed by the price of online content could be eliminated without the government lowering prices artificially or subsidizing the purchases of low-income consumers. The solution would be driven instead by a unique combination of market forces and code.

First, what is price discrimination? Price discrimination occurs when a seller charges different prices to different consumers for the same product. For example, software companies often price discriminate by charging lower prices to educational institutions than to home consumers. Airlines often price discriminate by charging lower prices to vacationers than to business travelers. Movie theaters often price discriminate by giving discounts to students and senior citizens.

Why is price discrimination desirable? For sellers, price discrimination is desirable in that it enables them to make money off every potential consumer of their product. For buyers, price discrimination is desirable in that it helps to ensure that no one is priced out of the market. To illustrate, take the above example of a movie theater. Suppose a movie theater owner knows that her theater will not be filled if she charges everyone the normal price of $7.50: only 80 consumers, say, will be willing and able to pay $7.50 to see a movie, when there are 100 seats in the house. In order to fill those 20 remaining empty seats, it makes sense for the owner to price discriminate. Thus, suppose the owner offers discounts to those consumers less willing and able to pay the normal price: specifically, suppose she sells tickets for $4 to students and senior citizens – typically low-income consumers. The owner is then made better off, because now no seats are left empty; she’s now making $4 apiece on the 20 seats that would have brought her no revenue had she not price discriminated. Students and senior citizens are also made better off, because now they are able to afford going to the movies.

If price discrimination is thus desirable for both sellers and buyers, why does it not occur more often? There are several reasons. First, it is difficult for sellers to distinguish customers according to their willingness and ability to pay. Student and senior citizen IDs, for example, serve only as very rough indicators of willingness and ability to pay. Price discrimination would occur more frequently if sellers could estimate willingness and ability to pay with more precision.

But, second, even if buyers could present relatively reliable signals of their willingness and ability to pay, they might be unwilling to do so for fear of stigma or privacy invasion. A similar phenomenon occurs in the context of government subsidies for the poor: many low-income consumers are reluctant to use food stamps or welfare checks at grocery stores for fear of the stigma that comes with having to disclose that they are poor.

Third, the threat of arbitrage often discourages sellers from price discriminating. A seller considering whether to sell his product to group A for $10 and to group B for $5 will not do so if nothing prevents group B from buying all his products and re-selling to group A for less than $10. Our movie theater owner, for example, would not price discriminate if she had no way of controlling the risk of ticket scalping by those customers receiving discounts.

Fourth, and finally, sellers are limited in the extent to which they can price discriminate by the marginal cost of production. If the marginal cost of producing a product is $10, a seller will not sell the product for less than that, no matter how many consumers are only willing or able to pay for the product if it is priced less than $10. A related problem is fixed capacity. Returning to the movie theater example, once the owner fills the theater, there still may be consumers who are priced out of the market – consumers who are not willing or able to pay even $4 for a ticket, but who would pay, say, $2. Had the owner any seats left, she would sell tickets to these consumers for $2. But there are no seats left; for her to accommodate these consumers, she would have to install additional seats, which, let us assume, would cost her more than the revenue she stands to gain.

Now, returning to the original question: why should we believe, given these obstacles to price discrimination in real space, that price discrimination could be facilitated especially in cyberspace? The reason is that code could dramatically lower each of the above four obstacles. First, code offers a variety of solutions to the problem of consumers’ being able to signal their willingness and ability to pay. Digital certificates offer one way of discriminating among customers. Not only could digital certificates certify student or senior citizen status, as a student ID or driver’s license does in real space, but they could also certify more directly relevant information – such as what tax bracket a consumer falls into, what his income level was last year, or whether he is a recipient of a government aid program. Other, less intrusive, means of discriminating among customers are available as well. For example, content providers could employ "metering" as a means of distinguishing customers according to willingness and ability to pay. By charging for access on a per-page or per-download basis, content providers could distinguish between high-volume users – who demonstrate by their usage that they are more willing or able to pay – and low-volume users – who similarly demonstrate that they are less willing or able to pay. Upon separating customers thus, content providers could price discriminate by charging high-volume users at a higher rate than low-volume users.

Code also could reduce – indeed, arguably eliminate – the problems of stigma and invasion of privacy that can accompany the act of signaling willingness and ability to pay. Because purchases in cyberspace are not publicly exposed – especially if they are encrypted – information about a consumer’s willingness and ability to pay, such as information regarding income level, is shared only between the consumer and the seller. There is a risk that a seller will pass this information on to third parties, but this risk could be mitigated, e.g., by legally prohibiting sellers from passing such information on to third parties, or by enabling consumers to transact business in cyberspace anonymously, so that signals of willingness and ability to pay need not ever be attached to names.

The most significant barrier to price discrimination in cyberspace is the threat of arbitrage. Presently, information may be easily copied and redistributed in cyberspace, and as long as this remains the case, the risk of arbitrage may dissuade most sellers from price discriminating. However, trusted systems already are being devised which would allow sellers of information to protect their copyrights. Should such systems prove effective, they would enable sellers of information would to guard against arbitrage and thus to price discriminate without fear.

Finally, the most important reason why price discrimination would work in cyberspace is that, in cyberspace, the marginal cost of distributing information is essentially zero. To compare to real space: A bookseller in real space incurs a significant marginal cost for each book he sells, since selling each book requires that he pay certain costs for printing, packaging, shipping, and so forth. By contrast, selling the same book in electronic form in cyberspace entails no such costs; the seller can sell a million copies of the book just as easily as he can sell one. To draw again on the analogy of the movie theater, distributing online content in cyberspace is like operating a movie theater in which there is always an empty seat left to fill: there are no significant limitations on a seller’s capacity for distribution. Consequently, in theory, as long as a content provider can determine with some accuracy how much a consumer is willing and able to pay, the provider will charge just that price, no matter how low it is; for while the seller may have little to gain, he has nothing to lose.

Price discrimination, then, offers a way to ensure that no one will be priced out of the market for online content. Again, it should be emphasized, that this solution would be arrived at voluntarily by both sellers and buyers. Sellers would not need to be forced to lower prices for those less willing and able to pay; they would do so voluntarily in order to maximize profits. Similarly, buyers would not need to be forced to signal their willingness and ability to pay; they would do so voluntarily in order to receive discounts for purchases. For example, buyers need not be forced to carry digital certificates providing information about their willingness and ability to pay. Rather, the appropriate digital certificates could be made available to those who wanted them, and businesses would respond by giving these consumers discounts. Well-off consumers who wished not to divulge information about their willingness and ability to pay would not carry the certificates, and thus they would still pay the "normal" price, ending up no worse off than they would be without price discrimination.

How precisely price discrimination would be implemented in cyberspace raises numerous questions: What kinds of digital certificates would consumers use to signal their willingness and ability to pay? Would these certificates directly reveal information about their income? If so, what privacy concerns are raised, and how precisely could they be addressed? Would well-off consumers not be able to stomach the fact that sellers are charging them one price while charging low-income consumers another, even though they are being made no worse off thereby?

This paper leaves these questions for others to answer. It concludes merely that, as the price of online content becomes an increasingly significant barrier to entry into cyberspace, facilitating price discrimination through market forces and the resources of code offers a promising solution to lowering this barrier. The implications of this conclusion should be further explored by businesses, consumer groups, and policy-makers.

D. Summary of Policy Recommendations