"The computer, with its insatiable appetite for information, its image of infallibility, its inability to forget anything that has been put into it, may become the heart of a surveillance system that will turn society into a transparent world in which our home, our finances, our associations, our mental and physical condition are laid bare to use most casual observer." 
A long-time goal of computer scientists, specifically those specializing in Artificial Intelligence, has been to create computer systems that are able to simulate human intelligence. At the same time, researchers have continually been concerned with improving the identification and authentication methods used for access to computer systems and networks. Biometric authentication systems are a natural extension (to computers) of the recognition methods that human beings have used since the beginning of time. In these systems, physical or behavioral characteristics of the person to be authenticated determine whether he is indeed who he declared himself to be - this is analogous to how people recognize each other (i.e. how they identify others and verify that the person is who he appears to be) by examining physical features that are essentially unique to the other person, like his face.
In the past, biometrics was used primarily to control physical access to high-security facilities, and to identify crime suspects . The CIA and the U.S. Department of Defense began using fingerprint-recognition systems in the late 1970s to control admission to nuclear plants. As early as 1903, prisons in the State of New York were using fingerprints to identify criminals . There were three primary obstacles which slowed down the spread of the use of biometric authentication systems: cost, reliability, and accuracy. Today, advancements in biometrics technologies have greatly improved the performance of these systems, and along with significant drops in price, this has led both public and private society to make much higher use of biometrics systems. Indeed, in 1998 the biometrics market was estimated to be worth roughly $250 million . It is no longer the case that biometrics authentication systems are used only by law enforcement agencies, the technology has branched into the automobile industry and the financial services sector, and the range of applications in which biometrics is used continues to grow.
As with many rapidly expanding technologies that affect social life, biometrics has come under attack by civil libertarians. Privacy advocates argue that biometrics will lead to an even deeper erosion of personal privacy in both real and cyber-space. In this paper, we study the many privacy concerns which have been brought forward since the increases in use and the popularity of biometric systems for identification and authentication purposes in digital and physical environments. We will argue that contrary to critics' arguments, biometrics can in fact be tailored (by engineers) to be minimally invasive into personal privacy. Further, we hold that if biometric systems are used in conjunction with existing security mechanisms (such as public-key algorithms), they can provide almost fool-proof protection for electronic transactions and other operations in smart environments. The key element, however, is that government intervention, in the form of a set of standards for how any collection of biometric information can be used, is an absolute necessity for complete privacy protection.
Our goal is to demonstrate how biometric systems can themselves be advocates of privacy. We do so by attempting to answer the following questions: 1) How can biometric systems be designed to minimize intrusiveness into personal data sets? 2) How can government intervention through legislation guarantee privacy protection of users of biometric authentication and identification systems? 3) In the absence of government regulation, how much reliance can users of biometric systems have on self-regulation for privacy protection? We start off by examining the authentication and identification requirements of networked digital environments, as well as the privacy requirements of such environments. This is followed by a discussion of how biometric systems can be made compatible with the privacy requirements outlined. We will close by looking at what the possible implications of regulation of the biometrics industry, both from government and the technical community might be on today's digital world.
Before launching into a discussion of the privacy issues associated with biometric authentication systems, we first need to define what we mean by authentication in the context of digital environments. A natural starting point is to look at the security requirements of networked computer systems, since it is digital networks such as the Internet which have the potential to be the largest platform for privacy intrusion, simply because they facilitate highly rapid dissemination of information.
Security is a fundamental requirement of any digital environment. One key security principle that must be provided for in any security policy of a system in such an environment is accountability - someone must be responsible for each action that takes place in the digital space. Accountability, therefore, necessitates identification. Furthermore, the system must be able to verify a user's claim to Identity X. In other words, identification necessitates authentication.
There are three classic bases for authentication: (1) something the user knows (a password), (2) something the user has (a key, a smartcard), (3) something the user is or does (biometrics).
Knowledge-based authentication is the most commonly used method for verifying a user's identity to a computer system. Indeed, authentication by knowledge has several advantages: it is easy to implement, users can protect their knowledge - typically a password - easily, the knowledge is portable, and it can be easily changed if it is ever compromised.
At the same time, however, authentication based on knowledge of a password is often insufficient in preventing unauthorized access to computer systems. Password-based authentication systems are vulnerable to offline dictionary attacks, and exhaustive-search attacks. In an offline dictionary attack, the attacker will steal a password file which stores a number of encrypted passwords, and then encrypt each word in a dictionary to see if any of them match the encrypted password(s) on the file. In an exhaustive-search attack, all possible passwords of the minimum length are encrypted and compared against the encrypted password in the system. Another problem with password-based authentication schemes is that it is difficult for users to come up with strong passwords. "A good password is easy to remember and hard to guess ... Something is easy to remember if it is meaningfully related to other things one knows. These same relationships make it easy to guess." 
Biometrics thins the line separating identification methods from authentication methods. There are two main phases in biometric authentication. In the enrollment phase, the user's characteristic is measured. This might be a physical characteristic such as his fingerprint, hand geometry, retina vein configuration, iris pattern, face, or DNA, or a behavioral characteristic like his voice or signature dynamics. The data collected in the enrollment phase is then analyzed to build a template. To authenticate a person with identity X, the characteristic must be measured again in the same manner, and then compared with the template. The person is then authenticated depending on how closely the freshly measured characteristic compares to the template. Two things could go wrong at this point: either an impersonator is accepted by the system (false acceptance), or a person with a true claim to an identity is rejected by the system (false rejection).
In general, the collision rate of biometric measures from different persons is extremely low. In other words, it is almost impossible for Eve to try to identify and authenticate herself as Alice because they are different people - Eve would have to find a way to recreate Alice's fingerprint template, her iris-scan template, her retina-scan template, or her voice. The probability of her succeeding in this (without Alice's help) is quite minimal. It is important to note, however, that biometrics are not keys. That is, they are not suitable in applications that need the characteristics of a key, namely, "secrecy, randomness, the ability to update or destroy" .
The uniqueness of biometric identifiers, the fact that they are non-transferable, and that they cannot be lost or forgotten give biometric authentication systems an edge over knowledge-based systems. In both public and commercial sectors, biometrics has found its way into many varied applications, especially in areas where there is the threat of fraud. For example, biometrics systems are being used for physical area security, and computer/network security. Also, some banks use fingerprint recognition systems for access to ATMs. States have used biometric authentication systems to control welfare distribution; certain nations (e.g. Mexico) use biometrics for voter identification. Law enforcement agencies have long used biometrics for criminal identification schemes, and prisoner identification. More recently, the Immigration and Naturalization Service (INS) has developed INSPASS, which incorporates biometric identifiers into an ID card for use by frequent international travelers. As technology progresses, we can expect even more applications of biometrics to surface.
Privacy and security are not the same. A guarantee of security does not ensure a guarantee of privacy. The relationship between these two subjects, often a confusing one, is rather that privacy protections typically include provisions for the physical security of that which is to be protected.
Many countries are dependent on electronic data storage mechanisms. As this reliance continues to increase the question becomes one of safeguarding electronic information against misuse. There are thousands of databases of information about people on computers, often servers connected to the Internet. Names, addresses, credit card and bank account numbers are just some of the personally identifying information that is being stored by independent information traders, including the State and Federal Governments. Since the advent of the Internet, information trade has become an enormous business. The recent estimated $1.7 billion merge of Abacus Direct, a market research company, and DoubleClick, possibly the largest existing Internet Advertising company, is a fine example. Their merger "brings together data on Web surfing habits obtained from the 5 billion ads DoubleClick serves per week and the 2 billion personally identifiable consumer catalog transactions recorded by Abacus" .
Computer users judge the guarantee of privacy as an essential requirement of any digital environment. Privacy is a basic expectation in networked environments from the user's standpoint. What then do we mean by privacy? Several definitions of the term have been tossed around over the years, especially since it is never actually defined (or even mentioned) in the Constitution of the United States. Prof. Larry Lessig defines the architecture of privacy as that which is "left after one subtracts ... the monitored, and the searchable, from the balance of social life" . Lessig's view is that a life less monitored and less searchable is a life more private. To be a bit less abstract, we expand on Lessig's definition by considering the privacy issue as one of defining boundaries. People want to be able to draw a boundary circle around themselves, information about themselves, and how they behave. They feel entitled to the ability to control all that falls in the interior of this circle, and they want to be able to regulate how, to whom, and for what reasons the information within the circle is disseminated.
Roger Clarke of the Faculty of Engineering and Information Technology at the Australian National University explains privacy as "the interest that individuals have in sustaining a 'personal space', free from interference by other people and organizations." . Clarke defines several dimensions to this interest. The two that are most relevant to our discussion are: 1) Privacy of personal communications. " Individuals claim an interest in being able to communicate among themselves, using various media, without routine monitoring of their communications by other persons or organizations."  2) Privacy of personal data. " Individuals claim that data about themselves should not be automatically available to other individuals and organizations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use." 
In other words, users of computer systems (especially those in networked environments) expect that those who are storing their personal information will not abuse it. They expect too that wherever their personal information is being stored, it is safe, so even if a hacker were to succeed in breaking into the computer or server on which this data were stored, it would be impenetrable. Users expect also to be able to communicate anonymously. This is especially important for those who want to criticize the Government, or an employer, without having to worry about victimization.
In the context of biometrics, privacy is a central issue because any biometric information about a person necessarily falls within the boundary of the privacy-circle. Hence, individuals are concerned about how any biometrically identifying information about them is controlled.
The U.S Constitution does not explicitly guarantee a right to privacy. Privacy of personal data has traditionally been protected in two ways: through self-regulatory codes and through laws. Privacy Protection Through Self-Regulation
Privacy law and policy in the United States has been derived from a code of fair information practices developed in 1973 by the U.S. Department of Health Education and Welfare. This code is "an organized set of values and standards about personal information defining the rights of record subjects and the responsibilities of record keepers." . The Code highlights five principles of fair information practices :
1) The Privacy Act of 1974
The first response by Federal Government to the many concerns about their power to use and misuse personal information was the Privacy Act of 1974. This Act covers federal databases and is based on the Code of Fair Information practices defined above. In 1977, a Privacy Protection Study Commission rejected the idea of having a similar privacy law for the private sector. This means that individuals' privacy with respect to databases of information stored and maintained by private organizations is not protected. In the private sector, total reliance is on the fair information practice codes. This is a serious problem.
2) Constitutional Provisions
Though there is no clearly defined right to privacy in the Constitution, privacy rights are implied in several of the amendments. The right to privacy is rooted in the 4th Amendment, which protects individuals from unreasonable search and seizure; the 5th Amendment, which protects individuals from self-incrimination, and the 14th Amendment, which gives the individual control over his personal information.
What remains to be determined is the following: Is taking a biometric measure a search? If so, is it a reasonable search? Can the biometric information collected be used both for criminal and non-criminal searches and suspicionless searches? . Even if these remain unanswered for some time the following fact remains: there are no legal restrictions on biometrically identifying information, or biometric authentication systems.
Critics argue that biometric authentication methods present a serious threat to privacy rights. These arguments have been broken down into three categories: anonymity, tracking and surveillance, data matching and profiling
Privacy advocates argue that individuals lose their anonymity in any system or digital environment that uses biometric authentication methods. Many people claim the option of anonymity in the marketplace (for electronic purchases) and in the political arena (for voting) as part of their expectation of privacy. Critics of biometrics feel that if this technology were to gain widespread acceptance and proliferate further into daily life, then much of our anonymity when we use different services, and move from place to place will fade.
Privacy advocates envision biometrics as being able to foster "Big-Brother" monitoring of citizens by the State. This idea stems from the fact that biometric measures can be used as universal identifiers for individuals because each biometric measure is unique. Consider having a driver's license with a magnetic strip that stored one's fingerprint, say. One could imagine being pulled over by a traffic policeman for a trivial traffic violation, and being subject to very harsh treatment because after scanning your fingerprint in, the policeman has access to your entire criminal record and knows all of your past offenses. The Government has used technology to intrude into the interior of individuals' privacy-circle. Critics of biometrics argue that there is no reason to expect that the State will use biometrics technologies any differently.
Isolated identifying and non-identifying information in different databases can be used to create extensive records that profile people's shopping and spending habits. The biggest danger of biometrics, according to privacy advocates, is that biometric identifiers can be linked to databases of other information that people do not want dispersed. The threat to privacy arises from "the ability of third parties to access this data in identifiable form and link it to other information, resulting in secondary uses of the information, without the consent of the data subject." . This would be a violation of the Code of Fair Information Practices, since the individual would no longer have control over the dissemination of his personal information.
People have generally frowned on biometrics, in particular fingerprints, because of its long-time association with criminal identification, and more recently because of its use in State welfare schemes to prevent recipients from making double claims on their benefits. The argument is that people are reduced to mere codes, and are subject to inanimate, unjust treatment. A similar argument against the use of biometrics is that biometric identifiers are an "example of the state's using technology to reduce individuality." . This type of identification corrupts the relationship between citizen and state because it empowers the State with control over its citizens.
Religious groups argue that biometric authentication methods are "the mechanism foretold in religious prophecy (e.g. the Mark of the Beast)" . Further religious objections are based on the premise that individuals must give up themselves, or part of themselves, to a symbol of authority which has no spiritual significance.
Though there are no documented cases of biometrics technologies causing actual physical harm to users, certain methods are considered as invasive. For example, retina scanning requires the user to place his eye as close as 3" away from the scanner so that it can capture an image of his retina pattern. Fingerprint recognition devices too are deemed as invasive because they require the user to actually touch a pad.
Biometrics cannot be blamed for anonymity loss in today's world. There are larger social and technological forces that have caused this. If a single advancement had to be blamed for the erosion of anonymity it would have to be the computer. The computer, and computer networks like the Internet make it incredibly easy to collect and store information about people, and to disperse this information to a large number of people. The Internet hosts a vast wealth of resources about many people, and the search capabilities that exist make it relatively simple for adversaries to get personal information about anyone. The Internet provides many resources for identity theft (e.g. search engines, genealogy databases). In the physical world, people have access to others' credit reports, and for a small fee employers can perform checks on their employees through services provided by companies like Informus (http://www.informus.com/) and Infoseekers (http://www.infoseekers.net/). There is no need for a universal identifier in order to link identifying and non-identifying information from separate databases. Similarly, there is no need for biometrics in order for "Big-Brother" surveillance to take place. There are already satellites which can track a person's movements with extreme detail. Video surveillance cameras in department stores, online electronic transactions, and email sniffing are just three means by which others can keep track of one's digital identity.
In "Biometrics: Privacy's Foe or Privacy's Friend?" John Woodward proposes three arguments that establish biometrics as a friend of privacy. Woodward's first argument is that biometrics protects privacy by safeguarding identity and integrity. Biometric authentication systems provide very secure protection against impersonators. Criminals in both real and cyberspace commonly exploit weaknesses in token-based and knowledge-based authentication systems in order to break into a user's bank account, for example. Using a biometric identifier for access to systems makes it much more difficult for such compromises to occur. Second, Woodward argues that biometrics is a friend to privacy because it can be used to limit access to information. Finally, he proposes that biometrics is a privacy-enhancing technology. Many current biometric algorithms use biometric characteristics to construct a unique code that can be reconstructed only with the particular biometric identifier. This means the person's actual physical characteristic is not stored by the system. These types of biometrics systems can be used to create PINs for users, thus providing a sort of anonymous verification mechanism. HOW DO WE MAKE BIOMETRICS SYSTEMS COMPATIBLE WITH PRIVACY CONCERNS?
There are many different forces acting on biometrics, including industry and law. The only way biometrics systems can address privacy concerns is if the two forces propose and implement a mechanism that simultaneously accomplishes the following:
Separate efforts by each of these forces will not work because they have conflicting interests. In industry, engineers want to design biometrics systems with lower and lower false rejection rates. Policy makers are concerned with wider public interests. It would not be surprising if they were to lay down laws that would totally ban the use of biometrics systems (at least in the private sector).
In March 1999, the International Biometric Industry Association (IBIA) announced a set of principles to protect personal information collected by biometrics authentication systems. In this announcement, the IBIA stressed that it is very concerned with the issues of privacy and personal information use. The principles they propose as guidelines to manufacturers, integrators, customers and users are :
This is a first step toward privacy protection for users of biometric systems, but it is lacking. First, it suggests only self-regulation for the private sector. This means that there would be no legal way to punish companies for misuse of biometric information. This would leave the current state of affairs as is. It is imperative that database managers be accountable for how they handle people's information. Second, it is hard to keep track of who is adhering to these principles and who is not. There are many companies who do not audit how information is used and disclosed. Businesses commonly sell stores of information to each other in order to use data mining algorithms to discover consumer trends, and send them targeted advertising material. Third, it makes no mention of what sorts of technological solutions can be used to deal with the privacy problem. Engineers need to come up with different ways so that users can have more control over their personal information. Biometrics technologies that use the biometric identifier to create a PIN are an example of how industry can design systems.
Government policy-makers and engineers in industry need to collaborate to ensure :
Industry and government also need to set up and fund a research organization (or extend the research scope of the Government-funded Biometric Consortium) to design biometrics authentication systems that fall in the realm of privacy-enhancing technology. The implications of such a collaboration could solve the privacy problems raised by security solutions that use biometric identifiers. This would also provide a model for how to approach the wider privacy issue that is a consequence of the ubiquitous presence of computers and the wealth of information available on the Internet.
The digital evolution that we are witnessing today is leaning ever more strongly toward smart environments where humans and computers are in symbiosis. Biometric identification and authentication schemes are a first step toward this, as they cloud the line between a person's claim to an identity and his means of verifying this claim. Privacy advocates worry that the sensitive biometric information used for authentication will provide yet another opportunity for both private and public sector information traders to exploit individuals. Their stance is that biometrics will lead to an even deeper erosion of personal privacy in both real and cyber-space; that it will foster Big-Brother monitoring of citizens by the Government; and that individuals will lose their anonymity whenever they use biometric devices to authenticate themselves. In the absence of adequate legislation to regulate how such information is deployed and used, some of the predictions of the critics of biometrics may well materialize. What is needed is for policy makers (who represent the ethical interests of individuals) and engineers of biometric systems (who represent the technological interests of individuals) to collaborate so that a well-defined legal framework within which biometric technologies can safely operate and advance is established. The computer science and electrical engineering research & industry communities have already begun designing and implementing biometric systems tailored toward giving the user as much control as possible over his information, it is now time for policy makers to look more closely into what contributions they can make in order that individuals' privacy interests are accommodated. REFERENCES
 Prof. Arthur Miller. "Statement to Sub-Committee of US Senate on Administrative Practice and Procedure" March 14th, 1967.
 Summers, Rita C. "Secure Computing: Threats and Safeguards" (McGraw-Hill, 1997) p. 349
 The History of Fingerprints http://onin.com/fp/fphistory.html
 Loizos, Constance. "The identification that you'll never leave home without." The Red Herring magazine (Sep 1998) Available at http://www.redherring.com/mag/issue58/biometrics.html
 See  p. 341
 Schneier, Bruce. "Biometrics: Uses and Abuses." Available at http://www.counterpane.com/insiderisks1.html
 Macavinta, Courtney "DoubleClick, Abacus merge in $1.7 billion deal" (Nov. 24, 1999) Available at http://home.cnet.com/category/0-1005-200-1463444.html
 Lawrence Lessig. The Architecture of Privacy. Presented at the Taiwan Net '98 Conference in Taipei, March 1998. Available at http://cyber.law.harvard.edu/works/lessig/architecture_priv.pdf
 Roger Clarke, "Introduction to Dataveillance and Information Privacy, and Definitions of Terms" Available at http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html
 See .
 See .
 Gellman, Robert. "Does Privacy Law Work?" Technology and Privacy: The New Landscape, edited by Philip E. Agre and Marc Rotenberg. MIT Press (1997)
 See  p. 29
 "Biometric Applications: Legal and Societal Considerations." Available at http://www-engr.sjsu.edu/biometrics/publications_consideration.html
 "Privacy and Biometrics" Available at http://www.ipc.on.ca/web_site.eng/matters/sum_pap/papers/pri-biom.htm
 Davies, Simon G. "Touching Big Brother. How biometric technology will fuse flesh and machine." Available at http://www.pclink.com/sarakawa/files/biometric.htm
 "IBIA Announces Privacy Principles" March 25, 1999. Available at http://www.ibia.org/press3.htm