A. Michael Froomkin

Document information and copyright notice

To table of contents

Notes for Part I, Sections A and B

22. Cryptography remains of paramount importance in guarding military and other national-security-related secrets during both peacetime and wartime. These uses of cryptography are outside the scope of this Article, although it bears mentioning that to date the government remains by far the largest producer and consumer of cryptography in this country. See ACM Report, supra note 15, at 12 (noting that the private market for cryptography remains a niche market in which a handful of companies gross only a few tens of millions of dollars annually). Back to text

23. [**PAGE 719**]But see infra part I.C.1.b (discussing the NSA's traffic analysis); infr a note 405 and accompanying text (discussing the use of voice recognition as a surveillance tool). Back to text

24. See, e.g., Scheppele, supra note 1, at 302 (commenting on the importance of secrecy at the individual level). Back to text

25. Of course, cryptography will not protect against all breaches of security. There are many other ways to lose one's privacy, including trusting the wrong people, leaving things in plain sight, and, of course, simple stupidity. Electronic listening devices also make people vulnerable. See, e.g., Kent Greenfield, Comment, Cameras in Teddy Bears: Electronic Visual Surveillance and the Fourth Amendment, 58 U. Chi. L. Rev. 1045, 1047-48 (1991) (describing miniature video cameras and other surveillance technologies); see also High-Tech Tools for Police Will "See Through" Clothes, Int'l Herald Trib., Dec. 19, 1994, at 4 (reporting that police may soon carry electromagnetic wave imagers that detect guns concealed under clothing). Back to text

26. See Gilles Garon & Richard Outerbridge, DES Watch: An Examination of the Sufficiency of the Data Encryption Standard for Financial Institution Information Security in the 1990s, Cryptologia, July 1991, at 177, 177 (stating that since its adoption in 1977, DES "has become the most widely used cryptographic system in the world"); Lance J. Hoffman et al., Cryptography Policy, Comm. ACM, Sept. 1994, at 109, 111 (noting that the Clearing House Interbank payment system currently moves an average of $1 trillion each day via wire and satellite). Both systems use the U.S. Data Encryption Standard (DES), which arguably has reached or will soon reach the end of its useful life for high-value security. See infra part I.B (noting that existing methods of encryption are beginning to look dated and vulnerable). Back to text

27. [**PAGE 720**] See Gerald Murphy, U.S. Dep't of the Treasury, Directive: Electronic Funds and Securities Transfer Policy--Message Authentication and Enhanced Security, No. 16-02, § 3 (Dec. 21, 1992). Back to text

28. See American Nat'l Standards Committee on Financial Services, X9 Secretariat, American Bankers Ass'n, American National Standard for Personal Identification Number (PIN) Management and Security 9 (1982) (describing proper ATM encryption standards); see also Beth E. Secaur, Note, Automated Teller Machines Under the New York Banking Law: Do They Serve Community Credit Needs?, 37 Syracuse L. Rev. 117, 120-23 (1986) (discussing the technology and development of ATMs). Back to text

29. See E-mail from Ross Anderson, University of Cambridge Computer Laboratory, to Michael Froomkin (Feb. 14, 1994) (on file with author) (discussing the technology and development of ATMs). Back to text

30. See Schneier, supra note 12, at 221 (citing cryptographic standards for bank transactions). Banks rely primarily on the U.S. Data Encryption Standard (DES). See infra part I.B.1 (discussing how DES became the standard and why that standard is becoming increasingly vulnerable). Nevertheless, consider this disturbing boast: "`Give me $1 billion and 20 people and I'll shut America down. I'll shut down the Federal Reserve, all the ATMs; I'll desynchronize every computer in the country.'" Technology as Weaponry, Info. Wk., Jan. 10, 1994, at 48, 50 (quoting futurist Alvin Toffler's recollection of an unidentified intelligence official's statement). Back to text

31. See Digital Privacy and Security Working Group, Electronic Frontier Found., Privacy, Security, and the National Information Infrastructure 2 (1993) ("Without strong cryptography, no one will have the confidence to use networks to conduct business, to engage in commercial transactions electronically, or to transmit sensitive personal information."); Hoffman et al., supra note 26, at 111 ("One of the consequences of an increasingly electronics-oriented economy will be the need to provide some amount of anonymity and privacy for users of such a digital cash system in order to ensure that electronic money remains anonymous and untraceable . . . ."). For a discussion of digital signatures, see infra Technical Appendix, part C. Back to text

32. [**PAGE 721**] See infra text preceding note 798 (noting that digital signatures uniquely identify the sender and connect the sender to the message). Back to text

33. A properly generated digital signature copied from one message has an uninfinitesimal chance of successfully authenticating any other message. See infra note 799 and accompanying text. Back to text

34. See infra text accompanying note 798. Back to text

35. See Inquiry on Privacy Issues Relating to Private Sector Use of Telecommunications-Related Personal Information, 59 Fed. Reg. 6842, 6842 [hereinafter Inquiry on Privacy Issues] ("As the [National Information Infrastructure] develops, Americans will be able to access numerous commercial, scientific, and business data bases . . . [and] engage in retail, banking and other commercial transactions . . . all from the comfort of their homes."); see also Microsoft and Visa to Provide Secure Transaction Technology for Electronic Commerce, PR Newswire, Nov. 8, 1994, available in WESTLAW, PRNews-C database (announcing plans to provide secure electronic bankcard transactions across global public networks using RSA encryption). Back to text

36. Inquiry on Privacy Issues, supra note 35; cf. Jeffrey Rothfeder, Privacy for Sale: How Computerization Has Made Everyone's Private Life an Open Secret 28 (1992) ("As the population grows more computer literate and databanks become more prevalent and sophisticated, long-distance, invisible assaults on privacy will occur more frequently."). Back to text

37. See infra part I.A.5 (discussing use of cryptography by criminals). Back to text

38. [**PAGE 722**]See Economics and Statistics Admin. & Bureau of the Census, U.S. Dep't of Commerce, Statistical Abstract of the United States 1993, at 596 [hereinafter 1993 U.S. Statistical Abstract]. This sum includes all research and development conducted outside the government, regardless of whether funded by industry or government. Back to text

39. See GAO Communications Privacy, supra note 15, at 12. Back to text

40. See, e.g., ACM Report, supra note 15, at 1 (describing electronic industrial espionage against an Alaskan oil company and by British Airways against Virgin Atlantic Airlines). Back to text

41. See James Daly, Laptop Thefts Spur Security Efforts, Computerworld, Oct. 12, 1992, at 1, 12 (discussing theft of laptops to obtain corporate plans, and the ways devised by firms to deny access to information on laptops). Back to text

42. See Key Escrow: Its Impact and Alternatives 6 (May 3, 1994) (testimony of Dr. Whitfield Diffie, Distinguished Engineer, Sun Microsystems, Inc., before the Subcommittee on Technology and Law of the Senate Judiciary Committee) (on file with author) (discussing factors making security more essential and more difficult to achieve). Back to text

43. [**PAGE 723**]According to FBI Director Louis Freeh, the governments of at least 20 nations are "actively engaged in economic espionage." Louis J. Freeh, Address at the Executives' Club of Chicago 8 (Feb. 17, 1994) (transcript available at the FBI) [hereinafter Freeh Speech]; see also ACM Report, supra note 15, at 1 (describing Soviet electronic surveillance of the IBM corporation in the 1970s); id. at 24 (describing the U.S. as the "greatest potential prey" of communications intelligence); David Silverberg, Spy Charges Fray Ties Between U.S., France; French Officials Refute Espionage Accusations, Def. News, May 3, 1993, available in LEXIS, News Library, Curnws File (describing U.S. accusations of industrial espionage by the French government allegedly aimed at 49 U.S. manufacturing companies, 26 financial institutions, and various U.S. government laboratories). Back to text

44. See Freeh Speech, supra note 43, at 11 (urging private businesses to be vigilant in protecting valuable information such as research and development results, marketing plans, and corporate negotiating positions). Back to text

45. CIA Director James Woolsey described economic and industrial espionage by the CIA as "the hottest current topic in intelligence." Ross Thomas, Industrial Espionage: The CIA's New Frontier, L.A. Times, July 18, 1993, at M2. The suggestion that the CIA diversify into industrial espionage received some support. See, e.g., Gerard P. Burke, Economic Espionage: Government Help Is Needed, Gov't Executive, Nov. 1992, at 56 (noting that the U.S. government should attend to the "intelligence being directed against American business . . . without pause for philosophical agonizing"). It was also criticized because it was unclear how the CIA proposed to define a "foreign" corporation and how it proposed to decide which "domestic" corporations would enjoy the spoils. See William T. Warner, Economic Espionage: A Bad Idea, Nat'l L.J., Apr. 12, 1993, at 13. Later in 1993, Director Woolsey stated that for "ethical and legal reasons" the CIA had decided not to embark on the project. Tim Kennedy, Men and Matters, Moneyclips, Dec. 12, 1993, available in LEXIS, News Library, Moclip File (quoting Director Woolsey's statement on Larry King Live). But see Robert Dreyfuss, Company Spies: The CIA Has Opened a Global Pandora's Box by Spying on Foreign Competitors of American Companies, Mother Jones, May-June 1994, at 16, 16 (suggesting that the "CIA has already begun a clandestine effort to help the American auto industry"). Back to text

46. [**PAGE 724**]A 1993 survey by the American Bar Association Legal Technology Resource Center found that almost 75% of attorneys have a computer assigned to them. See Betty Cline, Protecting Electronic Confidences, Legal Times, June 20, 1994, at S30, S30); see also David P. Vandagriff, Opening the Computer Door, A.B.A. J., Aug. 1994, at 92, 92 (describing a law firm which relies on e-mail to communicate with and attract clients). This does not of course prove that all lawyers use their computers. Back to text

47. Communications between a client and her attorney, made in confidence by the client when seeking legal advice, are privileged, unless this protection is waived by the client or her representative. This privilege can be waived by unintentional disclosure. See 81 Am. Jur. 2d Witnesses § 379 (1992) ("The presence of a third person indicates a lack of intention that the communications . . . are meant to be confidential."). But see 2 B.E. Witkin, California Evidence § 1074, at 1019 (3d ed. 1986) (discussing California Evidence Code § 954, which permits the holder of the privilege to prevent disclosure of privileged communications, and extends the privilege to communications which are overheard by an "eavesdropper, finder or interceptor"). Back to text

48. See Jeffrey I. Schiller, Secure Distributed Computing, Sci. Am., Nov. 1994, at 72, 72 (suggesting an increasing frequency of "passive attacks"--eavesdropping--on the Internet). For an assessment of the security of commercial services, such as CompuServe, as a medium for attorney-client confidences, see Ronald Abramson, Protecting Privilege in E-mail Systems, Legal Times, Aug. 15, 1994, at 29.

Although the Internet grew out of the Defense Department network it is now insecure. As a result, the U.S. intelligence community has created the "Intelink," a secure alternative network for communications too important to be entrusted to the Internet. See William F. Powers, Cloak and Dagger Internet Lets Spies Whisper in Binary Code, Wash. Post, Dec. 28, 1994, at A4 (noting that the intelligence community created the "Intelink" because the "very public, very uncontrollable global mesh of computer networks [(the Internet)] was too risky a place to do business"). Back to text

49. See Peter H. Lewis, Computer Snoopers Imperil Pentagon Files, Experts Say, N.Y. Times, July 21, 1994, at A1, B10 (reporting that there "`are probably no secure systems on the Internet'" (quoting Peter G. Neumann, principal scientist at SRI International, a think tank formerly known as the Stanford Research Institute)); see also Terri A. Cutrera, Comment, The Constitution in Cyberspace: The Fundamental[**PAGE 725**] Rights of Computer Users, 60 UMKC L. Rev. 139, 140-42 (1991) (surveying "hackers' skirmishes with the law"). Back to text

50. As the ACM Report states:

[T]here has been a migration of communications from more secure media such as wirelines or physical shipment to microwave and satellite channels; this migration has far outstripped the application of any protective measures. Consequently, communications intelligence is so valuable that protecting its flow . . . is an important objective of U.S. national security policy.
ACM Report, supra note 15, at 24. Back to text

51. See, e.g., Cline, supra note 46, at S30 (describing the risks and ethical concerns of the increased use of technology in the legal field as well as the possible ways to protect confidential information). Back to text

52. See Vincent M. Brannigan & Ruth E. Dayhoff, Medical Informatics: The Revolution in Law, Technology, and Medicine, 7 J. Legal Med. 1, 48-50 (1986) (noting that there are several different approaches in the law to protect patient privacy, including tort litigation and violation of state or federal privacy acts). Back to text

53. See, e.g., 81 Am. Jur. 2d Witnesses § 448 (1992) (describing physicians' eviden-tiary privileges). Back to text

54. The recent theft of a laptop computer from the hypnotherapist who treated the Princess of Wales illustrates the dangers to doctors and their patients. After the theft, the British press began an orgy of speculation about the revelations that might emerge, see Edward Pilkington, Theft from Princess's Bulimia Therapist Raises New Privacy Fears, Guardian, Aug. 1, 1994, at 3, although none did. Back to text

55. See generally infra Technical Appendix, part C (discussing digital signatures). Back to text

56. [**PAGE 726**] See Gustavus J. Simmons, Subliminal Communication Is Possible Easy Using the DSA, in Advances in Cryptology-EUROCRYPT '93: Workshop on the Theory and Application of Cryptographic Techniques 218, 219 (Tor Helleseth ed., 1994) (describing the various types of information that can and may be digitized onto an ID card). Back to text

57. In the 1970s the Pentagon admitted that the Army was stamping discharge papers with 530 different "SPN" code numbers that gave savvy employers derogatory information about servicemen, including some with honorable discharges. The codes did not appear on discharge papers issued to servicemen but were available to employers who asked for more detailed records. Classifications included "drug abuse," "disloyal or subversive security program," "homosexual tendency," "unsuitability--apathy, defective attitudes and inability to expend effort constructively," and "unsuitability--enuresis [bed wetting]." See Dana A. Schmidt, Pentagon Using Drug-Abuse Code, N.Y. Times, Mar. 1, 1972, at 11. Receipt of antiwar literature sufficed to be classified as disloyal or subversive. See Peter Kihss, Use of Personal-Characterization Coding on Military Discharges Is Assailed, N.Y. Times, Sept. 30, 1973, at 46. In response to public pressure, the Pentagon abandoned the program and reissued discharge papers without the codes. See Pentagon Abolishes Code on Discharges of Military Misfits, N.Y. Times, Mar. 23, 1974, at 64; Uncoded Discharge Papers Are Offered to Veterans, N.Y. Times, April 28, 1974, at 33. Back to text

58. [**PAGE 727**]U.S. paper money is not completely anonymous, however, because each (authentic) bill carries a unique serial number and bills can be marked to facilitate tracking. Back to text

59. For example, when my spouse and I purchase surprise gifts for each other, we tend to pay in cash because we have joint checking and credit card accounts. Back to text

60. See David Chaum, Achieving Electronic Privacy, Sci. Am., Aug. 1992, at 96, 96-97 (discussing electronic cash). See generally infra Technical Appendix, part C (describing digital signatures). Back to text

61. A "perfect crime" works as follows: The criminal commits an act of extortion, for example, blackmail or kidnapping, which does not require face-to-face contact with the victim to make the demand for money. Instead of demanding small unmarked bills, the extortionist demands that the victim publish the digital signatures of a large quantity of E$ in a newspaper. Because the "payoff" occurs via publication in a newspaper, there is no danger of being captured while attempting to pick up a ransom. And because the E$ is untraceable, the extortionist is able to spend it without fear of marked bills, recorded serial numbers, or other forms of detection.
[**PAGE 728**] Currently, this strategy would require a sophisticated criminal, because the extortion demand would have to include the result of computations based on large random numbers, but not the random numbers themselves. These computational results would be used by the digital bank as inputs for its production of the verified E$ and would not only ensure the untraceability of the E$ but also prevent anyone but the criminal--who is the only one who knows the large random numbers--from using the E$ whose digital signatures are published in the newspaper. See Sebastiaan von Solms & David Naccache, On Blind Signatures and Perfect Crimes, 11 Computers & Security 581, 582-83 (1992) (describing the mathematical steps that must be followed in order to effectuate a "perfect crime"). If, however, digital money becomes commonplace, all the necessary functions will be built into easily available software. This may not be too far away. See Peter H. Lewis, Attention Shoppers: Internet Is Open, N.Y. Times, Aug. 12, 1994, at D1 (describing the purchase of a compact disc via the Internet by using digital signatures and high-grade cryptography to encrypt a credit card number). Back to text

62. See, e.g., Dan Lehrer, Clipper Chips and Cypherpunks, 259 Nation 376, 376 (1994) (describing William Steen's use of PGP to encrypt what sheriff's deputies claimed was potential evidence of traffic in child pornography). Similarly, the defendant in Commonwealth v. Copenhefer would have benefitted from encryption. See 587 A.2d 1353, 1355-56 (Pa. 1991) (holding that an additional warrant was not required to retrieve incriminating data "deleted," but still recoverable, from the defendant's computer's hard disk). Back to text

63. See infra text accompanying note 761. Back to text

64. See Hoffman et al., supra note 26, at 111. Back to text

65. See 18 U.S.C. § 2511 (1988) (providing that "any person who . . . intentionally intercepts . . . any wire, oral, or electronic communication . . . shall be fined . . . or imprisoned not more than five years, or both"). Back to text

66. See John Markoff, Electronics Plan Aims to Balance Government Access with Privacy, N.Y. Times, Apr. 16, 1993, at A1, A18 ("[C]ellular phone calls can be monitored by anyone with an inexpensive scanner."). Back to text

67. [**PAGE 729**] See ACM Report, supra note 15, at 10 (referring to "the government's STU-III secure telephone system, which is inaccessible to the general public"). Back to text

68. See supra note 50. Back to text

69. See Hoffman et al., supra note 26, at 111. Back to text

70. New AT&T Security Device Targets Spying by Fax, PR Newswire, June 13, 1994, available in LEXIS, News Library, Curnws File. Back to text

71. See Microsoft At Work Fax Software Debuts in Windows for Workgroups 3.11, Business Wire, Oct. 5, 1993, available in LEXIS, News Library, Curnws File. Back to text

72. See Hoffman et al., supra note 26, at 111. Back to text

73. See Schneier, supra note 12, at 437. The most popular program is Phil Zimmermann's Pretty GoodTM Privacy (PGPTM), currently in MIT freeware version 2.6.2 for noncommercial use only, and for commercial use in PGP Viacrypt version 2.7. See Philip Zimmermann, PGPTM User's Guide Volume I: Essential Topics (Oct. 11, 1994), available online URL [hereinafter PGPTM User's Guide]. PGP is available to U.S. and Canadian residents for file transfer protocol[**PAGE 730**](FTP) from Ftp is the file transfer program by which documents and program files can be retrieved from any computer on the Internet, where they have been placed for public copying.

In order to comply with U.S. export restrictions, however, the Massachusetts Institute of Technology requires that would-be downloaders read the warnings located in a file whose name changes every 30 minutes before obtaining the short-lived access code needed to download the program. Foreign residents, or those with less patience, can download the file by connecting to an English server:, or a German server:, and then selecting the appropriate sub-directory for the operating system and PGP version of their choice. For an excellent introduction to PGP, see Simson Garfinkel, PGP: Pretty Good Privacy (forthcoming Jan. 1995). Back to text

74. See infra part I.C.1.a (discussing law enforcement's view of the importance of electronic intelligence gathering). Back to text

75. The Alien and Sedition Act made it a crime to publish "false, scandalous and malicious writing" against the United States government, Congress, or the President, with intent to excite "hatred" against them. 1 Stat. 596, 596 (1798). The Act, supported primarily by the Federalist Party, did not make it a crime to excite hatred against the Vice President or publish falsehoods about him because the Vice President at the time was Thomas Jefferson, who was not a Federalist. See generally James M. Smith, Freedom's Fetters: The Alien and Sedition Laws and American Civil Liberties (1956) (discussing the background, enforcement, and implications of the[**PAGE 731**]Alien and Sedition laws). Back to text

76. See William Preston, Jr., Aliens and Dissenters: Federal Suppression of Radicals 1903-1933, at 208-37 (1963) (describing the secret, mass roundups in the 1920s of some 10,000 immigrants and others active in the labor movement, the Socialist Party, the Communist Party, and other dissident groups; their interrogation without access to counsel or bail; the illegal seizure of their records; the attempts to extort confessions from them; and the FBI investigations of government officials who sought to ensure due process for these arrestees). Back to text

77. See Exec. Order No. 9066, 7 Fed. Reg. 1407 (1942) (authorizing internment camps); see also Korematsu v. United States, 323 U.S. 214, 217-18 (1944) (rejecting several constitutional challenges to the internment of U.S. citizens of Japanese descent pursuant to the Act); Hirabayashi v. United States, 320 U.S. 81, 100-01 (1943) (rejecting an equal protection challenge to a curfew order pursuant to the Act); Act of Mar. 21, 1942, Pub. L. No. 77-503, 56 Stat. 173 (criminalizing the refusal to comply with internment orders of a military commander). See generally Eugene V. Rostow, The Japanese American Cases--A Disaster, 54 Yale L.J. 489 (1945) (describing and criticizing the treatment of Japanese aliens and U.S. citizens of Japanese origin during World War II). Back to text

78. See Frank J. Donner, The Age of Surveillance: The Aims and Methods of America's Political Intelligence System 20 (1980) (arguing that COINTELPRO, an FBI counterintelligence program, is a form of punishment directed at individuals or readily identifiable groups for past actions without trial, and is thus an attainder). Back to text

79. Robert Michels, Political Parties 15 (Eden Paul & Cedar Paul trans., 1962) (arguing that "oligarchy . . . is an intrinsic part of bureaucracy or large-scale organ-ization"). Back to text

80. National security is defined as the "national defense and foreign relations" of the United States. Exec. Order No. 12,356, 47 Fed. Reg. 14,874 (1982), reprinted in 50 U.S.C. § 401 (1988). Back to text

81. S. Rep. No. 755, 94th Cong., 2d Sess., pt. 2, at 4 (1976) [hereinafter Church Committee Report]. Back to text

82. Id. at 4, 5. Back to text

83. [**PAGE 732**]Id. at 7, 54-57. By 1958, the FBI had whittled down the list to only 12,870 names, but the FBI placed the names it removed from the round-up list on its "Communist Index" (renamed the "Reserve Index" in 1960) for "priority consideration" for "action" after the first group had been detained. Id. at 55-56.

By 1972, the FBI had access to the fingerprints of more than 85 million U.S. residents. See Morris D. Forkosch, Freedom of Information in the United States, 20 DePaul L. Rev. 1, 97 n.347 (1971). Back to text

84. See Sanford J. Ungar, FBI 137 (1975) (describing wiretapping of the Black Panthers). Back to text

85. See Church Committee Report, supra note 81, at 6 (noting that the FBI had 500,000 domestic intelligence files, many with more than one name included). Back to text

86. See id. at 6. Back to text

87. See id. Back to text

88. See id. at 6, 58-59. Back to text

89. Id. at 5. Back to text

90. [**PAGE 733*]See David Burnham, The Rise of the Computer State 20, 23-25 (1983). Although the IRS Code, 26 U.S.C. § 6103 (1988 & Supp. V 1993), provides for the confidentiality of tax returns, one commentator has described this restriction as "quite permeable." Steven A. Bercu, Toward Universal Surveillance in an Information Age Economy: Can We Handle Treasury's New Police Technology?, 34 Jurimetrics J. 383, 429 (1994). Back to text

91. See Church Committee Report, supra note 81, at 56-59 (discussing domestic CIA activities). Back to text

92. For example, a former FBI officer stated, "We never gave any thought to [whether proposed actions were legal] because we were just naturally pragmatists. The one thing we were concerned about was this: will this course of action work, will it get us what we want . . . ." Id. at 968 (quoting testimony of former FBI Assistant Director for Domestic Intelligence William Sullivan). Back to text

93. Imagine that a new client comes to consult you and says that she is about to form a new political action group. This group will organize demonstrations and will encourage a general strike to support an unpopular political or social opinion. The group intends to consult you frequently so as to stay within the bounds of the law, but it intends to use every politically and socially disruptive tactic that the law allows in order to gain the maximum audience for its platform. Your client also believes that at times some group members may conclude that extreme cases of injustice require peaceful civil disobedience. On the way out, your new client asks if you think the group should worry about having its telephones tapped by the police. What do you say? Back to text

94. Pub. L. No. 90-351, tit. III, § 802, 82 Stat. 197, 211-25, reprinted in 1968 U.S.C.C.A.N. 237, 253 (current version at 18 U.S.C. §§ 2510-2521 (1988 & Supp. V 1993) (Electronic Communications Privacy Act of 1986)) [hereinafter Title III]. Back to text

95. The volumes of successful suppression orders demonstrate that the police wiretap more than the courts believe is justified; most suppression orders, however, involve cases where a court issued some process, if only on the basis of an inadequate foundation. Exposure of illegal, bad-faith, and warrantless wiretaps seems to have become rare, although clearly not completely a thing of the past. See, e.g., Eric Schmitt, Suffolk Officers Testify They Wiretapped Illegally, N.Y. Times, Jan. 14, 1988, at B3 (reporting narcotics officers' testimony that they conducted illegal wiretaps with[**PAGE 734**]the approval of their supervisor and the chief of the District Attorney's narcotics bureau). Back to text

96. Robert D. Hershey, Jr., I.R.S. Staff Is Cited in Snoopings, N.Y. Times, July 19, 1994, at D1. Back to text

97. See Office of Technology Assessment, Congress of the United States, Information Security and Privacy in Network Environments, 2-3 (1994) (OTA-TCT-606) [hereinafter OTA Information Security]. Back to text

98. Elaine Viets, And the Winner Is: The Trashiest Ever, St. Louis Post-Dispatch, May 29, 1990, at 3D; see also Mark Miller, Vermont Writer Wins PEN Award, Wash. Times, Apr. 21, 1993, at E4, available in LEXIS, News Library, Curnws File (quoting ex-postal worker E. Annie Proulx as saying, "I worked in the post office, and I know perfectly well that everyone loves to read postcards. . . . Not only do [the postal workers] read them, but if they get a particularly juicy one, they show it to their co-workers" (alteration in original)). Back to text

99. See Sam J. Ervin, Jr., The Whole Truth: The Watergate Conspiracy 133-37 (1980) (discussing Nixon's electoral dirty tricks); Jonathan Alter & Richard Sandza, When Tricks Turn Dirty, Newsweek, July 18, 1983, at 18, 18 (reporting that Jimmy Carter agreed not to bug television networks at the 1976 Democratic Convention only after a media advisor raised the specter of Watergate). See generally Bruce L. Felknor, Dirty Politics (1966) (discussing the history of dirty tricks in American politics). Back to text

100. See, e.g., Bell of Pa., Consumer Yellow Pages: Philadelphia 276-77 (1994). Back to text

101. See Tom Seibel, The Spy Shop: Now Anyone Can Play James Bond, Chicago Sun-Times, May 16, 1993, at 4, available in LEXIS, News Library, Majpap File. Back to text

102. [**PAGE 735**]See Cryptographic Algorithms for Protection of Computer Data During Transmission and Dormant Storage, 38 Fed. Reg. 12,763, 12,763 (1973) ("The increasing volume, value and confidentiality of these records regularly transmitted and stored by commercial and government agencies has led to heightened recognition and concern over their exposure to unauthorized access and use. . . . The need for protection is then apparent and urgent."); Encryption Algorithms for Computer Data Protection, 39 Fed. Reg. 30,961, 30,961 (1974) ("Because of the significant value or sensitivity of communicated and stored data, the need for adequate protection of this data from theft and misuse has become a national issue."). Back to text

103. See Encryption Algorithm for Computer Data Protection, 40 Fed. Reg. 12,134, 12,134 (1975) ("In order to insure compatibility of secure data, it is necessary to establish a data encryption standard and develop guidelines for its implementation and use."). Back to text

104. The algorithm the NBS ultimately selected did not meet all these criteria[**PAGE 736**]because other agencies considered it too strong to export. See infra part I.C.1.c.i (describing U.S. export control of encryption software under the International Traffic in Arms Regulations (ITAR)). Back to text

105. Bamford, supra note 17, at 347; see U.S. Senate Select Comm. on Intelligence, Unclassified Summary: Involvement of the NSA in the Development of the Data Encryption Standard, reprinted in IEEE Comm., Nov. 1978, at 53, 53-55 (discussing the debate over the NSA involvement in the development of the algorithm). Back to text

106. DES, issued as FIPS 46 in January 1977, was reviewed, slightly revised, reaffirmed for federal government use in 1983 and 1987, and reissued as FIPS 46-1 in January 1988; on September 11, 1992, NIST announced a third review of FIPS 46-1, DES, and reaffirmed it for another five years as FIPS 46-2. See Revision of Federal Information Processing Standard (FIPS) 46-1 Data Encryption Standard (DES), 58 Fed. Reg. 69,347, 69,347-48 (1993) [hereinafter FIPS 46-2].

DES is identical to the ANSI standard Data Encryption Algorithm (DEA) defined in ANSI X3.92-1981. See Eric Bach et al., Cryptography FAQ (05/10: Product Ciphers Cryptology) § 5 (Aug. 30, 1993), available online URL cryptography-faq/part05. Back to text

107. See FIPS 46-2, supra note 106, at 69,348. Back to text

108. See Garon & Outerbridge, supra note 26, at 179. Back to text

109. See FIPS 46-2, supra note 106, at 69,348. Back to text

110. See Bamford, supra note 17, at 346 (stating that the key was originally 128 bits long); Schneier, supra note 12, at 221 (stating that the key was originally 112 bits). Back to text

111. See Bamford, supra note 17, at 348 (noting that the if the IBM 128-bit key had been used, "[a]s opposed to the moderate $5000 price tag, each solution would have cost an unimaginable $200 septillion, or $200,000,000,000,000,000,000,000,000"); infra text accompanying notes 776-80. Back to text

112. "Back doors" are sometimes inaccurately called "trap doors," although[[**PAGE 737**]technically a "trap door" function is one which is computationally easy in comparison to its inverse. For example, multiplying large prime numbers is much, much easier than factoring a very large number whose only factors are two large primes. See Brian Hayes, The Magic Words Are Squeamish Ossifrage, 82 Am. Scientist 312, 312 (1994); see also 19 Authors, Essay, The Law of Prime Numbers, 68 N.Y.U. L. Rev. 185, 188 n.14 (1993) (I had to cite it). Back to text

113. See Bamford, supra note 17, at 347. For a full description of the "S-box" technique, complete with mind-numbing diagrams, see Schneier, supra note 12, at 224-41, or Encryption Algorithm for Computer Data Protection, supra note 103, at 12,134. Back to text

114. See Schneier, supra note 12, at 240. Back to text

115. Exports are controlled pursuant to the ITAR, 22 C.F.R. §§ 120-130 (1994). See infra part I.C.1.c.i. Back to text

116. [**PAGE 738**]ACM Report, supra note 15, at 5. Back to text

117. See, e.g., Schneier, supra note 12, at 224-41. Back to text

118. See Letter from William B. Robinson, Director, Office of Defense Trade Controls, U.S. Dep't of State, to Phil Karn (Mar. 2, 1994), available online URL EFF/policy/crypto/ITAR_export/Karn_Schneier_export_case/book_1st. response (noting that a book is "not subject to the licensing jurisdiction of the Department of State since the item is in the public domain"). Back to text

119. In contrast to DES, the government has classified the SKIPJACK encryption algorithm used by the Clipper family of chips. This should prevent the SKIPJACK algorithm from being exported. See infra note 187 and accompanying text (discussing the SKIPJACK algorithm). Back to text

120. See FIPS 46-2, supra note 106, at 69,347. Back to text

121. See generally Garon & Outerbridge, supra note 26, at 177-82 (arguing that DES is becoming increasingly vulnerable to attack as the cost of breaking it decreases exponentially). Back to text

122. Michael J. Wiener, Efficient DES Key Search § 10 (Aug. 20, 1993) (unpublished manuscript, on file with author). The attack requires that the attacker have access to the plaintext as well as the ciphertext of a single message. Cryptologists call this a "known plaintext" approach. Such attacks are easier than one might suppose, because one does not need a long, known plaintext and it is often possible to infer something about the contents of the message one seeks to cryptanalyze. Back to text

123. See id. § 4. Back to text

124. See Garon & Outerbridge, supra note 26, at 181. Back to text

125. In fact, there is more to a successful electronic funds transfer attack than breaking the code. Banking protocols contain other safeguards designed to thwart a "known plaintext" attack, making the calculations in the text more of a theoretical possibility than a likelihood. Letter from Dorothy Denning, Professor and Chair, Computer Sciences Department, Georgetown University, to Michael Froomkin 2 (Sept. 17, 1994) (on file with author). Back to text

126. Garon and Outerbridge estimate a one in 200,000 chance for 512 linked machines running under 20 MIPS each, at which speed they are capable of 15,000 DES operations per second. See Garon & Outerbridge, supra note 26, at 187. Pentiums should be at least seven times as fast. See John Blackford, The Promise of the P6, Computer Shopper, Aug. 1994, at 146 (noting that a 100Mhz Pentium is rated at 150 MIPS and that successor chips will be twice as fast). The estimate is probably low. Phil Karn recently reported using an optimized DES code on a 50 Mhz 486, and achieving more than 38,000 DES encryptions per second. See Posting from Phil Karn to Cypherpunks Mailing List (Aug. 6, 1994) (on file with author). Pentiums operating at 100 Mhz would probably run more than 2.5 times faster than this. Back to text

127. A MIPS-year is the computer power of a computer capable of executing one million instructions per second operating for a year. Five thousand MIPS-years is approximately equal to the power of 33 100Mhz Pentiums running for a year. See Blackford, supra note 126, at 146. Back to text

128. See Hayes, supra note 112, at 312; Gina Kolata, 100 Quadrillion Calculations Later, Eureka!, N.Y. Times, Apr. 27, 1994, at A13. The problem, known as RSA-129, was first publicized in the Mathematical Games column of Scientific American in 1977. The challenge was to factor 114,381,625,757,888,867,669,235,779,976,146,612,010, 218,296,721,242,362,562,561,842,935,706,935,245,733,897,830,597,123,563,958,705, 058,989,075,147,599,290,026,879,543,541. See Martin Gardner, Mathematical Games: A New Kind of Cipher That Would Take Millions of Years to Break, Sci. Am., Aug. 1977, at 120, 123. The problem remained unsolved for 17 years. The answer, 5000 MIPS-years later, was the factors 3,490,529,510,847,650,949,147,849,619,903,898,133,417,764,638,493,387,843,990,820,577 and 32,769,132,993,266,709,549,961,988,190,834,461,413,177,642,967,992,942,539,798,288,533. See Hayes, supra note 112, at 313. The message encoded with the 129-digit key said: "The magic words are squeamish ossifrage." See id. at 312. Back to text

129. For a definition of RSA, see Paul Fahn, RSA Laboratories, Answers to Frequently Asked Questions About Today's Cryptography § 2.1 (Sept. 20, 1993), available online URL;; [hereinafter RSA Cryptography Today FAQ]. Back to text

130. Brute-force attacks on RSA keys can use shortcuts, because RSA keys are cracked by factoring large prime numbers. Shortcuts exist for this problem so that not every possible number needs to be tested. "A rule of thumb suggests that adding 10 decimal digits to the length of a number [used as an RSA key] makes it from five to 10 times as hard to factor." Hayes, supra note 112, at 316. Adding the same number of decimal digits to DES would result in a larger increase in the computational complexity of a brute-force keysearch. Back to text

131. See Wiener, supra note 122, § 9 (providing a schematic for triple-DES encryption). Back to text

132. Some versions of triple-DES use a different key each time, while others reuse the first key for the final round. The two-key version is estimated to be 10^13 times as resistant to a brute-force attack. Triple-DES is estimated to be even more secure. See id. Back to text

133. The alternatives are legion. The NSA might not want to certify a cipher it knew to be insecure, although it also might not wish to let it be known that a cipher considered secure by others was in fact vulnerable. The most likely explanation, however, is that triple-DES is so secure as to be computationally infeasible to break.

The hypothesis that the NSA opposes triple-DES because it is too hard to break gains support from the NSA's lobbying campaign to discourage the X9 secretariat of the American Bankers Association from undertaking a standards development process that might lead to the adoption of triple-DES as an approved option for domestic and international financial transactions. The NSA, which is a member of the X9 group, gave several reasons for its opposition, including:

* The government's commitment to EES and DES is inconsistent with this objective.
* Triple-DES is not exportable.
* "[F]urther proliferation of triple-DES is counter to national security and economic concerns."
NSA Reasons for Negative Vote (Oct. 18, 1994) (circulated with X9 letter ballot) (copy on file with author). Ultimately, after a reballoting of its executive committee, the X9 membership decided to undertake the standards development process for triple-DES by forwarding the issue to its X9F Subcommittee on Data and Information Security. See Letter from Cindy Fuller, Associate Director X9 Secretariat, to Michael Froomkin (Jan. 31, 1995) (on file with author). Back to text

134. Indeed, NIST has stated that no publicly available cipher was suitable because such a cipher could be used without escrow. See National Institute of Standards and Technology, Key Escrow Initiative Questions and Answers 3 (July 29, 1993) (on file with author) [hereinafter Key Escrow Initiative Q&A]. NIST selected SKIPJACK in order to "assure[] no one can use the algorithm in non-escrowed systems." Id. The idea is to maximize the possibility that no one will be able to use SKIPJACK with a key shielded from the government.

Because the SKIPJACK algorithm at the heart of the Clipper Chip is classified, it has not had the benefit of almost 25 years of determined attack by academic cryptologists. Even though the NSA spent 10 years developing SKIPJACK, see id. at 2-3, this lack of publicity leaves open the possibility that it has an intentional or unintentional back door, something that seems very unlikely with DES. Back to text

135. Chaining, in which each secure message contains the key to the next message, will work only where the same two parties send each other a continuous stream of messages and are certain that the delays between messages will be short. If there are any significant delays, the stream becomes vulnerable to probabilistic attack. See supra text accompanying notes 124-25. Back to text

To table of contents