A. Michael Froomkin

Document information and copyright notice

To table of contents

Notes for Part I, Section C

136. Turner Broadcasting Sys., Inc. v. FCC, 114 S. Ct. 2445, 2451 (1994). Back to text

137. See, e.g., Freeh Speech, supra note 43. Back to text

138. On October 25, 1994, President Clinton signed the digital telephony bill. See Communications Assistance for Law Enforcement Act, Pub. L. No. 103-414, 108 Stat. 4279 (1994). Both houses of Congress passed the bill shortly before the end of the legislative session. See 140 Cong. Rec. S14666 (daily ed. Oct. 7, 1994) (reporting the Senate's passage of the bill by voice vote); 140 Cong. Rec. H10917 (daily ed. Oct. 5, 1994) (reporting the House's passage of the bill by two-thirds vote); see also Sabra Chartrand, Clinton Gets a Wiretapping Bill That Covers New Technologies, N.Y. Times, Oct. 9, 1994, at A27.

On the digital telephony bill, see generally Jaleen Nelson, Comment, Sledge Hammers and Scalpels: The FBI Digital Wiretap Bill and Its Effect on Free Flow of Information and Privacy, 41 UCLA L. Rev. 1139 (1994) (arguing that legislation requiring communication through only certain media and also requiring communication providers to affirmatively assist the government in wiretapping is inconsistent with rights of free flow of information and privacy in U.S. and international law). Back to text

139. William Shakespeare, The Third Part of King Henry VI act 4, sc. 7, l. 37 (Andrew S. Cairncross ed., Methuen & Co. Ltd. 1964). Back to text

140. It is clear that plans for EES have been gestating for years in the national security bureaucracy. Because, however, the Clinton Administration has adopted them wholeheartedly, this Article refers to the plan as the Administration's proposal. Back to text

141. "With hindsight, the intelligence community might consider the public disclosure of the DES algorithm to have been a serious error and one that should not be repeated." ACM Report, supra note 15, at 25. Back to text

142. See Stewart A. Baker, Don't Worry, Be Happy: Why Clipper Is Good for You, Wired, June 1994, at 100, 100 (debunking seven "myths" about key escrow encryption). Back to text

143. Freeh Speech, supra note 43, at 13; see also OTA Information Security, supra note 97, at 9-10 (noting increasingly frequent portrayals of cryptography as a threat to domestic security and public safety). Back to text

144. Title III authorizes the use of wiretaps. See 18 U.S.C. § 2516 (1988 & Supp. V 1993). Orders for pen registers, which record the telephone numbers called but not the content of the conversation, are much more frequent than wiretap orders. Also, not every wiretap order necessarily falls under Title III. Back to text

145. See Communications and Computer Surveillance, Privacy and Security: Hearing Before the Subcomm. on Technology, Environment and Aviation of the House Comm. on Science, Space, and Technology, 103d Cong., 2d Sess. 10 (1994) (statement of James Kallstrom, Special Agent in Charge, Federal Bureau of Investigation) [hereinafter Kallstrom Statement] (indicating that, in the 10-year period ending in 1992, more than 22,000 convictions have resulted from court-authorized surveillances); Administrative Office of the U.S. Courts, 1993 Report on Applications for Orders Authorizing or Approving the Interception of Wire, Oral, or Electronic Communications 6 (1993) [hereinafter Wiretap Report] (stating that in 1993 "a total of 2,428 persons [were] arrested as a result of electronic surveillance activity; of those arrested, 413 were convicted"). Back to text

146. See Kallstrom Statement, supra note 145, at 3; Wiretap Report, supra note 145, at 4 (showing fluctuations between 600 and 1000 court-ordered wiretaps per year between 1983 and 1993). Back to text

147. See Wiretap Report, supra note 145, at 5. Back to text

148. See Hoffman et al., supra note 26, at 115. Back to text

149. See Wiretap Report, supra note 145, at 5. Back to text

150. For an interesting analysis of the costs and benefits of wiretapping, which concludes that the digital telephony bill is not cost-effective, see Robin Hanson, Can Wiretaps Remain Cost-Effective?, Comm. ACM, Dec. 1994, at 13. Back to text

151. Hoffman et al., supra note 26, at 115. Similarly, the FBI warns that the conversion of telephone networks to digital and fiber-optic systems threatens to "make it impossible for the FBI to carry out court-approved surveillance in life-and-death cases." Freeh Speech, supra note 43, at 12. According to FBI Director Louis Freeh, "Development of technology is moving so rapidly that several hundred court-authorized surveillances already have been prevented by new technological impediments associated with advanced communications equipment." Id. at 13.

The Electronic Privacy Information Center (EPIC) recently filed suit under the Freedom of Information Act (FOIA) to force the FBI to release internal studies which formed the basis for testimony to Congress that new technologies were already having a harmful effect on law enforcement's wiretapping capabilities. See Electronic Privacy Information Center, Press Release, Group Seeks Release of FBI Wiretap Data, Calls Proposed Surveillance Legislation Unnecessary (Aug. 9, 1994). The FBI is resisting the suit. See Telephone Interview with David Sobel, Legal Counsel, Electronic Privacy Information Center (Nov. 29, 1994). Back to text

152. Office of the Press Secretary, The White House, supra note 20, at 1. Back to text

153. Even an enthusiastic defender of an absolutist vision of the First Amendment conceded that an "absolute" right against being tortured might nonetheless find room for an exception in the case of "the man who knew where the [atom] bomb [was ticking, but] sat grinning and silent in a chair" far from the place he had planted it. Charles L. Black, Jr., Mr. Justice Black, The Supreme Court, and the Bill of Rights, Harper's, Feb. 1961, at 63, reprinted in The Occasions of Justice: Essays Mostly on Law 89, 99 (1963). Explaining this position in a Constitutional Law class I attended at Yale in 1984, Professor Black stated that he believed torture morally justified in this extreme and hypothetical case. Once the torturer extracted the information required, Black continued, he should at once resign to await trial, pardon, and/or a decoration, as the case might be. Back to text

154. ACM Report, supra note 15, at 23. Back to text

155. See, e.g., Kahn, supra note 6, at 8 (discussing the U.S. Navy's use of traffic analysis to determine the movements of Japanese forces in World War II). Back to text

156. See Bamford, supra note 17, at 359 (quoting then-NSA Director Bobby Ray Inman as warning: "Application of the genius of the American scholarly community to cryptographic and cryptanalytic problems, and widespread dissemination of result-ing discoveries, carry the clear risk that some of the NSA's cryptanalytic successes will be duplicated, with a consequent improvement of cryptography by foreign targets."). Back to text

157. See ACM Report, supra note 15, at 25.

The goals of U.S. export control policy in the area of cryptography are (i) to limit foreign availability of cryptographic systems of strategic capability, namely, those capable of resisting concerted cryptanalytic attack; (ii) to limit foreign availability of cryptographic systems of sufficient strength to present a serious barrier to traffic selection or the development of standards that interfere with traffic selection by making the messages in broad classes of traffic (fax, for example) difficult to distinguish . . . .
Id. Back to text

158. For a general survey of high-technology export controls, see Peter Swan, A Road Map to Understanding Export Controls: National Security in a Changing Global Environment, 30 Am. Bus. L.J. 607 (1992). Back to text

159. See ACM Report, supra note 15, at 25. Back to text

160. 15 C.F.R. §§ 768-99 (1994). The EAR are administered by the Bureau of Export Administration in the Department of Commerce. The statutory authority for the EAR, the Export Administration Act of 1979, 50 U.S.C. app. § 2401-2420 (1988 & Supp. IV 1992), lapsed on August 20, 1994. See 50 U.S.C.A. app. § 2419 (West Supp. 1994). President Clinton issued an executive order requiring that the EAR be kept in force to "the extent permitted by law" under the International Emergency Powers Act (IEPA), 50 U.S.C. §§ 1701-1706 (1988 & Supp. IV 1992). See Exec. Order No. 12,924, 59 Fed. Reg. 43,437 (1994). Back to text

161. See 22 C.F.R. § 121.1 (XIII)(b)(1) (1994). The ITAR are administered by the Office of Defense Trade Controls in the Department of State. If the State Department chooses, it can transfer jurisdiction of an export application to the Commerce Department. The statutory authority for the ITAR is the Arms Export Control Act, codified as amended at 22 U.S.C. § 2778 (1988 & Supp. IV 1992). Back to text

162. See Evan R. Berlack & Cecil Hunt, Overview of U.S. Export Controls, in Coping [**PAGE 749**]with U.S. Export Controls 1994, at 11, 26 (PLI Com. Law & Practice Course Handbook Series No. A-705, 1994) (arguing that "under-staffing, technically complex applications, [and] many layers of review within DOD . . . [as well] as between DOD and State" characterize the permit application process under ITAR); Ira S. Rubinstein, Export Controls on Encryption Software, in Coping with U.S. Export Controls 1994, supra, at 177, 194 (stating that, under the ITAR, the State Department's Office of Defense Trade Controls (DTC) has primary jurisdiction over cryptographic software). The export of articles or services on the U.S. Munitions List is regulated by the DTC under the ITAR. See 22 C.F.R. § 120.5 (1994). The DTC settles disputes regarding whether an item is on the U.S. Munitions List according to the commodity jurisdiction procedure, which determines whether the ITAR or the EAR will apply. See 22 C.F.R. § 120.4 (1994).

Whether the ITAR unconstitutionally restrict free speech is outside the scope of this Article. For a discussion of this topic, see Constitutionality of the Proposed Revision of the International Traffic in Arms Regulations, 5 Op. Off. Legal Counsel 202, 213-14 (1981) (finding that the ITAR have constitutional and unconstitutional applications, and that they should be narrowed so that they are less likely to apply to certain protected speech); Christine Alexander, Preserving High Technology Secrets: National Security Controls on University Research and Teaching, 15 Law & Pol'y Int'l Bus. 173, 203 (1983) (noting that export controls on technology raise constitutional issues because technical expression may be considered speech and because such controls infringe upon the right of academic freedom); Mary M. Cheh, Government Control of Private Ideas--Striking a Balance Between Scientific Freedom and National Security, 23 Jurimetrics J. 1, 22 (1982) (arguing that cryptographic information is protected by the First Amendment); James R. Ferguson, Scientific Inquiry and the First Amendment, 64 Cornell L. Rev. 639, 654-56 (1979) (arguing that scientific inquiry merits some degree of protection by the First Amendment); Harold P. Green, Constitutional Implications of Federal Restrictions on Scientific Research and Communication, 60 UMKC L. Rev. 619, 643 (1992) (suggesting that the national security basis of governmental restrictions on scientific freedom may be weak because it is impossible to hold national security secrets effectively for even short periods of time); Ruth Greenstein, National Security Controls on Scientific Information, 23 Jurimetrics J. 50, 76-83 (1982) (arguing that noncommercial scientific communication should receive full First Amendment protection and noting that the ITAR may be unconstitutional unless interpreted narrowly because they may lack ties to compelling government interests); David A. Wilson, National Security Control of Technological Information, 25 Jurimetrics J. 109, 128-29 (1985) (suggesting that research contracts be used as devices for implementing the ITAR to avoid the constitutional difficulties associated with requiring licenses); Roger Funk, Comment, National Security Controls on the Dissemination of Privately Generated Scientific Information, 30 UCLA L. Rev. 405, 441-44 (1982) (arguing that current export control laws are overbroad in their restriction of speech and therefore employ excessive means to protect national security); Kenneth J. Pierce, Note, Public Cryptography, Arms Export Controls, and the First Amendment: A Need for Legislation, 17 Cornell Int'l L.J. 197, 213-19 (1984) (arguing that the ITAR's licensing requirement is an unconstitutional prior restraint of protected speech); Allen M. Shinn, Jr., Note, The First Amendment and the Export Laws: Free Speech on Scientific and Technical Matters, 58 Geo. Wash. L. Rev. 368, 397-400 (1990) (arguing that the regulation of scientific expression that would be unconstitutional if imposed directly would also be unconstitutional if imposed by contract). Back to text

163. See supra part I.B.1 (discussing how DES became a standard in the United [**PAGE 750**]States). Back to text

164. See OTA Information Security, supra note 97, at 154; see also GAO Communications Privacy, supra note 15, at 6-7, 24-28. Back to text

165. See 22 C.F.R. § 121.1(XIII)(b) (1994) (designating cryptographic software as auxiliary military equipment). The ITAR state: "Software includes but is not limited to the system functional design, logic flow, algorithms, application programs, operating systems and support software for design, implementation, test, operation, diagnosis and repair." 22 C.F.R. § 121.8(f) (1994). Back to text

166. Freeware is software which is provided for distribution at no charge by the author, although the author typically retains the intellectual property rights. See supra note 73 (discussing the accessibility of Phil Zimmermann's PGP). Back to text

167. See E-mail from Vesselin Bontchev, Research Associate, Virus Test Center, University of Hamburg, to Michael Froomkin (July 22, 1994) (on file with author) (stating that Bontchev received a copy of PGP 2.6 from a chain of anonymous remailers and put it on the university's FTP site). According to an e-mail sent to the author by a person requesting anonymity, someone also uploaded a copy to a popular Linux archive site ( Overnight, therefore, the many sites around the world that mirror sunsite faithfully and unwittingly distributed PGP 2.6 worldwide. Back to text

168. For the curious, the URL is 2.6mit. Other copies quickly appeared at Internet distribution sites in England, Italy, and other European countries. Back to text

169. See, e.g., Privacy Issues in the Telecommunications Industry: Hearings on the Administration's "Clipper Chip" Key Escrow Encryption Program Before the Subcomm. on Technology and the Law of the Senate Comm. on the Judiciary, 103d Cong., 2d Sess. (May 3, 1994) [hereinafter Clipper Chip Hearings], available in WESTLAW, USTestimony Database, 1994 WL 231119, at *13-26 (testimony of Stephen T. Walker, President, Trusted Information Systems, Inc.) (summarizing studies showing how export controls harm U.S. business); see also Charles L. Evans, Comment, U.S. Export Control of Encryption Software: Efforts to Protect National Security Threaten the U.S. Software Industry's Ability to Compete in Foreign Markets, 19 N.C. J. Int'l L. & Com. Reg. 469, 481-82 (1994) (indicating that, because most countries do not have export controls [**PAGE 751**]on encryption software, U.S. software developers are concerned about losing their foreign market shares). Back to text

170. See Letter from Martha Harris, Deputy Assistant Secretary for Export Controls, U.S. Dep't of State, to Philip R. Karn, Jr. (Oct. 7, 1994) (ODTC Case CJ 081-94) (on file with author). The source code was published in Applied Cryptography. See, e.g., Schneier, supra note 12, at 519-32 (listing the source code for the IDEA cipher). Back to text

171. See Letter from William B. Robinson to Phil Karn, supra note 118. Back to text

172. Letter from William B. Robinson, Director, Office of Defense Trade Controls, U.S. Dep't of State, to Phillip R. Karn, Jr. (May 11, 1994), available online URL pub/EFF/Policy/Crypto/ITAR_export/Karn_Schneier_export_case/ floppy_2nd.response. Back to text

173. Letter from Martha Harris to Philip R. Karn, Jr., supra note 170. Back to text

174. See 22 C.F.R. § 120.10(5) (1994); see also Rubinstein, supra note 162, § 3 (describing the limited reach of the public domain exception). Back to text

175. See Letter from Martha Harris to Phillip R. Karn, Jr., supra note 170. Back to text

176. 35 U.S.C. §§ 181-188 (1988 & Supp. V 1993) (codified as "Secrecy of Certain Inventions and Filing Applications in Foreign Country"). Back to text

177. Id. § 181. Back to text

178. See id. § 186. Back to text

179. See Bamford, supra note 17, at 354-58 (describing the national publicity surrounding the NSA's issuance of secrecy orders to two inventors of cryptographic devices). Back to text

180. See id. at 355-56 ("Of the three hundred or so secrecy orders issued each year, all but a very few are either on inventions the government has originated itself and already classified, or on inventions somehow connected with the government."). Back to text

181. RSA is an example of a high-level cryptographic algorithm in circulation which has been patented. The issue of the validity of algorithmic patents is outside the scope of this Article. Back to text

182. Cf. Ferguson, supra note 162, at 659-61 (examining how the First Amendment limits effective regulation of scientific research and publication). Back to text

183. William Shakespeare, The Merry Wives of Windsor act 3, sc. 3, ll. 149-51 (H.J. Oliver ed., Methuen & Co. Ltd. 1971). Back to text

184. The text uses the Clipper Chip, which is designed for use in telephones, as an example. Similar procedures apply to the Capstone Chip and Fortezza PCMCIA card, although those also include circuitry supporting the Digital Signature Standard and data exchange such as electronic mail. Back to text

185. This prevents a "man-in-the-middle" attack by which the eavesdropper, armed with two Clipper telephones, intercepts the signal, decrypts it, records it, and then reincrypts it. In the event of such an attack, the two users will have different session keys (one with each of the attacker's phones), and will thus see different character strings appear on their readouts. See OTA Information Security, supra note 97, at 65 (Box 2-7). Back to text

186. Well, that is the theory anyway. One preliminary report suggests that the AT&T 3600c, a $1300 Clipper-equipped telephone, is difficult to use:

The hackers who bought the things had quite a hard time getting them to work at all. There were troubles getting it set up so that it would attempt to go into secure mode, and trouble getting it to do so reliably once a pair of phones that worked were found. . . To make the unit go into secure mode, one person pushes a red button. . . . Then the modems do their thing, making modem noises for about 20 seconds (your time may vary; AT&T manual said 10 seconds.) Once connected, the sound is very weak. We in the conference had trouble hearing when the earpiece was right next to a microphone. There was also a roughly quarter second delay (presumably this is for A/D conversion + encryption) in talking. This is a longish delay, roughly equal to an overseas satellite conversation.
Posting from Adam Shostack to Cypherpunks Mailing List (Aug. 15, 1994) (on file with author) (describing a demonstration of the AT&T 3600c at the 1994 HOPE (Hackers on Planet Earth) Conference). Back to text

187. SKIPJACK is a single-key cipher similar to DES but with an 80-bit key. In a single-key cipher the sender and the receiver use the same key to encrypt and decrypt the message. Originally, the NSA intended SKIPJACK for government communications systems. See Key Escrow Initiative Q&A, supra note 134, at 2. The current estimate is that SKIPJACK should remain secure against brute-force attacks, despite continual increases in computing power, for at least 30 years. See Ernest F. Brickell, et al., SKIPJACK Review Interim Report: The SKIPJACK Algorithm 1 (July 28, 1993), available online URL skipjack-review.html [hereinafter SKIPJACK Interim Report] ("[T]here is no significant risk that SKIPJACK will be broken by exhaustive search in the next 30-40 years.").

Not only is the SKIPJACK algorithm used by the Clipper Chip classified, but it is burned into the chip in a fashion "which is highly resistant to reverse engineering (destructive or non-destructive) to obtain or modify the cryptographic algorithm." FIPS 185, supra note 14, at 6004. NIST's description of its "state of the art" anti-reverse engineering design mentions three techniques: the use of nonmetallic links [**PAGE 754**]to hold the "write once" information programmed on to the chip which it claims "cannot be investigated externally"; the addition of "ghost logic" which is additional random or intentional circuits with no purpose other than to confuse analysis; and "false heat dissipation methods." National Inst. of Standards and Technology, Reverse Engineering Protection, 3 FIPS 185 Docket at tab 3. NIST also noted that reverse engineering tends to destroy the device being examined, and that because every chip will have differences, these differences also provide a layer of protection. See id.

Nevertheless, one commentator on the proposed FIPS concluded that, based on his review of the literature and his experience in teaching a class in reverse engineering at MIT, "[p]hysics tells us that we can find out what is in these chips. Others WILL perform this analysis. The only question is when. I believe it is 3-9 months." Comments of Thomas F. Knight, Jr., 2 FIPS 185 Docket. Back to text

188. A session key is the sequence of bits allowing decryption that will be used for only a single communication, one e-mail, or one telephone call. See infra text accompanying notes 790-91. Back to text

189. One such supersecure method is the Diffie-Hellman Key Exchange. See infra text following note 792. Back to text

190. Both Clipper Chips in a telephone conversation use the same session key to encrypt and decrypt their messages. The original Clipper proposal envisaged two keys, one for each chip in a telephone conversation, but NIST revised the standard to require only a single key. The NIST revision came in response to a comment it [**PAGE 755**]received which noted that a two-key system would limit law enforcement executing a wiretap on Alice to her side of the conversation unless it obtained a warrant for every Clipper Chip that communicated with Alice's telephone. See FIPS 185, supra note 14, at 6001. Capstone will generate its own keys. See Mykotronx, Inc., Capstone MYK-80: A New Breakthrough in Encryption Technology 1 (1993) (sales literature, on file with author) (stating that the MYK-80 features "Message Encryption Key generation"). Back to text

191. The chips also send each other an "initialization vector" of unspecified length, which NIST defines as a "mode and application dependent vector of bytes used to initialize, synchronize and verify the encryption, decryption and key escrow functions." FIPS 185, supra note 14, at 6004. Back to text

192. In a two-way real-time communication, the two chips send each other a LEAF, and each chip performs a check to ensure that the other LEAF is valid. In one-way communications, like e-mail, there is only one chip sending one LEAF, which will later be checked by the receiving chip. The difference is significant. If only one chip sends a LEAF, then a wiretapper will need to obtain the chip unique key for every chip which calls the line being tapped, potentially resulting in the compromise of a large number of chips. By contrast, if the LEAFs go both ways, the wiretapper is indifferent as to who started the conversation because she always has one LEAF she can decipher. Back to text

193. The exact makeup of the LEAF is classified. See FIPS 185, supra note 14, at 6004. It is known, however, that it consists of the 80-bit session key which has been encrypted with the unit key, a 32-bit serial number unique to each Clipper Chip, and a 16-bit checksum. See National Inst. of Standards and Technology, Technical Fact Sheet on Blaze Report and Key Escrow Encryption 1-2 (June 2, 1994). A checksum is a "computer technique for ensuring the integrity of an identifying number." John M. Carroll, Computer Security 334 (1977). Back to text

194. Supporters of the Clipper Chip challenge this assumption. Because the family key will be in circulation only by means of a special circuit board which will be inserted into a personal computer operated by law enforcement agents, supporters of the Clipper Chip argue that its distribution will be relatively limited. See, e.g., Dorothy E. Denning & Miles Smid, Key Escrowing Today, IEEE Comm., Sept. 1994, at 58, 58 (emphasizing that the family key is secret and only accessible to authorized government officials). Back to text

195. See Clipper Chip Hearings, supra note 169, available in Westlaw, USTestimony Database, 1994 WL 231122, at *4 (statement of Jo Ann Harris, Assistant Attorney General, Criminal Division, U.S. Department of Justice) (describing the decrypt processor). As this Article went to press, the law enforcement community had access to one of the two existing decrypt processors, although Clipper-equipped telephones are currently being shipped to government purchasers. See Telephone Interview with Miles Smid, Security and Technology Group Manager, National Institute of Standards and Technology (Feb. 9, 1994). Back to text

196. See U.S. Dep't of Justice, Authorization Procedures for Release of Encryption Key Components in Conjunction with Intercepts Pursuant to Title III (Feb. 4, 1994) [hereinafter Title III Authorization Procedures] (establishing procedures by which escrow agents could release keys in response to requests pursuant to Title III), in Office of the Press Secretary, The White House, Key Escrow Encryption: Announcements-February 4, 1994 (Feb. 15, 1994) (information packet accompanying press release) (on file with author) [hereinafter Key Escrow Announcements]; U.S. Dep't of Justice, Authorization Procedures for Release of Encryption Key Components in Conjunction with Intercepts Pursuant to FISA (Feb. 4, 1994) [hereinafter FISA Authorization Procedures] (same, pursuant to the Foreign Intelligence Surveillance Act (FISA), 50 U.S.C. §§ 1801-1811 (1988)), in Key Escrow Announcements, supra; U.S. Dep't of Justice, Authorization Procedures for Release of Encryption Key Components in Conjunction with Intercepts Pursuant to State Statutes (Feb. 4, 1994) [hereinafter State Authorization Procedures] (same, pursuant to state statutes or Title III), in Key Escrow Announcements, supra.

The Attorney General's procedures for release of key escrow components require that the request for key components include the agency and individual conducting the wiretap, as well as the termination date of the period for which the intercept will be authorized. See U.S. Dep't of Justice, Attorney General Makes Key Escrow Encryption Announcements 2 (Feb. 4, 1994) [hereinafter Attorney General's Key Escrow Announcements], in Key Escrow Announcements, supra.

U.S. foreign intelligence agencies have the authority to listen in on all forms of electronic communication, including telephones, without seeking a warrant if the communications are between foreign powers or are signals (other than spoken communication) from a foreign country, embassy, or consulate to another foreign party. See Foreign Intelligence Surveillance Act, 50 U.S.C. §§ 1801-1811 (1988); see also Americo R. Cinquegrana, The Walls (and Wires) Have Ears: The Background and First Ten Years of the Foreign Intelligence Surveillance Act of 1978, 137 U. Pa. L. Rev. 793, 811-13 (1989) (describing FISA procedures). Back to text

197. State Authorization Procedures, supra note 196, at 1; Title III Authorization[**PAGE 758**] Procedures, supra note 196, at 1; FISA Authorization Procedures, supra note 196, at 1. Back to text

198. The Attorney General's procedures require that requests for key segments be made by the principal prosecuting attorney of a state or political subdivision, or by the responsible person in an agency. See State Authorization Procedures, supra note 196, at 2. This requirement overlaps with the authority for emergency wiretaps in Title III, 18 U.S.C. § 2518(7) (1988). Back to text

199. See Clifford S. Fishman, Wiretapping and Eavesdropping § 30 (1978) ("Law enforcement officials have been reluctant to use [the emergency eavesdropping] authorization for fear it is unconstitutional."); 1 Wayne R. LaFave & Jerold H. Israel, Criminal Procedure § 4.2(g) (1984) ("There has been virtually no use of this emergency power . . . ."). But see Clifford S. Fishman, Wiretapping and Eavesdropping §§ 30-30f (Supp. 1993) [hereinafter Fishman Supplement] (describ-ing various procedures for emergency wiretaps which have been used in life-threatening situations). Back to text

200. Posting from Matt Blaze to Cypherpunks Mailing List (Feb. 2, 1994) (on file with author) (reporting surprising frankness on the part of NSA spokespersons at a meeting discussing key escrow). Back to text

201. The Department of Justice, however, is required to ascertain, after the fact, that the legal authorization existed for Title III wiretaps and FISA wiretaps. See Title III Authorization Procedures, supra note 196, at 2 (stating that the "Department of Justice shall" ascertain the existence of authorizations for electronic surveillance); FISA Authorization Procedures, supra note 196, at 2 (same). Strangely, the Justice Department has no such obligation when the key segment is requested by a state or local police force. See State Authorization Procedures, supra note 196, at 2 (stating that the "Department of Justice may" inquire into the authorization for electronic surveillance). Back to text

202. See 18 U.S.C. §§ 2510-2521 (1988 & Supp. V 1993). Back to text

203. XOR is a binary operation by which two binary numbers are compared a bit at a time. If both bits have the same value then XOR returns zero; if the two bits dif-fer, XOR returns one. Both segments are thus equally necessary to retrieve the key, and neither segment alone provides the holder with any more information about the value of the key than would be possessed by a person who held no segments at all.

An example, using a hypothetical 1-bit key divided into two 1-bit segments, A and B, may make this clearer. Even if you know that segment A is 1, you still have no more information about the key's value than does anyone else. If segment B is 0, the key is 1 (because 1 XOR 0 = 1); but if segment B is 1, then the key is 0 (because 1 XOR 1 = 0). Similarly, someone holding segment B but not A is equally uninformed as to the key's actual value. Back to text

204. See Attorney General's Key Escrow Announcements, supra note 196, at 1. Back to text

205. Currently a company called Mykotronx is the only supplier authorized to produce Clipper Chips. The secure, compartmented information facility is located at Mykotronx. See Key Escrow Initiative Q&A, supra note 134, at 6. This may impose barriers to entry for potential competitors. Back to text

206. NIST has devised a fairly elaborate procedure for the key generation process. Someone working for each of the escrow agents will type 80 characters into a computer, which will store the characters, the amount of time between the keystrokes, the date, and the time. The computer will then feed these values into NIST's secure hash algorithm to produce a number. For a discussion of the secure hash algorithm, see Approval of Federal Information Processing Standards Publication 180, Secure Hash Standard (SHS), 58 Fed. Reg. 27,712 (1993); Proposed Revision of Federal Information Processing Standard (FIPS) 180, Secure Hash Standard, 59 Fed. Reg. 35,317 (1994) (correcting a technical flaw and confirming the algorithm's security reliability). See also Dorothy E. Denning, The Clipper Encryption System, Am. Scientist, July-Aug. 1993, at 319, 321-22 (describing how two escrow agents and a computer are needed to create the unit key, thus increasing public confidence that the failure of one escrow agent cannot compromise the system); Denning & Smid, supra note 194, at 60-61 (describing how escrow agents generate a key number and a random seed number for use in each programming session). Currently, the system is able to program about 120 Clipper Chips per hour, although NIST contemplates switching to a higher volume system at some future date. See id. at 64.

The procedure for generating the random numbers is important because anyone who knows which pseudorandom number generator was used and who also knows the "seed" could use this information to recreate all the keys without going to the trouble of consulting the escrow agents. Back to text

207. Technically, the two 80-bit segments held by the escrow agents are XORed to produce the actual key. See Denning & Smid, supra note 194, at 64-65. Back to text

208. The procedures were devised in collaboration with the Department of Justice, the FBI, NIST, the NSA, the Department of the Treasury Automated Systems Division, and Rapid System Solutions, Inc. See id. at 58. Back to text

209. See OTA Information Security, supra note 97, at 65 n.5 (Box 2-7) (citing presentation by NIST Security Technology Manager Miles Smid in June 1994). Back to text

210. Title III Authorization Procedures, supra note 196, at 3; FISA Authorization Procedures, supra note 196, at 3; State Authorization Procedures, supra note 196, at 3. The government is completely correct to warn users of EES that their rights to exclude illegally seized or tainted evidence in any criminal proceeding are unchanged by EES. Back to text

211. Traffic analysis using pen registers (which record the numbers called by a telephone) and trap and trace devices (which record numbers calling the telephone) do not implicate the Fourth Amendment. See Smith v. Maryland, 442 U.S. 735, 741-46 (1979). Under Title III, however, both pen registers and trap and traces require a court order, although an actual warrant is not required. See 18 U.S.C §§ 3121-3123 (1988); Criminal Procedure Project, Twenty-Second Annual Review of Criminal Procedure: United States Supreme Court and Court of Appeals 1991-1992, 81 Geo. L.J. 853, 952-54 (1993). Because decrypting the LEAF with the family key involves listening to at least a few seconds of the conversation, the act of intercepting and decrypting the LEAF constitutes wiretapping. This is so even though the information thus gathered is no better than could be obtained by a trap and trace device or a pen register. Perversely, however, even though the decryption of the LEAF is a wiretap, it may not violate the Fourth Amendment if the telephone user has no reasonable expectation of privacy for the LEAF. Whether a chip user would have for her LEAF a reasonable expectation of privacy, as the term is used in Fourth Amendment cases, is not as clear as it should be. The difficulty arises because the user is aware that the government has the information needed to decrypt the LEAF. Although the government has promised to use that information only in specific circumstances, the government cannot be estopped and is therefore free to renege. See, e.g., Office of Personnel Management v. Richmond, 496 U.S. 414, 434 (1990) (holding that payments from the Federal Treasury may only be made if authorized by statute, and that erroneous advice given to a claimant by a government employee does not therefore estop the government's denial of the claim); Heckler v. Community Health Servs., Inc., 467 U.S. 51, 63 (1984) (noting "the general rule that those who deal with the Government are expected to know the law and may not rely on the conduct of Government Agents contrary to law"); Federal Crop Ins. Corp. v. Merill, 332 U.S. 380, 385 (1947) (holding that claimants' lack of knowledge of regulations published in the Federal Register does not prevent those claimants from being bound by such regulations). Back to text

212. See 18 U.S.C. § 2520 (1988). Back to text

213. If the agent knowingly released a key improperly, the agent might be a co-[**PAGE 763**]conspirator or abettor of the illegal wiretapper. Back to text

214. If the executive order were not classified, it would presumably have to be disclosed pursuant to the Adminstrative Procedures Act. See Adminstrative Procedures Act, 5 U.S.C. § 552(a)(2)(C) (1988) (requiring agencies to make publicly available instructions to staff that affect members of the public). Back to text

215. See 18 U.S.C. § 1030(a)(3) (1988) (codifying the Computer Fraud and Abuse Act, which makes it illegal to trespass into federal computer systems). Section 1030(a)(4) proscribes the use of federal computers to defraud. Section 1030(a)(5) makes illegal any unauthorized access to a computer system used in interstate commerce, as well as the alteration or destruction of records. This last provision applies only to those acting without authority. See § 1030(a)(5). Thus, the "plumber" would violate the statute, but arguably the escrow agent would not. Back to text

216. See Smith v. Maryland, 442 U.S. 735, 743-44 (1979) ("[A] person has no legitimate expectation of privacy in information he voluntarily turns over to third parties."). Back to text

217. See infra note 329 and accompanying text (discussing the Electronic Communications Privacy Act of 1986). Back to text

218. H.R. 5199, 103d Cong., 2d Sess. (1994). Back to text

219. Id. § 31(h)(2). The Encryption Standards and Procedures Act of 1994, if enacted, would have provided:

The United States shall not be liable for any loss incurred by any individual or other person resulting from any compromise or security breach of any [**PAGE 764**]encryption standard established under subsection (b) or any violation of this section or any regulation or procedure established by or under this section by-
(1) any person who is not an official or employee of the United States; or
(2) any person who is an official or employee of the United States, unless such compromise, breach, or violation is willful.

Id. Back to text