A. Michael Froomkin
Document information and copyright notice
[Page n] references relate to the pagination of the printed version.
Click here to jump to a specific page:
The increases in personal privacy and communications security
promised by cryptography come at the expense of those who benefit
from insecure communications. If every telephone call is routinely
encrypted, domestic law enforcement agencies, such as the FBI and
local police forces, will find wiretapping harder or even
impossible. If information on computers is routinely encrypted
police may find evidence inaccessible or incomprehensible. When
sophisticated encryption technologies are used abroad, intelligence
agencies such as the NSA, which routinely seek to penetrate the
communications of foreign powers, find their missions complicated.
To the extent [Page 743]
that
American citizens are better off because wiretaps help catch and
convict criminals, and to the extent that communications
intelligence protects the national interest from foreign threats,
developments that impede legitimate wiretaps may make us all worse
off.
The fear of losing electronic surveillance capabilities because
of advances in encryption technology has produced a three-pronged
reaction from the law enforcement and intelligence communities.
First, their spokespersons have begun a public relations offensive
designed to explain why these capabilities matter.{137} Second, they have
sought legislation requiring that telephone networks and other
similar communications channels be designed in a manner that
facilitates wiretapping.{138} Third, they have designed and supported EES,
best known in its most famous implementation, the Clipper Chip,
which enables the government to keep a copy of the key needed to
decrypt all communications using EES. These activities share the
premise that it is reasonable for the government to request, and in
some cases require, that private persons communicate in a manner
that makes interception by the government at least practical and
preferably easy.
The Administration{140} makes two types of arguments in favor of EES.
In its hard sell, the Administration, primarily through the [Page 744]
FBI, paints a lurid picture
of law enforcement stripped of an essential crime-detection and
evidentiary tool--wiretapping--while pornographers, drug dealers,
terrorists, and child molesters conspire via unbreakable ciphers,
storing their records and child pornography in computers that
become virtual cryptographic fortresses. Meanwhile, the
intelligence agencies, primarily the NSA, quietly murmur that
existing policies have proved ineffective in preventing the
increasing use of unescrowed encryption, and suggest that their
proposals should be adopted to prevent developments that might (or
might not, they won't say) undermine the nation's communications
intelligence capabilities.
In its soft sell, the government argues that if the NSA has
designed a cryptographic system that it is willing to certify as
secure and make available to the American public, the government
has an obligation to take steps to prevent that cipher from being
used against it by criminals and foreign governments. In fact, the
current national standard cipher, DES, is strong enough that the
U.S. government has sought to prevent its export and may indeed
regret having let the algorithm become publicly available.{141} EES, the argument
goes, just maintains the status quo. Even if everyone used a
Clipper-equipped telephone, telephone conversations would be no
less secure against legitimate government wiretapping than they are
today, while being more secure against illicit eavesdropping.{142}
a. Domestic Law Enforcement
According to FBI Director Louis Freeh, electronic intelligence,
especially wiretapping, is crucial to effective law enforcement:
if the FBI and local police were to lose the ability to tap
telephones because of the widespread use of strong cryptography,
the "country [would] be unable to protect itself against
terrorism, violent crime, foreign threats, drug trafficking,
espionage, kidnapping, and other crimes."{143}
From the statistics available, it is difficult to determine how
[Page 745]
much difference
wiretaps actually make.{144} The FBI estimates that wiretaps play a role
in an average of 2200 convictions per year,{145} but it is unclear how
many of these convictions could have been obtained without
wiretaps. Despite an almost 50% increase since 1983, court-ordered
wiretaps are still relatively rare: only 919 were authorized in
1992 for all federal, state, and local police forces.{146} Of these, only 141
wiretap orders covered electronic devices such as faxes, digital
display pagers, voice pagers, cellular phones, or electronic mail.
In 1993, the 976 active court-ordered wiretaps allowed police to
hear approximately 1.7 million conversations involving nearly
94,000 persons. The listeners described about 20% of the
conversations as incriminating.{147} The law enforcement community suggests that
wiretaps make the biggest difference in the largest cases because
wiretaps have been used to gather evidence in 90% of the terrorism
cases brought to trial.{148} The average cost of a wiretap was $57,256 in
1993,{149} so it may be
that the biggest cases are the only ones in which the expense of
monitoring a telephone line seems justified.{150}
Statistics aside, it seems only logical that the spread of
strong, user-friendly cryptography would increase the risk that
evil people will be able to frustrate law enforcement attempts to
crack their computers or bug their telephones. Whether the risk
has yet [Page 746]
manifested
itself is less clear. For all its predications of disaster in the
making, "the FBI has not been able to point to a single
instance to date [(September 1994)] where encryption has hampered
[its] investigation of a case."{151}
Nevertheless, the fear that rogue cryptography might allow
"terrorists, drug dealers, and other criminals"{152} to evade law enforcement seems to supply a large part of the motivation for the
Administration's support for EES. One can only sympathize with
officials who were, no doubt, asked whether they wished to go down
in history as the individuals responsible for letting loose a
technology that might someday hamper the investigation of a
terrorist threat to a large population center.{153} Faced with the FBI's
Manichaean vision of, on the one hand, a world of rampant
cryptography in which the bad guys remain impregnable behind
cryptological walls and, on the other hand, an ambitious plan to
return to the status quo ante in which the police remain able to
intercept and understand most if not all electronic communication,
it is not surprising that the Clinton Administration opted for what
must have appeared to be the safer course.
[Page 747]
b.
Intelligence-Gathering
The communications intelligence capabilities of the United States
are a subject "characterized by secrecy even greater than that
surrounding nuclear weapons."{154} Unclassified discussion of the effect of
strong private cryptography on the capabilities of intelligence
agencies quickly becomes conjecture. We do know, however, that two
of the most important functions of the NSA are to acquire and
decrypt foreign communications, and to conduct traffic analysis of
foreign and international communications.
The two functions are related, but different. Acquisition and decryption of foreign communications are the stuff of headlines: listening to the Soviet President's telephone calls made from his limousine or breaking German codes during World War II. Traffic analysis is more subtle, but no less important. It is the study of the sources and recipients of messages, including messages that the eavesdropper cannot understand. In wartime, traffic analysis allows intelligence agencies to deduce lines of command. Changes in the volume and direction of traffic can signal the imminence of operations.{155}
Widespread foreign access to even medium-grade cryptography makes
it more difficult for U.S. communications intelligence to select
the messages that are worth decrypting, or even worth reading.{156} Worse, it makes
traffic analysis much more difficult. So long as most electronic
communications are unencrypted, intelligence agencies are able to
sort messages in real time, and identify those of interest, or
those which warrant further attention.{157}
[Page 748]
Furthermore, if most traffic is plaintext, then ciphertext cries
out for attention--here is someone with something to hide. Even if
the message cannot be decrypted quickly, the source can be flagged
for traffic analysis, which enables the intelligence agency to
build up a picture of the persons with whom the source
communicates. If everyone is using strong cryptography, then the
most secret messages no longer stand out.
c. Failure of Laws Designed to Prevent the Spread of Strong
Cryptography
The United States has several long-standing laws and policies
designed to prevent strong cryptography from spreading abroad, and
even from being widely used at home. Although these may have
served to slow the spread of strong cryptography, ultimately they
have failed to stop it. The following is only a brief summary of
two exemplary policies and their effects.{158}
i. Export Control: The ITAR
U.S. export control is designed to prevent foreigners from
acquiring cryptographic systems that are strong enough to create a
serious barrier to traffic analysis, or that are difficult to
crack.{159} Two sets of
regulations govern the export of encryption software: the Export
Administration Regulations (EAR) govern "dual use"
technologies{160} and
the International Traffic in Arms Regulations (ITAR) apply to items
that the government considers inherently military in nature.{161} The EAR are
generally less demanding, but the ITAR take precedence.{162} Under the ITAR
regime, applica[Page 749]
tions
to export cryptographic software as strong as (or stronger than)
DES are routinely denied.{163} Only strong products that lack [Page 750]
the capability of being
adapted for encryption, or which are designed for specific banking
applications, receive official export clearance.{164}
The ITAR have failed to prevent the spread of strong cryptography. The ITAR prohibit export of cryptographic software,{165} nevertheless software created in the United States routinely and quickly finds its way abroad. For example, when version 2.6 of PGP, a popular military-grade cryptography program, was released in the United States by graduate students at MIT as freeware,{166} a researcher at the Virus Test Center at the University of Hamburg, in Germany, received a copy within days from an anonymous remailer.{167} He then placed it on his internationally-known Internet distribution site.{168} As would-be sellers of cryptographic products have frequently testified to Congress, the major effect of the ITAR is to prevent U.S. companies from competing with those foreign companies that sell sophisticated cryptographic software abroad.{169}
[Page 751]
Meanwhile,
enforcement of the ITAR has produced absurd results. The State
Department has refused to license the export of a floppy disk
containing the exact text of several cryptographic routines
identical to those previously published in book form.{170} The refusal was all
the more bizarre because the book itself was approved for export.{171} The only reasons
given by the State Department for its refusal were that
"[e]ach source code listing has been partitioned into its own
file and has the capability of being compiled into an executable
subroutine,"{172}
and that the source code is "of such a strategic level as to
warrant" continued control.{173} The State Department also concluded that the
"public domain" exception to the ITAR{174} did not apply
and--most bizarrely of all--that its decision was consistent with the
First Amendment.{175}
ii. "Classified at Birth"
The Inventions Secrecy Act{176} gives the Commissioner of Patents the
authority to issue patent secrecy orders. Even if the government
has no ownership interest in the invention, the orders block the
issuance of a patent and place the application under seal. If the
Nuclear Regulatory Commission or the Department of Defense states
that publicizing the invention would be detrimental to the national
security, the patent will be withheld "for such period as the
national interest requires."{177} Willful disclosure of an invention covered by
a secrecy order is a criminal offense.{178} [Page 752]
While the application of the Inventions Secrecy Act to privately
created cryptographic devices has sometimes occasioned publicity,{179} most devices covered
by secrecy orders are invented at government expense.{180}
The existence of a number of high-level cryptographic algorithms in public circulation, some patented,{181} some not, suggests that the Inventions Secrecy Act has been far from successful at preventing the spread of strong cryptography.{182}
"Here, here, here be my keys; ascend my chambers;The Escrow Encryption Standard is designed to provide users with communications that are secure against decryption by all third parties except authorized agents of the U.S. government. Before a Clipper Chip is installed in a telephone,{184} the government will permanently inscribe it with a unique serial number and a unique encryption key. The government will keep both of these numbers on file. In order to reduce the danger that the file might be stolen or otherwise compromised, the chip's unique encryption key will be split into two pieces, each held by a different "escrow agent." The escrow agents will be required to guard the segments and release them only to persons who can demonstrate they will be used for authorized intercepts. Reuniting the pieces of a chip's unique key gives the government the capability to decrypt any Clipper conversations.
search, seek, find out."{183}
[Page 753]
a. A Tale of Three Keys
From the user's point of view, the Clipper Chip is a black box:
pick up your Clipper-equipped telephone, dial another Clipperphone,
push a red button to initiate the security feature, wait a few
seconds for the two chips to synchronize, read off the character
string displayed on the telephone to the other party to confirm the
security of the conversation,{185} and start the conversation.{186} The conversation is
scrambled with a classified algorithm called SKIPJACK, which took
the NSA ten years to develop, and which the government certifies as
secure for the foreseeable future.{187} What [Page
754]
happens during those few seconds before the
conversation begins, and why, are the essence of EES and the source
of controversy.
From the government's point of view, EES relies on three keys: the session key,{188} the chip key, and the family key. The session key is what SKIPJACK uses to encrypt and decrypt the conversation. Every conversation has a new session key, and any third party seeking to eavesdrop on the conversation would need to have the session key to decrypt the conversation. Oddly, the Clipper Chip does not select the session key; indeed, the Clipper Chips do not care how the telephones do this.
Suppose Alice wants to have a secure conversation with Bob.
Alice calls Bob, then pushes the red button. At this point, the
two Clipperphones have to agree to a session key according to a
method selected by the manufacturer. The maker of the Clipperphone
is free to use as secure a method as she likes. The two
Clipperphones might, for example, use a supersecure method of
agreeing on the session key which is so safe that two strangers who
have never met before can agree on a session key in public while
being overheard, and yet anyone who overhears what they say will
still be unable to work out what the key is.{189} Assume that Alice
and Bob use telephones that have this supersecure selection method
built in. Once the two telephones agree on the session key, each
phone feeds the key to its Clipper Chip.{190} As soon as the Clipper Chips are [Page 755]
told the session key, they
begin the Clipper telephone session. The first step in a Clipper
telephone session is to undermine the eavesdropper-proof creation
of the session key by transmitting the session key in encrypted
form for the benefit of any public servants who may be
listening.
At the start of every Clipper session, a Clipper Chip sends a
stream of data called a Law Enforcement Access Field (LEAF).{191} Unless Bob's Clipper
Chip receives a valid LEAF from Alice's chip, Bob's chip will not
talk with it.{192} As
can be seen from the Figure on page 756, the LEAF is built in
layers. At the center lies the session key. The chip encrypts the
session key with the unique chip key. It then appends the sending
chip's serial number and a checksum, then reencrypts the data with
the family key, which is a master key held by the government.{193}
[Page 756]
In short, eavesdroppers seeking access to the session key must
use two keys to decrypt the LEAF: the family key (which is common
to all chips) and the chip key (which is different for every chip).
Assuming that the family key will be in fairly wide circulation,{194} the security of the
Clipper Chip stands or falls on the security of the master list of
chip keys. This list, or the two lists of key segments, would be
of enormous value to any attacker, such as a foreign government
bent on industrial espionage. The way in which the keys are
created, and the method by which they are held and released, are
critical elements of the user's security.
When a public servant engaged in a lawful wiretap first comes
across a Clipper session, she records it, including the LEAF. The
public servant must now acquire the family key if she does not
already possess it. According to NIST, the family keys will not be
transmitted to law enforcement personnel, but will instead be
stored [Page 757]
in special
circuit boards capable of being installed in ordinary PCs.{195} Once decrypted with
the family key, the LEAF reveals the serial number of the Clipper
Chip and also reveals the encrypted session key. The public
servant must then contact the two escrow agencies, giving them the
chip's serial number and a legally valid reason for the wiretap,
usually in the form of a warrant from a state court, a federal
court, or the special Foreign Intelligence Surveillance Act (FISA)
court.{196} The
requestor must "certify that [the] necessary legal
authorization for interception has been obtained to conduct
electronic surveillance regarding these communications."{197} How this
certification operates when the legal basis [Page 758]
is "exigent circumstances" (which is determined by the same officer who would be
requesting the key segment), is not explained,{198} perhaps because
warrantless wiretaps based on exigent circumstances are relatively
rare.{199} There
remains some doubt as to how the NSA and other agencies in the
national security community will obtain keys. It is notable that
in a recent meeting involving the FBI, the NSA, and AT&T's Bell
Labs, "the NSA did not answer a question as to whether the
national security community would obtain keys from the same escrow
mechanism for their (legally authorized) intelligence gathering or
whether some other mechanism would exist for them to get the
keys."{200}
The escrow agents have no duty to make any independent inquiries
as to the adequacy of the certification before releasing the key
segments.{201} Once
satisfied that the wiretap request appears legitimate (in that it
comes from someone authorized to make a request and contains her
certification that adequate legal authority exists), the escrow
agents are required to disclose the key segments for the key for
which the serial number was submitted. The public servant
requesting the key fragments puts them together and uses [Page 759]
the reconstituted chip key to
decrypt the session key. Armed with the decrypted session key, the
public servant can at last decrypt the conversation. Because the
presence of the Clipper Chip has no effect on the applicable
constitutional and statutory rules, the public servant remains
obligated to minimize the intrusion.{202}
In summary, a public servant might decrypt an EES message as follows:
Public servant
(1) intercepts the message, including the LEAF (128-bit LEAF encrypted with the family key);
(2) decrypts the LEAF with the family key (32-bit chip ID, 80-bit session key encrypted with chip key, 16-bit checksum);
(3) contacts her escrow agents, reports the chip ID, and avers existence of the legal authority for the wiretap;
(4) receives two 80-bit key segments;
(5) XORs{203} the key segments to produce an 80-bit chip key;
(6) decrypts the encrypted session key with the chip key;
(7) decrypts the entire message with her decrypted session key.
[Page
760]
they will be taken to a secure, compartmented
information facility,{205} which is the vault-like room that the government uses when handling classified documents. Each of the escrow
agents will provide a list of random numbers which, when combined,
will provide the numbers from which the keys will be generated.{206}After the keys are generated, the escrow agents will be given a disk containing lists of chip serial numbers and an associated 80-bit number which represents half the information needed to recreate a chip's key. Both key segments must be combined to retrieve the chip key, and neither segment alone provides the holder with any information as to the chip key's contents.{207}
Although the escrow agents do not check the bona fides of any
requests for key fragments, they do require a substantial amount of
paperwork before releasing a key. The escrow agents are also
required to keep detailed records of key segment requests and
releases. The existence of this paper trail should provide a
significant disincentive to rogue wiretapping requests by agents in
the field. Similarly, NIST has announced an elaborate system of
safeguards to protect each Clipper Chip's unique key. The scheme
[Page 761]
involves complex
rationing of information and mutual monitoring by the escrow agents
from the moment the Clipper Chip is created. Further security
attends the inscription of the key upon a Clipper Chip, its
subsequent division into two key segments, and ultimate
safeguarding by the two escrow agents.{208}
The security precautions introduced by NIST in late 1994 are
complex. To the nonspecialist they appear sufficient to prevent
security breaches at the time the keys are "burned in"
and to prevent surreptitious copying or theft of the key list from
the escrow agents. But no amount of technical ingenuity will
suffice to protect the key fragments from a change in the legal
rules governing the escrow agents. Thus, even if the technical
procedures are sound, the President could direct the Attorney
General to change her rules regarding the escrow procedures.
Because these rules were issued without notice or comment, affect
no private rights, and (like all procedural rules) can therefore be
amended or rescinded at any time without public notice, there is no
legal obstacle to a secret amendment or supplement to the existing
rules permitting or requiring that the keys be released to
whomever, or according to whatever, the President directs. Because
the President's order would be lawful, none of the security
precautions outlined by NIST would protect the users of the EES
system from disclosure of the key segments by the escrow agents.
Nothing in the EES proposal explicitly states that the NSA will not
keep a set of keys; indeed, the only way to acquire a set of EES-compliant chips is to have the device that incorporates them tested
and approved by the NSA. Similarly, although the specifications
for the decrypt processor call for it to delete keys when a warrant
expires and to automatically send a confirmation message to the key
escrow agents, the interim model (there is only one) in use by law
enforcement organizations relies on manual deletion.{209}
[Page 762]
c. Limited Recourse for Improper Key
Disclosure
The escrow system lacks legal guarantees for the people whose
keys are generated by the government and held by the escrow agents.
Indeed, the Attorney General's escrow procedures state that they
"do not create, and are not intended to create, any
substantive rights for individuals intercepted through electronic
surveillance."{210}
In short, the government disclaims in advance any reliance interest
that a user of an EES-equipped device might have in the
government's promise to keep the key secret.{211} A victim of an
illegal wiretap would have a cause of action under Title III
against the wiretapper,{212} but, it appears, no remedy against the escrow
agents, even if the escrow agents acted negligently or failed to
follow their own procedures.{213} The Attorney General's proce[Page 763]
dures themselves are merely
directives. They are not even legislative rules, which might be
subject to notice and comment restrictions before being rescinded.
A future administration could, if it wanted, secretly{214} instruct the escrow
agents to deliver copies of the keys to an intelligence or law
enforcement agency, or even White House "plumbers,"
thereby violating no law or regulation (the plumbers, though, would
violate Title III when they used the information).{215} Because the chip-unique keys were voluntarily disclosed to the government, the
chip's owner might lack a "legitimate" (that is,
enforceable) expectation of privacy in the information.{216}
If the intercepted communication were an e-mail or a file transfer, rather than a telephone call, the chip owner subject to an illegal or inadvertent disclosure by the escrow agents may be in a particularly weak position if the information ever makes its way to court: many Title III protections granted to voice communications do not apply to transfers of digitized data.{217}
Shortly before the 103d Congress adjourned, Congressman George
Brown introduced the Encryption Standards and Procedures Act of
1994,{218} which would
have waived the sovereign immunity of the United States for
"willful" but unauthorized disclosures of key fragments
by its officials--and excluded liability in all other circumstances.{219} In the
absence of similar legislation, however, there [Page 764]
may currently be no monetary
remedy even for a "willful" disclosure.