A. Michael Froomkin

Document information and copyright notice

[Page n] references relate to the pagination of the printed version.

Click here to jump to a specific page:

To table of contents

[Page 885]

Technical Appendix: Brute-Force Cryptanalysis,
Public-Key Encryption and Digital Signatures

There are two kinds of cryptography in this world: cryptography that will stop your kid sister from reading your files, and cryptography that will stop major governments from reading your files.{761}

Cryptography makes it possible to talk privately even if someone is listening in on your telephone line:{762} you can have a secure communication over an insecure channel. Cryptography also allows you to keep electronic records in a form that is easily accessible to you but inaccessible to snoops, whether siblings or governments. Ironclad protection, however, requires effort.

Part of the art of cryptography consists of choosing an appropriate level of security because, for long texts, the highest-level encryption can be slow even with a computer.{763} If the purpose of the encryption is to keep someone from peeking at the answer to a riddle, printing the answer upside down may be enough. If the purpose is to keep salacious material out of the hands of the lower classes, then Latin was for many years the traditional cipher.{764} If the encryption is used, however, to authenticate wire transfers of funds or to send messages to submarines in wartime, a high-level method is required. In each case, the determination is driven by considerations of the time and effort it takes the sender and receiver to use the system, the resources that a third-party would[Page 886]likely be willing to devote to cracking the system, and the costs of guessing wrong.

The strength of a cryptographic system is usually measured by the amount of effort that would be required to crack it by an enemy who knows the algorithm. This means, absent the discovery of a latent vulnerability in the cipher that allows the attacker to take a mathematical short-cut,{765} a "brute-force" attack in which computers try every possible key.{766} Surprisingly, modern cryptography does not require that the algorithm be kept secret. Indeed, the rule of thumb is that ciphers that depend on keeping the algorithm secret are unreliable.{767} In modern cryptography, only the key, and not the method of encryption needs to remain secret to preserve the security of the message.{768}

The optimal cryptographic system would be easy to use and impossible to crack. The real world imposes tradeoffs between ease of use and vulnerability to attack,{769} although computers now put very powerful cryptography within the reach of anyone who has access to a personal computer (PC). Cryptography can be used to defend against various forms of attack including decryption by third parties, modification of messages, and fabrication of authentic-[Page 887]looking but spurious messages. A strong cryptographic system protects messages that may be intercepted by an enemy, and also authenticates messages received.

A. Brute-Force Cryptanalysis

There are three fundamental ways for a third party to crack a cipher where the algorithm is known but the key is not. First, an enemy can simply steal the key or suborn a key-holder. Second, if the enemy knows the algorithm but not the key, the enemy can try to analyze the cipher, hoping to find a weakness in the algorithm. Some very trusted ciphers have fallen to mathematical analysis, at which point decryption becomes easy.{770} Third, and of particular relevance to modern cryptography, the attacker can mount a "brute-force" attack using computers, perhaps with large numbers of specially optimized chips running in parallel, to try every possible key until the message is decrypted. Finding the sender's key gives the attacker more than the text of a single message; it gives access to all future messages that use the same key. If the sender uses a digital signature, the attacker can forge the sender's signature as well.

If the only (known) way to decrypt a cyphertext is to try every possible key, then a longer key makes for a stronger cipher because a longer key has more possible values. An eight-bit {771} key has 2^8 (256) possible values. A computer would have to try all 256 possible values to be certain of finding the key, although the average number of possible values the computer would have to test before encountering the answer will be only 128. Similarly, if the key is 128 bits long, which is the equivalent of the maximum amount of information in a sixteen-character message on a personal computer,{772} a[Page 888]brute-force attack would require that 2^128 keys be tested to be certain of finding the key. To put this number in perspective, a computer processing a million keys per second would require about 10^25 years to complete the task, an amount of time 10^15 times greater than the estimated age of the universe.{773}

Although 10^15 times the age of the universe makes for an impressive statistic, it is misleading. Chips capable of trying 256 million keys per second are foreseeable within the decade, and several of these can be harnessed to work together. Trying out keys is often ideally suited to parallel processing. Parallel processing allows a problem to be split up between many computer chips running simultaneously, trying many millions, even billions, of keys per second. The chips need not be part of a single computer: the key-cracking problem can be distributed among large numbers of workstations on a network, with each workstation being only slightly more powerful than today's desktop PCs. Indeed, the problem can be parceled out to several different networks, each communicating with one another only infrequently. A distributed processor of this nature, or a single optimized parallel processor, can try vastly more than a million keys per second, making large keys susceptible to being broken in a reasonable period of time.{774} Parallel processors already on the drawing board might make it possible to break even a 512-bit key at a cost which, although out of reach of the average citizen, would be well within the means of the poorest government.{775} The cryptographer's solution, of course, is to use a longer key.

Exactly how large a key would be vulnerable to an economical brute-force attack is a matter of debate. Advances in computer power continue to make longer and longer keys vulnerable, but the same advances make it easier and cheaper to encrypt and decrypt with longer keys. If one assumes the existence of economically[Page 889]rational, albeit immoral, attackers seeking to maximize the return on an investment in the computing power needed in order to mount a sustained attack, then high-value secrets are more at risk than low-value ones. They therefore deserve longer keys. No consensus exists as to how much security is enough for high-value secrets, however, and in any case that point is probably well past the security provided by DES.

DES's inventors originally planned to use a 128-bit key, which would have provided 40 sextillion (40,000,000,000,000,000,000,000) times more security, but the NSA persuaded them that 56 bits sufficed.{776} A 56-bit key provides a keyspace of 2^56, or 72 quadrillion possible keys.{777} Although this is a very large number, even upon DES's adoption as the U.S. standard in 1977, critics predicted that an optimized computer costing $20 million to build could break an average of two 56-bit DES keys a day. Depreciated over the machine's first five years of service, this would have worked out to only $5000 per solution.{778} This was and is a sum well within the reach of many governments, and the price would be much lower today.{779} DES's inventors estimated that the cost of building such a computer would be closer to $200 million, or $50,000 per key.{780} If DES had used the 128-bit key originally contemplated, breaking each key would have cost an average of $200 septillion in the 1970s, even using Diffie and Hellman's lowest-cost assumptions. Perhaps as a result of its shorter key, DES is not considered sufficiently secure to protect classified data.{781} Despite suggestions from the NSA that DES might not deserve recertification after 1988,{782} NIST recently recertified DES as suitable for commercial purposes for five more years.{783} [Page 890]

B. Public-Key Cryptography

Before the invention of public-key cryptography in 1976,{784} a sender and receiver who wanted to use a cipher had to agree on a key in order to communicate securely. This method was very burdensome to both parties. First, sender and receiver needed a secure means to transmit the key itself. For example, if you were trying to send a message to your agent behind enemy lines, and you were afraid that the bad guys had cracked your cipher, it was too late to send a coded message saying "the new key is X."{785} Second, even if the key was transmitted securely (for example, by handing it to the agent before she left for her mission), the security of a single-key cipher evaporated as soon as the key was compromised. If you wrote the secret key down, someone could have found it; if you did not write it down, either it must have been short or you needed a phenomenal memory. Third, the ever-present danger of key compromise cast a doubt over the authenticity of every message. A third party who had the key could use it to alter messages, or to send fake messages purporting to be from any of the parties who legitimately held the key.

Public-key cryptography solves all of these problems. As a result, public-key cryptography has been described as a "revolutionary technology," which will make routine communication encryption ubiquitous.{786} In a public-key system, each user creates a public key, which is published, and a private key, which is secret.{787} [Page 891]Messages encrypted with one key can be decrypted only with the other key, and vice-versa. Thus, if Alice wants to send a securee-mail message to Bob, and they both use compatible public-key cryptographic software, Alice and Bob can exchange public keys on an insecure line. Alice would have to input her plaintext and Bob's public key. The program outputs the ciphertext, which is a stream of characters that looks like garbage to anyone who should happen to see it. When Bob receives the ciphertext, he inputs both it and his private key to his program, which reveals Alice's plaintext. One of the wonderful properties of public-key encryption is that, so far as we know,{788} a third party's possession of Alice's public key and a complete description of the encryption algorithm puts that third party no closer to deducing Alice's private key or reading her messages than if the third party had the ciphertext alone. Thus, it is easy to establish a secure line of communication with anyone who is capable of implementing the algorithm. (In practice, this is anyone with a compatible decryption program or other device.) Sender and receiver no longer need a secure way to agree on a shared key. If Alice wishes to communicate with Bob, a new recipient with whom she has never communicated before, Alice and Bob can exchange the plaintext of their public keys or look each other up in a freely accessible directory of public keys. Then, Alice and Bob can each encrypt their outgoing messages with the other's public key and decrypt their received messages with their own secret private key.{789}

One drawback, however, is that public-key encryption and [Page 892]decryption is much slower than commonly used single-key systems such as DES.{790} Thus, although public-key encryption is ideal for short messages, it is less than ideal for longer ones, and is particularly unsuitable for high-speed real-time applications like fast data transfer or telephone conversations. The speed problem can be overcome, however, by using a hybrid system. In a hybrid system, two parties who wish to communicate securely over an insecure medium use a public-key system to agree on a session key which then becomes the one-time key for a faster, conventional, relatively secure, single-key cipher such as DES.{791} Each time the parties initiate a new conversation, they generate a new session key, which, though lasting for the entire conversation, is never repeated.

If Alice and Bob are going to change their session key every time they talk so as to maximize the security of their single-key cipher, they need a secure means of agreeing on a session key each time they want to communicate. Ideally, the method would allow them to choose the session key in public, or on an insecure line, without fear of eavesdroppers. Public-key cryptography allows Alice and Bob to achieve this feat in either of two ways. Using the first method, Alice generates the session key, encrypts it with Bob's public key, and sends it to him. Bob decrypts the message with his private key, inputs the session key to his single-key software or telephone, and then the data exchange or conversation begins.{792} Alternatively, the parties can use Diffie-Hellman Key Exchange, in which Alice and Bob publicly send each other numbers from which they, and only they, can jointly calculate a session key. In Diffie-[Page 893]Hellman Key Exchange, Alice and Bob agree on a number, b, which does not have to be secret, to use as the basis for their calculations. They also agree, again without any attempt at secrecy, on a large prime number which will serve as their modulus, m. (A modulus is the base for arithmetic operations. Usually we calculate in base ten; in binary arithmetic we calculate in base two. Alice and Bob will calculate using base m, where m is a large prime number.) Alice then selects a (secret) large random number A; Bob meanwhile selects a (secret) large random number B. Alice sends Bob a number she calculates as b^A (modulus m). Bob sends Alice a number he calculates as b^B (modulus m). Both Alice and Bob then compute b^AB (modulus m). This becomes their secret session key.

Diffie-Hellman works because it is computationally easy to calculate powers of numbers modulus m, but very, very difficult to find the logarithm of a large exponent of a number modulus m. Yet, this is what an eavesdropper would have to do to compute b^AB (modulus m) without knowing either A or B. Thus, if m is about 1000 bits long, Alice and Bob can do their calculations in seconds. At the current state of the art, however, the logarithmic computation would take a powerful computer a quintillion (a billion billion) years.{793}

Diffie-Hellman Key Exchange works for real-time communication because the parties can generate a session key at the start of the exchange, but it imposes potentially long delays on e-mail because the parties must exchange several messages to generate a session key before the parties can have a secure conversation. On the other hand, e-mail can be encrypted and decrypted at leisure, so the entire message can use public key encryption rather than just the session key. As a result, all that Bob needs in order to send Alice a secure e-mail is a reliable way of getting Alice's public key.

Key servers provide a simple way of making public keys generally available. Essentially, a key server is a computer with a white pages approach to public key management. Bob enters Alice's name and the key server replies with Alice's public key--if she has registered it. Key servers, whether run by the U.S. Post Office or [Page 894]others, are likely to be an essential element of the National Information Infrastructure.

Key servers generally work on one of two principles: the certification authority or the web of trust. Under the certification authority paradigm, some central body authenticates the identity of the registrant when the key is first deposited. For example, the U.S. Post Office has proposed that it act as a certifying authority. Alice could then identify herself to the Post Office by providing identification similar to that currently required to get a passport. The Post Office would then add her key to its server, and/or provide Alice with a copy of her public key signed with the Post Office's private key.{794}

By contrast, there is no central authority for web-of-trust systems: Alice can upload a key to the key server at anytime. In order to demonstrate that the key purporting to be "Alice's" is hers, Alice must then find other persons to "sign" her key by uploading authentications signed with their private keys. Typically this is done by meeting face-to-face and exchanging public keys and showing identification. If Alice has her key signed by Carol, whom Bob knows or trusts, Bob can safely assume that the signature purporting to be from "Alice" is not in fact an impostor's. Suppose, however, that Alice and Bob do not have any friends in common, but that Bob's friend Carol has signed Ted's key, and Ted has signed Alice's key. From Bob's point of view this is not as good as if Carol, whom he knows, has signed Alice's key, but it is considerably better than nothing. Bob needs to decide how many intermediaries he is willing to accept before he considers a public key unreliable. The increase in the length of the chain of authentication can be offset by finding multiple routes to Alice. For example, Bob may still feel reasonably secure if he can establish three relatively long but independent chains of authentication.{795} This web-of-trust approach is the foundation of the PGP encryption system.{796} [Page 895]

C. Digital Signatures

Verification of a message's origins takes on a new importance in the world of electronic mail, a world where all messages arrive on identical stationery in an identical typeface, having passed through many electronic hands en route. The fact that a message comes properly encrypted, even if the cipher is a simple one, tends to show that it was sent by the person who had access to the cipher and knew she was supposed to use it.{797} If the cipher is strong and the key tightly guarded, the use of the correct cipher strongly suggests that the message was sent by the person it purports to be from. Public-key systems in which a private key is held by only one person take this security feature to its logical extreme.

Public-key systems also allow users to append a so-called digital signature to an unencrypted message. A digital signature uniquely identifies the sender and connects the sender to the message. Because the signature uses the plaintext as an input to the encryption algorithm, if the message is altered in even the slightest way, the signature will not decrypt properly, showing that the message was altered in transit or that the signature was forged by copying it from a different message.{798} A digital signature copied from one message has an infinitesimal chance of successfully authenticating any other message.{799}

The importance of a Digital Signature Standard (DSS), the prerequisite to a system of electronic commerce, has not been lost on the federal government. Nevertheless, for many years NIST was unable or unwilling to promote a standard. Although NIST had begun working on a DSS in the early 1980s, its progress was slowed by the close relation between digital signature and cryptographic systems: an algorithm that produces a secure digital signature can[Page 896]also be used to produce a secure cipher. The U.S. government, however, wished to find a DSS which, although secure enough to be useful, nonetheless could not be used as an encryption system that the government might be powerless to break.{800} In August 1991, NIST announced its selection of the Digital Signature Algorithm (DSA) which, having been patented by an NSA employee, would be available royalty-free to all users.{801} NIST then encountered patent difficulties. These were exacerbated when a U.S. corporation acquired the rights to the main patent allegedly infringed by the DSA. In June 1993, NIST announced that it would give the corporation an exclusive license to the DSA; the U.S. government would have free use but everyone else would have to pay. Reaction from industry, which was already gravitating towards a competing corporation's product and was poised to accept NIST's standard only if it were royalty-free, was very negative.{802} In May 1994, NIST announced that it would not give an exclusive license to anyone, and that the DSA would be available royalty-free after all.{803} The alleged patent infringement was dealt with by the statement that NIST--has concluded there are no valid claims."{804} The DSS is now scheduled to be incorporated into Fortezza PCMCIA chips.

Meanwhile, academic cryptologists, ever vigilant for intentional weaknesses in government-sponsored cryptography, found an interesting property in the DSS proposed by NIST and approved by [Page 897]the NSA. Small "subliminal" messages can be inserted into the digital signature without the signer's knowledge.{805} This property "allows an unscrupulous implementer of DSS to leak a [small] piece of the [user's] private key with each signature."{806} One commentator noted that the DSA "provides the most hospitable setting for subliminal communications discovered to date."{807} The NSA has not let on whether it knew about this feature when it agreed to the DSS.{808}

The proposed Capstone Chip combines the encryption functions of Clipper with the DSA.{809} It will be available in a PCMCIA card called Fortezza. Thus, if users wish to encrypt computer data transmissions with a government approved algorithm, they will have to take the DSA as part of the package.

Although digital signatures use strong cryptographic algorithms, they raise fewer, and different, policy issues from the Clipper Chip. The U.S. government is not concerned about the world-wide use of authentication software because affixing a signature to a plaintext message does not make that message any harder to read.{810} The real power struggle centers on whether the government will retain the capability to read private electronic messages and listen to private telephone conversations at will.

[end of document] Send feedback here, please.

To table of contents