A. Michael Froomkin

Document information and copyright notice

To table of contents

Notes for Appendices

761. Schneier, supra note 12, at xv. Back to text

762. Note that the best scrambler phones in the world will not protect you from a listening device in your room. Back to text

763. Successful cryptographic systems combine two basic principles: confusion and diffusion. Confusion is nothing more than some form of substitution--letters or, more commonly, sets of information bits representing other letters or information bits in some fashion. Replacing every letter with the one next to it on the typewriter keyboard is an example of confusion by substitution. Diffusion means mixing up the pieces of the message so that they no longer form recognizable patterns. Jumbling the letters of a message is a form of diffusion. Because modern ciphers usually operate at the bit level, many modern cryptographic systems produce confusion and diffusion which depends on every bit of the plaintext. Changing a single character will change the operations performed during encryption, making encryption more difficult, but also more difficult to crack. See Schneier, supra note 12, at 193; Feistel, supra note 7, at 15. Back to text

764. Cf. Note, 17 L.Q. Rev. 223, 223 (1901) (urging "prudent persons not to write defamatory statements of any kind on postcards, even in the decent obscurity of a learned language"). Back to text

765. As we will see, this vulnerability may have been unknown to the cryptographers who designed the cipher, or it may have been inserted intentionally. Back to text

766. The two can sometimes be combined: a mathematical attack might, for example, demonstrate that only certain types of keys need to be tried, thus lowering the computational effort involved. Back to text

767. Algorithms whose security depends on the algorithm being kept secret have "historical interest, but by today's data security standards they provide woefully inadequate security. A large or changing group of users cannot use them, because users will eventually reveal the secret. When they do, the whole security of the system fails. . . . [They are also] trivial to break by experienced cryptanalysts." Schneier, supra note 12, at 2. Back to text

768. Ordinarily, in systems using multiple keys, only one of the keys need be kept secret. Back to text

769. The only type of algorithm guaranteed to be secure against all forms of mathematical and brute-force attacks is known as the "one-time pad." A one-time pad is "nothing more than a nonrepeating set of truly random key letters . . . . The sender uses each key letter on the pad to encrypt exactly one plaintext character. The receiver has an identical pad and uses each key on the pad, in turn, to decrypt each letter of the ciphertext." Schneier, supra note 12, at 13; see also Gardner, supra, note 128, at 120 (stating that ciphers that provide "absolute secrecy" are not always used because it is "too impractical"). The critical features of a one-time pad are that the pad must be kept from the enemy, the characters on the pad must be truly random, and the pad must never be used twice. Because large pads are difficult to generate, must be communicated to the recipient in utmost secrecy, and are unwieldy, the one-time pad is difficult to use for anything other than short messages of the highest security. See Schneier, supra note 12, at 14-15. Back to text

770. See Eric Bach et al., Cryptography FAQ (06/10: Public-key Cryptography) § 6.6 (June 7, 1994), available online URL ("Historically even professional cryptographers have made mistakes in estimating and depending on the intractability of various computational problems for secure cryptographic properties."). Back to text

771. A bit is a binary unit of information that can have the value zero or one. Computers organize bits into bytes, often 8, 16, or 32 bits in length. For example, DOS-based personal computers use eight-bit bytes to represent alphanumeric characters. Back to text

772. Although a 16-character message on a PC has 128 bits, most 8-bit-to-a-byte PCs limit the characters that can be represented to fewer than 256 per byte because one bit is used for error-checking. Hence, although the amount of information in a 128-bit key is equal to a 16-character text on a PC, such a text itself would in effect be a much shorter key on most personal computers because an attacker would know that[**PAGE 888**]many possible values for certain bits could be ignored. Limiting the possible keys only to lowercase letters and digits would have even more drastic effects: a 56-bit key, which if unconstrained would produce 2^56 (approximately 10^16) possible keys, would be limited to 10^12 possible keys, making it 10,000 times easier to crack. See Schneier, supra note 12, at 141. Back to text

773. See id. at 7, 129. The universe is estimated to be about 10^10 years old. See id. at 7. Back to text

774. See Garon & Outerbridge, supra note 26, at 179-81; Schneier, supra note 12, at 7. Back to text

775. See Ronald L. Rivest, Responses to NIST's Proposal, Comm. ACM, July 1992, at 41, 44-45 (estimating that today a 512-bit key can be broken with about $8.2 million worth of equipment, and noting that the cost will continue to shrink). Back to text

776. See Bamford, supra note 17, at 346-47 (describing the "closed door negotiations" between the NSA and IBM that resulted in the key size reduction from 128 to 56 bits). Back to text

777. See supra text accompanying note 108. Back to text

778. See Bamford, supra note 17, at 348; Whitfield Diffie & Martin E. Hellman, Exhaustive Cryptanalysis of the NBS Data Encryption Standard, Computer, June 1977, at 74, 74. Back to text

779. See Diffie & Hellman, supra note 778, at 74 (predicting that the rapidly decreasing cost of computation should have, by 1987, reduced the solution cost to the $50 range); Rivest, supra note 775, at 45 (estimating the cost to break a 512-bit key as $8.2 million); see also Bamford, supra note 17, at 348. Back to text

780. See Bamford, supra note 17, at 348-49 (comparing arguments about the cost of a computer that could break a 56-bit key). Back to text

781. See FIPS 46-2, supra note 106, at 69,348. Back to text

782. See H.R. Rep. No. 153 (Part I), 100th Cong., 1st Sess. 18 (1987), reprinted in 1987 U.S.C.C.A.N. 3120, 3133. Back to text

783. See FIPS 46-2, supra note 106, at 69,347. NIST suggested, however, that it may [**PAGE 890**]not recertify DES when its current five-year certification expires. See id. at 69,350 (noting that in 1998, when the standard will be over 20 years old, NIST will consider alternatives that offer a higher level of security). Back to text

784. For the original papers describing public-key cryptography, see Whitfield Diffie & Martin E. Hellman, New Directions in Cryptography, IT-22 IEEE Transactions Info. Theory 644 (1976), and Ralph C. Merkle, Secure Communication over Insecure Channels, Comm. ACM, Apr. 1978, at 294. For further information about public-key cryptography, see generally Schneier, supra note 12, at 29. More concentrated descriptions can be found in Bach, et al., supra note 770, § 6; RSA Cryptography Today FAQ, supra note 129; and in Whitfield Diffie, The First Ten Years of Public-Key Cryptography, 76 Proc. IEEE 560 (1988) (discussing the history of public key cryptography). Back to text

785. This proved to be a real problem during World War II, resulting in the capture of several spy rings. See Kahn, supra note 6, at 530. Back to text

786. Bach et al., supra note 770, § 6.2. Despite the prediction of ubiquity, the fact remains that to date nongovernmental commercial uses of public-key cryptography have been very limited. "It is easy to build a case for buying cryptography futures. . . . Nonetheless, cryptography remains a niche market in which (with the exception of [sales to the government]) a handful of companies gross only a few tens of millions of dollars annually." ACM Report, supra note 15, at 12. Back to text

787. The ASCII version of the author's public key for his 1024-bit key in Pretty[**PAGE 891**]Good Privacy is:

Version: 2.6.2
Back to text

788. See Schneier, supra note 12, at 284-85 (stating that security of RSA evaporates if someone discovers a rapid means of factoring large numbers); id. at 318-20 (explaining that security of certain other public-key algorithms depends on the continuing inability of mathematicians to solve the long-standing problem of calculating discrete logarithms). Back to text

789. Clearly, the security of the system evaporates if the private key is compromised, that is, transmitted to anyone. Back to text

790. For example, RSA, one of the leading public-key programs, is at least 100 times slower than DES in software implementations, and up to 10,000 times slower in hardware. See RSA Cryptography Today FAQ, supra note 129, § 2.3. Back to text

791. Another popular single-key cipher, which is not hampered by a 56-bit limit on key length, is called IDEA. See Schneier, supra note 12, at 260-66. Back to text

792. If Bob does not have a public key on record somewhere that Alice considers reliable, then Bob needs to authenticate it in a manner that protects against a "man in the middle" attack when he sends his public key to Alice. In a "man in the middle" attack, a third party intercepts Bob's first message. The man in the middle substitutes his public key for Bob's. Now Alice thinks she has Bob's key, and sends messages that are easily decrypted by the third party. The third party then reencrypts them with Bob's public key and sends them on to Bob who may never know the difference. A "man in the middle" attack will be prevented if Bob signs his public key with a digital signature that Alice can recognize. But this requires either that Bob register his public key for digital signatures somewhere trustworthy or that he find someone whose digital signature Alice already knows and who can affix a digital signature to Bob's transmission of his public key, thus attesting to the fact that it really comes from Bob. Back to text

793. See ACM Report, supra note 15, at 8 (noting that the logarithmic computation would "demand more than 2^100 (or approximately 10^30) operations" and that "today's supercomputers . . . would take a billion billion years to perform this many operations"); Schneier, supra note 12, at 275-77 (discussing Diffie-Hellman); see also Diffie & Hellman, supra note 784, at 644 (noting that in "a public key cryptosystem, enciphering and deciphering are governed by distinct keys, E and D, such that computing D from E is computationally infeasible" (emphasis omitted)). Back to text

794. See OTA Information Security, supra note 97, at 55-56. For a thorough survey of the legal and policy issues involved in setting up and running a certification authority, see generally Michael Baum, National Institute of Standards and Technology, Federal Certification Authority Liability and Policy: Law and Policy of Certificate-Based Public Key and Digital Signatures (1994). Back to text

795. See PGP(TM) User's Guide, supra note 73. Back to text

796. See Garfinkel, supra note 73, at 235-36. Back to text

797. This is not irrefutable proof because a third party could obtain the key from the authorized user by stealth, purchase, accident, or torture (also known as "rubber hose cryptanalysis"). Back to text

798. Consider the following example: To sign a message, Alice does a computation involving both her private key and the message itself; the output is called the digital signature and is attached to the message, which is then sent. Bob, to verify the signature, does some computation involving the message, the purported signature, and Alice's public key. If the results properly hold in a simple mathematical relation, the signature is verified as genuine; otherwise, the signature may be fraudulent or the message altered, and they are discarded. See RSA Cryptography Today FAQ, supra note 129, § 2.13. Back to text

799. See Schneier, supra note 12, at 35 (noting that a digital signature using a 160-bit checksum has only a one in 2^160 chance of misidentification). Back to text

800. See, e.g., Sherry L. Harowitz, Building Security into Cyberspace, Security Mgmt., June 1994, available in LEXIS, News Library, Curnws File (noting that the U.S. government "wants to discourage" the use of digital signatures other than the Clipper Chip for encryption purposes); Robert L. Hotz, Sign on the Electronic Dotted Line, L.A. Times, Oct. 19, 1993, at A1 ("Federal officials refused to adopt the earlier technique, called RSA, as a national standard because they were concerned that it could be used to conceal clandestine messages that could not be detected by law enforcement or national security agencies."). Back to text

801. See A Proposed Federal Information Processing Standard for Digital Signature Standard (DSS), 56 Fed. Reg. 42,980, 42,980 (1991). The DSS was to be "applicable to all federal departments and agencies for the protection of unclassified information," and was "intended for use in electronic mail, electronic funds transfer, electronic data interchange, software distribution, data storage, and other applications which require data integrity assurance and data origin authentication." Id. at 42,981. Back to text

802. See Kevin Power, Use DSS with No Fear of Patent Liability, NIST Says, Gov't Computer News, Oct. 17, 1994, at 64 (noting overwhelming opposition to NIST's proposal to give a patent group an exclusive license to the Digital Signature Algorithm). Back to text

803. See Approval of Federal Information Processing Standards Publication 186, Digital Signature Standard (DSS), 59 Fed. Reg. 26,208, 26,209 (1994). Back to text

804. Id. Back to text

805. A subliminal message is invisible to the user because it is encrypted and mixed in with the garbage-like stream of characters that constitutes the signature. Back to text

806. Schneier, supra note 12, at 313; see also id. at 390-92 (explaining how the subliminal channel works). The DSA is the algorithm used in the DSS. Back to text

807. Simmons, supra note 56, at 218. Back to text

808. See id. Back to text

809. See Capstone Chip Technology, supra note 16 (The Capstone Chip "implements the same cryptographic algorithm as the CLIPPER chip. In addition, the CAPSTONE Chip includes . . . the Digital Signature Algorithm (DSA) proposed by NIST . . . ."). Back to text

810. See ACM Report, supra note 15, at 3 (noting that "information that is authenticated and integrity-checked is not necessarily confidential; that is, confidentiality can be separated from integrity and authenticity"). Back to text

To table of contents