Выбрать главу

The Identity Layer is infrastructure for the Internet. It gives value (and raises concerns) to many beyond Microsoft. But though Microsoft’s work is an important gift to the Internet, the Identity Layer is not altruism. “Microsoft’s strategy is based on web services”, Cameron described to me. “Web services are impossible without identity.[9]” There is important public value here, but private interest is driving the deployment of this public value.

The Identity Layer would benefit individuals, businesses, and the government, but each differently. Individuals could more easily protect themselves from identity theft[10]; if you get an e-mail from PayPal demanding you update your account, you’ll know whether the website is actually PayPal. Or if you want to protect yourself against spam, you could block all e-mail that doesn’t come from an authenticated server. In either case, the technology is increasing confidence about the Internet. And the harms that come from a lack of confidence — mainly fraud — would therefore be reduced.

Commerce too would benefit from this form of technology. It too benefits from the reduction of fraud. And it too would benefit from a more secure infrastructure for conducting online transactions.

And finally, the government would benefit from this infrastructure of trust. If there were a simple way to demand that people authenticate facts about themselves, it would be easier for the government to insist that they do so. If it were easier to have high confidence that the person on the website was who he said he was, then it would be cheaper to deliver certain information across the web.

But while individuals, commerce, and government would all benefit from this sort of technology, there is also something that each could lose.

Individuals right now can be effectively anonymous on the Net. A platform for authenticated identity would make anonymity much harder. We might imagine, for example, a norm developing to block access to a website by anyone not carrying a token that at least made it possible to trace back to the user — a kind of driver’s license for the Internet. That norm, plus this technology, would make anonymous speech extremely difficult.

Commerce could also lose something from this design. To the extent that there are simple ways to authenticate that I am the authorized user of this credit card, for example, it’s less necessary for websites to demand all sorts of data about me — my address, my telephone numbers, and in one case I recently encountered, my birthday. That fact could build a norm against revealing extraneous data. But that data may be valuable to business beyond simply confirming a charge.

And governments, too, may lose something from this architecture of identification. Just as commerce may lose the extra data that individuals need to reveal to authenticate themselves, so too will the government lose that. It may feel that such data is necessary for some other purpose, but gathering it would become more difficult.

Each of these benefits and costs can be adjusted, depending upon how the technology is implemented. And as the resulting mix of privacy and security is the product of competition and an equilibrium between individuals and businesses, there’s no way up front to predict what it will be.

But for our purposes, the only important fact to notice is that this infrastructure could effectively answer the first question that regulability requires answering: Who did what where? With an infrastructure enabling cheap identification wherever you are, the frequency of unidentified activity falls dramatically.

This final example of an identification technology throws into relief an important fact about encryption technology. The Identity Layer depends upon cryptography. It thus demonstrates the sense in which cryptography is Janus-faced. As Stewart Baker and Paul Hurst put it, cryptography “surely is the best of technologies and the worst of technologies. It will stop crimes and it will create new crimes. It will undermine dictatorships, and it will drive them to new excesses. It will make us all anonymous, and it will track our every transaction.[11]

Cryptography can be all these things, both good and bad, because encryption can serve two fundamentally different ends. In its “confidentiality” function it can be “used to keep communications secret.” In its “identification” function it can be “used to provide forgery-proof digital identities.[12]” It enables freedom from regulation (as it enhances confidentiality), but it can also enable more efficient regulation (as it enhances identification).[13]

Its traditional use is secrets. Encrypt a message, and only those with the proper key can open and read it. This type of encryption has been around as long as language itself. But until the mid-1970s it suffered from an important weakness: the same key that was used to encrypt a message was also used to decrypt it. So if you lost that key, all the messages hidden with that key were also rendered vulnerable. If a large number of messages were encrypted with the same key, losing the key compromised the whole archive of secrets protected by the key. This risk was significant. You always had to “transport” the key needed to unlock the message, and inherent in that transport was the risk that the key would be lost.

In the mid-1970s, however, a breakthrough in encryption technique was announced by two computer scientists, Whitfield Diffie and Martin Hellman[14]. Rather than relying on a single key, the Diffie-Hellman system used two keys — one public, the other private. What is encrypted with one can be decrypted only with the other. Even with one key there is no way to infer the other.

This discovery was the clue to an architecture that could build an extraordinary range of confidence into any network, whether or not the physical network itself was secure[15]. And again, that confidence could both make me confident that my secrets won’t be revealed and make me confident that the person using my site just now is you. The technology therefore works to keep secrets, but it also makes it harder to keep secrets. It works to make stuff less regulable, and more regulable.

In the Internet’s first life, encryption technology was on the side of privacy. Its most common use was to keep information secret. But in the Internet’s next life, encryption technology’s most important role will be in making the Net more regulable. As an Identity Layer gets built into the Net, the easy ability to demand some form of identity as a condition to accessing the resources of the Net increases. As that ability increases, its prevalence will increase as well. Indeed, as Shawn Helms describes, the next generation of the Internet Protocol — IPv6 — “marks each packet with an encryption ‘key’ that cannot be altered or forged, thus securely identifying the packet’s origin. This authentication function can identify every sender and receiver of information over the Internet, thus making it nearly impossible for people to remain anonymous on the Internet.[16]

вернуться

10.

A number of states have now passed legislation dealing with ID theft. A current listing follows: Alabama Alabama Code – 13A-8–190 through 201 Alaska Alaska Stat – 11.46.565 Arizona Ariz. Rev. Stat. – 13–2008 Arkansas Ark. Code Ann. – 5–37–227 California Cal. Penal Code – 530.5–8 Connecticut Conn. Stat. – 53a-129a Conn. Stat. – 52–571h Delaware Del. Code Ann. tit. II, – 854 District of Columbia Title 22, Section 3227 Florida Fla. Stat. Ann. – 817.568 Georgia Ga. Code Ann. – 16–9-120, through 128 Guam 9 Guam Code Ann. – 46.80 Hawaii HI Rev. Stat. – 708–839.6–8 Idaho Idaho Code – 18–3126 Illinois 720 Ill. Comp. Stat. 5/16 G Indiana Ind. Code – 35–43–5-3.5 Iowa Iowa Code – 715A.8 Kansas Kan. Stat. Ann. – 21–4018 Kentucky Ky. Rev. Stat. Ann. – 514.160 Louisiana La. Rev. Stat. Ann. – 14:67.16 Maine ME Rev. Stat. Ann. tit. 17-A –905-A Maryland Md. Code Ann. art. 27 – 231 Massachusetts Mass. Gen. Laws ch. 266, – 37E Michigan Mich. Comp. Laws – 750.285 Minnesota Minn. Stat. Ann. – 609.527 Mississippi Miss. Code Ann. – 97–19–85 Missouri Mo. Rev. Stat. – 570.223 Montana Mon. Code Ann – 45–6-332 Nebraska NE Rev. Stat. – 28–608 and 620 Nevada Nev. Rev. State. – 205.463–465 New Hampshire N.H. Rev. Stat. Ann. – 638:26 New Jersey N.J. Stat. Ann. – 2C:21–17 New Mexico N.M. Stat. Ann. – 30–16–24.1 New York NY CLS Penal – 190.77–190.84 North Carolina N.C. Gen. Stat. – 14–113.20–23 North Dakota N.D.C.C. – 12.1–23–11 Ohio Ohio Rev. Code Ann. – 2913.49 Oklahoma Okla. Stat. tit. 21, – 1533.1 Oregon Or. Rev. Stat. – 165.800 Pennsylvania 18 Pa. Cons. Stat. – 4120 Rhode Island R.I. Gen. Laws – 11–49.1–1 South Carolina S.C. Code Ann. – 16–13–510 South Dakota S.D. Codified Laws – 22–30A-3.1. Tennessee TCA – 39–14–150 TCA – 47–18–2101 Texas Tex. Penal Code – 32.51 Utah Utah Code Ann. – 76–6-1101–1104 Virginia Va. Code Ann. – 18.2–186.3 Washington Wash. Rev. Code – 9.35.020 West Virginia W. Va. Code – 61–3-54 Wisconsin Wis. Stat. – 943.201 Wyoming Wyo. Stat. Ann. – 6–3-901.

вернуться

11.

Stewart A. Baker and Paul R. Hurst, The Limits of Trust: Cryptography, Governments, and Electronic Commerce (Boston: Kluwer Law International, 1998), xv.

вернуться

13.

See Hal Abelson et al., "The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption," World Wide Web Journal 2 (1997): 241, 245: "Although cryptography has traditionally been associated with confidentiality, other cryptographic mechanisms, such as authentication codes and digital signatures, can assure that messages have not been tampered with or forged."

вернуться

14.

Whitfield Diffie and Martin E. Hellman, "New Directions in Cryptography," IEEE Transactions on Information Theory it–22 (November 1976): 29–40. The idea had apparently been discovered earlier by James Ellis at the British Government Communication Headquarters, but it was not then published; see Baker and Hurst, The Limits of Trust, xvii–xviii.

вернуться

15.

Even if the wires are tapped, this type of encryption still achieves its magic. We can get a hint of how in a series of cases whose accumulating impact makes the potential clear. A. If I want to send a message to you that I know only you will be able to read, I can take your public key and use it to encrypt that message. Then I can send that message to you knowing that only the holder of the private key (presumably you) will be able to read it. Advantage: My message to you is secure. Disadvantage: You can't be sure it is I who sent you the message. Because anyone can encrypt a message using your public key and then send it to you, you have no way to be certain that I was the one who sent it. Therefore, consider the next example. B. Before I send the message I have encrypted with your public key, I can encrypt it with my private key. Then when you receive the message from me, you can first decrypt it with my public key, and then decrypt it again with your private key. After the first decryption, you can be sure that I (or the holder of my private key) was the one who sent you the message; after the second decryption, you can be sure that only you (or other holders of your private key) actually read the content of the message. But how do you know that what I say is the public key of Larry Lessig is actually the public key of Larry Lessig? How can you be sure, that is, that the public key you are using is actually the public key it purports to be? Here is where the next example comes in. C. If there is a trustworthy third party (say, my bank, or the Federal Reserve Board, or the ACLU) with a public key (a fact I am able to verify because of the prominence of the institution), and that third party verifies that the public key of Larry Lessig is actually the public key of Larry Lessig, then along with my message sent to you, encrypted first in your public key and second in my private key, would be a certificate, issued by that institution, itself encrypted with the institution's private key. When you receive the message, you can use the institution's public key to decrypt the certificate; take from the certificate my public key (which you now are fairly confident is my public key); decrypt the message I sent you with the key held in the certificate (after which you are fairly confident comes from me); and then decrypt the message encrypted with your public key (which you can be fairly confident no one else has read). If we did all that, you would know that I am who I say I am and that the message was sent by me; I would know that only you read the message; and you would know that no one else read the message along the way.

вернуться

16.

Shawn C. Helms, "Translating Privacy Values with Technology," Boston University Journal of Science and Technology Law 7 (2001): 288, 299.