Выбрать главу

It’s not just the Internet’s infrastructure that I’m defining as fundamentally American—it’s the computer software (Microsoft, Google, Oracle) and hardware (HP, Apple, Dell), too. It’s everything from the chips (Intel, Qualcomm), to the routers and modems (Cisco, Juniper), to the Web services and platforms that provide email and social networking and cloud storage (Google, Facebook, and the most structurally important but invisible Amazon, which provides cloud services to the US government along with half the Internet). Though some of these companies might manufacture their devices in, say, China, the companies themselves are American and are subject to American law. The problem is, they’re also subject to classified American policies that pervert law and permit the US government to surveil virtually every man, woman, and child who has ever touched a computer or picked up a phone.

Given the American nature of the planet’s communications infrastructure, it should have been obvious that the US government would engage in this type of mass surveillance. It should have been especially obvious to me. Yet it wasn’t—mostly because the government kept insisting that it did nothing of the sort, and generally disclaimed the practice in courts and in the media in a manner so adamant that the few remaining skeptics who accused it of lying were treated like wild-haired conspiracy junkies. Their suspicions about secret NSA programs seemed hardly different from paranoid delusions involving alien messages being beamed to the radios in our teeth. We—me, you, all of us—were too trusting. But what makes this all the more personally painful for me was that the last time I’d made this mistake, I’d supported the invasion of Iraq and joined the army. When I arrived in the IC, I felt sure that I’d never be fooled again, especially given my top secret clearance. Surely that had to count for some degree of transparency. After all, why would the government keep secrets from its secret keepers? This is all to say that the obvious didn’t even become the thinkable for me until some time after I moved to Japan in 2009 to work for the NSA, America’s premier signals intelligence agency.

It was a dream job, not only because it was with the most advanced intelligence agency on the planet, but also because it was based in Japan, a place that had always fascinated Lindsay and me. It felt like a country from the future. Though mine was officially a contractor position, its responsibilities and, especially, its location were more than enough to lure me. It’s ironic that only by going private again was I put in a position to understand what my government was doing.

On paper, I was an employee of Perot Systems, a company founded by that diminutive hyperactive Texan who founded the Reform Party and twice ran for the presidency. But almost immediately after my arrival in Japan, Perot Systems was acquired by Dell, so on paper I became an employee of Dell. As in the CIA, this contractor status was all just formality and cover, and I only ever worked in an NSA facility.

The NSA’s Pacific Technical Center (PTC) occupied one-half of a building inside the enormous Yokota Air Base. As the headquarters of US Forces Japan, the base was surrounded by high walls, steel gates, and guarded checkpoints. Yokota and the PTC were just a short bike ride from where Lindsay and I got an apartment in Fussa, a city at the western edge of Tokyo’s vast metropolitan spread.

The PTC handled the NSA’s infrastructure for the entire Pacific, and provided support for the agency’s spoke sites in nearby countries. Most of these were focused on managing the secret relationships that let the NSA cover the Pacific Rim with spy gear, as long as the agency promised to share some of the intelligence it gleaned with regional governments—and so long as their citizens didn’t find out what the agency was doing. Communications interception was the major part of the mission. The PTC would amass “cuts” from captured signals and push them back across the ocean to Hawaii, and Hawaii, in turn, would push them back to the continental United States.

My official job title was systems analyst, with responsibility for maintaining the local NSA systems, though much of my initial work was that of a systems administrator, helping to connect the NSA’s systems architecture with the CIA’s. Because I was the only one in the region who knew the CIA’s architecture, I’d also travel out to US embassies, like the one I’d left in Geneva, establishing and maintaining the links that enabled the agencies to share intelligence in ways that hadn’t previously been possible. This was the first time in my life that I truly realized the power of being the only one in a room with a sense not just of how one system functioned internally, but of how it functioned together with multiple systems—or didn’t. Later, as the chiefs of the PTC came to recognize that I had a knack for hacking together solutions to their problems, I was given enough of a leash to propose projects of my own.

Two things about the NSA stunned me right off the bat: how technologically sophisticated it was compared with the CIA, and how much less vigilant it was about security in its every iteration, from the compartmentalization of information to data encryption. In Geneva, we’d had to haul the hard drives out of the computer every night and lock them up in a safe—and what’s more, those drives were encrypted. The NSA, by contrast, hardly bothered to encrypt anything.

In fact, it was rather disconcerting to find out that the NSA was so far ahead of the game in terms of cyberintelligence yet so far behind it in terms of cybersecurity, including the most basic: disaster recovery, or backup. Each of the NSA’s spoke sites collected its own intel, stored the intel on its own local servers, and, because of bandwidth restrictions—limitations on the amount of data that could be transmitted at speed—often didn’t send copies back to the main servers at NSA headquarters. This meant that if any data were destroyed at a particular site, the intelligence that the agency had worked hard to collect could be lost.

My chiefs at the PTC understood the risks the agency was taking by not keeping copies of many of its files, so they tasked me with engineering a solution and pitching it to the decision makers at headquarters. The result was a backup and storage system that would act as a shadow NSA: a complete, automated, and constantly updating copy of all of the agency’s most important material, which would allow the agency to reboot and be up and running again, with all its archives intact, even if Fort Meade were reduced to smoldering rubble.

The major problem with creating a global disaster-recovery system—or really with creating any type of backup system that involves a truly staggering number of computers—is dealing with duplicated data. In plain terms, you have to handle situations in which, say, one thousand computers all have copies of the same single file: you have to make sure you’re not backing up that same file one thousand times, because that would require one thousand times the amount of bandwidth and storage space. It was this wasteful duplication, in particular, that was preventing the agency’s spoke sites from transmitting daily backups of their records to Fort Meade: the connection would be clogged with a thousand copies of the same file containing the same intercepted phone call, 999 of which the agency did not need.

The way to avoid this was “deduplication”: a method to evaluate the uniqueness of data. The system that I designed would constantly scan the files at every facility at which the NSA stored records, testing each “block” of data down to the slightest fragment of a file to find out whether or not it was unique. Only if the agency lacked a copy of it back home would the data be automatically queued for transmission—reducing the volume that flowed over the agency’s transpacific fiber-optic connection from a waterfall to a trickle.