Encryption works, I explained, by way of algorithms. It’s a mathematical method of transforming information—such as your emails, phone calls, photos, videos, and files—in such a way that it becomes incomprehensible to anyone who doesn’t have a copy of the encryption key. And it’s reversible. You can think of a modern encryption algorithm as a magic wand that you can wave over a document to change each letter into a language that only you and those you trust can read. The encryption key is the magic spell that puts the wand to work. It doesn’t matter how many people know that you used the wand, so long as you can keep your personal magic spell from anyone you don’t trust.
Encryption algorithms are basically just sets of math problems designed to be incredibly difficult even for computers to solve. If all of our data, including our communications, were encrypted, then no government would be able to understand them. It could still intercept and collect the signals, but it would be intercepting and collecting pure noise. Encrypting our communications would essentially delete them from the memories of every entity we deal with. It would effectively withdraw permission from those to whom it was never granted to begin with.
Any government hoping to access encrypted communications has only two options: It can either go after the keymasters or go after the keys. The best means we have for keeping our keys safe is called “zero knowledge,” a method that ensures that any data you try to store externally—say, for instance, on a company’s cloud platform—is encrypted by an algorithm running on your device before it is uploaded. With zero knowledge, the key is never shared and remains in the users’ hands—and only in the users’ hands. No company, no agency, no enemy can touch them.
My key to the NSA’s secrets went beyond zero knowledge: It was a zero-knowledge key consisting of multiple zero-knowledge keys.
My keys to the drive were hidden everywhere. But I retained one for myself. And if I destroyed that single lone piece that I kept on my person, I would destroy all access to the NSA’s secrets forever.
TWENTY-THREE
The Boy
I was more curious than ever about the one fact I was still finding elusive: the absolute limit of who the IC could turn its gaze against.
The only way to discover the answer was to narrow my vision to that of the NSA employees with the freest access to the rawest forms of intelligence. They could type into their computers the names of individuals who’d fallen under the agency’s suspicion, foreigners and US citizens alike. The NSA was interested in finding out everything about these individuals and their communications.
The program that enabled this access was called XKEYSCORE, which is perhaps best understood as a search engine. Imagine a kind of Google that instead of showing pages from the public internet returns results from your private email, your private chats, your private files, everything. Though I’d read enough about the program to understand how it worked, I hadn’t yet used it. But I was looking for a personal confirmation of the depths of the NSA’s surveillance intrusions—the kind of confirmation you don’t get from documents but only from direct experience.
One of the few offices in Hawaii with truly unfettered access to XKEYSCORE was the National Threat Operations Center. As luck would have it, NTOC had a position open through a contractor job as an infrastructure analyst. The role involved using the complete spectrum of the NSA’s mass surveillance tools, including XKEYSCORE.
I’d decided to bring my archives out of the country and pass them to the journalists I’d contacted, but before I could even begin to contemplate the logistics of that act I had to go shake some hands. I had to fly east to DC and spend a few weeks meeting and greeting my new bosses and colleagues. This was what brought me back home to the Beltway for the very last time, and back to Fort Meade.
The NSA described XKEYSCORE, in the documents I’d later pass on to journalists, as its “widest-ranging” tool, used to search “nearly everything a user does on the internet.” It was, simply put, the closest thing to science fiction I’ve ever seen in science fact: an interface that allows you to type in pretty much anyone’s address, telephone number, or IP address, and then basically go through the recent history of their online activity. In some cases you could even play back recordings of their online sessions, so that the screen you’d be looking at was their screen, whatever was on their desktop. You could read their emails, their browser history, their search history, their social media postings, everything. You could set up notifications that would pop up when some person or some device you were interested in became active on the internet for the day.
My weeks at Fort Meade, and the short stint I put in at my new job back in Hawaii, were the only times I saw, firsthand, the abuses actually being committed. I didn’t type the names of the agency director or the president into XKEYSCORE, but after enough time with the system I realized I could have. Everyone’s communications were in there—everyone’s. I was initially fearful that if I searched those in the uppermost echelons of state, I’d be caught and fired, or worse.
But it was simple to disguise a query by encoding my search terms. If any of the auditors who were responsible for reviewing the searches ever bothered to look more closely, they would see only a snippet of obfuscated code, while I would be able to scroll through the most personal activities of a Supreme Court justice or a congressperson.
One thing you come to understand very quickly while using XKEYSCORE is that nearly everyone in the world who’s online stores photos and videos of their family. This was true for virtually everyone of every gender, ethnicity, race, and age—from the meanest terrorist to the nicest senior citizen, who might be the meanest terrorist’s grandparent or parent or cousin.
It’s the family stuff that got to me the most. I remember this one child in particular, a little boy in Indonesia. Technically, I shouldn’t have been interested in this little boy, but I was, because my employers were interested in his father.
The boy’s father, like my own father, was an engineer—but unlike my father, this guy wasn’t government or military affiliated. He was just a regular academic who’d been caught up in a surveillance dragnet. I can’t even remember how or why he’d come to the agency’s attention, beyond sending a job application to a research university in Iran. The grounds for suspicion were often poorly documented, if they were documented at all, and the connections could be incredibly tenuous—“believed to be potentially associated with.”
Selections from the man’s communications had been assembled into folders—here was the fatal copy of the résumé sent to the suspect university; here were his texts; here was his Web browser history; here was the last week or so of his correspondence both sent and received, tagged to IP addresses. Here were the coordinates of a “geo-fence” the analyst had placed around him to track whether he strayed too far from home, or perhaps traveled to the university for his interview.
Then there were his pictures, and a video. He was sitting in front of his computer, as I was sitting in front of mine. Except that in his lap he had a toddler, a boy in a diaper.
The father was trying to read something, but the kid kept shifting around, smacking the keys and giggling. The computer’s internal mic picked up his giggling and there I was, listening to it on my headphones. The father held the boy tighter, and the boy straightened up and, with his dark crescent eyes, looked directly into the computer’s camera—I couldn’t escape the feeling that he was looking directly at me. Suddenly I realized that I’d been holding my breath. I shut the session, got up from the computer, and left the office for the bathroom in the hall, head down, headphones still on with the cord trailing.