When the authorities—in their unlimited ignorance—failed to notice, I was mad with power, galloping laps around the living room. I, the master of time, would never again be sent to bed. I was free. And so it was that I fell asleep on the floor, having finally seen the sunset on June 21, the summer solstice, the longest day of the year. When I awoke, the clocks in the house once again matched my father’s watch.
IF ANYBODY BOTHERED to set a watch today, how would they know what to set it to? If you’re like most people these days, you’d set it to the time on your smartphone. But if you look at your phone, and I mean really look at it, burrowing deep through its menus into its settings, you’ll eventually see that the phone’s time is “automatically set.” Every so often, your phone quietly—silently—asks your service provider’s network, “Hey, do you have the time?” That network, in turn, asks a bigger network, which asks an even bigger network, and so on through a great succession of towers and wires until the request reaches one of the true masters of time, a Network Time Server run by or referenced against the atomic clocks kept at places like the National Institute of Standards and Technology in the United States, the Federal Institute of Meteorology and Climatology in Switzerland, and the National Institute of Information and Communications Technology in Japan. That long invisible journey, accomplished in a fraction of a second, is why you don’t see a blinking 12:00 on your phone’s screen every time you power it up again after its battery runs out.
I was born in 1983, at the end of the world in which people set the time for themselves. That was the year that the US Department of Defense split its internal system of interconnected computers in half, creating one network for the use of the defense establishment, called MILNET, and another network for the public, called the Internet. Before the year was out, new rules defined the boundaries of this virtual space, giving rise to the Domain Name System that we still use today—the .govs, .mils, .edus, and, of course, .coms—and the country codes assigned to the rest of the world: .uk, .de, .fr, .cn, .ru, and so on. Already, my country (and so I) had an advantage, an edge. And yet it would be another six years before the World Wide Web was invented, and about nine years before my family got a computer with a modem that could connect to it.
Of course, the Internet is not a single entity, although we tend to refer to it as if it were. The technical reality is that there are new networks born every day on the global cluster of interconnected communications networks that you—and about three billion other people, or roughly 42 percent of the world’s population—use regularly. Still, I’m going to use the term in its broadest sense, to mean the universal network of networks connecting the majority of the world’s computers to one another via a set of shared protocols.
Some of you may worry that you don’t know a protocol from a hole in the wall, but all of us have made use of many. Think of protocols as languages for machines, the common rules they follow to be understood by one another. If you’re around my age, you might remember having to type the “http” at the beginning of a website’s address into the address bar of your Web browser. This refers to the Hypertext Transfer Protocol, the language you use to access the World Wide Web, that massive collection of mostly text-based but also audio- and video-capable sites like Google and YouTube and Facebook. Every time you check your email, you use a language like IMAP (Internet Message Access Protocol), SMTP (Simple Mail Transfer Protocol), or POP3 (Post Office Protocol). File transfers pass through the Internet using FTP (File Transfer Protocol). And as for the time-setting procedure on your phone that I mentioned, those updates get fetched through NTP (Network Time Protocol).
All these protocols are known as application protocols, and comprise just one family of protocols among the myriad online. For example, in order for the data in any of these application protocols to cross the Internet and be delivered to your desktop, or laptop, or phone, it first has to be packaged up inside a dedicated transport protocol—think of how the regular snail-mail postal service prefers you to send your letters and parcels in their standard-size envelopes and boxes. TCP (Transmission Control Protocol) is used to route, among other applications, Web pages and email. UDP (User Datagram Protocol) is used to route more time-sensitive, real-time applications, such as Internet telephony and live broadcasts.
Any recounting of the multilayered workings of what in my childhood was called cyberspace, the Net, the Infobahn, and the Information Superhighway is bound to be incomplete, but the takeaway is this: these protocols have given us the means to digitize and put online damn near everything in the world that we don’t eat, drink, wear, or dwell in. The Internet has become almost as integral to our lives as the air through which so many of its communications travel. And, as we’ve all been reminded—every time our social media feeds alert us to a post that tags us in a compromising light—to digitize something is to record it, in a format that will last forever.
Here’s what strikes me when I think back to my childhood, particularly those first nine Internet-less years: I can’t account for everything that happened back then, because I have only my memory to rely on. The data just doesn’t exist. When I was a child, “the unforgettable experience” was not yet a threateningly literal technological description, but a passionate metaphorical prescription of significance: my first words, my first steps, my first lost tooth, my first time riding a bicycle.
My generation was the last in American and perhaps even in world history for which this is true—the last undigitized generation, whose childhoods aren’t up on the cloud but are mostly trapped in analog formats like handwritten diaries and Polaroids and VHS cassettes, tangible and imperfect artifacts that degrade with age and can be lost irretrievably. My schoolwork was done on paper with pencils and erasers, not on networked tablets that logged my keystrokes. My growth spurts weren’t tracked by smart-home technologies, but notched with a knife into the wood of the door frame of the house in which I grew up.
WE LIVED IN a grand old redbrick house on a little patch of lawn shaded by dogwood trees and strewn in summer with white magnolia flowers that served as cover for the plastic army men I used to crawl around with. The house had an atypical layout: its main entrance was on the second floor, accessed by a massive brick staircase. This floor was the primary living space, with the kitchen, dining room, and bedrooms.
Above this main floor was a dusty, cobwebbed, and forbidden attic given over to storage, haunted by what my mother promised me were squirrels, but what my father insisted were vampire werewolves that would devour any child foolish enough to venture up there. Below the main floor was a more or less finished basement—a rarity in North Carolina, especially so close to the coast. Basements tend to flood, and ours, certainly, was perennially damp, despite the constant workings of the dehumidifier and sump pump.
At the time my family moved in, the back of the main floor was extended and divided up into a laundry room, a bathroom, my bedroom, and a den outfitted with a TV and a couch. From my bedroom, I had a view of the den through the window set into what had originally been the exterior wall of the house. This window, which once looked outside, now looked inside.
For nearly all the years that my family spent in that house in Elizabeth City, this bedroom was mine, and its window was, too. Though the window had a curtain, it didn’t provide much, if any, privacy. From as far back as I can remember, my favorite activity was to tug the curtain aside and peek through the window into the den. Which is to say, from as far back as I can remember, my favorite activity was spying.