Most importantly, at this moment, our species began to become more aware of ourselves. While it’s evident that the mind co-evolved with language, it was at this moment that we were not only able to communicate increasingly complex concepts to one another, but also to store it in our brains effectively. Our cognition advanced.
Knowledge Is Power
With language came the ability to coordinate with each other more effectively. Nomadic tribes began to develop symbols to keep themselves better organized. Calendars appear to have been developed about 10,000 years ago, improving our ability to plant seasonal crops. Armed with the seeds of agriculture, we didn’t need to be as transient anymore; gradually, supported by the surpluses of expanding agriculture, nomadic tribes turned into civilizations, and more sophisticated governments emerged.
The Sumerians and then the Egyptians started using glyphs to express the value of currency around 6,000 years ago. Once the Egyptians settled on a standard alphabet years later, they reaped another information technology boom and reached heights no other civilization previously known to man ever had, taking on massive engineering tasks, building new modes of transportation, and acquiring vast power across an empire.
Yet carving symbols into stone tablets was painstaking work, and errors were costly. Stone tablets didn’t travel that well either, and if they broke, weeks, months, or even years of work could be lost in an instant. As such, production of this kind of information was largely relegated to a special class: scribes.
Being a good scribe meant holding significant power in Egypt.[15] Not only were scribes exempt from the manual labor of the lower classes in Egypt, but many also supervised developments or large-scale government projects. They were considered part of the royal court, didn’t have to fight in the military, and had guaranteed employment not only for themselves, but for their sons as well.[16] In the case of one scribe of the third dynasty’s chief, Imhotep, it even meant post-mortem deification.[17]
Later, another era in communication began with the creation of the first form of mass media: Gutenberg’s moveable type, fitted to a printing press, which enabled writing to be produced as type-set books, each copy identical to the last, without scribes.
Again, society was transformed. Literacy spread along with printing. As books became plentiful and inexpensive, they could be acquired by any prosperous, educated person, not just by the ruling or religious classes. This set the stage for the Renaissance, the flowering of artistic, scientific, cultural, religious, and social growth that swept across Europe. Next came the revival and spread of democracy. By the time of the American Revolution, printing had made Thomas Paine’s pamphlets bestsellers that rallied the troops to victory. The modern metropolitan newspaper, radio, television—all were based on the same basic idea: that communication could be mass-produced from a central source.
The latest transformational change came in earnest just three decades ago, when the personal computer and then the Internet converged to throw us firmly into the digital age. Today, five billion people have cell phones. A constantly flowing electron cloud encircles and unites a networked planet. Anyone with a broadband connection to the Internet has access to much, if not all, of the knowledge that came before, and the ability to communicate not just as a single individual but as a broadcaster. Smartphones are pocketsized libraries, printing presses, cameras, radios, televisions—all that came before, in the palm of your hand.
The Arguments Against Progress
Technical progress always comes with its critics. The greater the speed and power of this progress, the greater the criticism. Intel researcher Genevieve Bell notes that every time we have shifts in technology, we also have new moral panic. Panic? Here’s just one example: there were some who believed, during the development of the railway, that a woman’s uterus could go flying out of her body if she accelerated to 50 miles per hour.[18]
Electricity came with a set of critics, too: the electric light could inform miscreants that women and children were home. The lightbulb was a recipe for total social chaos.
These Luddite folk tales are funny, looking back. But other criticisms have gained traction over the centuries.
Our connection to the teachings of Socrates, for instance, is through the written word of Plato, because Socrates was vehemently against the written word. Socrates thought that the book would do terrible things to our memories. We’d keep knowledge in books and not in our heads. And he was right: people don’t carry around stories like The Iliad in their heads anymore, though it was passed down in a verbal tradition for hundreds of years before the written word. We traded memorization for the ability to learn less about more—for choice.
Critics of the printing press believed that books would cause the spread of sin and eventually destroy the relationship between people and the church. As author and New York University professor Clay Shirky rightfully points out, the printing press did indeed fuel the Protestant Reformation, and yes, growth in erotic fiction.[19]
Though some critiques of the written word have fared better than others, all have faded over time. There just aren’t that many people today who think the printing press was a bad idea—not if five billion of them are voting with their purchases to carry one around with them all day.
Despite this, critiques of technology on moral grounds look very similar today. The strongest critiques (as Bell notes) tend to be about women and children. As if it were the modern-day critique of electricity, the television show “To Catch a Predator” features sexual predators using the Internet to seduce children—the subtext is that this powerful new tool can be used to steal your babies.
Still, there is a serious trend emerging in digital age critiques. Distinguished journalists, acclaimed scholars, and prominent activists are worried about what the information explosion is doing to our attention spans or even to our general intelligence. Bill Keller, former executive editor of The New York Times, equated allowing his daughter to join Facebook to passing her a pipe of crystal meth.[20]
Nicholas Carr’s book The Shallows (W.W. Norton) is full of concerns that social media is making his brain demand “to be fed the way the Net fed it— and the more it was fed, the hungrier it became.”[21] In the Atlantic, Carr’s “Is Google Making Us Stupid” expresses similar fears: “Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.”[22]
In his book The Filter Bubble (Penguin), my friend and left-of-center activist Eli Pariser warns us of the dangers of personalization. Your Google search results, your Facebook newsfeed, and even your daily news are becoming tailored specifically to you through the magic of advanced technology. The result: an increasingly homogenized view of the world is delivered to you in such a fashion that you’re only able to consume what you already agree with.
These kinds of critiques of the Web are nothing new. They’re as old as the Web itself—older, actually. In 1995, in the very early days of the World Wide Web, Clifford Stoll wrote in Silicon Snake Oil (Anchor), “Computers force us into creating with our minds and prevent us from making things with our hands. They dull the skills we use in everyday life.”
15
It’s important to note, in the context of power’s relationship to information, that reading and writing quickly became trade secrets belonging to this set of professionals. Women were quickly excluded. Lower-class citizens needn’t apply.
17
M. Lichtheim.
18
http://blogs.wsj.com/tech-europe/2011/07/11/women-and-children-first-technology-and-moral-panic/
22
Carr, Nicholas.