Выбрать главу

Another information technology that has seen smooth exponential growth is our ability to communicate with one another and transmit vast repositories of human knowledge. There are many ways to measure this phenomenon. Cooper’s law, which states that the total bit capacity of wireless communications in a given amount of radio spectrum doubles every thirty months, has held true from the time Guglielmo Marconi used the wireless telegraph for Morse code transmissions in 1897 to today’s 4G communications technologies.4 According to Cooper’s law, the amount of information that can be transmitted over a given amount of radio spectrum has been doubling every two and a half years for more than a century. Another example is the number of bits per second transmitted on the Internet, which is doubling every one and a quarter years.5

The reason I became interested in trying to predict certain aspects of technology is that I realized about thirty years ago that the key to becoming successful as an inventor (a profession I adopted when I was five years old) was timing. Most inventions and inventors fail not because the gadgets themselves don’t work, but because their timing is wrong, appearing either before all of the enabling factors are in place or too late, having missed the window of opportunity.

The international (country-to-country) bandwidth dedicated to the Internet for the world.6

The highest bandwidth (speed) of the Internet backbone.7

Being an engineer, about three decades ago I started to gather data on measures of technology in different areas. When I began this effort, I did not expect that it would present a clear picture, but I did hope that it would provide some guidance and enable me to make educated guesses. My goal was—and still is—to time my own technology efforts so that they will be appropriate for the world that exists when I complete a project—which I realized would be very different from the world that existed when I started.

Consider how much and how quickly the world has changed only recently. Just a few years ago, people did not use social networks (Facebook, for example, was founded in 2004 and had 901 million monthly active users at the end of March 2012),8 wikis, blogs, or tweets. In the 1990s most people did not use search engines or cell phones. Imagine the world without them. That seems like ancient history but was not so long ago. The world will change even more dramatically in the near future.

In the course of my investigation, I made a startling discovery: If a technology is an information technology, the basic measures of price/performance and capacity (per unit of time or cost, or other resource) follow amazingly precise exponential trajectories.

These trajectories outrun the specific paradigms they are based on (such as Moore’s law). But when one paradigm runs out of steam (for example, when engineers were no longer able to reduce the size and cost of vacuum tubes in the 1950s), it creates research pressure to create the next paradigm, and so another S-curve of progress begins.

The exponential portion of that next S-curve for the new paradigm then continues the ongoing exponential of the information technology measure. Thus vacuum tube–based computing in the 1950s gave way to transistors in the 1960s, and then to integrated circuits and Moore’s law in the late 1960s, and beyond. Moore’s law, in turn, will give way to three-dimensional computing, the early examples of which are already in place. The reason why information technologies are able to consistently transcend the limitations of any particular paradigm is that the resources required to compute or remember or transmit a bit of information are vanishingly small.

We might wonder, are there fundamental limits to our ability to compute and transmit information, regardless of paradigm? The answer is yes, based on our current understanding of the physics of computation. Those limits, however, are not very limiting. Ultimately we can expand our intelligence trillions-fold based on molecular computing. By my calculations, we will reach these limits late in this century.

It is important to point out that not every exponential phenomenon is an example of the law of accelerating returns. Some observers misconstrue the LOAR by citing exponential trends that are not information-based: For example, they point out, men’s shavers have gone from one blade to two to four, and then ask, where are the eight-blade shavers? Shavers are not (yet) an information technology.

In The Singularity Is Near, I provide a theoretical examination, including (in the appendix to that book) a mathematical treatment of why the LOAR is so remarkably predictable. Essentially, we always use the latest technology to create the next. Technologies build on themselves in an exponential manner, and this phenomenon is readily measurable if it involves an information technology. In 1990 we used the computers and other tools of that era to create the computers of 1991; in 2012 we are using current information tools to create the machines of 2013 and 2014. More broadly speaking, this acceleration and exponential growth applies to any process in which patterns of information evolve. So we see acceleration in the pace of biological evolution, and similar (but much faster) acceleration in technological evolution, which is itself an outgrowth of biological evolution.

I now have a public track record of more than a quarter of a century of predictions based on the law of accelerating returns, starting with those presented in The Age of Intelligent Machines, which I wrote in the mid-1980s. Examples of accurate predictions from that book include: the emergence in the mid- to late 1990s of a vast worldwide web of communications tying together people around the world to one another and to all human knowledge; a great wave of democratization emerging from this decentralized communication network, sweeping away the Soviet Union; the defeat of the world chess champion by 1998; and many others.

I described the law of accelerating returns, as it is applied to computation, extensively in The Age of Spiritual Machines, where I provided a century of data showing the doubly exponential progression of the price/performance of computation through 1998. It is updated through 2009 below.

I recently wrote a 146-page review of the predictions I made in The Age of Intelligent Machines, The Age of Spiritual Machines, and The Singularity Is Near. (You can read the essay here by going to the link in this endnote.)9The Age of Spiritual Machines included hundreds of predictions for specific decades (2009, 2019, 2029, and 2099). For example, I made 147 predictions for 2009 in The Age of Spiritual Machines, which I wrote in the 1990s. Of these, 115 (78 percent) are entirely correct as of the end of 2009; the predictions that were concerned with basic measurements of the capacity and price/performance of information technologies were particularly accurate. Another 12 (8 percent) are “essentially correct.” A total of 127 predictions (86 percent) are correct or essentially correct. (Since the predictions were made specific to a given decade, a prediction for 2009 was considered “essentially correct” if it came true in 2010 or 2011.) Another 17 (12 percent) are partially correct, and 3 (2 percent) are wrong.

Calculations per second per (constant) thousand dollars of different computing devices.10