Yet the wedding of science and technology proposed by Bacon was not soon consummated. Over the next 200 years, carpenters and mechanics—practical men of long standing—built iron bridges, steam engines, and textile machinery without much reference to scientific principles, while scientists—still amateurs—pursued their investigations in a haphazard manner. But the body of men, inspired by Baconian principles, who formed the Royal Society in London in 1660 represented a determined effort to direct scientific research toward useful ends, first by improving navigation and cartography, and ultimately by stimulating industrial innovation and the search for mineral resources. Similar bodies of scholars developed in other European countries, and by the 19th century scientists were moving toward a professionalism in which many of the goals were clearly the same as those of the technologists. Thus, Justus von Liebig of Germany, one of the fathers of organic chemistry and the first proponent of mineral fertilizer, provided the scientific impulse that led to the development of synthetic dyes, high explosives, artificial fibres, and plastics, and Michael Faraday, the brilliant British experimental scientist in the field of electromagnetism, prepared the ground that was exploited by Thomas A. Edison and many others.
The role of Edison is particularly significant in the deepening relationship between science and technology, because the prodigious trial-and-error process by which he selected the carbon filament for his electric lightbulb in 1879 resulted in the creation at Menlo Park, N.J., of what may be regarded as the world’s first genuine industrial research laboratory. From this achievement the application of scientific principles to technology grew rapidly. It led easily to the engineering rationalism applied by Frederick W. Taylor to the organization of workers in mass production, and to the time-and-motion studies of Frank and Lillian Gilbreth at the beginning of the 20th century. It provided a model that was applied rigorously by Henry Ford in his automobile assembly plant and that was followed by every modern mass-production process. It pointed the way to the development of systems engineering, operations research, simulation studies, mathematical modeling, and technological assessment in industrial processes. This was not just a one-way influence of science on technology, because technology created new tools and machines with which the scientists were able to achieve an ever-increasing insight into the natural world. Taken together, these developments brought technology to its modern highly efficient level of performance. Criticisms of technology
Judged entirely on its own traditional grounds of evaluation—that is, in terms of efficiency—the achievement of modern technology has been admirable. Voices from other fields, however, began to raise disturbing questions, grounded in other modes of evaluation, as technology became a dominant influence in society. In the mid-19th century the non-technologists were almost unanimously enchanted by the wonders of the new man-made environment growing up around them. London’s Great Exhibition of 1851, with its arrays of machinery housed in the truly innovative Crystal Palace, seemed to be the culmination of Francis Bacon’s prophetic forecast of man’s increasing dominion over nature. The new technology seemed to fit the prevailing laissez-faire economics precisely and to guarantee the rapid realization of the Utilitarian philosophers’ ideal of “the greatest good for the greatest number.” Even Marx and Engels, espousing a radically different political orientation, welcomed technological progress because in their eyes it produced an imperative need for socialist ownership and control of industry. Similarly, early exponents of science fiction such as Jules Verne and H.G. Wells explored with zest the future possibilities opened up to the optimistic imagination by modern technology, and the American utopian Edward Bellamy, in his novel Looking Backward (1888), envisioned a planned society in the year 2000 in which technology would play a conspicuously beneficial role. Even such late Victorian literary figures as Lord Tennyson and Rudyard Kipling acknowledged the fascination of technology in some of their images and rhythms.
Yet even in the midst of this Victorian optimism, a few voices of dissent were heard, such as Ralph Waldo Emerson’s ominous warning that “Things are in the saddle and ride mankind.” For the first time it began to seem as if “things”—the artifacts made by man in his campaign of conquest over nature—might get out of control and come to dominate him. Samuel Butler, in his satirical novel Erewhon (1872), drew the radical conclusion that all machines should be consigned to the scrap heap. And others such as William Morris, with his vision of a reversion to a craft society without modern technology, and Henry James, with his disturbing sensations of being overwhelmed in the presence of modern machinery, began to develop a profound moral critique of the apparent achievements of technologically dominated progress. Even H.G. Wells, despite all the ingenious and prophetic technological gadgetry of his earlier novels, lived to become disillusioned about the progressive character of Western civilization: his last book was titled Mind at the End of Its Tether (1945). Another novelist, Aldous Huxley, expressed disenchantment with technology in a forceful manner in Brave New World (1932). Huxley pictured a society of the near future in which technology was firmly enthroned, keeping human beings in bodily comfort without knowledge of want or pain, but also without freedom, beauty, or creativity, and robbed at every turn of a unique personal existence. An echo of the same view found poignant artistic expression in the film Modern Times (1936), in which Charlie Chaplin depicted the depersonalizing effect of the mass-production assembly line. Such images were given special potency by the international political and economic conditions of the 1930s, when the Western world was plunged in the Great Depression and seemed to have forfeited the chance to remold the world order shattered by World War I. In these conditions, technology suffered by association with the tarnished idea of inevitable progress.
Paradoxically, the escape from a decade of economic depression and the successful defense of Western democracy in World War II did not bring a return of confident notions about progress and faith in technology. The horrific potentialities of nuclear war were revealed in 1945, and the division of the world into hostile power blocs prevented any such euphoria and served to stimulate criticisms of technological aspirations even more searching than those that have already been mentioned. J. Robert Oppenheimer, who directed the design and assembly of the atomic bombs at Los Alamos, N.M., later opposed the decision to build the thermonuclear (fusion) bomb and described the accelerating pace of technological change with foreboding:
One thing that is new is the prevalence of newness, the changing scale and scope of change itself, so that the world alters as we walk in it, so that the years of man’s life measure not some small growth or rearrangement or moderation of what he learned in childhood, but a great upheaval.