Выбрать главу

There have been a number of criticisms of actor-network theory.[8] It tends to overlook groups such as women and the unemployed who are not prominent in networks associated with technological innovation. Actor-network theorists often seem to smuggle in concepts of social structure that they supposedly have jettisoned.

More importantly, social constructivists seem to restrict their efforts to explaining existing technology, not taking any stance on whether it is good or bad for humans nor saying how to go about changing it.[9] Since actor-network theory builds on actors — including artefacts — that exist, there is no theoretical warrant for examining technology that might be designed in a social system putting a priority on nonviolent struggle, especially since social structural analysis, including the concept of the military, is avoided.

Biased Technology

A useful framework for analysing technology for nonviolent struggle is to think of artefacts as non-neutral, biased, political or selectively useful.[10] In other words, they are easier to use for some purposes than others. A key aim of a social analysis of technology then is to find out which purposes a technology can be most easily used for, and why.

Most technologies developed by the military are biased, or selectively useful, for killing and destruction. This obviously is because the aim of most military science and technology has been to develop more lethal and destructive weapons.[11]

It is quite possible to kill or incapacitate someone without technology. For example, a suitable blow from the hand at the back of the neck can do this. Mass killing can occur without technology, but it is much easier — and more tempting — if technology designed for killing is available. Spears, axes, bows and arrows, rifles and explosives make killing easier. Admittedly, they can be used for killing animals and other less lethal purposes, but in many cases they have been specially designed for battles.

The idea of biased technology obviously is incompatible with the idea of technology as good, bad or neutral. On the other hand, the idea of biased technology is quite compatible with the social shaping perspective. One would expect that when the military influences the development of an artefact — such as designing a radar system or grenade — it is likely to be selectively useful to the military. But there are no automatic connections. It is necessary to examine actual technologies, not just the social shaping process, to determine which groups can most easily use them. The Internet had military origins but has turned out to be highly useful for communication between antiwar activists.

Another way to describe this approach is to say that technologies embody social values or social interests. The idea of embodiment suggests that technologies take on the values of the interest groups crucial to their development and in turn are likely to be selectively useful to these same interest groups. For example, nuclear technology was developed by scientists and engineers working in the service of governments and militaries. Some of the key characteristics of nuclear weapons and nuclear power are high potential danger and large scale, both generating a need for high security and centralised control. These features make nuclear technology selectively useful to the military and the state.

The idea of biased technology is quite common among those who examine technological alternatives, such as appropriate technology. But it has never been the centre of popular or scholarly perceptions. The most common popular perceptions of technology seem to be that it is neutral, good or bad. The social study of technology has focussed on social shaping approaches; in the past couple of decades, social analysis of the impacts of technology has not been nearly as common as analysis of social influences on technology. There is not even a good name for the view of technology as biased. To talk of biased technology certainly counters the idea of neutral technology, but it suggests that there is something wrong with it: in a general sense, being biased is not seen as a good thing, even if it is biased in favour of harmony or biased against torture. Also, to talk of biased technology suggests that bias could be removed, which is not possible — the question is which way technology is biased, and in whose interests. The meanings of alternative terms such as embodiment or selective usefulness are not immediately obvious.

Whatever its name, though, this perspective is quite useful for analysing technology for nonviolent struggle. This appendix began with the assumption that it is worthwhile to analyse technologies, including yet-to-be-developed technologies, according to their value to a system for nonviolent struggle. Working backwards, it is possible to judge theories of technology to see how well they serve this purpose. Ideas that technology or technologies are inherently good, bad, neutral or inevitable are not helpful at all. Ideas of social shaping have more potential, but are not well adapted to looking at alternatives to what exists. Most useful is the idea that technologies embody social values and are selectively useful for certain purposes. It should not be surprising that this has been the framework implicitly used throughout this book!

вернуться

8.

Olga Amsterdamska, “Surely you are joking, Monsieur Latour?” Science, Technology, & Human Values Vol. 15, 1990, pp. 495-504; Pam Scott, “Levers and counterweights: A laboratory that failed to raise the world.” Social Studies of Science, Vol. 21, 1991, pp. 7-35.

вернуться

9.

Langdon Winner, “Upon opening the black box and finding it empty: social constructivism and the philosophy of technology,” Science, Technology, and Human Values, Vol. 18, No. 3, Summer 1993, pp. 362-378. See also Stewart Russell, “The social construction of artefacts: a response to Pinch and Bijker,” Social Studies of Science, Vol. 16, 1986, pp. 331-346, a critique of another constructivist approach called “social construction of technology.”

вернуться

10.

There are no central references on this approach. Some representative works are David Elliott and Ruth Elliott, The Control of Technology (London: Wykeham, 1976); Ivan Illich, Tools for Conviviality (London: Calder and Boyars, 1973); Richard E. Sclove, Democracy and Technology (New York: Guilford Press, 1995); Langdon Winner, The Whale and the Reactor: A Search for Limits in an Age of High Technology (Chicago: University of Chicago Press, 1986).

вернуться

11.

Harvey M. Sapolsky, “Science, technology and military policy,” in Ina Spiegel-Rösing and Derek de Solla Price (eds.), Science, Technology and Society: A Cross-disciplinary Perspective (London: Sage, 1977), pp. 443-471 makes this point nicely, commenting that, in the shadow of weapons development, there is some work “in repairing battle wounds, in making rations more tasty, and in preventing machinery from rusting” (p. 459).