There are some precedents for this sort of participatory R&D. Citizen groups in Japan — often with participation by some scientists — have investigated environmental problems, using simple techniques such as talking to people about local health problems and testing for the presence of radioactivity by observing specially sensitive plants. Such an approach was more successful in determining the cause of Minamata disease — due to mercury pollution in the ocean — than heavily funded teams of traditional scientists using sophisticated ocean sampling and computer models.[6]
Many parts of the women’s health movement — most prominently, the Boston Women’s Health Book Collective — have reassessed available evidence and drawn on their own personal experiences to provide a different perspective about women’s health, one that is less responsive to the interests of drug companies and medical professionals and more responsive to the concerns and experiences of women themselves.[7]
AIDS activists in the US, concerned about the slow and cumbersome processes for testing and approving drugs to treat AIDS, developed their own criteria and procedures and tried them out with drugs, some of which were produced and distributed illicitly. Their efforts and political pressure led to changes in official procedures.[8]
These examples show that nonscientists can make significant contributions to the process of doing science, and in some cases do better or cause changes in establishment approaches. However, the issue is not a competition between scientists and nonscientists, but rather promotion of a fruitful interaction between them. Scientists, to do their jobs effectively, need to bring the community “into the lab” and nonscientists need to learn what it means to do research. In the process, the distinction between the two groups would be blurred.
A good case study of the two models is the debate over encryption of digital communication described in chapter 5. The military model was embodied, literally and figuratively, in the Clipper chip, designed by the US National Security Agency so that authorised parties could decipher any encrypted messages. Clipper was designed in secrecy. It was based on the Skipjack algorithm, which remained a secret. Clipper and related systems were planned for installation in telephones and computer networks essentially as “black boxes,” which people would use but not understand. If Clipper had been a typical military technology, such as a ballistic missile or fuel-air explosive, it would have been implemented in military arenas with little debate (except perhaps from peace activists) and certainly little public input into the choice of technology.
At first glance, the participatory alternative to Clipper is public key encryption, widely favoured by computer users. But rather than the alternative being a particular technology, it is more appropriate to look at the process of choosing a technology. Encryption has been the subject of vigorous and unending discussions, especially on computer conferences. Different algorithms have been developed, tested, scrutinised and debated. This has occurred at a technical level and also a social level. Various encryption systems have been examined by top experts, who have then presented their conclusions for all to examine. As well, the social uses and implications of different systems have been debated. Last but not least, lots of people have used the encryption systems themselves. The contrast to Clipper is striking.
Even the more participatory process used in developing and assessing encryption is still limited to a small part of the population. This is inevitable, since not everyone can be involved in looking at every technology. The point is that the process is relatively open: there are far more people who have investigated cyptography in relation to public key encryption than could ever be the case with a government-sponsored technology such as Clipper. The other important point is that the participatory process requires informed popular acceptance of the technology, rather than imposition through government pressure. The best indicator of the participatory process is a vigorous and open debate involving both technical and social considerations.
The case of encryption shows that participatory R&D does not eliminate the role of expertise. What it does reduce is the automatic association of expertise with degrees, jobs in prestigious institutions, high rank, awards, and service to vested interests. Expertise has to be tested in practical application. Just as an athlete cannot claim current superiority on the basis of degrees or past victories, so an expert in a process of participatory R&D cannot rely on credentials, but is always subject to the test of current practice.
These comments on participatory R&D are inevitably tentative. By their very nature, participatory systems are shaped by the process of participation itself, so what they become is not easy to predict.
10. TECHNOLOGY POLICY FOR NONVIOLENT STRUGGLE
The basic idea of technology for nonviolent struggle is straightforward. Actually bringing this alternative about — doing relevant research and developing, testing and implementing relevant technologies — is much more difficult. In this chapter I discuss priorities for moving towards technology that serves nonviolent rather than violent struggle.
The term usually used when discussing priorities of this sort is “policy,” in this case technology policy. The idea of policy, though, has come to refer primarily to decisions and implementation by governments. Governments are certainly important players in R&D, but not the only ones. After discussing priorities, I look at what can be done by three particular groups: governments; scientists and engineers; and community groups.[1]
Before beginning, it is worth emphasising that there are enormous institutional and conceptual obstacles to promoting nonviolent struggle.[2] Many government and corporate leaders would do everything they could to oppose development of grassroots capacity for nonviolent action, since this would pose a direct threat to their power and position. Furthermore, the idea of popular nonviolent struggle is extremely challenging to many people given standard expectations that the “authorities” or experts will take care of social problems, including defence. Therefore, to talk of technology policy for nonviolent struggle may seem utopian. But if alternatives are ever to be brought about, it is important to talk about them now. Without vision and dialogue, there is little hope of building a nonviolent future.
Priorities
The traditional idea of technological advance was the “linear model”: first there is scientific research; the results of the research are applied, thereby producing a technological application; finally, the technology is taken up in the marketplace. Among those who study technological innovation, this simple model is pretty much discredited. Innovation seldom happens this way.
Another model is “market pull.” There is a demand for a certain product or service. This encourages technologists to search for a suitable solution; sometimes this involves doing directed research.
In practice, the process of innovation is usually complex. It involves market incentives, new ideas coming out of basic research, economic and psychological commitments to current systems, and the particular agendas of interest groups such as politicians, government bureaucracies, corporate elites, and various pressure groups. Nevertheless, the usual models of innovation focus on several key players: government and the market and their relation to R&D. The “market” is constituted by those who buy and sell the product in question.
6.
Jun Ui, “The interdisciplinary study of environmental problems,”
7.
Boston Women’s Health Book Collective,
8.
Steven Epstein, “Democratic science? AIDS activism and the contested construction of knowledge,”
1.
Conventional technology policy literature is not deployed in this chapter. It is almost entirely oriented to top-down decision making and provides few insights about policy making for a participatory system such as social defence. Issues such as the suppression of innovation by vested interests, the influence of managerial control, worker opposition and social movements are almost entirely absent from the conventional policy literature. Innovation from the grassroots, or more generally any innovation that is noncommercial or a challenge to state interests, is given virtually no attention. Some typical sources that fit this characterisation are Rod Coombs, Paolo Saviotti and Vivien Walsh,