I am now working to complete a book titled Informating Environmentalism, which examines how developments in information technology and culture since the late 1980s have played out in the environmental field, shaping how environmental problems are conceived and deemed significant, or rendered marginal. It is both a history of technology that charts how people have developed databases, web interfaces, computer models and simulations for visualizing and governing environmental problems, and an ethnographic study of how new information flows and processing capabilities have transformed people’s motivations, practices, judgments and priorities, and the political economic spaces in which they operate. My primary interlocutors in the research for the book were information technologists of various kinds, working in academic and government laboratories, at the grassroots and in international environmental organizations like Greenpeace.
The key argument of the book is that the environmental field is one site of a significant historical shift in the way knowledge is produced and used, largely driven by information technology and culture. The informating of environmentalism, I argue, has changed what counts as epistemologically robust and objective. The argument builds on work in the history of science focused on ways scientific concepts and methodologies evolve, affecting scientific practice, how science is used in practical affairs, and cultural constructs about what makes knowledge robust, credible and operational. I show how information-processing capabilities have encouraged people to develop understanding through comparison, extrapolation and cumulative effect, displacing the dominance of linear constructs of causality in sense-making practices. In the environmental field, where inability to establish direct causality has made it difficult to act on many problems, this has quite profound implications.
I illustrate and develop this argument with examples from many different corners of the environmental field. One section of the book, for example, focuses on exposure science, a relatively new scientific field that aims to connect toxic emission data to human health effects. It is “crafty” science, often dependent on incomplete monitoring data sets and complex, interlinked computer models and simulations. Practitioners tend to be very aware of the limits of their knowledge and speak fluently about what they term “requisite precision” – a level of accuracy attuned simultaneously to the demands of good science and timely policy recommendations. Practitioners also tend to be highly aware of the cost of carrying out exposure science (monitoring with currently available technology, for example, is very costly), and of the political climate in which their claims will be circulated. They thus tend to be reflective and articulate about how knowledge is created and legitimated.
Another section of the book focuses on toxicogenomics, an emergent scientific field that aims to understand the impact of environmental stressors at the genetic level. Toxicogenomics is carried out with gene chips that make it possible to profile thousands of genes in a sweep, identifying which ones “express” in response to specific toxicants. Gene chips, however, are noisy, creating lots of signals that don’t necessarily link to adverse effects. The work is highly interpretative, and dependent on databases that allow toxicogenomics researchers to “anchor” their findings to more established indices of toxicity – provided by pathologists or clinical chemists, for example. Toxicogenomics researchers thus work through comparison at multiple levels, ever trying to tease out differences that make a difference. The dream of many toxicogenomics researchers is to be able to extrapolate to understand whole chemical classes so that toxicology catches up with the thousands of chemicals in use, most of which have not been tested at all for toxicity.
Increased awareness of what we don’t know about environmental risk has driven many of the developments that I describe, and it is with a key event in the development of this awareness that the book begins. The US Community Right-to-Know Act of 1986, passed in response to the Bhopal disaster, mandated a range of initiatives to support emergency planning and public access to information. High-risk facilities, for example, had to provide the information needed by local rescue personnel to plan emergency evacuations. By the time amendments to the Clean Air Act were passed in 1990, this had evolved into a mandate for “worst-case scenarios” for 66,000 high-risk facilities around the United States. Another key component of the 1986 Right-to-Know Act was the Toxic Release Inventory (TRI), the first federal database that Congress said must be released to the public in a computer readable format.
Environmental right-to-know laws have dramatically increased the quantity and quality of environmental risk information available to the public. They have also tightened the entwining of environmental politics with information politics. Recognition of environmental problems has always depended on extensive information practices, often involving intricate networks of people and organizations. Since the late 1980s, it has become clear how politically charged these information practices and networks are. Some corporations, for example, have “gone green” in efforts to control information flow about their activities and – even before “9-11” – lobbied to have environmental risk information removed from the public sphere, claiming that it would enable terrorism. It is now illegal to post details on “worst case scenarios” – Bhopal-like incidents – on the Internet. Greenpeace has figured out ways to work around and within the restriction, becoming a leading voice on “chemical security.”
Informating Environmentalism is made up of stories like these – showing how information technology and culture have played out in the environmental field, showing readers what so-called Information Society looks like in practice. The book is intended to provoke re-consideration of the now common argument that digitization is a disenchanting, reductive project that feeds what Avital Ronell calls “metaphysical cravings” to transcend the body. This argument is not incorrect. But nor does it account for all that information technology and culture have produced. Many of the projects that I describe have been driven by corrective logics and a desire to describe a particular phenomenon in all its totality. Information shortage is the problem many people start with. Sustained engagement with information technology and culture often leads elsewhere, however, compelling people to play with different ways of knowing, helping build understanding even when information is imperfect, incomplete, or so voluminous as to be illegible. Much of the work of the environmental field over the last two decades has been involved in this kind of play. I interpret this turn enthusiastically, seeing information technology as a material cultural form that encourages constant re-ordering, substitution, addition, displacement and re-alignment. It thus has great conceptual force, literally changing the grounds of imagination and thought. Informating Environmentalism tells this story, a story of how toxics, through the work and play of information technology, theory and culture, may finally be heard as signal rather than as noise.