How do computational technologies shape ecological thought? This paper is concerned with developing a richer understanding of how technologies influence ecological worldviews through examples drawn from the histories of both art and science. Technologies profoundly structure our understanding of complex systems, informing the analogies and metaphors we think with. For example, it is common place to say we live in eco-systems, depend on food chains and our brains store, compute and recall information. Just as Earth’s environments have been materially refigured by infrastructures for energy, resource extraction and communications, our language and epistemologies have also been shaped by these systems. We take for granted that the environment is a system, a notion that is propped up by the dominance of computation and the narratives which surround these technologies. However, as I will argue, environments and their ecologies are not systems, and this pervasive idea leaves much of the world out.
As we scramble to understand and respond to a rapidly destabilizing environment characterized by human disturbance, climate change and extinction, we are simultaneously reveling in an unprecedented surplus of computing. It is therefore tempting (and lucrative) to make claims that neat technological fixes can address thorny existential problems, a modernist impulse that remains well and truly alive in projects like the smart city and even more dramatically in proposals for environmental interventions such as atmospheric engineering. 2018 will see the first atmospheric tests of geoengineering technologies carried out in the deserts of Arizona (Temple). However, these responses often fail to adequately acknowledge the social and economic factors that cannot be addressed by technology alone, and readily overlook the always incomplete nature of a data-driven perspective on the world. At a time when computing giants like Google and Microsoft are pushing to insert the logic of computation into the management of environments that can only ever be partially understood, it is hard to overstate the importance of tending to a deeper and more robust understanding of the relationship between computational technologies and ecological thinking.
If environments are not systems, then what are they?
Ecological thought refers to how we perceive Earth’s environments and understand the multitude of relations within them. I build upon Anna Tsing’s work, understanding environments to be open ended assemblages of non-humans, living and nonliving, entangled in ways of life.
“Ecologists turned to assemblages to get around the sometimes fixed and bounded connotations of ecological “community.” The question of how the varied species in a species assemblage influence each other—if at all—is never settled: some thwart (or eat) each other; others work together to make life possible; still others just happen to find themselves in the same place. Assemblages are open-ended gatherings. They allow us to ask about communal effects without assuming them.” (Tsing 54)
For Tsing, metaphors like system or community imply narratives of progress, overemphasize intentionality and chains of cause and effect, and obscure the role that precarity and indeterminacy play in the conditions of life. Both precarity, the state of being vulnerable to another, and indeterminacy, the unplanned or unpredictable nature of the encounters, are downplayed when the world is viewed as a system.
I argue that these worldviews are intimately tied to our technological apparatus, which influence how we measure, value and manipulate environmental assemblages. When armed with a computer, the world becomes data and relations appear as systems. In this way computational approaches like modelling and AI simultaneously reveal and obscure aspects of reality. Just this year Microsoft announced ”AI for Earth”, a program that will put artificial intelligence in the hands of those who are trying to “monitor, model and manage the earth’s natural systems” (“AI for Earth Grant”). Pledging $50 million to the initiative, this gives environmental researchers access to Microsoft’s Azure cloud platform and its AI products. In the last decade Silicon Valley ideology has saturated a diverse range of fields from urban design to the justice system, and this announcement is imbued with a familiar mix of solutionism and teleology, this time promising to transform “the way we are currently managing complex environmental challenges” (“AI for Earth Grant”).
If we are to believe the hype, and seriously consider if and how these technologies can reshape our relationship with environments, we must thoroughly examine what they amplify and what they edit out. This is my starting point, and I consider several histories that highlight some characteristics of a computational view of the world. My paper will will go on to explore how this view might be rounded out.
When models fail.
The history of environmental management is riddled with embarrassing failures that stem from incorrect assumptions and the oversimplification of environmental assemblages. An early example comes from James Scott’s account of German forestry through the 18th and 19th century. Although this precedes the advent of contemporary computation, the German use of numerical and bureaucratic methodologies to understand and manage natural systems remain strongly resonant. Scott’s study of the ‘scientific forest’ or ‘fiscal forest’ shows how decisions regarding the management and replanting of commercial forests were driven to optimize timber yield (Scott, 11). As such, the data collected represented trees as resources and drove the once biodiverse forests to be reproduced almost as monocultures. The ecological entanglements of species in the forest were deemed to be external to a system designed to efficiently transform species into commodities. Modeling the forest as a timber factory oversimplified the reality of the relations and interdependencies of its species and after a couple of generations, yields began to drop, the forest ecosystem destabilized and came close to collapse.
Scott’s scientific forest is an example of what Tsing calls “the plantation” (Tsing, 56). The plantation is an extreme simplification of environments as they are optimized to produce commodities, something that often results in monoculture. Like in Scott’s forest, the economic drivers of capitalism make crop yields the ultimate goal of an agricultural landscape, and this determines how the landscape is measured, modeled and manipulated. The landscape become a factory and its species become assets alienated from their lifeworlds, like workers who fulfil hits on Mechanical Turk with no bearing on each other or what they produce. When the asset can no longer be extracted, the landscape becomes a ruin and disappears from view, deemed worthless (Tsing, 31). Both Scott’s scientific forest and Tsing’s plantation are examples of the complex interplay of economics and numerical approaches to landscape management, processes that are now accelerated through contemporary computation. What they show us is that data collection and modeling practices are never neutral. Rather, they reflect what is deemed important or trivial in the eyes of the modeler and therefore are profoundly shaped by economics, hubris and culture.
Today we can collect orders of magnitude more data from environments to produce higher resolution models of the world. However, data collection and modeling practices remain shaped by often tacit assumptions of what is thought to be typical or atypical. The story of British geophysicist Joe Farman’s discovery of the ozone hole is a well known example of the dangers of assuming more data equals more reality. Farman maintained a single ground based ozone sensor in the Antarctic in spite of the launch of NASA atmospheric monitoring satellites that collected vastly larger quantities of data (Vitello). When Farman’s data began to show a 40% drop in ozone levels, he assumed the sensor was damaged and replaced it, as NASA’s climate models had reported no such change. After years of study and careful checking, Farman published this alarming result in Nature, confirming the destruction of the ozone layer due to human pollutants (Farman). How had NASA’s satellites missed such a marked change in ozone composition? One response from NASA suggests that their data processing software was programmed to discard readings that appeared to be outliers, thus ignoring the drastic changes that were occurring (Vitello). In this case, reality itself was an outlier and assumed to be an error.
The limits of computation
What if there was no cap on the amount of data produced from the environment for analysis? What if computational technologies develop to a point where we could stop discarding outliers in datasets? Would the resultant models of the world solve the problems highlighted in these examples and produce a robust representation of reality, free of human assumption? This is the utopian narrative at the heart of big data and artificial intelligence initiatives like Microsoft’s AI for Earth, however there are two reasons to tend to a reflexive view of what these technologies can and cannot do.
Firstly, at the center of a data driven approach is the assumption that the past is indicative of the future. Big data has led to developments in machine learning which describes new statistical approaches that derive models from pattern in large quantities of data without any knowledge of underlying structures, whether these are language, environmental processes or genomics. As Chris Anderson succinctly observes in his 2008 article in Wired magazine, End of Theory, “Correlation is enough. We can stop looking for models” (Anderson). And yet climate change, more aptly described as climate destabilization, means that as environmental change accelerates, our ability to make accurate predictions from our existing data is diminished. At best, environmental datasets like precipitation records span 250 years, with most spanning a much shorter period than this (Simpson and Jones). From a geological point of view this is an absurdly small slice of time, and one in which the earth’s climate has been relatively stable. As the patterns, rhythms and cycles in both climatic and biological phenomena are drastically disrupted, it becomes increasingly difficult to make predictions based on this short, stable interval of climate data. This is what William B. Gail calls the coming of “a new dark age”, where our accumulated observations of Earth’s irreducibly complex conditions are increasingly rendered obsolete, revealing the limits of machine learning methods in environmental modeling (Gail).
We must acknowledge these limits, but perhaps more significantly we must acknowledge how deeply entrenched we are within a computational worldview that assumes the systemacity of environments and under-acknowleges the indeterminacy of environmental encounters. We should not assume that atmospheric gases and the species that encounter them can be adequately represented in a purely mechanistic, data-driven way any more than we should assume humans fit the cybernetic behavior models proposed by researchers like B. F. Skinner in the 1930s (Skinner). Data and models are just one method in our toolkit for understanding environments and ecology. Yet, the authority of science, amplified by computation, makes it is easy to forget that there is always more going on.
What other methods do we have then to round out ecological thinking and add to the perspectives offered by science and technology? In what ways can we build a capacity for understanding environmental assemblages that can behave like systems, but are not systems themselves. How can we break free from the binding reductive metaphors of technological thought?
What tactics do we have then to build a richer view of environmental assemblages? What tactics construct more robust understanding of what Bruno Latour calls “matters of concern”? (Latour) As a start, we should look to practices that explore the unforeseen consequences and possibilities of technologies, applying them to reveal the edges of systems, where they break down, what they leave out and how they might be repurposed. These are practices of eccentric engineering, practices that blend scientific and conceptual languages and that reveal technologies as products of ideology in performative and public ways.
AI for Earth Grant (2017), https://www.microsoft.com/en-us/research/academic-program/azure-research-award-ai-earth/
Anderson, Chris. “The end of theory: The data deluge makes the scientific method obsolete.” Wired magazine 16.7 (2008): 16-07.
Farman, Joseph C., Brian G. Gardiner, and Jonathan D. Shanklin. “Large losses of total ozone in Antarctica reveal seasonal ClOx/NOx interaction.” Nature 315.6016 (1985): 207-210.
Gail, William B. “A New Dark Age Looms.” New York Times, April 19 (2016), https://www.nytimes.com/2016/04/19/opinion/a-new-dark-age-looms.html
Haraway, Donna Jeanne. When species meet. Vol. 224. U of Minnesota Press, 2008.
Latour, Bruno. “Why has critique run out of steam? From matters of fact to matters of concern.” Critical inquiry 30.2 (2004): 225-248.
Madison, M. “Potatoes Made of Oil: Eugene and Howard Odum and the Origins and Limits of American Agroecology.” Environment and History, 3(2),(1997): 209-238.
Scott, James C. Seeing like a state: How certain schemes to improve the human condition have failed. Yale University Press, 1998.
Simpson, I. R., and P. D. Jones. “Analysis of UK precipitation extremes derived from Met Office gridded data.” International Journal of Climatology 34.7 (2014): 2438-2449.
Skinner, Burrhus Frederic. The behavior of organisms: An experimental analysis. BF Skinner Foundation, 1990.
Temple, James. “Harvard Scientists Moving Ahead on Plans for Atmospheric Geoengineering Experiments.” MIT Technology Review March 24, 2017, https://www.technologyreview.com/s/603974/harvard-scientists-moving-ahead-on-plans-for-atmospheric-geoengineering-experiments/
Tsing, Anna Lowenhaupt. The mushroom at the end of the world: On the possibility of life in capitalist ruins. Princeton University Press, 2015.
Vitello, Paul. “Joseph Farman, 82, Is Dead; Discovered Ozone Hole.” New York Times, May 18, 2013.