Tega Brain – The Environment is not a System

How do computational technologies shape ecological thought? This paper is concerned with developing a richer understanding of how technologies influence ecological worldviews through examples drawn from the histories of both art and science. Technologies profoundly structure our understanding of complex systems, informing the analogies and metaphors we think with. For example, it is common place to say we live in eco-systems, depend on food chains and our brains store, compute and recall information. Just as Earth’s environments have been materially refigured by infrastructures for energy, resource extraction and communications, our language and epistemologies have also been shaped by these systems. We take for granted that the environment is a system, a notion that is propped up by the dominance of computation and the narratives which surround these technologies. However, as I will argue, environments and their ecologies are not systems, and this pervasive idea leaves much of the world out.

Ecosystem diagram by Howard Odum, 1960.
Prominent biologist of the 1960s, Howard Odum presented ecosystems using the symbolism and aesthetic of electric circuit diagrams (Madison).

 

As we scramble to understand and respond to a rapidly destabilizing environment characterized by human disturbance, climate change and extinction, we are simultaneously reveling in an unprecedented surplus of computing. It is therefore tempting (and lucrative) to make claims that neat technological fixes can address thorny existential problems, a modernist impulse that remains well and truly alive in projects like the smart city and even more dramatically in proposals for environmental interventions such as atmospheric engineering. 2018 will see the first atmospheric tests of geoengineering technologies carried out in the deserts of Arizona (Temple). However, these responses often fail to adequately acknowledge the social and economic factors that cannot be addressed by technology alone, and readily overlook the always incomplete nature of a data-driven perspective on the world. At a time when computing giants like Google and Microsoft are pushing to insert the logic of computation into the management of environments that can only ever be partially understood, it is hard to overstate the importance of tending to a deeper and more robust understanding of the relationship between computational technologies and ecological thinking.

If environments are not systems, then what are they?

Ecological thought refers to how we perceive Earth’s environments and understand the multitude of relations within them. I build upon Anna Tsing’s work, understanding environments to be open ended assemblages of non-humans, living and nonliving, entangled in ways of life.

“Ecologists turned to assemblages to get around the sometimes fixed and bounded connotations of ecological “community.” The question of how the varied species in a species assemblage influence each other—if at all—is never settled: some thwart (or eat) each other; others work together to make life possible; still others just happen to find themselves in the same place. Assemblages are open-ended gatherings. They allow us to ask about communal effects without assuming them.” (Tsing 54)

For Tsing, metaphors like system or community imply narratives of progress, overemphasize intentionality and chains of cause and effect, and obscure the role that precarity and indeterminacy play in the conditions of life. Both precarity, the state of being vulnerable to another, and indeterminacy, the unplanned or unpredictable nature of the encounters, are downplayed when the world is viewed as a system.

I argue that these worldviews are intimately tied to our technological apparatus, which influence how we measure, value and manipulate environmental assemblages. When armed with a computer, the world becomes data and relations appear as systems. In this way computational approaches like modelling and AI simultaneously reveal and obscure aspects of reality. Just this year Microsoft announced ”AI for Earth”, a program that will put artificial intelligence in the hands of those who are trying to “monitor, model and manage the earth’s natural systems” (“AI for Earth Grant”). Pledging $50 million to the initiative, this gives environmental researchers access to Microsoft’s Azure cloud platform and its AI products. In the last decade Silicon Valley ideology has saturated a diverse range of fields from urban design to the justice system, and this announcement is imbued with a familiar mix of solutionism and teleology, this time promising to transform “the way we are currently managing complex environmental challenges” (“AI for Earth Grant”).

If we are to believe the hype, and seriously consider if and how these technologies can reshape our relationship with environments, we must thoroughly examine what they amplify and what they edit out. This is my starting point, and I consider several histories that highlight some characteristics of a computational view of the world. My paper will will go on to explore how this view might be rounded out.

When models fail.

The history of environmental management is riddled with embarrassing failures that stem from incorrect assumptions and the oversimplification of environmental assemblages. An early example comes from James Scott’s account of German forestry through the 18th and 19th century. Although this precedes the advent of contemporary computation, the German use of numerical and bureaucratic methodologies to understand and manage natural systems remain strongly resonant. Scott’s study of the ‘scientific forest’ or ‘fiscal forest’ shows how decisions regarding the management and replanting of commercial forests were driven to optimize timber yield (Scott, 11). As such, the data collected represented trees as resources and drove the once biodiverse forests to be reproduced almost as monocultures. The ecological entanglements of species in the forest were deemed to be external to a system designed to efficiently transform species into commodities. Modeling the forest as a timber factory oversimplified the reality of the relations and interdependencies of its species and after a couple of generations, yields began to drop, the forest ecosystem destabilized and came close to collapse.

Scott’s scientific forest is an example of what Tsing calls “the plantation” (Tsing, 56). The plantation is an extreme simplification of environments as they are optimized to produce commodities, something that often results in monoculture. Like in Scott’s forest, the economic drivers of capitalism make crop yields the ultimate goal of an agricultural landscape, and this determines how the landscape is measured, modeled and manipulated. The landscape become a factory and its species become assets alienated from their lifeworlds, like workers who fulfil hits on Mechanical Turk with no bearing on each other or what they produce. When the asset can no longer be extracted, the landscape becomes a ruin and disappears from view, deemed worthless (Tsing, 31). Both Scott’s scientific forest and Tsing’s plantation are examples of the complex interplay of economics and numerical approaches to landscape management, processes that are now accelerated through contemporary computation. What they show us is that data collection and modeling practices are never neutral. Rather, they reflect what is deemed important or trivial in the eyes of the modeler and therefore are profoundly shaped by economics, hubris and culture.

Today we can collect orders of magnitude more data from environments to produce higher resolution models of the world. However, data collection and modeling practices remain shaped by often tacit assumptions of what is thought to be typical or atypical. The story of British geophysicist Joe Farman’s discovery of the ozone hole is a well known example of the dangers of assuming more data equals more reality. Farman maintained a single ground based ozone sensor in the Antarctic in spite of the launch of NASA atmospheric monitoring satellites that collected vastly larger quantities of data (Vitello). When Farman’s data began to show a 40% drop in ozone levels, he assumed the sensor was damaged and replaced it, as NASA’s climate models had reported no such change. After years of study and careful checking, Farman published this alarming result in Nature, confirming the destruction of the ozone layer due to human pollutants (Farman). How had NASA’s satellites missed such a marked change in ozone composition? One response from NASA suggests that their data processing software was programmed to discard readings that appeared to be outliers, thus ignoring the drastic changes that were occurring (Vitello). In this case, reality itself was an outlier and assumed to be an error.

The limits of computation

What if there was no cap on the amount of data produced from the environment for analysis? What if computational technologies develop to a point where we could stop discarding outliers in datasets?  Would the resultant models of the world solve the problems highlighted in these examples and produce a robust representation of reality, free of human assumption? This is the utopian narrative at the heart of big data and artificial intelligence initiatives like Microsoft’s AI for Earth, however there are two reasons to tend to a reflexive view of what these technologies can and cannot do.

Firstly, at the center of a data driven approach is the assumption that the past is indicative of the future. Big data has led to developments in machine learning which describes new statistical approaches that derive models from pattern in large quantities of data without any knowledge of underlying structures, whether these are language, environmental processes or genomics. As Chris Anderson succinctly observes in his 2008 article in Wired magazine, End of Theory, “Correlation is enough. We can stop looking for models” (Anderson). And yet climate change, more aptly described as climate destabilization, means that as environmental change accelerates, our ability to make accurate predictions from our existing data is diminished. At best, environmental datasets like precipitation records span 250 years, with most spanning a much shorter period than this (Simpson and Jones). From a geological point of view this is an absurdly small slice of time, and one in which the earth’s climate has been relatively stable. As the patterns, rhythms and cycles in both climatic and biological phenomena are drastically disrupted, it becomes increasingly difficult to make predictions based on this short, stable interval of climate data. This is what William B. Gail calls the coming of “a new dark age”, where our accumulated observations of Earth’s irreducibly complex conditions are increasingly rendered obsolete, revealing the limits of machine learning methods in environmental modeling (Gail).

We must acknowledge these limits, but perhaps more significantly we must acknowledge how deeply entrenched we are within a computational worldview that assumes the systemacity of environments and under-acknowleges the indeterminacy of environmental encounters. We should not assume that atmospheric gases and the species that encounter them can be adequately represented in a purely mechanistic, data-driven way any more than we should assume humans fit the cybernetic behavior models proposed by researchers like B. F. Skinner in the 1930s (Skinner). Data and models are just one method in our toolkit for understanding environments and ecology. Yet, the authority of science, amplified by computation, makes it is easy to forget that there is always more going on.

What other methods do we have then to round out ecological thinking and add to the perspectives offered by science and technology? In what ways can we build a capacity for understanding environmental assemblages that can behave like systems, but are not systems themselves. How can we break free from the binding reductive metaphors of technological thought?

What tactics do we have then to build a richer view of environmental assemblages? What tactics construct more robust understanding of what Bruno Latour calls “matters of concern”? (Latour) As a start, we should look to practices that explore the unforeseen consequences and possibilities of technologies, applying them to reveal the edges of systems, where they break down, what they leave out and how they might be repurposed. These are practices of eccentric engineering, practices that blend scientific and conceptual languages and that reveal technologies as products of ideology in performative and public ways.

References

AI for Earth Grant (2017), https://www.microsoft.com/en-us/research/academic-program/azure-research-award-ai-earth/

Anderson, Chris. “The end of theory: The data deluge makes the scientific method obsolete.” Wired magazine 16.7 (2008): 16-07.

Farman, Joseph C., Brian G. Gardiner, and Jonathan D. Shanklin. “Large losses of total ozone in Antarctica reveal seasonal ClOx/NOx interaction.” Nature 315.6016 (1985): 207-210.

Gail, William B. “A New Dark Age Looms.” New York Times, April 19 (2016), https://www.nytimes.com/2016/04/19/opinion/a-new-dark-age-looms.html

Haraway, Donna Jeanne. When species meet. Vol. 224. U of Minnesota Press, 2008.

Latour, Bruno. “Why has critique run out of steam? From matters of fact to matters of concern.” Critical inquiry 30.2 (2004): 225-248.

Madison, M. “Potatoes Made of Oil: Eugene and Howard Odum and the Origins and Limits of American Agroecology.” Environment and History, 3(2),(1997): 209-238.

Scott, James C. Seeing like a state: How certain schemes to improve the human condition have failed. Yale University Press, 1998.

Simpson, I. R., and P. D. Jones. “Analysis of UK precipitation extremes derived from Met Office gridded data.” International Journal of Climatology 34.7 (2014): 2438-2449.

Skinner, Burrhus Frederic. The behavior of organisms: An experimental analysis. BF Skinner Foundation, 1990.

Temple, James. “Harvard Scientists Moving Ahead on Plans for Atmospheric Geoengineering Experiments.” MIT Technology Review March 24, 2017, https://www.technologyreview.com/s/603974/harvard-scientists-moving-ahead-on-plans-for-atmospheric-geoengineering-experiments/

Tsing, Anna Lowenhaupt. The mushroom at the end of the world: On the possibility of life in capitalist ruins. Princeton University Press, 2015.

Vitello, Paul. “Joseph Farman, 82, Is Dead; Discovered Ozone Hole.” New York Times, May 18, 2013.

Advertisements

3 Comments

  1. Hi Tega,

    thanks for your great ideas and critique about systems, that make me think about my own usage of the term “systems” when talking about database systems.

    Recently, in relation to the so called artificial intelligence, I came accross another good example for the failure of models, namely neuroscience models being applied to the human brain. Erik Jonas and Konrad Kording evaluated standard models of neurosciences against an old microprocessor (where one can expect to determine the exact input/output relation). It was expected that the neuro-science models could explain what the computer was actually computing, but surprisingly they didn’t. Jonas and Kording put in question, whether these neuroscience models should then be used for researching the functioning of the human brain. Eric Jonas & Konrad Paul Kording, http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1005268

    Also I liked your observation about how the ozone hole beat the models and might have gone unrecognized …

    What I would be interested in, is to see more (real world) examples, of how the usage of “assemblages” instead of “systems” makes a difference to the discourse. And I would like to discuss, if machinic computation (using mathematics and logics for symbol processing) is able to grasp anything at all. In the end, it is humans who ascribe meaning to whatever is computed

    best
    Francis

    Like

  2. Hi Tega,

    Thanks for the text, it was really inspiring, I didn’t know that bioengineering was so “advanced” already, thanks for bringing up this issue.

    I think that it is essential to question the tools and methods that we are currently using to read and quantify our life-giving habitats. Probably you are familiar with the work of Catherine de Ignazio on feminist mapping, in case not, I think it would give you great insights in this topic.

    Really looking forward to keep on discussing this!

    Like

  3. Hi Tega,

    Really good piece… very much enjoyed reading it. Not sure I can particularly productive comments at this stage but in any case I look forward to discussing more.

    What it brought to mind is this bit from Latour, which you probably already know:

    “But there is another more philosophical advantage I wish to underline in conclusion: the notion of
    critical zone, because of the number of disciplines involved in monitoring the chunk of land they explore together, help resists the temptation to think that we are dealing with a “unified system”. The great geopolitical fallacy of political ecology is that the Earth is a whole where “everything is connected” and if only we could bring together the boxes representing the “natural” elements with the “social” ones, we would have unified the question and could zoom in from the larger scales to the smaller ones. The problem of such a view is that it imports a technical metaphor (mechanical or cybernetic) that implies (most of the time surreptitiously) the hidden presence of an engineer at work who has devised the whole as a system of which we see only the parts. But there is no engineer at work and thus the relations between elements cannot be that of the parts with a whole. Hence a scientific qua political puzzle that should not be solved too quickly by jumping at the idea that we are dealing with a system. Thus, it is much easier to realize the necessity of composing the common world because of the sheer difficulty of gathering the various ingredients that make up a critical zone.”

    from here:
    http://www.bruno-latour.fr/sites/default/files/P-169-GAILLARDET-pdf.pdf

    So of course that’s exactly the problem you address. One question could be whether you use the concept of the “critical zone” that he refers to? More generally, because Tsing and Latour appear to count among your conceptual allies, I guess one question could be whether they are the right allies to have when the actual disastrous effects of climate destabilisation start manifesting themselves – and this time has already started in some regions of the world – in terms of developing practical politics to respond to this condition. Is it still the time to try “building a richer view of environmental assemblages”, which they certainly can help with? Or are we already in emergency/damage-control mode – where the question is: how to get out of here, including how to “compute” our way out of here? Not that I think the latter is the way to go, at all, obviously, and of course it all depends on how you define the “we” here, because there’s nothing more hierarchical than an evacuation / rescue operation – but I think the problem you pose, as to how the environment should be regarded and understood today, is crucially linked to where you position yourself within this problem, on which side you are of the increasingly lethal lines that it traces across the planet. If this makes sense?

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s