* This text is an excerpt of an on-going research carried out by Joana Moll and Vladan Joler on the material impact of surveillance capitalism.
Our so-called networked society has failed so far to transpose the logic of interconnectedness into our daily life. Citizens are becoming increasingly machine-like and dependent on data, threatening the connection between humans and their (life-giving) natural habitats. Although most of our daily transactions are carried out through electronic devices, we know very little of the apparatus that facilitates such interactions, or in other words, about the factory that lies beyond the interface. The Internet is the biggest “thing” that humanity has ever built. Its massive infrastructure is composed of billions of computers and thousands of kilometers of submarine and inland cables. This immense infrastructure rests on the shoulders of invaluable supporting technologies, largely unnoticed by its audiences – namely human labour, intangible legions of algorithms, and a vast consumption of natural resources. In 2008, the Internet was already responsible for the 2% of CO2 global emissions, exceeding those of the entire aviation Industry1. The amount of users and network connections has increased at a whooping pace ever since. In 2015, the Internet registered 966 Exabytes of IP traffic (1.037.234.601.984 GB) and is expected to reach 1579,2 Exabytes by 20182.
Yet despite the growing number of Internet users and information flows, the material representation of the Internet and surveillance economy behind it remains blurred in the social imagination. It seems like as the number of interactions and information generated over the Internet grows, the more invisible it becomes to users.
Dig more coal the PCs are coming
The Carboniferous is a geological period that lasted from about 359.2 to 299 million years ago3. Carboniferous refers to “coal-bearing”, formed by the Latin words carbō (“coal”) and ferō (“I bear, I carry”)4. This period is known for producing many of the global resources of coal that are consumed nowadays. Coal is essential to humankind, as it is the main resource used to produce electricity worldwide. According to the World Coal Association, coal-fired plants currently serve 41% of global electricity needs5 and are responsible for 44% of global CO2 emissions6. In countries like Mongolia, South Africa, Poland and China, coal fuels more than 75% of their electricity, while the U.S., China, and India are the top producers and consumers of coal.
Coal fired energy is not just responsible for powering machines. Coal is essential to the production of our devices, for example the 206,6 million tablets shipped from China in 20157, the 2292,45 million mobile phones manufactured in China from October 2015 until October 20168 and the 238,500,000 computers sold globally in 20159. Moreover, more than 80% of the energy that a computer will consume during its lifespan will be used in its production process10.
Coal Fired Computers”, 2010. YoHa. Source: Jamie Woodley.
Machines Connecting Humans
Our society is framed by our main structures and the activities that we practice within them, facilitated by a vast network of interconnected machines that process information. In December 2017, an estimated 20 billion devices were connected to the Internet and it is expected to reach 30 billion by 202011. According to a recent study published by CISCO, in 2020 it would take more than 5 million years to watch the amount of videos that will cross global IP networks each month, that’s at a rate of a million minutes per second. Less than half of this web traffic will be attributed to humans; a massive 56% of web traffic is already generated by bots, impersonators, hacking tools, scrapers and spammers12.
Daily distribution of minutes spent in front of screens in 2014. Source: http://bit.ly/2idCKwR
Infrastructures Beyond the Interface
The Internet is far from being a purely immaterial entity, like a cloud, but rather it is an extremely complex physical structure composed of a massive number of actors that have a direct and deep impact in every aspect of our daily lives. Despite its crucial role in many aspects of our society, the material and computational architectures that allow the Internet to exist are widely ignored by most of its users. Our interactions are mediated via electronic devices, and in turn, their underlying physical and graphical interfaces – often overly simplified in the process of building bridges of communication between human and machine – either a desired action into semiotic representations of output, or triggers for understanding possibilities for input. As stated, the Internet is the biggest infrastructure that humanity has ever built, yet its materiality remains mostly hidden. The amount of things needed to operate this giant entity seem to be endless. In fact, it is practically impossible to trace a precise map capable of visualizing the amount of “things” that allow the Internet to exist. Nevertheless, in this section we will try to reveal the complexity of some of its crucial infrastructures and processes, along with its environmental impacts.
Simply put, the Internet is a global web of interconnected computers. This interconnectedness, or cyberspace, allows each computer to send and receive information from any other computer connected to the network, regardless of its physical location.
Cyberspace’s history is not only a recent one. It has a long antecedence with its roots in the development of the telegraph and telephone in the 19th century. These technologies were the first to connect distant places in order to allow the instant communication of data13.
Map of the Telegraph stations in the United States, the Canadas & Nova Scotia created in 1853. Source: http://bit.ly/2h2mulm
The early origins of the modern Internet date back to 1969 – starting as a research project called Arpanet funded by the Advanced Research Projects Agency, a branch of the US military initially connected to the University of Utah and three research centers based in California. Arpanet was built as an experiment to test a new technology called “packet-switching” that allowed information to be cut into smaller segments in order to transmit them efficiently across the network14.
Record of the first message sent on the Internet on October 29th 1969 at 22:30 from Boelter Hall 3420 at UCLA. Source: http://bit.ly/2lWDmYm
Arpanet expanded greatly in the following years, going international in 1973 when it connected London and Norway to the several nodes already established in the US only a few years earlier. The Arpanet became what we now know as the Internet in 1984, when the military realized that they couldn’t manage it if it continued to grow. Thus, the military decided that the Arpanet had to be reorganized as ‘a decentralized “network of networks.”15’
The Internet’s Spine
The Internet backbone is often described as the spine of the Internet, a multi conglomerate of high-speed long distance data transmission lines that provide connectivity all over the world through inland and submarine cables. The first backbone was built in 1992 in the US and has expanded greatly ever since, conquering the global territory and especially the undersea world.
Machinery on the U.S.S. Niagara, provided for the second attempt to lay the first transatlantic communication cable. Source: http://bit.ly/2hTRiUM
The first submarine communication cable, a telegraph cable, was finished in 1858, connecting western Ireland to eastern Newfoundland island in Canada16. The first transatlantic communication took place on August 16th of that very same year when the Queen Victoria sent a telegram to congratulate US president James Buchanan on his election win. The message took seventeen hours to be transmitted, at a rate of two minutes per character17. The cable was destroyed a couple of weeks later when Wildman Whitehouse, an english surgeon and ‘electrical experimenter by avocation’18 applied too much voltage to the cable in order to increase its speed performance and burnt it down.
Wildman Whitehouse, the first human to destroy a transatlantic communication cable. Source: Wikipedia.
By January 2017, the backbone was composed of approximately 300 submarine and inland cables, covering a total of of 885,000 km19. SEA-ME-WE 3, finished in the year 2000, is considered to be the longest underwater communication cable in the world[20 with a length of 39,000 kilometers and 39 different landing points across Europe, Africa and Asia.
Map of the global network of submarine cables. Source: http://bit.ly/2hmEB5Q
Submarine communication cables enable the exchange of massive streams of data at an incredibly reduced amount of time. We can send vast amounts of information in microseconds: in June 2016 the global average connection speed was 6/Mbit/s21, in 1996, 13.7/Kbits/s22. Nevertheless, efficiency (quantified by speed of performance) has a huge environmental toll. Undersea cables are far from being harmless to the undersea habitats that they colonize. Power cables’ frequent transmission losses, each of which generate a much higher environmental footprint than that of the whole submarine power cable production chain, including extraction of raw materials, manufacturing and transportation. Not to mention the noise emission, heat dissipation, occurrence of electromagnetic fields, contamination and disturbance which have been identified as five critical potential environmental issues derived from the installation, operation and maintenance of submarine cables23.
Sketch of influenced migratory behaviour by submarine cables. Source: Impacts of submarine cables on the marine environment — A literature review —.
Noise is potentially causing dramatic effects on animals that rely on sound to communicate. Whales, for instance, use sound to navigate, monitoring their surroundings and talking to one another across hundreds of kilometers of water. Whale sonar allows them to acquire food, travel safely by confusing predators and avoid geographical obstacles – facilitating migrations to safe breeding areas. On the other hand, there’s a large quantity of submarine species that rely on the earth’s magnetic fields to orientate. As a result, magnetic fields generated by the cables can potentially affect the orientation of marine fish and mammals during their migrations or even redirect the migration24, causing devastating effects to the survival of several species. In February 2017, a total of fifty-six dolphins and whales were found dead on the coasts of Ireland, considered the biggest number registered up to date25.
Two dead sperm whales on Skegness beach in 2016. Source: © knsnews.co.uk
As we’ve already seen, submarine cables are in charge of transporting millions of megabytes across the globe, but where does data “live”? Typically, data is stored in so-called data centers: large buildings or industrial warehouses that contain thousands, sometimes millions, of interconnected microcomputers (servers). For example, every time we upload a picture to Facebook, Twitter or Instagram this information is being sent and stored in a particular server/s of one of these company-run data center facilities, spread all over the world.
In a study carried out in 2014, it was estimated that the total amount of data centers around the globe will reach 8,6 million by 2017, although this number is likely to decrease, as small data centers will move to mega facilities run by big IT corporations. However, while the number of global data centers is expected to shrink, the square meters dedicated to data storage will continue to grow, from 1.58 billion square feet in 2013 to 1.94 billion in 201826.
Undersea cables landing port in a data center facility in Lower Manhattan, one of the most heavily guarded hubs of the Internet. Source: Peter Garritano.
Data centers have evolved greatly throughout history, responding to the need for keeping, maintaining and accessing large amounts of information in a fast and efficient way. Prior to 1960, a single computer would occupy an entire room27 and was mostly used by government agencies. As technology has evolved, the size of computers has became smaller and made more available for commercial and domestic uses. Early data centers started to document disaster recovery plans28 in 1973, but it wasn’t until the dot-com bubble in the 90’s when the use of these giant data pantries exploded, as many companies needed fast, constant and reliable Internet connectivity.
CyberBunker is a data center built in a decommissioned NATO above-ground nuclear bunker in The Netherlands that is designed to operate for over 10 years without outside contact. Source: http://www.cyberbunker.com/web/gallery2.php
According to Marcus Hurst, data centers are ‘the second most power-hungry elements of the internet, after devices’29. On average, one data center consumes as much energy as 25,000 homes30. In 2016, worldwide data centers consumed around 416.2 terawatts of electricity, which represents more energy than 182 countries. With the increase of data production, this number is doubling every year. A recent study carried out in Japan states that at this rate the Japanese data centers will consume the entire production of the country’s electricity by 203031.
In 2012, google claimed to use just 0.01% of the global electricity production, equivalent to the same amount of energy used by Turkey in the same year. Likewise, Facebook used the same amount of energy as Burkina Faso in 201332.
Although this energy could come from renewable sources, it’s not usually the case. As stated by Greenpeace in his 2012 report “How Clean is the Cloud” ‘70% of the 400,000 mobile phone antennas in India don’t have access to reliable electricity sources, relying on diesel-powered generators to make up for the inadequate power supply. The big data centers in Western countries also rely on back-up diesel generators that kick into action in the event of power cuts’. For example, Amazon Web Services (AWS) data centers use over 6.5 million MWh of power per year, enough to power 600,000 american households, 77% of it being dirty and non-renewable energy33.
Clean Energy Index studio developed by Greenpeace. Source: http://bit.ly/1hmLMhs
Most of the energy spent in data centers is not a product of computing tasks, but rather of powering the fans and chillers used to cool down computer chips. Currently there have been many efforts from the IT giants to overcome the environmental impact of their operations. In fact, most of these companies are developing sustainable plans to run their facilities on renewable sources. There have also been several experiments on how to creatively recycle the residual heat derived from data center procedures, such as turning it into electricity or heating up water. Such approaches are fundamental, as high estimates of unused heat waste could provide much needed resources if distributed effectively, such as the estimates stating that the heat waste of a 10-megawatt data center could warm up about 700 homes34.
A rack of University of Notre Dame servers (at rear) heats an enclosed botanical garden at the South Bend Conservatory in South Bend, Indiana. Air drawn from outdoors cools the computers; hot air is released into the greenhouse. The servers are connected to the university’s main computing cluster and are given more processing tasks if higher temperatures are needed. Source: https://www.technologyreview.com/s/425858/greenhouse-effect-5-ideas-for-reusing-data-centers-waste-heat /
Data flows and CO2
So far we’ve disclosed the physical infrastructures and data architectures that operate the Internet, along with a few of its environmental impacts. But how much data actually pollutes? As we’ve already seen, due to the complex set of actors involved in the configuration and operation of the Internet, it is impossible to determine the exact number of its CO2 emissions, so the data we present here is approximate. A paper published in February 2008 estimated that it takes 13 kWh to transmit 1GB of information35. On average, the production of 1 kWh emits 544 gr. of CO236, thus, every time we upload or download 1MB of data, we are generating around 7,072 gr of CO2. Nevertheless, if this transmission is carried out through 3G or 4G networks, the amount of emissions might increase up to four to five times, about 35 gr of CO2/mb37.
Avatar (2009) by Michael Saup. Source; http://www.z-n-e.info/?root=2&sub=0&id=260&pic=1&lang=en
In 2009, Michael Saup, a German artist based in Berlin, calculated the amount of coal required to generate youtube’s one million views of the trailer for the film “Avatar”. The final tally of coal was represented as a series of cubes, with a total mass measuring three meters long and weighing 37 tons. In June 2010, the trailer’s views exceeded 14.5 million, enlarging the cube with a side length of seven meters, and weighing 540 tons38. According to Saup’s calculations, the 3.39 minute Avatar trailer required 1.235.000GB gigabytes to stream one million times on youtube (totalling 49.942kWh and represented as 54 tons of CO2 emissions)39.
Filtering and Diluting Tangible Realities
Our widespread techno-ecological habitat is expansive, yet is only accessible through interfaces. Interfaces play a key role in the configuration and functionality of surveillance capitalism. Despite their crucial role within our society, interfaces have been reappropriated as something natural and neutral – becoming mostly invisible structures that dissolve into our daily landscapes.
Typically, an interface is an entity that allows a human to efficiently operate a machine. Thus, interfaces are in charge of translating machines’ underlying functional logics to “end users”, effectively teaching humans how to operate and think like machines.
The heavy computerization of the workspace that took place throughout the 90’s gave birth, among other things, to a new discipline of conceptual design coined by Donald Norman as User experience design (UX). UX design “is the process of enhancing user satisfaction with a product by improving the usability, accessibility, and pleasure provided in the interaction with the product”40. Bill Gates once brilliantly quoted “power in the digital age is about making things easy”. While attractive in nature and productive in some cases, ease of use unfolds as a core strategy of the surveillance capitalism machinery. We could attribute the quality of easiness to something that is cheap (or free), fast, reliable, accessible and free of negative consequences or guilt. The more easily our access to and interactions within the surveillance capitalism conglomerate are designed, the more information is being produced and accumulated in the hands of the global elites. Therefore, easiness might act as a key offensive, silently generating asymmetries within power structures.
Sketchpad, the first program to use a graphical user interface. Source: http://digital-archaeology.org/bitworld-the-creative-history-of-computers/
According to an article published in 2013, 40% of the Internet’s total carbon footprint may be attributed to the design of a web site41. In March 2017, the average weight of a site was 2,5MB, almost 3.5 times bigger than the average size of a website in 2010. This rapid increase is mostly attributed to images and videos displayed on websites. While in 2010 the average size of images found on a webiste was 430KB (and the videos were practically nonexistent), in 2017 embedded website images weigh 1664KB and videos weight in at 199KB. Likewise, the size of stylesheets, scripts, fonts and other files have also tripled in the last seven years42.
Google search engine website in 1997 weighed 43KB, on March 21st 2017, 1241KB.
The numerous domestic interfaces that we use in our everyday life play an essential role in diluting the many tangible realities of our networked society. This is particularly true when it comes to the several tangible and intangible infrastructures that construct the Internet, supported by their underlying material impacts. In my opinion, interfaces’ tendency to blur the materiality constructing their own operations directly dilutes the user control they aim to empower. The result generates a sense of comfortable limbo where the user can interact “free” of guilt, thought and reflection. In that respect, we can argue that the Interface may unfold a critical agent in the generation of a culture of irresponsibility.
I firmly believe that when operating electronic devices, interfaces can play a key role in raising broad public awareness surrounding the relationships between our actions and their material impact on the physical world. By designing mechanisms capable of triggering thoughts and actions, interfaces can empower stimulatation and re-appropriate subjectivity. I believe that interfaces hold not only the power, but the responsibility to generate critical thought about the true nature of technology, and the imagining of alternative techno-paradigms which offer greater responsiblity towards our environmental and human conditions.
1 http://lab.cccb.org/en/how-polluting-is-the-internet/ (retrieved February 2014).
3 http://www.ucmp.berkeley.edu/carboniferous/carboniferous.php (retrieved January 2017).
5 https://www.worldcoal.org/coal/uses-coal/coal-electricity (retrieved January 2017).
7 https://www.statista.com/statistics/272070/global-tablet-shipments-by-quarter/ (retrieved December 2017).
8 https://www.statista.com/statistics/226434/production-of-cell-phones-in-china-by-month/ (retrieved December 2017).
9 http://www.statisticbrain.com/computer-sales-statistics/ (retrieved December 2017).
11 https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/ (retrieved December 2017).
12 https://hostingfacts.com/internet-facts-stats-2016/ (retrieved December 2017).
13 Dodge, M., Kitchin, R. Atlas of Cyberspace. Pearson Education Ltd 2001. London
16 https://en.wikipedia.org/wiki/Transatlantic_telegraph_cable (retrieved June 2015).
17 https://en.wikipedia.org/wiki/Transatlantic_telegraph_cable (retrieved June 2015).
19 https://www.sciencealert.com/watch-this-map-shows-the-885-000-km-of-internet-cable-hidden-under-the-ocean (retrieved November 2016).
23 Worzyk, T. Submarine Power Cables: Design, Installation, Repair, Environmental Aspects. Page 251. Springer, Berlin 2009.
24 K. Meißner, H. Schabelon, J. Bellebaum, H. Sordyl, Impacts of submarine cables on the marine environment — A literature review —. Institute of Applied Ecology Ltd 2006. Neu Broderstorf.
27 In 1945 the US army developed a large computer called ENIAC that weighed 30 tons, occupied 1,800 sq ft of floor space, needed 6 full-time technicians to keep it running and carried out 5000 operations per second.
28 ‘Disaster recovery involves keeping all essential aspects of a business functioning despite significant disruptive events.’
29 http://lab.cccb.org/en/how-polluting-is-the-internet/ (retrieved February 2014).
30 Kindler, “Revolutionizing Data Center Energy Efficiency”, McKinsey, July 2009.
31 http://www.tomshardware.com/reviews/intel-xeon-e5-2600-v4-broadwell-ep,4514-8.html (retrieved November 2016).
32 https://www.theatlantic.com/technology/archive/2015/12/there-are-no-clean-clouds/420744/ (retrieved January 2016).
33 http://www.greenpeace.org/usa/wp-content/uploads/legacy/Global/usa/planet3/PDFs/2015ClickingClean.pdf (retrieved November 2016).
35 http://evanmills.lbl.gov/commentary/docs/carbonemissions.pdf (retrieved November 2016).
37 James Christie. (2013) Sustainable Web Design. https://alistapart.com/article/sustainable-web-design (retrieved December 2013).
39 http://kunst.1001suns.com/XOR/avatar/avatar_calc001.pdf (retrieved February 2015).
40 https://en.wikipedia.org/wiki/User_experience_design (retrieved January 2017).
41 James Christie. (2013) Sustainable Web Design. https://alistapart.com/article/sustainable-web-design (retrieved December 2013).