The algorithm has increasingly suffused into laboring bodies, into domestic interiors, and into urban fabrics. For a platform like Uber this entails new forms of algorithmic governance that ushers drivers to particular locations in the city at particular times of the day, and draws out a specific type of performance understood as ‘best practice.’ For the ‘always listening’ digital assistant that is Amazon Alexa, this means filling the traditionally private space of the kitchen or living room with an invisible new zone of capture. And within a system like Airbnb, the algorithmic indexing of listings exerts unseen pressures on architectures—rearranging apartments, transforming homes into hotels and subtly reconstituting the wider geographies of the city itself. Alongside these consumer-facing examples are less visible but equally significant intrusions made at the enterprise or governmental levels. These come without focus-grouped product names, but determine teacher rankings, credit scores, loan approvals, parole sentences, and no fly lists. More and more, the algorithmic permeates into the processes and people around us, impinging upon society and culture in highly significant ways.
But this we already know. What is less clear is how this shaping is accomplished. How does the algorithmic invest bodies, enlist subjects, move matter, and coordinate relationships? In short, how does an algorithmic procedure attain and exert power?
The algorithmic is typically conflated with code, with a special form of language written by a programmer. If, the argument goes, the user or researcher could only read back this text, then all would be revealed. Mark Marino, to take just one example, states that “we can read and explicate code the way we might explicate a work of literature” (2006). But this text is typically proprietary, only available to employees or selected developers. The moment of enlightenment never arrives and the algorithmic becomes the oft cited black-box, an opaque object unable to be examined or intervened within. A new approach is needed, and indeed the development of these systems has moved away from the monolithic application and adopted a micro-service approach. These are small, targeted services that do one thing and do it well—converting currency, logging miles, tracking ads. Hundreds of these services sit within a wider ecosystem, responding asynchronously to requests as they arrive.
So algorithmic objects can be understood as ecologies, a term which better encompasses this heterogenous mix of cables and wire, bodies and vehicles, capital and code. Ecologies provide a way of “understanding the various scales and layers through which media are articulated together with politics, capitalism and nature, in which processes of media and technology cannot be detached from subjectivation” (Parikka and Goddard, 2011, 1). These elements come together in various ways to carry out activity in the world. But if the world is promising, it is also unpredictable. Here, data becomes messy, subjects turn contentious, space can be antagonistic. Execution, Wendy Chun insists (2008, 304), is not simply a “perfunctory affair.” Nothing is guaranteed. Instead, any power must be incessantly negotiated. The forces exerted by the ‘merely’ technical operations of the algorithmic—storing, searching, indexing, presenting—must accumulate into meta-operations: encapsulating life, enlisting subjects, remaking space, and enchanting users. In focusing on these performances, we move away from secret codes and software to a set of observable and embodied operations that can be analysed.
These operations are political in that they determine the possible. Rather than parties and politicians, this is the politics of protocols (Galloway 2004) and parameters (Rossiter & Magee 2015, 76). Or to update Michel Foucault (1978, 94), one might say the politics of power-relations now become immanent within the algorithmic. Take Alexa, for example. Amazon chose to construct this intelligent agent as a female assistant. In doing so, they tapped into a deep historical vein of gendered service embodied in the female voice, from present-day airport announcements back to Otis elevators and all the way back to Bell’s telephone operators. As Sadie Plant reminds us (1997, 126), this operator routing connections at the switchboard exemplified the role of the woman “poised as an interface between man and world.” The new breed of digital assistants for the smarthome: Microsoft Cortana, Apple Siri, and Google’s Project Majel—simply continue this lineage. The gendered telephone exchange thus establishes the precedent for the gendered Internet of Things exchange. Alexa-as-interface builds directly atop the older concept of woman-as-interface. In doing so, the gender politics of Silicon Valley are infused into the qualities and responses of ‘Alexa’ and in turn determine our range of possible activities and agencies.
Indeed, we can continue to follow this ecology, investigating the microprocessors that power the ‘cloud-based’ Alexa. These Intel chips themselves are based on years of research and development originating in Silicon Valley. As Pellow and Park demonstrate (2002, 114, 120), a particular work force built these chips, one composed of Latino and Asian-American women, considered a more docile labor by management; working with highly toxic chemicals day after day, they developed nausea, migraines, skin burns, cancers and long-term reproductive debilitations. Across times and spaces, a logic of extraction from the servile female remains constant. Thus Alexa’s ‘effortless’ experience is built atop historical and cultural operations that supplement its technicity in crucial ways. Infused into technicity, these parameters inform how we see ourselves and others.
And yet this power is never totalizing. A certain “grammar of operations” (Fuller 2005, 167) must be performed in order to map subjects and spaces, draw them into a functional sequence, and exhaust their productive potential. But in moving from whiteboard to world, the algorithmic must deal with a diverse set of materials—materials which can be antagonistic to its objectives. Operations are never guaranteed, but must be negotiated. What occurs when these operations are unsuccessful? To explore this, we can sketch out a failure of algorithmic power taking place within a particular ecology—Uber.
Uber as Algorithmic Failure
Uber’s worker starts life as a data-object. The object specifies the properties that represent the platform’s so-called Driver-Partner: name, city, rating, current status and so on. The rich life of the subject is thus mapped onto an internal informational schema, a process I call encapsulation. Within Uber’s inner world, every Driver-Partner is abstracted into a collection of variables or parameters. This abstraction is productive, constructing a generic object that can be tracked, messaged, rated and compared against other drivers. But to abstract is also to ignore—including these properties means excluding other possible understandings of identity: race, culture, class. The result is a generic driver, interchangeable with any other. Important complexities and contingencies are not encapsulated, leaking out of this strict envelope of subjectivity. So Uber’s understanding of the worker is universal, fungible—a driver is a driver. And this thin understanding recoils on the ride-share company in various ways.
Every time a Rider requests a ride, a driver needs to be there. And they not only need to show up, but to perform a professional and timely service. In other words, Uber requires an enlistment of the worker towards a specific objective. This operation is made all the more difficult by the conditions of platform labor—conditions which insist on the autonomy of the remote worker, ruling out many of the standard employer/employee tactics. A cluster of operations—timed messaging, gamified missions, citywide campaigns, surge notifications—attempt to direct drivers to a particular understanding of ‘best practice’ labor conducted in particular places at particular times.
But enlistment can only operate on the understanding of the Driver that Uber has encapsulated—a universal everyman, a generic caricature. This explains why, for instance, Uber’s attempts to funnel workers into shift work have been largely ineffective, and why many drivers ignore mechanisms like Surge pricing altogether (Lee et al. 2015, 5). These ‘targeted communications’ largely miss their target and instead fall on an abstracted, algorithmically constructed subject that often fails to incorporate the complex and varied motivations unique to each worker. As Rosenblat and Hwang argue (2016, 4), the universal platform mistakenly sees the labor pool as monolithic, a “relatively equivalent mass”. A clear gap begins to emerge between the worker and Uber’s understanding of the worker. One byproduct is turnover—more than half of all Uber drivers quit within the first year.
For those who continue to drive, a particular subjectivity is desired. It is not enough to simply show up and perform a task. The task must include an affective element. For Uber this often entails a smile, a conversation, an offer of bottled water or mints. Technicity cannot code for this ‘affective labor’ (Hochschild 2003), because it must appear to be spontaneous, improvised, from the heart. The task must also be made algorithmically legible. Here, again technicity reaches its limits—only so much information can be gleaned from the smartphone and its sensors. Instead, Uber seeks to enchant the worker, drawing out a particular subjectivity that accommodates itself to the algorithmic. This entails the worker “turning to face the algorithm” (Gillespie), and attempting to mirror the desired response. In playing to strengths and overlooking weaknesses, the subject supplements ‘pure’ technicity in important ways, tuning activities for maximum recognition: views, stars, results.
But Uber’s ineffective encapsulation and enlistment instead often disenchants the worker. Disillusioned, drivers work to obfuscate rather than make legible, discovering ‘hacks’ and share them on forums. For example, one technique to reset bans is to log off immediately and log on again. Here the gap between subject and Uber’s understanding takes the form of a temporal distinction—a difference between the smooth, cohesive time of the subject and the syncopated temporality of the platform. Far from being glitches or errors, these techniques rely on the very consistency of computation—logically working with its internal (and inevitably partial) understandings. Rather than ‘breaking’ the system, they are better understood as immanent to it, widening a fundamental gap that already exists, the gap between subjects and their algorithmically understood counterpart.
Today, power is conducted through the prism of the algorithmic. This power is never given or assumed, but must be incessantly performed through a set of operations. These technical operations—instantiating objects and indexing data—must coalesce into meta-operations—creating subjectivities, forming relations, and directing work. In carrying out encapsulation, enlistment and enchantment, algorithmic platforms exert significant force on subjects. Yet the opposite also applies—when this grammar of operations is partial or unsuccessful, traction is not attained and a gap between subject and referent emerges. As each new technique is added, the gap between subject and referent only increases. In this sense, the algorithmic is often constructed, not unlike finance, as “long chains of increasingly speculative instruments that all rest on the alleged stability of that first step” (Sassen 2014, 118). Instrumentalizing this discrepancy suggests more intentional and effective interventions in the algorithmic regimes that increasingly shape our everyday.
Luke Munn uses the body and code, objects and performances to activate relationships and responses. His projects have featured in the Kunsten Museum of Modern Art, the Centre de Cultura Contemporània de Barcelona, Fold Gallery London, Causey Contemporary Brooklyn and the Istanbul Contemporary Art Museum, with commissions from Aotearoa Digital Arts, and TERMINAL. He is a Studio Supervisor at Whitecliffe College of Art & Design and a current PhD Candidate at Institute for Culture and Society, Western Sydney University.
Chun, Wendy Hui Kyong. 2008. “On ‘Sourcery,’ or Code as Fetish.” Configurations 16 (3):299–324.
Foucault, Michel. The History of Sexuality. Translated by Robert Hurley, Pantheon Books, 1978.
Fuller, Matthew. 2005. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, MA: MIT Press.
Galloway, Alexander R. 2004. Protocol: How Control Exists after Decentralization. Cambridge, MA: MIT Press.
Gillespie, Tarleton. “The Relevance of Algorithms.” Media Technologies Essays on Communication, Materiality, and Society, edited by Tarleton Gillespie et al., MIT Press, 2014, pp. 167–94.
Hochschild, Arlie Russell. The Managed Heart: Commercialization Of Human Feeling. University of California Press, 2003.
Lee, Min Kyung, Daniel Kusbit, Evan Metsky, and Laura Dabbish. 2015. “Working With Machines: The Impact of Algorithmic and Data-Driven Management on Human Workers.” In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 1603–1612. ACM. https://www.cs.cmu.edu/~mklee/materials/Publication/2015-CHI_algorithmic_management.pdf.
Magee, Liam, and Ned Rossiter. 2015. “Service Orientations: Data, Institutions, Labor.” In There Is No Software, There Are Just Services, edited by Irina Kaldrack and Martina Leeker, 73–89. Leuphana, Germany: meson press.
Marino, Marc. 2006. “Critical Code Studies.” Electronic Book Review. December 4, 2006. http://www.electronicbookreview.com/thread/electropoetics/codology.
Parikka, Jussi, and Michael Goddard. 2011. “Unnatural Ecologies.” The Fibreculture Journal, no. 17:1–5.
Pellow, David, and Lisa Sun-Hee Park. The Silicon Valley of Dreams: Environmental Injustice, Immigrant Workers, and the High-Tech Global Economy. New York University Press, 2003.
Plant, Sadie. Zeroes + Ones: Digital Women + the New Technoculture. Doubleday, 1997.
Rosenblat, Alex, and Tim Hwang. 2016. “Regional Diversity in Autonomy and Work: A Case Study from Uber and Lyft Drivers.” https://datasociety.net/pubs/ia/Rosenblat-Hwang_Regional_Diversity-10-13.pdf.
Sassen, Saskia. Expulsions: Brutality and Complexity in the Global Economy. Harvard University Press, 2014.