Sei sulla pagina 1di 4

Editorial

Big Data & Society


January–June 2018: 1–4
Deconstructing the algorithmic sublime ! The Author(s) 2018
DOI: 10.1177/2053951718779194
journals.sagepub.com/home/bds

Morgan G Ames

Abstract
This special theme contextualizes, examines, and ultimately works to dispel the feelings of ‘‘sublime’’—of awe and terror
that overrides rational thought—that much of the contemporary public discourse on algorithms encourages. Employing
critical, reflexive, and ethnographic techniques, these authors show that while algorithms can take on a multiplicity of
different cultural meanings, they ultimately remain closely connected to the people who define and deploy them, and the
institutions and power relations in which they are embedded. Building on a conversation we began at the Algorithms in
Culture conference at U.C. Berkeley in December 2016, we collectively study algorithms as culture (Seaver, this special
theme), fetish (Thomas et al.), imaginary (Christin), bureaucratic logic (Caplan and boyd), method of governance (Coletta
and Kitchin; Lee; Geiger), mode of inquiry (Baumer), and mode of power (Kubler).

Keywords
Algorithms in culture, technological sublime, ethics, imaginaries, ethnography, critical scholarship

This article is a part of special theme on Algorithms in Culture. To see a full list of all articles in this special theme,
please click here: http://journals.sagepub.com/page/bds/collections/algorithms-in-culture.

A growing number of scholars have noted a distinct algebraic manipulations for which al-Khwarizmı was
algorithmic moment in the contemporary zeitgeist. best known, or a stand-in for the decimal number
With machine learning again in ascendancy amid system more generally. This changed starting in the
ever-expanding practices to digitize not only all of the mid-20th century, when the emerging field of computer
important records of our lives but an increasing quan- science adopted the term to refer to a specification for
tity of our casual traces—mining them like archeolo- solving a particular kind of problem that could be
gists at digital middens—it is indeed no wonder that the implemented by a computer.
academy has also made an ‘‘algorithmic turn.’’ In As computers spread and digital information systems
response, universities are adding interdisciplinary pro- replaced paper, there was a concomitant rise in the inter-
grams in ‘‘data science,’’ and scholars across the sci- est of algorithms as a social phenomenon (and concern).
ences and humanities are weighing in on the promises Some of the first algorithms that computer science stu-
and perils of algorithmic approaches to their work. dents learn allow them to tackle fairly simple data organ-
While it is true that algorithms—loosely defined as a ization and retrieval tasks—but these tend not to be the
set of rules to direct the behavior of machines or algorithms that capture social imaginations. Instead,
humans—are shaping infrastructures, practices, and those that are more complicated and difficult to
daily lives around the world via their computerized
instantiations, they are neither strictly digital nor
strictly modern. The word ‘‘algorithm,’’ a Center for Science, Technology, Medicine and Society and School of
Latinization of the name of ninth-century Persian Information, University of California, Berkeley, CA, USA
mathematician and scholar al-Khwarizmı , in
Corresponding author:
fact predates the digital computer by over a thousand Morgan G Ames, Center for Science, Technology, Medicine and Society
years (Al-Daffa, 1977). For many of these years, ‘‘algo- and School of Information University of California, Berkeley, CA, USA.
rithm’’ was an obscure term associated with the Email: morganya@berkeley.edu

Creative Commons CC-BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License (http://
www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further
permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-
at-sage).
2 Big Data & Society

understand—or those that have been intentionally FAT* (Fairness, Accountability, and Transparency,
black-boxed by their creators—are also those most either in Machine Learning specifically or in technical
prone to evoking in us feelings of a technological sublime systems more generally) attest to a growing recognition
in all its awe-inspiring, rationality-subsuming glory of the responsibilities that some in the field feel toward
(Mosco, 2005; Nye, 1996). Media theorist describes the building (and teaching) ethical systems.
technological sublime as a feeling of ‘‘astonishment, awe, Alongside these technical advancements, the bur-
terror, and psychic distance’’—feelings once reserved for geoning field of ‘‘algorithm studies’’ has been working
natural wonders or intense spiritual experiences, but to dispel the algorithmic sublime by attending closely to
increasingly applied to technologies that are new and the grounded material implications of algorithms, with
potentially transformatory, but also complex and more and more critical social scientists with strong
poorly understood (Mosco, 2005: 23). technical skills assessing the rhetoric and the realities
In the 1960s, for instance, alongside the heady trans- of algorithms (see e.g., Gillespie and Seaver, 2016).
disciplinary rubric of cybernetics, the first artificial intel- Applying many of the same methods and theories
ligence algorithms sought not just to define rules for that have enabled social scientists and humanists to
human–machine systems, but to govern them through understand processes of scientific inquiry and techno-
far-reaching behaviorist feedback mechanisms, conjuring logical development for nearly 50 years, this work
sublime worlds of automatically regulated harmony explores the sociotechnical implications of algorithms
(Dupuy, 2009). Alongside this utopian dream developed in politics, media, science, organizations, culture, and
a dystopian nightmare of a mechanized society in which the construction of the self.
individual agency was subsumed by coercive algorithmic However, sublimes can be stubborn. On one side, some
control via institutions and governments. U.C. Berkeley researchers in engineering and computer science still see
students protested these kinds of visions, riffing on the themselves as engaged in ‘‘basic research’’ that need not
text printed on computer punch-cards cards to make attend to ethics (e.g., Hutson, 2018). Some instead ‘‘dis-
signs that read, ‘‘I am a UC student. Please don’t bend, cover’’ the importance of this area, ignoring decades of
fold, spindle, or mutilate me’’ (Lubar, 1992). Both aspects scholarship from across the social sciences and huma-
of this cybernetic sublime largely turned to disillusion- nities that have closely scrutinized the social implications
ment as these early algorithms failed to deliver on their of technology. On the other side, current public discourse
utopian promises1—even as some of the dystopian fears about algorithms tends to reinforce claims that despite
of algorithmic control were quietly implemented by cor- the often-extensive human tuning that goes into these sys-
porations and governments in the decades since. tems, even partial transparency and interpretability are
A strikingly similar algorithmic sublime has been impossible. In the middle, scholars who actually are enga-
reinvigorated in the last few years with powerful new ging with these questions either tend to be ignored in these
techniques enabled by massive datasets (demurely broader discourses (e.g., O’Neil, 2017), or a few succumb
called ‘‘Big Data’’), increased computing power,2 and to the pressure to sensationalize their research for public
new techniques in machine learning—techniques that consumption, which unfortunately often involves shallow
are difficult to understand, with potentially massive and incomplete interpretations that can lead technolo-
social ramifications—that take advantage of both. gists to think that social scientists do not really under-
More mundane algorithms already do play a role in stand them after all. The result is a widespread
many aspects of our daily lives, from healthcare impression that many algorithms are ‘‘black boxes’’
to creditworthiness to the management of utilities. with little hope for supervision or regulation—and that
But the ways that algorithms ignite the contemporary (despite ample evidence to the contrary) academia has
cultural imagination—much like those attached to been woefully remiss in neglecting to interrogate the
cybernetic visions in decades past—makes them seem implications of this algorithmic turn.
still in the realm of science fiction, harbingers of a revo- This special theme works against this mythology.
lutionary future of which we are forever on the cusp. Adding to the growing field of algorithm studies, the
Under the hood, even cutting-edge deep machine papers here consider algorithms as an object of cultural
learning algorithms and the Big Data on which they inquiry from a social scientific and humanistic perspec-
depend are, if not fully understandable, at least par- tive. We explore the sociotechnical implications of the
tially interrogatable. Methods like backpropagation development, deployment, and resistance of algorithms
and visualization techniques can help researchers across various social worlds (Becker, 1982). Moreover,
understand what an algorithm ‘‘sees,’’ and a growing we examine how algorithms are not only embedded in
number of scholars are taking seriously the effects of these cultures, but are what Seaver in this special theme
implicit (and explicit) biases, exclusions, and unpredict- calls ‘‘of cultures’’: they are co-constituted by the same
ability in both data sets and data models. The growing cultural processes and take on a multiplicity of different
popularity and rigor of conferences such as FATML/ cultural meanings. In short, these authors collectively
Ames 3

find that algorithms have everything to do with the Geiger’s ethnography of the infrastructure of
people who define and deploy them, and the institu- Wikipedia provides another example of some of the
tions and power relations in which they are embedded. decoupling and buffering that Christin describes, with
We began this conversation at a conference on an eye toward the kinds of tacit knowledges that can
Algorithms in Culture hosted by the Center for Science, make the community particularly difficult for new-
Technology, Medicine and Society and the Berkeley comers to navigate. He focuses on the specific inter-
Institute of Data Science at the University of California, actions and workarounds that volunteer moderators
Berkeley in December 2016.3 Our explorations were moti- (of which he is one) develop by working alongside
vated by a number of wide-ranging questions. How a host of algorithmic ‘‘bots’’ on Wikipedia. Geiger
broadly might we usefully define algorithms, for instance? discusses the implications these ‘‘bots’’ have for govern-
Why has there been an explosion of discourse about ance, gatekeeping, and newcomer socialization in com-
‘‘algorithms’’ in popular culture in the last decade? Are munities like Wikipedia, which can come to rely quite
contemporary algorithms a necessarily computational heavily on these algorithmic mediators to function.
phenomenon, or might we learn something from their Like Geiger, Lee considers how people make sense
algebraic history? What kinds of work are done to make of what algorithms do in the process of working along-
algorithms computable, and what are their material side them—though she takes a different approach in
effects? What does it mean to study algorithms as culture exploring this question. Drawing on an experimental
(Seaver, this special theme), fetish (Thomas et al.), imagin- design where participants react to the same managerial
ary (Christin), bureaucratic logic (Caplan and boyd), decisions variously presented as coming from a human
method of governance (Coletta and Kitchin; Lee; or from an algorithm, Lee considers how those who are
Geiger), mode of inquiry (Baumer), or mode of power subject to managerial control understand authority,
(Kubler)? And in what ways can our methods and theories fairness, and trust differently in these two cases, and
for answering these questions contribute back to com- with different kinds of tasks. Where humans were
puter science, data science, and Big Data initiatives? often seen as ‘‘authoritative,’’ algorithms could be at
In this special theme, Seaver compellingly makes the times efficient and objective, but also unfair and
case that there is no one stable definition of algorithms: untrustworthy in decisions that seemed to rely on intu-
they are ‘‘multiples—unstable objects that are enacted ition—and algorithms, seen as dehumanizing, did not
through the varied practices that people use to engage as reliably elicit the kind of positive emotional response
with them.’’ Each author in this special theme grapples that humans did.
with this multiplicity, largely drawing on reflexive The next several articles explore how algorithmic
ethnographic methods (e.g., Burawoy, 1998) to richly control has been playing out at scale, shaping the tem-
account for how algorithms play out in the ‘‘mangle of poral rhythms of cities, bureaucratic norms and expect-
practice’’ (Pickering, 1993). As such, these scholars also ations across the technology industry, and the possible
avoid casting the companies that often create these scope of state surveillance. Drawing on an ethno-
algorithms as monoliths with unified moral visions. graphic observation of a two Dublin city systems that
While critiques of the hegemonic ideologies that circu- feed into a ‘‘city dashboard’’—one that manages traffic
late within the technology industry are important, patterns in real time and another that monitors noise
it is equally important to grapple with the complex levels—Coletta and Kitchin examine how the rhythms
and heterogeneous practices on the ground: trade-offs, of city life shift with the possibility of real-time data
conflicts, even acts of resistance. collection and processing. They discuss how these
Christin describes some of these heterogeneous prac- come to change, governance practices and even consti-
tices based on ethnographic work with a news agency gov- tute new modes of ‘‘algorhythmic’’ governance.
erned by web analytics and a criminal court using The kinds of things that commonly-used algorithms
algorithms to understand recidivism risks of potential par- make (more easily) possible, and the kinds of institu-
olees. She finds that algorithms are co-opted as symbolic tional logics that they come to embody in the process,
resources in both of these communities, and that both have had impacts that go beyond just one company,
draw on ‘‘algorithmic imaginaries’’ to make sense of Caplan and boyd argue. Using the example of
what these algorithms are doing—though each has distinct Facebook to examine the spread of sensationalism,
practices based on their institutional context. Christin ‘‘Fake News,’’ and other forms of propaganda—web
notes that there are differences between the intended and content that has proven to be fairly attention-grabbing
actual effects of algorithms, which she terms ‘‘decoupling.’’ and thus lucrative for online advertising—these authors
Participants then take up various strategies of ‘‘buffer- demonstrate that the norms that algorithms enforce
ing’’—acts of resistance such as foot-dragging, gaming, (such as catchy ‘‘clickbait’’ headlines, more extreme
or open critique—to reclaim agency and expertise within related article recommendations, and easy mechanisms
algorithmically mediated work environments. for ‘‘going viral’’) can end up homogenizing entire
4 Big Data & Society

industries in their quest to optimize for the algorithm before algorithms stabilize ‘‘into full-fledged gods and
(and for profit). Because these algorithms have largely demons.’’
been written by private companies, they have lacked the In sum, this special theme considers how algorithms
oversight that a state institution would (ideally) are enacted, practiced, and contested, and provides
have—oversight they might have to be subject to in tools for others doing the same. Together, we work to
the future if we want to regain the possibility for delib- examine and dispel the algorithmic sublime that charac-
erative discourse. terizes contemporary discourses on algorithms—not by
On the other hand, Kubler provides a sobering simplistically collapsing the definition of algorithms,
example of how these algorithmic norms can be used but considering the rich, multifaceted, and at times
even by ostensibly democratic states for ever-more- contradictory meanings that algorithms take on
intrusive surveillance practices—especially when condi- across many domains.
tions in those systems also preclude sufficient oversight.
Intervening in discussions of posthegemonic power and Notes
post-panoptic surveillance mechanisms, he examines
1. Some elements of this sublime lived on in ‘‘cyberspace,’’
how IBM’s ‘‘i2 Analyst’s Notebook’’ allows French
but the algorithmic focus receded in favor of free-wheeling
law enforcement to quickly draw from massive data-
imaginaries of an untamed ‘‘electronic frontier’’ (Mosco,
bases of information in order to make associations, 2005; Turner, 2006).
craft narratives around criminal and potential terrorist 2. https://blog.openai.com/ai-and-compute/
activity, and ultimately exercise power in ways specific- 3. http://cstms.berkeley.edu/algorithms-in-culture/
ally afforded by the information from these algorithms.
This level of intrusion and surveillance, while enabled
References
by algorithms, has been routinized by the repeated
extensions of national states of emergency, and has Al-Daffa AA (1977) The Muslim Contribution to
precipitated shifts in state subjectivity. Mathematics. London: Croom Helm.
Becker HS (1982) Art Worlds. Berkeley, CA: University of
How might we interrogate these systems? In addition
California Press.
to the ethnographic methods Seaver advocates and that Burawoy M (1998) The extended case method. Sociological
most of the authors here enact, Baumer uses design- Theory 16(1): 4–33.
based interventions to more actively consider how Dupuy J-P (2009) On the Origins of Cognitive Science: The
people perceive algorithms and to explore the discon- Mechanization of the Mind. Cambridge, MA: The MIT
nects between these ‘‘lay’’ understandings and how Press.
algorithms are actually implemented. By using three Gillespie T and Nick S (2016) Critical algorithm studies: A
human-centered design techniques—speculative design, reading list. Social Media Collective. Available at: social-
participatory design, and theoretical framing—for mediacollective.org/reading-lists/critical-algorithm-stu-
algorithm studies, Baumer compellingly makes the dies/ (2016, accessed 15 May 2018).
case for shifting from focusing exclusively on perform- Hutson M (2018) Artificial intelligence could identify gang
ance to more human-centered metrics in evaluating crimes – And ignite an ethical firestorm. Science, 28
February. Available at: http://www.sciencemag.org/news/
algorithms. When discussions of algorithmic bias often
2018/02/artificial-intelligence-could-identify-gang-crimes-
assume that algorithms can solve the problems that algo- and-ignite-ethical-firestorm.
rithms create, such a reframing becomes especially Lubar S (1992) ‘Do Not Fold, Spindle or Mutilate’: A cul-
important. At the same time, Baumer also highlights tural history of the punch card. The Journal of American
some of the challenges that arise from this translation- Culture 15(4): 43–55.
work, particularly the need for those involved to develop Mosco V (2005) The Digital Sublime: Myth, Power, and
both technical and critical skills. Cyberspace. Cambridge, MA: The MIT Press.
Thomas, Nafus, and Sherman take a broader view of Nye DE (1996) American Technological Sublime. Cambridge,
the power we invest in algorithms. Drawing on theor- MA: The MIT Press.
etical engagement with Graeber and ethnographic O’Neil C (2017) The ivory tower can’t keep ignoring tech. The
research with the computer-vision and quantified-self New York Times, 14 November. Available at: https://
www.nytimes.com/2017/11/14/opinion/academia-tech-
worlds, they cast the kinds of ‘‘social contracts’’ that
algorithms.html.
algorithms make possible as fetishistic, characterized by
Pickering A (1993) The mangle of practice: Agency and emer-
a faith in what algorithms can (or should be able to) gence in the sociology of science. American Journal of
do—in other words, a faith in algorithmic power and Sociology 99(3): 559–589.
agency. The authors show that this lens can lay bare the Turner F (2006) From Counterculture to Cyberculture:
priorities of those who hold this faith and those who are Stewart Brand, The Whole Earth Network, and The Rise
objects of algorithmic control, allowing those more crit- of Digital Utopianism. Chicago: University of Chicago
ical of the algorithmic fetish to contest these priorities Press.

Potrebbero piacerti anche