Sei sulla pagina 1di 24

History of Computing

Thomas Haigh Editor

Exploring
the Early
Digital
History of Computing

Founding Editor
Martin Campbell-Kelly, University of Warwick, Coventry, UK

Series Editors
Gerard Alberts, University of Amsterdam, Amsterdam, The Netherlands
Jeffrey R. Yost, University of Minnesota, Minneapolis, USA

Advisory Board
Jack Copeland, University of Canterbury, Christchurch, New Zealand
Ulf Hashagen, Deutsches Museum, Munich, Germany
Valérie Schafer, CNRS, Paris, France
John V. Tucker, Swansea University, Swansea, UK
Thomas Haigh
Editor

Exploring the Early Digital


Editor
Thomas Haigh
Department of History
University of Wisconsin–Milwaukee
Milwaukee, WI, USA
Comenius Visiting Professor
Siegen University
Siegen, Germany

ISSN 2190-6831 ISSN 2190-684X (electronic)


History of Computing
ISBN 978-3-030-02151-1 ISBN 978-3-030-02152-8 (eBook)
https://doi.org/10.1007/978-3-030-02152-8

Library of Congress Control Number: 2019931874

© Springer Nature Switzerland AG 2019


Chapter 4: © This is a U.S. government work and not under copyright protection in the U.S.; foreign
copyright protection may apply 2019
This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of
the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology
now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the
editors give a warranty, express or implied, with respect to the material contained herein or for any errors
or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims
in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG.
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Contents

1 Introducing the Early Digital . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1


Thomas Haigh
2 Inventing an Analog Past and a Digital Future
in Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Ronald R. Kline
3 Forgotten Machines: The Need for a New
Master Narrative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Doron Swade
4 Calvin Mooers, Zatocoding, and Early Research
on Information Retrieval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Paul E. Ceruzzi
5 Switching the Engineer’s Mind-Set to Boolean:
Applying Shannon’s Algebra to Control Circuits
and Digital Computing (1938–1958) . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Maarten Bullynck
6 The ENIAC Display: Insignia of a Digital Praxeology . . . . . . . . . . . . 101
Tristan Thielmann
7 The Evolution of Digital Computing Practice
on the Cambridge University EDSAC, 1949–1951. . . . . . . . . . . . . . . . 117
Martin Campbell-Kelly
8 The Media of Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Mark Priestley and Thomas Haigh

xi
xii Contents

9 Foregrounding the Background: Business, Economics,


Labor, and Government Policy as Shaping Forces
in Early Digital Computing History . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
William Aspray and Christopher Loughnane
10 “The Man with a Micro-calculator”: Digital Modernity
and Late Soviet Computing Practices . . . . . . . . . . . . . . . . . . . . . . . . . . 179
Ksenia Tatarchenko

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
Chapter 1
Introducing the Early Digital

Thomas Haigh

Abstract This introductory chapter outlines the objectives of the book, explaining
how adopting “early digital” as a frame can encourage new perspectives on estab-
lished topics within the history of computing and productively integrate concerns
from related fields such as media theory and communications history. Haigh encour-
ages historians to take digitality seriously as an analytical category, probes the dif-
ferences between analog and digital computing, and argues that the ability of a
machine to follow a program is fundamentally digital. He also introduces the con-
tributions of the individual chapters in the book, situating each within this broader
analysis of digitality and its historical materiality.

At the workshop held to prepare this book, Paul Ceruzzi noted that the digital
computer was a “universal solvent.” This idea comes from alchemy, referring to an
imaginary fluid able to dissolve any solid material. In the 1940s the first digital
computers were huge, unreliable, enormously expensive, and very specialized.
They carried out engineering calculations and scientific simulations with what was,
for the time, impressive speed. With each passing year, digital computers have
become smaller, more reliable, cheaper, faster, and more versatile. One by one they
have dissolved other kinds of machine. Some have largely vanished: fax machines,
cassette players, telegraph networks, and encyclopedias. In other cases the digital
computer has eaten familiar devices such as televisions from the inside, leaving a
recognizable exterior but replacing everything inside.
For those of us who have situated ourselves with the “history of computing,” this
provides both a challenge and an opportunity: an opportunity because when the
computer is everywhere, the history of computing is a part of the history of every-
thing; a challenge because the computer, like any good universal solvent, has dis-
solved its own container and vanished from sight. Nobody ever sat down in front of

T. Haigh ( )
Department of History, University of Wisconsin–Milwaukee, Milwaukee, WI, USA
Comenius Visiting Professor, Siegen University, Siegen, Germany

© Springer Nature Switzerland AG 2019 1


T. Haigh (ed.), Exploring the Early Digital, History of Computing,
https://doi.org/10.1007/978-3-030-02152-8_1
2 T. Haigh

their television at the end of the day, picked up their remote control, and said “let’s
do some computing.” Our object of study is everywhere and nowhere.
The startling thing is that “computer” stuck around so long as a name for these
technologies. The word originally described a person carrying out complex techni-
cal calculations, or “computations.” (Campbell-Kelly and Aspray 1996) The “auto-
matic computers” of the 1940s inherited both the job and the title from their human
forebears and retained it even when, after a few years, their primary market shifted
to administrative work. Well into the 1990s, everyone knew what a computer was.
The word stuck through many technological transitions: supercomputers, minicom-
puters, personal computers, home computers, and pocket computers. Walking
through a comprehensive computing exhibit, such as the Heinz Nixdorf
MuseumsForum or the Computer History Museum, one passes box after box after
box. Over time the boxes got smaller, and toggle switches were eventually replaced
with keyboards. Computing was implicitly redefined as the business of using one of
these boxes to do something, whether or not that task involved computations.
Even then, however, other kinds of computers were sneaking into our lives, in
CD players, microwave ovens, airbags and antilock brakes, ATMs, singing greeting
cards, and videogame consoles. Within the past decade, the box with keys and a
screen has started to vanish. Laptop computers are still thought of as computers, but
tablets, smartphones, and high-definition televisions are not. To the computer scien-
tist, such things are simply new computing platforms, but they are not experienced
in this way by their users or thought of as such by most humanities scholars.
Instead many people have come to talk of things digital: digital transformation,
digital formats, digital practices, digital humanities, digital marketing, and even
digital life. Other newly fashioned areas of study, such as algorithm studies and
platform studies, also define themselves in terms of distinctly digital phenomena. In
some areas “digital” now denotes anything accomplished by using computers which
requires an above average level of technical skill. This is the usual meaning of “digi-
tal” in the “digital humanities” and in various calls for “digital STS” and the like.
Using “digital” as a more exciting synonym for “computerized” is not wrong,
exactly, as modern computers really are digital, but it is arbitrary. Historians of
computing have so far been somewhat suspicious of this new terminology, only
occasionally using it to frame their own work (Ensmenger 2012). I myself wrote an
article titled “We Have Never Been Digital.” (Haigh 2014) Yet historians of comput-
ing cannot expect the broader world to realize the importance of our community’s
work to understanding digitality if we are reluctant to seriously engage with the
concept. Instead the work of conceptualizing digitality and its historical relationship
to computer technology has been left largely to others, particularly to German media
scholars (Kittler 1999; Schröter and Böhnke 2004).
In this volume, we approach digitality primarily from within the history of com-
puting community, rethinking the technologies of computation within a broader
frame. Subsequent efforts will build on this reconceptualization of computational
digitality as an underpinning to the study of digital media. This book therefore
makes the case that historians of computing are uniquely well placed to bring rigor
to discussion of “the digital” because we are equipped to understand where digital
1 Introducing the Early Digital 3

technologies, platforms, and practices come from and what changes (and does not
change) with the spread of the digital solvent into new areas of human activity.
Hence the title of our book, Exploring the Early Digital.

1.1 Digital Materiality

Let’s start with what digitality isn’t. In recent usage, digital is often taken to mean
“immaterial.” For example, entertainment industry executives discuss the shift of
consumers toward “digital formats” and away from the purchase of physical disks.
The woman responsible for the long-running Now That’s What I Call Music! series
of hit music compilations was recently quoted (Lamont 2018) as saying that songs
for possible inclusion are now sent to her by email, unlike the “more glamorous…
analogue era, when labels sent over individual songs on massive DAT tapes by cou-
rier.” Such statements make sense only if one adopts a definition of “digital” that
excludes all disks and tapes. That is a stretch, particularly as the D in DVD stands
for Digital. So does the D in DAT.
The recent idea of “the digital” as immaterial is both ridiculous and common,
deserving its own historical and philosophical analysis. Langdon Winner’s classic
Autonomous Technology, which explored the history of the similarly odd idea of
technology as a force beyond human control, might provide a model. Some impor-
tant work in that direction has been done in “A Material History of Bits” (Blanchette
2011). Although the modern sense of “digital” was invented to distinguish between
different approaches to automatic and electronic computing, the characteristics it
described are much older. The mathematical use of our current decimal digits began
in seventh-century India before picking up steam with the introduction of zeros and
the positional system in the ninth century. Devices such as adding machines incor-
porated mechanical representations of digits. For example, in his chapter Ronald
Kline quotes John Mauchly, instigator of the ENIAC project and one of the creators
of the idea of a “digital computer,” explaining the new concept with reference to
“the usual mechanical computing machine, utilizing gears.”
Most of the chapters in this book deal with specific forms of digital materiality,
emphasizing that the history of the digital is also the history of tangible machines
and human practices. Ksenia Tatarchenko’s contribution deals with Soviet program-
mable calculators. These displayed and worked with digits, like the machines
Mauchly described, but exchanged gears for electronics.
Other digital technologies, used long after the invention of electronic computers,
avoided electronics entirely. Doron Swade provides a close technical reading of the
ways in which a complex mechanical odds-making and ticket selling machine rep-
resented and manipulated numbers.
Paul Ceruzzi’s chapter explores Zatocoding, a digital method of categorizing and
retrieving information using notches cut into the side of punched cards. Its creator,
Calvin Mooers, had early experience with digital electronics and used information
theory to help formulate a highly compressed coding scheme able to combine many
possible index terms. His work was foundation for modern information retrieval
4 T. Haigh

systems, including web search engines. Yet when Mooers went into business, he
was more excited by the promise of paper-based digital information media. The
cards represented information digitally, through different combinations of notches,
which were read using what Ceruzzi calls a “knitting needle-like device.”

1.2 Digital vs. Analog

The antonym of digital is “analog,” not “material.” As Kline explains, this distinc-
tion arose during the 1940s, with the spread of automatic computers. He locates it
in discussions between the creators of digital computers, tracing its initial spread
through enthusiasts for the new metascience of cybernetics, building on his work in
(Kline 2015). Both analog and digital machines could automate the solution of
mathematical problems, whether at the desk, in the laboratory, or, as control equip-
ment, in the field. However, the two kinds of computer represented the quantities
they worked on in fundamentally different ways.
Digital machines represented each quantity as a series of digits. Mechanisms
automated the arithmetic operations carried out by humans, such as addition and
multiplication, mechanizing the same arithmetic tricks such as carrying from less
significant digits to more significant digits or multiplying by repeated addition.
Within the limits imposed by their numerical capabilities, the machines could be
relied upon (when properly serviced, which was not a trivial task) to be accurate and
to give reproducible results. Machines with more digits provided answers with more
precision.
In analog machines, in contrast, each quantity being manipulated was repre-
sented by a distinct part of the machine such as a shaft, a reservoir, or an electrical
circuit. As the quantity represented grew or diminished, the component representing
it would change likewise. The shaft would spin more or less rapidly, the reservoir
empty or fill, or the voltage across the circuit rise or fall. This explains the name
“analog computer.” An analogy captures the relationship between things in the
world, defining a specific correspondence between each element of the analogy and
something in the system being modelled. Some, like model circuits used to simulate
the behavior of electrical power networks, were essentially scale models. Others
substituted one medium for another, such as water for money. The accuracy of an
analog computer was a matter of engineering precision. In practice analog comput-
ers were specialized for a particular kind of job, such as solving systems of differ-
ential equations. They were often faster than digital computers but usually accurate
to only a few significant figures.
Once the categories of analog and digital computers were established, it became
natural to project the idea of analog vs. digital back onto earlier technologies. In
these broader terms, any discreet representation of numbers appears digital, whereas
continuous representations appear analog. Kline notes that some computing special-
ists of the 1950s were wary of “analog” for this reason and prefer the precision
gained by speaking of “continuous” representations. Adding machines, calculating
1 Introducing the Early Digital 5

machines, cash registers, tabulating machines, and many other common technolo-
gies were digital. These machines typically represented each digit as a ten-faced
cog, which rotated to store a larger number. Newer, higher-speed devices stored
numbers as patterns in electromagnetic relay switches or electronic tubes. Other
calculating devices, going back to ancient navigational tools such as the astrolabe,
were analog. So was the once-ubiquitous slide rule (approximating the relationship
between a number and its logarithm). Automatic control devices before the 1970s
were, for the most part, analog: thermostats, the governors that regulated steam
engines, and a variety of military fire control and guidance systems. As David
Mindell has shown (Mindell 2002), engineers across a range of institutions and dis-
ciplinary traditions developed these techniques long before the mid-century fad for
cybernetics provided a unified language to describe them.
Although the application of “digital” to computing and communication was new
in the 1940s and bound up with automatic computers, many of the engineering
techniques involved were older and arose in other contexts. In his chapter, Doron
Swade explores technologies found in two recreational machines from the early
twentieth century which embedded computational capabilities: a golf simulator and
“automatic totalizator” machines used by dog racing tracks to calculate odds in real
time based on ticket sales. Swade notes that these machines have been left out of
traditional master narratives in the history of computing, which focus on scientific
calculator and office administration as the primary precursors of digital computer
technology. His work demonstrates the benefits of moving beyond the limitations
imposed by this traditional frame and taking a broader approach to the study of
computational technology. Indeed, even the categories of digital and analog, accord-
ing to Swade, are sufficiently tangled in engineering practice for him to challenge
the “faux dichotomous categories used retrospectively in the context of pre-elec-
tronic machines.”

1.3 Computer Programs Are Inherently Digital

Programmability, often seen as the hallmark of the computer, is itself a fundamen-


tally digital concept. As John von Neumann wrote when first describing modern
computer architecture, in the “First Draft of a Report on the EDVAC,” an “automatic
computing system is a (usually high composite) device which can carry out instruc-
tions to perform calculations of a considerable order of complexity….” (von
Neumann 1993). In that formulation, the device is a computer because it computes:
it carries out laborious and repetitive calculations according to a detailed plan. It is
automatic because like a human computer, but unlike a calculating or adding
machine, it goes by itself from one step in the plan to the next.
Kline’s contribution notes that digital and analog were not the only possible
terms discussed during the 1940s. Some participants advocated strongly for the
established mathematical terms “continuous” (instead of analog) and discrete
(instead of digital). These distinctions apply not only to number representations,
6 T. Haigh

which analysis has usually focused on, but also to the way the two kinds of com-
puter carry out their calculations. The latter distinction is perhaps the more funda-
mental, as it explains the ability of digital computers to carry out programs.
Analog computers work continuously, and each element does the same thing
again and again. Connections between these components were engineered to mimic
those between the real-world quantities being modelled. A wheel and disk linked
one shaft’s rotation to another’s, a pipe dripped fluid from one reservoir to another,
and an amplifier tied together the currents flowing in two circuits.
The MONIAC analog computer (Fig. 1.1) designed by William Phillips to simu-
late the Keynesian understanding of the economy illustrates these continuous flows.
Different tanks are filled or emptied to represent changing levels of parameters such
as national income, imports, and exports. Adjustable valves and plastic insets
expressing various functions governed the trickling of water from one chamber to
another. This gave a very tangible instantiation to an otherwise hard to visualize
network of equations, as economic cycles played themselves out and the impact of
different policy adjustments could be tested by tweaking the controls.
In a very obvious sense, analog computations occur continuously. In contrast a
modern computer, or to use the vocabulary of the 1940s an automatic digital com-
puter, breaks a computation into a series of discrete steps and carries them out over
time. At each stage in the computation, the mechanism may work on a different
variable. For example, most early digital computers had only one multiplying unit,
so every pair of numbers to be multiplied had first to be loaded into two designated
storage locations. Over the course of the computation, the numbers loaded into
those locations would refer to completely different quantities in the system being
modelled.
The first step in planning to apply a digital computer to a problem was to figure
out what steps the machine should carry out and what quantities would be stored in
its internal memory during each of those steps. Two of the earliest efforts to plan
work for automatic computers were made by Lovelace and Babbage in the 1830s
and 1840s for the unbuilt Analytical Engine (Fig. 1.2) and by the ENIAC team in
1943 to plan out the firing table computations for which their new machine was
being built (Fig. 1.3). When I explored these episodes in collaboration with Mark
Priestley, we were startled to realize that both teams came up with essentially the
same diagramming notation when setting out sample applications for their planned
computers: a table in which most columns represented different storage units of the
machine and each row represented one step in the algorithm being carried out. A
single cell thus specified the mathematical significance of an operation being car-
ried out on one of the stored quantities.
The word “program” was first applied in computing by the ENIAC team (Haigh
and Priestley 2016). Our conclusion was that its initial meaning in this context was
simply an extension of its use in other fields, such as a program of study, a lecture
program, or a concert program. In each case the program was a sequence of discreet
activities, sequenced over time. An automatic computer likewise followed a program
of operations. The word was first applied to describe the action of a unit within
ENIAC that triggered actions within other units: the master programmer. (The same
1 Introducing the Early Digital 7

Fig. 1.1 The Phillips Machine, or MONIAC, illustrates two key features of analog computing: the
“analogy” whereby different parts of the machine represent different features of the world and the
fixed relationships between these parts during the computation, which consisted of continuous
processes rather than discrete steps. (Reproduced from (Barr 2000), courtesy of Cambridge
University Press)
8 T. Haigh

Fig. 1.2 This 1842 table, prepared by Ada Lovelace, is a trace of the expected operation of
Babbage’s Analytical Engine running a calculation. Each line represents 1 of 25 steps in the com-
putation (some of them repeated). Most of the columns represents quantities storied in particular
parts of the engine

term is given to the electromechanical control unit in an automatic washing machine,


though we are not sure which came first.) Quickly, however, “program” came to
describe what von Neumann (1993) called “The instructions which govern this oper-
ation” which “must be given to the device in absolutely exhaustive detail.”
“Programmer” became a job title instead of a control unit. Thus “programmer” and
“computer” passed between the domains of human and machine at around the same
time but in opposite directions.
Because each part of an analog computer carried out the same operation through-
out the computation, analog computer users did not originally talk about “program-
ming” their machines though as digital computers became more popular, the term
was eventually applied to configuring analog computers. In contrast, digital comput-
ers built under the influence of von Neumann’s text adopted very simple architec-
tures, in which computations proceeded serially as one number at a time was fetched
from memory to be added, subtracted, multiplied, divided, or otherwise manipu-
lated. Such machines possessed a handful of general-purpose logic and arithmetic
capabilities, to be combined and reused as needed for different purposes.
As an automatic computer begins work, its instructions, in some medium or
another, are present within it and its peripheral equipment. If the computer is pro-
grammable, then these instructions are coded in a form that can be changed by its
users. In operation it translates some kind of spatial arrangement of instructions into
a temporal sequence or operations, moving automatically from one task to the next.
1 Introducing the Early Digital 9

Fig. 1.3 A detail from the ENIAC project diagram PX-1-81, circa December 1943. As with the
Babbage and Lovelace table, the rows represent discrete steps in the calculation, and the columns
(32 in the full diagram) represent different calculating units within ENIAC

Early computers stored and arranged these instructions in a variety of media.


ENIAC, the first programmable electronic computer, was wired with literal chains
and branches, along which control pulses flowed from one unit to another to trigger
the next operation. In his chapter, Tristan Thielmann uses ideas from media theory
to explore ENIAC’s user interface, specifically the grids of neon bulbs it used to
10 T. Haigh

display the current content of each electronic storage unit. Because ENIAC could
automatically select which sequence of operations to carry out next, and shifted
between them at great speed, its designers incorporated these lights and controls to
slow down or pause the machine to let its operators monitor its performance and
debug hardware or configuration problems.
The chapter Mark Priestley wrote with me for this volume explores the range of
media used to store programs during the 1940s and the temporal and spatial meta-
phors used to structure these media into what would eventually be called a “memory
space.” Several computers of the mid-1940s read coded instructions one at a time,
from paper tape. These tapes could be physically looped to repeat sequences.
Computers patterned after von Neumann’s conception for EDVAC stored coded
instructions in one or another kind of addressable memory. Whether this was a delay
line, tube memory or magnetic drum had major implications for the most efficient
way of spacing the instructions over the medium. Like ENIAC, these machines
could branch during the execution of a program, following one or another route to
the next instruction depending on the results of previous calculations. These media
held both instructions and data, laying the groundwork for later systems that
encoded text, audio, and eventually video data in machine-readable digital forms.
The fundamental technology of digital computers and networks takes many dif-
ferent shapes and supports many different kinds of practice. In his chapter, Martin
Campbell-Kelly explores the variety of use practices that grew up around one of the
earliest general-purpose digital computers, the EDVAC. Its users were the first to
load programs from paper tape media into electronic memory, quickly devising a
system that used the computer itself to translate mnemonics into machine code as it
read the tape. The ability of computers to treat their own instructions as digital data
to be manipulated has been fundamental to their proliferation as the universal
machines of the digital age. Some practices from the first computer installations,
such as the preparation of data and instructions in machine-readable digital form, or
the practice of debugging programs by tracing their operation one instruction at a
time spread with the machines themselves into many communities. Others were
specific to particular areas of scientific practice and remained local.
Almost all of the earliest projects to build automatic digital devices were spon-
sored in some way or another by government money. Charles Babbage was bank-
rolled by the British government, as was Colossus. Konrad Zuse relied on the
patronage of the Nazi regime, while ENIAC was commissioned by the US Army.
With the exception of the (widely misunderstood) connection of what became the
Internet to the military’s interest in building robust networks, the role of the state in
the later exploitation and improvement of digital technology is less widely appreci-
ated. Yet, as William Aspray and Christopher Loughnane show in their chapter in
this volume, the state remained vitally important in structuring the early use of digi-
tal computers as a procurer of digital technologies, a sponsor of research, and a
regulator of labor markets. Their chapter illustrates the contribution broader-based
historical analysis can provide to understanding the spread of digital technology, in
contrast to popular history with its focus on brilliant individuals, as demonstrated by
the title of the recent blockbuster The Innovators: How a Group of Hackers,
Geniuses, and Geeks Created the Digital Revolution (Isaacson 2014).
1 Introducing the Early Digital 11

1.4 Digital Information

The chapters collected here give a new and broader idea of the material culture of
digitality. Nothing is immaterial. Yet there is something special about the relation-
ship of bits to their material representations: different material representations are,
from a certain viewpoint, interchangeable. Digital information can be copied from
one medium to another without any loss of data, and the same sequence of bits can
be recovered from each. Transcribe the text of a book into a text file, save that file,
compress it, email it, download it, and print it out. The text has been represented in
many material forms during this process, but after all those transformations and
transcriptions, one retains the same series of characters. Matthew Kirschenbaum
called this the “formal materiality” of digitality (Kirschenbaum 2007). Discussion
of “digital formats” as alternative to material media, misleading as it is, captures
something about the truth of this experience.
Claude Shannon’s “mathematical theory of computation” (Shannon and Weaver
1949), popularized as “information theory,” is foundational to our sense of “the
digital” and to the modern sense of “information” (Kline 2006; Soni and Goodman
2017). Maarten Bullynck’s chapter in this volume examines the early adoption and
development of Shannon’s efforts to “synthesize” networks of relay switches from
logical descriptions defined using Boolean algebra. Such circuits provided the
building blocks of digital machines, including early computers. He suggests that it
took a decade of work by practicing engineers, and the creation of new craft prac-
tices and diagramming techniques, to turn his work into a practical basis for digital
electronic engineering. This also reminds us that Shannon’s work had a very spe-
cific institutional and technological context, looking backward to new digital com-
munication techniques developed during WWII and forward to anticipate the
generalization of these as a new basis for routine telecommunication.
Over time, the connection of digitality and information with computer technol-
ogy grew ever stronger and tighter. “Information” had previously been inseparable
from a process in which someone was informed of something. It now became what
Geoff Nunberg (1997) memorably called an “inert substance” that could be stored,
retrieved, or processed. “Information” became a synonym for facts or data – and in
particular for digitally encoded, machine-readable data. This processes gained
steam with the spread of the idea of “management information systems” within
corporate computing during the 1960s (Haigh 2001), followed by the widespread
discussion of “information technology” from the 1970s and the introduction of the
job title “chief information officer” for corporate computing managers in the 1980s.
I suspect that the root of all this is in the digital engineering practices used to build
computers and other devices during the 1950s. Shannon-esque digital communica-
tion was taking place within computers, as memory tanks, tape drives, printers, and
processing units swapped signals. This is the context in which it became natural to
think of a process of information occurring without human involvement and, with a
slight linguistic and conceptual slippage, to think of the stored data itself as “infor-
mation” even when it was not being communicated.
12 T. Haigh

1.5 When Was the Early Digital?

Our sense of what, exactly, “early digital” means shifted during our discussions. It
originally appealed as something indicating an era of historical interest, not unlike
the “early modern” period referred to by historians of science. This volume focuses
primarily on a time period from the 1930s to the 1950s, an era that provided the
original locus for work on the history of modern computing. It is during this era that
the concepts of “analog” and “digital” were invented, as were the technologies such
as programmable computers and electronic memories that we associate with “the
digital.” It is the era in which digital computational technologies are most clearly
defined against analog alternatives and a period in which their unmistakable and
sometimes monumental materiality makes it clearest that digitality does not mean
invisibility.
Yet “early digital” has an attractive temporal flexibility and encompasses other
devices that are not always considered to be “computers,” stretching back in time to
Babbage’s planned difference engine and wartime devices such as the codebreaking
Bombes. The phrase initially appealed to me because, like “modern” and “early
modern,” its boundaries are obviously permeable. Tatarchenko, for example, looks
at a late Soviet version of the early digital, which spread during the early 1980s and
centered on a more obviously digital technology: the programmable calculator.
Users coded programs, including games, as sequences of digits displayed on tiny
screens. From the viewpoint of a future in which humans have given up corporeal
existence to live forever in cyberspace, the present day would seem like the very
early digital.
As our thinking evolved over the course of several workshops, we came to think
of “early digital” less as something defining a general epoch in a society and more
as a very local designation describing the transformation of a specific practice
within a specific community. In particular, we do not believe that there was a single
“early digital” epoch or that one can follow those who talk about “digital revolu-
tions” into a view of the world in which a single event or invention creates a univer-
sal rupture between digital and pre-digital worlds.
The first instinct of the responsible historian is to challenge assumptions of
exceptionalism, whether made for nations or for technologies. Discourses of the
kind Gabrielle Hecht (2002) termed “rupture talk” have grown up around many new
technologies. These claim that the technology represents a break with all prior prac-
tice so dramatic that historical precedents are irrelevant. The now fashionable idea
of the “postdigital” is likewise premised on the idea that we are currently on the far
side of some kind of digital rupture.
Recognizing that rhetoric of a “digital transformation” parallels claims made for
nuclear power or space exploration as the defining technology of a new epoch, the
careful historian should begin with a default assumption that computer technology
is neither exceptional nor revolutionary. Yet all around us, we see the rebuilding of
social practices around computers, networks, and digital media. Even the most care-
ful historian might be moved to entertain the hypothesis that some kind of broadly
1 Introducing the Early Digital 13

based “digital transformation” really is underway. The challenge is to find a point of


engagement somewhere between echoing the naïve boosterism of Silicon Valley
(Kirsch 2014) and endorsing the reflex skepticism of those who assume that digital
technology is just a novel façade for the ugly business of global capitalism and neo-
liberal exploitation.
As the papers gathered in this volume begin to suggest, technologies and prac-
tices did not become digital in a single world-historic transformation sometime in
the 1940s or 1950s (or the 1980s or 1990s), but a set of localized and partial trans-
formations enacted again and again, around the world and through time, as digital
technologies were adopted by specific communities. Within those communities, one
can further segment the arrival of the early digital by task. The EDVAC users dis-
cussed by Campbell-Kelly were using a digital computer to solve equations, reduce
data, and run simulations, but it would be decades before they could watch digital
video or count their steps digitally. From this viewpoint, the early digital tag indi-
cates the period during which a human practice is remade around the affordances of
a cluster of digital technologies.
The early digital is also structured geographically. For most of humanity, it
arrived within the past 5 years, with the profusion of cheap smartphones. The poor-
est billion or two people are still waiting for it to begin.

1.6 Warming Up to the Early Digital

Our opportunity as historians of computing confronting a so-called “digital revolu-


tion” is to explain, rigorously and historically, what is really different about digital
electronic technology, how the interchangeability of digital representations has
changed practices in different areas, and how the technological aspects of digital
technologies have intertwined with political and social transformations in recent
decades. This means taking the “digital” in “digital computer” as seriously as the
“computer” part.
At one of the workshops in the Early Digital series, the phrase “I’m warming up
to the early digital” was repeated by several participants becoming, by the end of the
event, a kind of shared joke. The new phrase was beginning to feel familiar and use-
ful as a complement to more established alternatives such as “history of computing”
and “media history.”
The identity of “history of computing was adopted back in the 1970s, at a time
when only a small fraction of people had direct experience with digital electronic
technologies. Its early practitioners were computer pioneers, computer scientists,
and computer company executives – all of whom identified with “computing” as a
description of what people did with computers as well as with “the computer” as a
clearly defined artifact.
The history of computing community devoted a great deal of its early energy to
deciding what was and what was not a computer, a discussion motivated largely by
14 T. Haigh

the desire of certain computer pioneers, their family members, and their friends to
name an “inventor of the computer.” As I have discussed elsewhere (Haigh et al.
2016), this made the early discourse of the field a continuation of the lawsuits and
patent proceedings waged since the 1940s. Though these disputes alas continue to
excite some members of the public, they have little to offer scholars and were
resolved to our satisfaction (Williams 2000) by issuing each early machine with a
string of adjectives its fans were asked to insert between the words “first” and
“computer.”
This did not resolve the larger limitation of “the history of computing” as an
identity, which is that it makes some questions loom disproportionately large while
banishing others from view. Our inherited focus on the question of “what is a com-
puter,” combined with the fixation of some historically minded computer scientists
and philosophers on “Turing completeness,” has forced a divorce between closely
related digital devices. Digital calculators, for example, have been discussed within
the history of computing largely for the purposes of discounting them as not being
computers, and therefore not being worthy of discussion. Yet, as Tatarchenko’s
chapter shows, electronic calculators (the most literally digital of all personal elec-
tronic devices) shared patterns of usage and practice, as well as technological com-
ponents, with personal computers.
Neither can the history of computing cut itself off from other historical com-
munities. Computing, informing, communicating, and producing or consuming
media can no longer be separated from each other. Thirty or forty years ago, that
statement might have been a provocative claim, made by a professional futurist
or media savvy computer scientist looking for a lavish book deal. Today that digital
convergence is the taken for granted premise behind much of modern capitalism,
embodied in the smartphones we carry everywhere. Yet the histories of these differ-
ent phenomena occupy different literatures, produced by different kinds of histori-
ans writing in different journals and in many cases working in different kinds of
academic institution. “Information history” (Black 2006; Aspray 2015) happens for
the most part within information schools, media history within media studies, and
so on.
My own recent experience in writing about the 1940s digital electronic code-
breaking machine Colossus makes clear the distorting effect this boundary mainte-
nance has had on our historical understanding. Since its gradual emergence from
behind government secrecy since the 1970s, Colossus has been claimed by its vocal
proponents (Copeland 2013) to have been not just a computer, in fact the first fully
operational digital electronic computer, but also programmable. These claims, first
made (Randell 1980) at a time when its technical details were less well documented
than there are today, do not hold up particularly well – the basic sequence of opera-
tions of the Colossus machines was fixed in their hardware, and they could carry out
no mathematical operation more complicated than counting. So “computer” is a bad
fit, whether applied according to the usage of the 1940s (carrying out complicated
series of numerical operations) or that of later decades (programmable general-
purpose machines). The Colossus machines did, however, incorporate some com-
plex electronic logic and pioneer some of the engineering techniques used after the
1 Introducing the Early Digital 15

war to build early electronic computers. Their lead engineer, Tommy Flowers, spent
his career in telecommunications engineering dreaming of building an all-electronic
telephone exchange (Haigh 2018). Decades later, he was still more comfortable
viewing the devices as “processors” rather than computers. They applied logical
operations to transform and combine incoming bitstreams, anticipating some of the
central techniques behind modern digital communications. The Colossus machines
played an appreciable, if sometimes exaggerated, part in determining the course of
the Second World War. Within the frame of the “history of computing,” however,
they matter only if they can somehow be shoehorned into the category of computers,
which has motivated a great deal of special pleading and fuzzy thinking. Positioning
them, and other related machines used at Bletchley Park, as paradigmatic technolo-
gies of the “early digital” requires no intellectual contortions.
The “early digital” is also a more promising frame than “the history of comput-
ing” within which to examine digital networking and the historical development of
electronic hardware. Historians of computing have had little to say about the history
of tubes, chips, or manufacturing – these being the domain of the history of engi-
neering. While important work (Bassett 2002; Lecuyer 2006; Thackray et al. 2015)
has been done in these areas by scholars in close dialog with the history of comput-
ing, the material history of digital technologies has not been integrated into the
mainstream of the history of technology. Overview accounts such as  (Campbell-
Kelly and Aspray 1996) have focused instead on computer companies, architec-
tures, crucial products, and operating systems.
The need to engage with the history of digital communications is just as impor-
tant. Perhaps mirroring the origins of the Internet as a tool for the support of com-
puter science research, scholarly Internet history such as Abbate (1999) and Russell
(2014) has fallen inside the disciplinary wall surrounding the computer as a histori-
cal subject. On the other hand, the history of mobile telephony, which in the late
1990s becomes the story of digitally networked computers, has not. At least in the
United States (and particularly within the Society for the History of Technology),
historians of communications have so far focused on analog technologies – though
anyone following the story of telephony, television, and radio or the successors to
telegraphy past a certain point in time will have to get to grips with the digital. So
far, though, the history of computing and history of communications remain largely
separate fields despite the dramatic convergence of their respective technologies and
industries since the 1980s. Some scholars within media and communication studies
have been more engaged than historians of computing in situating digital technolo-
gies within broader political contexts (Mosco 2004; Schiller 2014), something from
which we can surely learn.
Media studies and media archaeology (Parikka 2011) have their own active areas
of historical enquiry. This work is often written without engagement with the his-
tory of technology literature and, in some cases, such as (Brügger 2010), has delib-
erately eschewed the disciplinary tools and questions of history to situate the
exploration of “web history” as a kind of media studies rather than a kind of history.
Enthusiasm for “platform studies” (Montfort and Bogost 2009) has similarly pro-
duced a new body of historical work only loosely coupled to the history of computing
16 T. Haigh

and history of technology literatures. Neither have the authors of path-breaking


work at the intersection of digital media studies, cultural studies, and literature
such as (Chun 2011) and (Kirschenbaum 2016) found it useful to identify as histo-
rians of computing. The “history of computing” does not resonate in most such
communities, whereas study of “the early digital” may be more successful in forg-
ing a broader scholarly alliance.

1.7 Conclusions

As the computer dissolves itself into a digital mist, we are presented with a remark-
able opportunity to use lessons from decades of scholarship by historians of com-
puting to bring rigor and historical perspective to interdisciplinary investigation of
“the digital.” By embracing, for particular questions and audiences, the frame of the
early digital as a new way of looking at the interaction of people with computers,
networks, and software, we can free our work from its historical fixation on “the
computer” as a unit of study.
The papers gathered here form a series of provocative steps in this direction,
spreading out from different starting points to explore different parts of this new
terrain. Our position recalls Michael Mahoney’s description, a generation ago, of
the challenge faced by the first historians of computing: “historians stand before the
daunting complexity of a subject that has grown exponentially in size and variety,
looking not so much like an uncharted ocean as like a trackless jungle. We pace on
the edge, pondering where to cut in.” (Mahoney 1988) Today we face in “the digi-
tal” a still larger and more varied jungle, but like so many of the exotic places that
daunted Western explorers, it turns out to be already inhabited. To comprehend it
fully, we will need to find ways to communicate and collaborate with the tribes of
scholars who have made their homes there.

References

Abbate, Janet. Inventing the Internet. Cambridge, MA: MIT Press, 1999.
Aspray, William. “The Many Histories of Information.” Information & Culture 50, no. 1 2015):
1–13.
Barr, Nicholas. “The History of the Phillips Machine.” In A.W.H.  Phillips: Collected Works in
Contemporary Perspective, 89–114. New York, NY: Cambridge University Press, 2000.
Bassett, Ross Knox. To The Digital Age: Research Labs, Start-Up Companies, and the Rise of
MOS Technology. Baltimore: Johns Hopkins University Press, 2002.
Black, Alistair. “Information History.” Annual Review of Information Science and Technology 40
2006): 441–473.
1 Introducing the Early Digital 17

Blanchette, Jean-François. “A Material History of Bits.” Journal of the American Society for
Information Science and Technology 62, no. 6 2011): 1042–1057.
Brügger, Niels, ed. Web History. New York: Peter Lang, 2010.
Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine.
New York, NY: Basic Books, 1996.
Chun, Wendy Hui Kyong. Programmed Visions: Software and Memory. Cambridge, MA: MIT
Press, 2011.
Copeland, B Jack. Turing: Pioneer of the Information Age. New  York, NY: Oxford University
Press, 2013.
Ensmenger, Nathan. “The Digital Construction of Technology: Rethinking the History of
Computers in Society” Technology and Culture 53, no. 4 (October 2012): 753–776.
Haigh, Thomas. “Inventing Information Systems: The Systems Men and the Computer, 1950–
1968.” Business History Review 75, no. 1 (Spring 2001): 15–61.
———. “Thomas Harold (“Tommy”) Flowers: Designer of the Colossus Codebreaking Machines.”
IEEE Annals of the History of Computing 40, no. 1 (January–March 2018): 72–78.
———. “We Have Never Been Digital.” Communications of the ACM 57, no. 9 (Sep 2014):
24–28.
Haigh, Thomas, and Mark Priestley. “Where Code Comes From: Architectures of Automatic
Control from Babbage to Algol.” Communications of the ACM 59, no. 1 (January 2016):
39–44.
Haigh, Thomas, Mark Priestley, and Crispin Rope. ENIAC In Action: Making and Remaking the
Modern Computer. Cambridge, MA: MIT Press, 2016.
Hecht, Gabrielle. “Rupture-talk in the Nuclear Age: Conjugating Colonial Power in Africa.” Social
Studies of Science 32, no. 6 (December 2002).
Isaacson, Walter. The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the
Digital Revolution. New York: Simon and Schuster, 2014.
Kirsch, Adam. “Technology is Taking Over English Departments: The False Promise of the Digital
Humanities.” The New Republic (May 2 2014).
Kirschenbaum, Matthew. Mechanisms: New Media and the Forensic Imagination. Cambridge,
MA: MIT Press, 2007.
Kirschenbaum, Matthew G. Track Changes: A Literary History of Word Processing. Cambridge,
MA: Harvard University Press, 2016.
Kittler, Friedrich A. Gramophone, Film, Typewriter. Stanford, CA: Stanford University Press,
1999.
Kline, Ronald. The Cybernetics Moment, Or Why We Call Our Age the Information Age: Johns
Hopkins University Press, 2015.
Kline, Ronald R. “Cybernetics, Management Science, and Technology Policy: The Emergence of
‘Information Technology’ as a Keyword, 1948–1985.” Technology and Culture 47, no. 3 (June
2006): 513–535.
Lamont, Tom. ‘You Can’t Judge a Generation’s Taste’: Making Now That’s What I Call Music
The Guardian, 23 June 2018. Available from https://www.theguardian.com/music/2018/jun/23/
generation-making-now-thats-what-i-call-music.
Lecuyer, Christoph. Making Silicon Valley: Innovation and the Growth of High Tech, 1930–70.
Cambridge, MA: MIT Press, 2006.
Mahoney, Michael S. “The History of Computing in the History of Technology.” Annals of the
History of Computing 10, no. 2 (April 1988): 113–125.
Mindell, David A. Between Human and Machine: Feedback, Control, and Computing Before
Cybernetics. Baltimore: Johns Hopkins University Press, 2002.
Montfort, Nick, and Ian Bogost. Racing the Beam: The Atari Video Computer System. Cambridge,
MA: MIT Press, 2009.
Mosco, Vincent. The Digital Sublime: Myth, Power, and Cyberspace. Cambridge, MA: MIT Press,
2004.
18 T. Haigh

Nunberg, Geoffrey. “Farewell to the Information Age.” In The Future of the Book, 103–138.
Berkeley: University of California Press, 1997.
Parikka, Jussi. “Operative Media Archaeology: Wolfgang Ernst’s Materialist Media
Diagrammatics.” Theory. Culture & Society 28, no. 5 (September 2011): 52-74.
Randell, Brian. “The Colossus.” edited by N Metropolis, J Howlett and Gian-Carlo Rota, 47–92.
New York: Academic Press, 1980.
Russell, Andrew L. Open Standards and the Digital Age: History, Ideology and Networks.
New York, NY: Cambridge University Press, 2014.
Schiller, Dan. Digital Depression: Information Technology and Economic Crisis. Champaign, IL:
University of Illinois Press, 2014.
Schröter, Jens, and Alexander Böhnke. “Analog/Digital  – Opposition oder Kontinuum? Zur
Theorie und Geschichte einer Unterscheidung.” Bielefeld, Germany: Transkript, 2004.
Shannon, Claude E, and Warren Weaver. The mathematical theory of communication. Urbana:
University of Illinois Press, 1949.
Soni, Jimmy, and Rob Goodman. A Mind at Play: How Claude Shannon Invented the Information
Age. New York, NY: Simon & Schuster, 2017.
Thackray, Arnold, David Brock, and Rachel Jones. Moore’s Law: The Life of Gordon Moore,
Silicon Valley’s Quiet Revolutionary: Basic Books, 2015.
von Neumann, John. “First Draft of a Report on the EDVAC.” IEEE Annals of the History of
Computing 15, no. 4 (October 1993): 27–75.
Williams, Michael R. “A Preview of Things to Come: Some Remarks on the First Generation of
Computers.” In The First Computers: History and Architectures, edited by Raúl Rojas and Ulf
Hashagen, 1–16. Cambridge, MA: MIT Press, 2000.