Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Memory Machines: The Evolution of Hypertext
Memory Machines: The Evolution of Hypertext
Memory Machines: The Evolution of Hypertext
Ebook287 pages3 hours

Memory Machines: The Evolution of Hypertext

Rating: 0 out of 5 stars

()

Read preview

About this ebook

This book explores the history of hypertext, an influential concept that forms the underlying structure of the World Wide Web and innumerable software applications. Barnet tells both the human and the technological story by weaving together contemporary literature and her exclusive interviews with those at the forefront of hypertext innovation, tracing its evolutionary roots back to the analogue machine imagined by Vannevar Bush in 1945.

LanguageEnglish
PublisherAnthem Press
Release dateJul 15, 2013
ISBN9780857281968
Memory Machines: The Evolution of Hypertext

Related to Memory Machines

Related ebooks

Computers For You

View More

Related articles

Reviews for Memory Machines

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Memory Machines - Belinda Barnet

    PREFACE

    This book would not have been possible without the cooperation of its subjects. Like technology historian Steve Shapin and sociologist Thierry Bardini, I write in the ‘confident conviction that I am less interesting than the subjects I write about’ (Shapin cited in Bardini 2000, xiv). I have had the great privilege of meeting many of the people you will read about in these pages – except for Vannevar Bush, who died in 1974 – the colourful anecdotes, magical visions and prescient ideas you will find here have come directly from them or from their work. At times I felt like a media studies bowerbird, procuring brightly coloured memories, sticky notes and cryptic computer manuals from various computing science professors, writers and visionaries across the globe. In that sense, this book may be read as a simple history book. There is no need to be self-reflexive or clever when presented with such a treasure trove; it is intrinsically interesting and needs no posthistorical garnish (I’ll confine that to the preface). That said, I do not claim to present you with the final word on hypertext history; this is an edited selection, a woven structure of deeply interconnected stories contributed in large part by the people who lived them.

    I have spent months, years in fact, arranging this collection; it was first assembled as a PhD thesis in 1999, before I had children and consequently when I had the luxury of time. Time to do things like roam around Brown University gathering documents and stories from Andries van Dam and the Hypertext Editing System (HES) team; time to rummage through the Vannevar Bush archives at the Library of Congress looking for interesting correspondence; time to interview Doug Engelbart and feel embarrassingly starstruck; and time to travel to Keio University in Japan to meet Ted Nelson. I also had time to ponder how it all might fit together, and more deeply, if it is even possible to say that a technical system ‘evolves’. What, exactly, am I tracing the path of here? In one of those delightful recursive feedback loops that punctuates any history, I discovered that Doug Engelbart is also concerned with how technology evolves over time – so concerned, in fact, that he constructed his own ‘framework’ to explain it. Inspired, I went off and interviewed one of the world’s most eminent palaeontologists, Niles Eldredge, and asked him what he thought about technical evolution. His response forms the basis of Chapter 1.

    This hodgepodge of oral histories and technical visions worked quite well, so in the following decade I added a few extra twigs and blue plastic bottle caps and published it as a series of articles. Now, with the encouragement of my editor at Anthem Press, Paul Arthur and Stuart Moulthrop, I have chased down some of my original interviewees and coaxed more stories from them. I’ve also added a new chapter on Storyspace and the birth of a literary hypertext culture. In the interim, the subjects I am writing about have also published important work of their own, which I have duly read and woven into the story (though not soon enough, as Ted crankily pointed out). So I’ve arranged a new cross-tangle of stories for you to walk around in and inspect. These stories are not at all linear; many of them overlap in time, and most of them transfer ideas and innovations from earlier systems and designs.

    According to Eldredge (2006), this is actually how technologies evolve over time – by transfer and by borrowing (he calls this the ‘lateral spread of information’). Ideally the chapters would be viewed and read in parallel, side-by-side strips, with visible interconnections between them: you would be able to see where all the different ideas have come from and trace them back to their original source. If that comes across all Xanadu, well, that’s because it is.

    The metaphor of a bower, a matted thicket, is an appropriate one. A leitmotif recurs throughout this history, a melody that plays through each chapter, slightly modified in each incarnation but nonetheless identifiable; it is the overwhelming desire to represent complexity, to ‘represent the true interconnections’ that crisscross and interpenetrate human knowledge, as Ted puts it (1993). This riff does not come from me; it is as old as information science. How can we best organize the tangle of human thought, yet preserve its structure? It is not just the ideas that are important when we read or write or conduct historical research, but the connections between them. We should be able to see and follow those connections, and we should be able to write with them. This was playing in the background of every interview I conducted, and it was explicitly stated in many of the dusty old manuals, project proposals, letters and articles from the ’40s,’50s and’60s through which I sneezed and highlighted my way. It is the theme song of hypertext.

    Now for the important bit: this story stops before the Web. More accurately, it stops before hypertext became synonymous with the Web. As Ted put it to me in 1999, ‘people saw the Web and they thought, Oh, that’s hypertext, that’s how it’s meant to look’. But hypertext is not the Web; the Web is one particular implementation of hypertext, one particular model of how it might work. It is without a doubt the most successful and prevalent model, but it is also an arguably limited one. The goal of this book is to explore the visions of the early hypertext pioneers, and in the process, to broaden our conception of what hypertext could or should be. I will define precisely what I mean by the word hypertext in the next chapter, but for now we will use Ted Nelson’s popular definition, branching and responding text, best read at a computer screen (Nelson 1993).

    The first hypertext systems were deep and richly connected, and in some respects more powerful than the Web. For example, the earliest built system we will look at here – the oN-Line System (NLS) – had fine-grained linking and addressing capabilities by 1968. On the Web, the finest level of ‘intrinsic’ addressability is the URL, and if that page is moved, your link breaks. In NLS, the address was attached to the object itself, so if the document you were linking to moved, the link followed it. No broken links, no 404 errors, and the ‘granularity’ was as fine as sifted flour. Try doing that with HTML. As Stuart Moulthrop writes in Hegirascope(1997), HTML stands for many things (‘Here True Meaning Lies’), one of which is ‘How to Minimise Linking’.

    These early systems were not, however, connected to hundreds of millions of other users. You could not reach out through FRESS and read a page hosted in Thailand or Libya. The early systems worked on their own set of documents in their own unique environments. Although Ted certainly envisioned that Xanadu would have the domestic penetration of the Web, and NLS had nifty collaborative tools and chalk-passing protocols, none of the early ‘built’ systems we look at either briefly or in depth in this book – NLS, HES, FRESS, WE, EDS, Intermedia or Storyspace – were designed to accommodate literally billions of users.¹ That’s something only the Web can do.

    As Jay Bolter put it in our interview:

    What the World Wide Web did was two things. One is that it compromised, as it were, on the ‘vision’ of hypertext. It said ‘this is the kind of linkage it’s always going to be, it’s always going to work in this way’, [but] more importantly it said that the really interesting things happen when your links can cross from one computer to another […] So global hypertext – which is what the Web is – turned out to be the way that you could really engage, well, ultimately hundreds of millions of users. (Bolter 2011)

    This story stops before the birth of mass ‘domesticated’ hypertext – before hypertext escaped its confines in university labs and wrapped itself around the globe like electronic kudzu to become the Web. It stops before we made up our mind about what hypertext is meant to look like. In that sense, the book is also a call to action. Hypertext could be different; it doesn’t have to be the way it is today. Imagine if we could read history in side-by-side parallel strips with visible interconnections, such as Xanadu would provide. Imagine a hypertext system with no broken links, with no 404 errors, with ‘frozen state’ addressability like NLS. Imagine if the link structure were separable from the content. Imagine if there were no artificial distinctions between readers and writers, such as HES would provide. As Ted has been saying for over 50 years, the computing world could be completely different.

    This book is about an era when people had grand visions for their hypertext systems, when they believed that the solution to the world’s problems might lie in finding a way to organize the mess of human knowledge, to create powerful tools for reading and writing: the ‘first generation’ of hypertext systems. These systems were prescient and visionary, but they were also useful. Most of it took place well before I was born and it terminates in the late ’80s. With the exception of Memex, which existed entirely on paper, the first generation of hypertext systems were mainframe-based; they were built on large, clunky machines that cost a small fortune and had to be kept in special air-conditioned rooms.

    The story changes in the mid-1980s with the emergence of several smaller workstation-based, ‘research-oriented’ systems (Hall et al. 1996, 14). I have included only two of these here, arguably the two that had the least commercial or financial success: Storyspace and Intermedia. I could have included several others from that period – most obviously NoteCards, Office Workstations Limited’s Guide, KMS, Microcosm, or Apple’s very popular HyperCard program released in 1987. The latter systems were all commercialized in some form, but HyperCard was the most successful, largely because it was bundled free with every Macintosh sold after 1987 (Nielsen 1995).

    HyperCard is the elephant in the pre-Web hypertext room; it popularized hypermedia before the Web and introduced the concept of linking to the general public. In some media accounts, one could be forgiven for thinking there was no hypertext before HyperCard (as New Scientist’s Helen Knight implies in her piece on pre-Web hypertext, ‘The Missing Link’ (2012, 45)). Actually computer-based hypertext had been around for twenty years before HyperCard, and the systems we will explore in this book were all built (or imagined) well before its release in 1987.

    If I had another ten thousand words to play with, I would include a chapter on Microcosm and Professor Dame Wendy Hall. Microcosm was a commercial application, but it deserves a mention here because it was also the harbinger of a new era: network culture. Microcosm was a pioneering open-linking service designed to deal with large-scale distributed hypertexts and (unlike Intermedia or Storyspace, for that matter, which still arguably belongs to an earlier era) it was quickly incorporated into the Web. It was built by Wendy Hall, Gary Hill and Hugh Davis at the University of Southampton, and influenced by the ideas of Ted Nelson. As Wendy Hall put it to me:

    Microcosm started out as a research system. But we did commercialise it. We launched the company – then called Multicosm Ltd and later Active Navigation – in 1994 just as the Web was taking off. The company did well for a while but in the end was overtaken by events. (Hall 2012)

    Hall started her career in hypermedia and hypertext at Southampton in 1984, and is now one of the world’s foremost computer scientists. She is founding director, along with Professor Sir Tim Berners-Lee, Professor Nigel Shadbolt and Daniel J. Weitzner, of the Web Science Research Initiative. She became a Dame Commander of the British Empire in 2009, due in no small part to her contributions to computing science.

    These systems were all pioneering in their own right, but they are (to my mind, at least) part of a different story: the commercialization of hypertext and the birth of network culture. That is a different book for a different time, and perhaps a different scholar. As Stuart Moulthrop pointed out to me one day as we were rattling along a freeway in Melbourne, this story traces the paired antiparticle that disappeared in a puff of subatomic debris before the Web. The other antiparticle, the one that survived, smashed into the bright links and baubles of HTML. The systems I look at here were exciting and revolutionary in their era, but like mermaid gold from children’s storybooks, they turned to ashes when brought to the surface.

    This is not my story; it is the story of the hypertext pioneers who designed and built the first systems and wrote the first hypertexts – and I need to thank them now. First and foremost, I would like to thank Stuart Moulthrop. Stuart has been a mentor and an inspiration to me for over fifteen years, and he first set me on the forking path to hypertext theory by hosting me as a visiting researcher at the University of Baltimore in 1999. His work has been deeply influential, both on my own writing and on hypertext as a critical

    Enjoying the preview?
    Page 1 of 1