This post is the fifth in a series of six blogs which will document and critically engage with a workshop series hosted by Dr. David Geiringer (QMUL) and Dr. Helen McCarthy (Cambridge) under the title ‘Rethinking Britain in the 1990s: Towards a new research agenda’. Running between January and March 2021, the series brings together contemporary historians from a range of career stages to map existing work and stimulate new thinking on a decade which, from the perspective of our present times, looks very unfamiliar indeed.
by David Dahlborn (St. John’s College, University of Cambridge)
Do we historians, by privileging digital technology with its own theories and interpretations, risk replicating technologically deterministic narratives of ‘digital revolution’ in the 1990s and beyond? What makes digital technology special to the extent that it deserves its own history, alongside political history or cultural history? How is it fundamentally different from analogue technology, to the extent that it is considered epoch-making?
Having considered the many thoughtful contributions at last week’s workshop, I think Richard Barbrook and Andy Cameron’s remarkable essay ‘The Californian Ideology’ from 1996 still holds up. I recommend it to contemporary historians, not least as some of the most lucid comments at the workshop leaned towards its conclusions. It argued that digital machines were, from business owners’ perspectives, a cheaper and faster version of analogue machines, able to produce more while paying less for labour and maintaining more control over workers. Stories of revolutionary utopian futurism were a Californian ideology, promoted by tech enterprises. However, simultaneously, the digital machines’ unique feature of copying information at almost zero cost contained the potential for an economy based more on voluntary exchanges of gifts, rather than the controlled sale of commodities.
Because of this intimate connection between digital machines and work, James Baker’s opening proposal to not overstate the Internet’s significance and to pay attention to workplaces was appropriate. Baker noted how most Brits first encountered information technology (IT) almost exclusively at work, although this changed during the 1990s as managers began assuming that workers would learn computer skills at home. As Tom Lean added, work computers and home computers became more similar as the decade progressed.
In this blog post, I argue that historians, by asking what digital machines actually are, can both go further and recognise the significance of many other digital machines in this period. At the same time, we should recognise that the ‘revolution’ of digital technology might be part of the Californian ideology. Finally, since Baker pointed out the importance of considering the motivations behind creating technological infrastructures, I will emphasise investments by public and private capital which have shaped the technology that characterised the 1990s and our interpretations of it.
These investments far predated the 1990s. In 1985, Ursula Huws, observed the effect of IT on women’s lives for Artificial Intelligence for Society, edited by Karamjit S. Gill. Computers and new technology would cut many jobs, she argued, while leaving others ‘simplified [and] casualized’. She predicted electronic homework increasing as workers were ‘remotely monitored by machine’, with ‘pretty depressing implications for women’. She rejected magazine advice for homeworking mothers on how to keep their children quiet with carrots and sticks while processing data on home computers as ‘completely Pavlovian’. All this was possible before the Internet.
Notably, Huws and Gill worked for a period at the Greater London Council (GLC) under Ken Livingstone’s administration in the early 1980s. Similarly, the GLC Women’s Committee bulletin and the London New Technology Network’s Women’s Training Course Project Group, whom Baker and Francesca Sobande cite, were enabled by local government funds. The GLC also supported the Minorities Information Technology Awareness Group. Therefore, rather than ‘community action groups’, it was the GLC that produced many 1980s analyses of digital machines and labour. While Sobande uses GLC-funded work as a prehistory for analysing black women’s work in the social media industry in the 2010s, it came about within a region-wide London Industrial Strategy that followed the Labour left’s Alternative Economic Strategy and the Lucas Plan, under which digital machines would be used for improving workers’ standards, freedom and control.
Nor, as John Naughton’s response highlighted, should the influence of private capital be underestimated in shaping digital machines. He placed the Internet centrestage, with the 1990s characterised by its discovery by the ‘non-tech-savvy world’. Most Internet boom technology that he mentioned was developed abroad, often with public capital, dating back to ARPANET by the US military and the World Wide Web, invented at CERN. The significant systems developed partially with British capital and expertise were, he noted, the Joint Academic Network and Global System for Mobile Communications (GSM, 2G mobile phone network), both public sector. Nevertheless, the relationship between private and public capital was clear in his example of Netscape, a privately owned web browser launched for e-commerce in 1994, but only after a breakthrough by the US public sector’s Mosaic. His mentions of lastminute.com and FriendsReunited showed how much private companies feature in memories of this period. Meanwhile singling the pornography market as a driver of technological development was a reminder of popular demand as an agent of change – a point of continuity since the development of photography or, indeed, paleolithic cave art.
With this in mind, we might consider creative destruction as a feature, not a bug of capitalist economies. This places Lean’s emphasis on ‘failed technologies’, like CD-Rom encyclopedias, in a different interpretative context. He noted that the development of the digital machines that became everyday items in the late twentieth century was marked by remarkable historical contingency, using the pertinent example of BT’s Prestel Videotex system. Similar to teletext, only interactive, Prestel launched in 1979 with many features later typical of the Internet; like Compunet, this was a significant path not taken. Notably, Prestel was established by a publicly owned BT and sold after privatisation. How might this change interpretations of failure? We might also question the line increasingly drawn in the 1990s between digital and analogue. Videotex and Ceefax (possibly, since 1974, the earliest mass digital interface most Brits chose to use) were a combination of digital and analogue machines.
John Agar presented a thought-provoking point by making mobile phones the most momentous 1990s machine. He called the GSM telephone network ‘the spectacular technological achievement of European bureaucracy’, and argued that the way mobiles, and the computer chips they contained, became mass items in the late 1990s anticipated their ubiquity and prime position in the 2010s attention economy. Similarly, the Tamagotchi, he argued, set the stage for caring for smart phones, symbolically naming the mobile era ‘the Tamagotchicene’. By stressing that few items have been significant enough to be carried on our person, like clothes or glasses, he presented its popularity as an epochal shift. However, during the discussion, Lawrence Black made the point that the function of the mobile phone was in itself prefigured by the filofax (produced since 1921). There might, therefore, be more continuity between pre- and post-digital than we might assume if we are looking for historical turning points.
None of the nostalgia in the late Tamagotchicene could replace experiencing Pokemania.
The area where historians can make many novel observations might, as Jane Winters argued, be how technology users experienced their machines. Contrasting a material approach, she cited Valérie Schafer’s work on early French experiences of the Web. Her observation was linked to her valuable methodological point that the way born-digital sources are preserved may misrepresent what waiting for a dial-up modem or games arcade was really like, as might be the case with oldweb.today or the Internet Archive’s games section. Born-digital sources are deceptive, she warned, as they may give impressions of completeness, when only partially archived, or without context. She called Web archives ‘cracked windows’, incomplete and without functioning links, at best. As a complement, she recommended printed manuals, website guides or web page print-outs, which might be the only way to view pages unavailable on the Internet Archive. A further methodological problem, raised by Baker was the ethical question of privacy, as many GeoCities posts, although accessible today, were not written at a time when they were searchable.
Overall, however, an enormous amount of digital material beyond computers and mobiles permeated or began to exponentially flood the 1990s. Considerable digital or hybrid technologies that received virtually no mention at the workshop included traffic control systems, barcodes, satellite television, CDs and synthesisers. Indeed, to Barbrook and Cameron in 1996, it was digital music, like jungle and techno, that anticipated a new gift economy. Furthermore, more households – 16.9 percent, or four million – had satellite TV in 1996 than had Internet in 1999 (rising to 29 percent in 1999). For years they received analogue signals transmitted by a digital machine shot into space from a French colony, and owned by a company initially guaranteed by the Grand Duchy of Luxembourg – the Astra 1A satellite. If a feudal remnant could back Europe’s premier satellite initiative, what might the GLC have founded had it enjoyed greater devolution?
On a serious note, the continued popularity of TV interestingly intersects with several interesting issues – firstly, the boundary between digital and analogue; secondly, the necessity to decentre computers and the Internet; thirdly, the considerable continuities between analogue and digital ‘worlds’; fourthly, the methodological problem of handling overwhelming amounts of material without a full context; and, fifthly, the transnational and public-private crossovers inherent in satellite technology.
Finally, on a transnational point, I must mention that radical writers in the 1980s and 1990s observed something that we did not mention: Europe’s digital machines utterly depended on cheap labour in Asia. Huws noted in 1985 how fundamental labour-intensive manual data entry for US systems was offshored to the Philippines – still there in 2014. Barbrook and Cameron, likewise, argued that creating machine slaves is impossible without human slavery. From a Chinese factory, a Manila sweatshop or, perhaps even, a British call centre, an abundance of digital machines in the UK perhaps looks less like a ‘digital revolution’ than the continuing global expansion of state-led capitalism.
It is an important and popular fact that things are not always what they seem.
From James Dale Davidson and William Rees-Mogg’s The Sovereign Individual in 1997 to Yuval Noah Harari’s Homo Deus in 2015 (and in many cases before and since) we have been told stories about digital machines that are technologically determinist. Journalists, sociologists, historians, philosophers and self-proclaimed prophets have often assumed that new machines would automatically equal new times, new classes and new imaginary futures. Their implicit arguments make machines, and not humans, the subject of history and say there is no alternative to free-market capitalism. Therefore, a central digital narrative that contemporary historians can begin by recognising and criticising is the ‘revolution’ promised by the Californian ideology.