In the face of the controversy surrounding Facebook/Cambridge Analytica, and in part as a response to the loss of trust in big tech companies (for example, Chakravorti 2018 and Yao 2018), there’s been some discussion which has sought to revisit the original ideals of the World Wide Web and hypertext. Anil Dash recently suggested:
the time is perfect to revisit a few of the overlooked gems from past eras. Perhaps modern versions of these concepts could be what helps us rebuild the web into something that has the potential, excitement, and openness that got so many of us excited about it in the first place.
That seems a rather forlorn hope, perhaps, but his revisiting of core concepts such as ‘View Source’, ‘Authoring’, and ‘Transclusion’ rang bells in my mind and led me to exhume a paper I gave back in 2004 in a session on Archaeology and the Electronic Word at the ‘Tartan TAG’ conference in Glasgow (amazingly the programme and abstracts, if not the website, are still available via https://www.antiquity.ac.uk/tag). At that time, I suggested that discussion of hypertext within archaeology had been relatively limited, especially in relation to issues such as access, power, communication and knowledge (which admittedly overlooked the contributions on digital publication in Internet Archaeology 6 (1999) for instance). This was despite the number of archaeological theorists who were enthusiastic proponents of hypertext in archaeology. For example, Ian Hodder wrote of enhanced participation and the erosion of hierarchical systems of archaeological knowledge together with the emergence of a different model of knowledge based on networks and flows – an environment in which “interactivity, multivocality and reflexivity are encouraged” (Hodder 1999a and see also Hodder 1999b, 117ff). Michael Shanks wrote of the benefits of collaborative writing in his Traumwerk wiki (no longer available) and the new insights that such activity can throw up through using an environment in which anyone could create and edit web pages on a particular topic, and add or alter the content of any contributions. Cornelius Holtorf published a number of papers discussing electronic scholarship and his experience in creating and publishing a hypermedia thesis (e.g. Holtorf 1999, 2001, 2004). In contrast, I proposed that there was a significant dislocation between the rhetoric and the reality – that what was actually being presented on our screens was masquerading as something which it was not and that consequently there might be a utopian or even a fetishisitic dialectic at work.
As archaeologists, we spend a great deal of time and effort looking at interfaces, be they between soil horizons or between cultural horizons, for instance. We pay rather less attention to the digital interfaces through which we access and analyse our evidence. And yet it is important that we do consider the nature of the negotiations that take place through the mediation of those interfaces. As Johanna Drucker has argued:
No single innovation has transformed communication as radically in the last half century as the GUI. In a very real, practical sense we carry on most of our personal and professional business through interfaces. Knowing how interface structures our relation to knowledge and behavior is essential. (Drucker 2014, vi, emphasis in original).
What’s not to like about the idea of central European folk dance being used as a means of illustrating the operation of different sorting algorithms? That’s what the Algo-rythmics did a few years ago – my personal favourite has to be the Quick Sort (below) with the hats changing with the operands, but do check them out (all six are on their Youtube page).
In 1975 the computer scientist Edsger Dijkstra wrote “The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities.” Dijkstra was writing in relation to programming languages but the same might equally apply to the software products coded in those languages. In this, he recalls Marshall McLuhan’s famous dictum: “We shape our tools, and thereafter our tools shape us” (although the pedant in me insists that this was actually by John Culkin).
The title for this post is shamelessly borrowed from an article by James Somers (2015) in which he seeks to argue that programming languages shape the way their users think:
“Software developers as a species tend to be convinced that programming languages have a grip on the mind strong enough to change the way you approach problems—even to change which problems you think to solve.”
Lists are all the rage – and not just the traditional rundowns at New Year. In 2014 Paul Ford suggested that it might be possible to propose a software canon consisting of great works of deeply influential software that changed the nature of what followed. He suggested five: the office suite Microsoft Office, the image editor Photoshop, the videogame Pac-Man, the operating system Unix, and the text editor Emacs.
Earlier, in 2013, Matthew Kirschenbaum had come up with ten: like Ford, he listed Photoshop but there was no room for Microsoft Office (instead WordStar and VisiCalc) – and no Pac-Man (Maze War, Adventure, and Minecraft were the games in his list). There was no operating system, but he did include Mosaic, the first graphical web browser.
Engaging in this kind of debate is generally best done over a few drinks, but in their absence, what would be the software canon for archaeologists?
Vint Cerf, co-designer of the TCP/IP protocols that make the Internet work and vice-president and Chief Internet Evangelist for Google, warned last month (for example, here, here and here) about an information black hole into which digitised material is lost as we lose access to the programs which are needed to view them. Somewhat ironically, Google’s own priorities recently seem to have been to increasingly withdraw from information projects which preserved the past – killing off archives, slowing down digitisation activities, removing the Timeline and increasingly prioritising newness over older more established sources in search results (Baio 2015).
Responses to the reporting of Cerf’s warnings were mixed. Some seemed relatively complacent: after all, we’re already preserving data and information in libraries and archives, aren’t we, while using open file formats will mean that bit rot is not a problem? In the process, many seemed to overlook part of Cerf’s argument – that there was a need to preserve old software and hardware so that we retain the ability to read files in their original formats: what he characterised as ‘digital vellum’.
A couple of articles have appeared in recent days which in different ways suggest that digital technology is too easy. Samuel Arbesman writes about how it is important to be able to see under the hood of the tools we use; Brian Millar suggests that simplicity of design takes away the satisfaction and confidence gained through mastery of the technology. In different ways, both link understanding with our ability to see past glossy interfaces designed to keep us at arms length from the guts of the process. Arbesman’s reminiscences about typing games programs from magazines into a VIC-20 and gaining an appreciation of coding and debugging certainly makes me nostalgic – in my case it was working with my younger brother’s Commodore-64 which led directly to working as a research student on machines ranging from ICL and VAX mainframes, North Star Horizons and Sirius PCs running CP/M, to the original IBM PCs and MS-DOS, and using them (and more recent models!) to construct archaeological programs in FORTRAN, C and C++.