Social media have been the focus of much attention in recent weeks over their unwillingness/tardiness in applying their own rules. Whether it’s Twitter refusing to consider Trump’s aggressive verbal threats against North Korea to be in violation of their harassment policy, or YouTube belatedly removing a video by Logan Paul showing a suicide victim (Matsakis 2018, Meyer 2018), or US and UK government attempts to hold Facebook and Twitter to account over ‘fake news’ (e.g. Hern 2017a, 2017b), there is a growing recognition that not only are ‘we’ the data for these social media behemoths, but that these platforms are optimised for this kind of abuse (Wachter-Boettcher 2017, Bridle 2017).
As archaeologists, we spend a great deal of time and effort looking at interfaces, be they between soil horizons or between cultural horizons, for instance. We pay rather less attention to the digital interfaces through which we access and analyse our evidence. And yet it is important that we do consider the nature of the negotiations that take place through the mediation of those interfaces. As Johanna Drucker has argued:
No single innovation has transformed communication as radically in the last half century as the GUI. In a very real, practical sense we carry on most of our personal and professional business through interfaces. Knowing how interface structures our relation to knowledge and behavior is essential. (Drucker 2014, vi, emphasis in original).
What’s not to like about the idea of central European folk dance being used as a means of illustrating the operation of different sorting algorithms? That’s what the Algo-rythmics did a few years ago – my personal favourite has to be the Quick Sort (below) with the hats changing with the operands, but do check them out (all six are on their Youtube page).
Within the last ten days, we’ve been reminded about the invisibility of algorithms which govern much of our online activity. We’ve seen Google alter its search ranking algorithm to prioritise mobile-friendly sites in its search results, Facebook change its newsfeed algorithm to give greater precedence to posts from friends (who’d have thought it?!), and the French Senate vote to require search engines to reveal the workings of their search ranking algorithms to ensure they deliver fair and non-discriminatory results. There’s also been discussion of the role of trading algorithms in the 2010 ‘flash crash’ and stock market movements in the last month or so in the US …
In 1975 the computer scientist Edsger Dijkstra wrote “The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities.” Dijkstra was writing in relation to programming languages but the same might equally apply to the software products coded in those languages. In this, he recalls Marshall McLuhan’s famous dictum: “We shape our tools, and thereafter our tools shape us” (although the pedant in me insists that this was actually by John Culkin).
The title for this post is shamelessly borrowed from an article by James Somers (2015) in which he seeks to argue that programming languages shape the way their users think:
“Software developers as a species tend to be convinced that programming languages have a grip on the mind strong enough to change the way you approach problems—even to change which problems you think to solve.”
A couple of articles have appeared in recent days which in different ways suggest that digital technology is too easy. Samuel Arbesman writes about how it is important to be able to see under the hood of the tools we use; Brian Millar suggests that simplicity of design takes away the satisfaction and confidence gained through mastery of the technology. In different ways, both link understanding with our ability to see past glossy interfaces designed to keep us at arms length from the guts of the process. Arbesman’s reminiscences about typing games programs from magazines into a VIC-20 and gaining an appreciation of coding and debugging certainly makes me nostalgic – in my case it was working with my younger brother’s Commodore-64 which led directly to working as a research student on machines ranging from ICL and VAX mainframes, North Star Horizons and Sirius PCs running CP/M, to the original IBM PCs and MS-DOS, and using them (and more recent models!) to construct archaeological programs in FORTRAN, C and C++.
There’s a lot of debate in the wider world about digital data – issues of access and privacy, the case of Aaron Swartz and open access to knowledge, the Ed Snowden revelations, and, at the personal level, the way that we all leave data trails behind as we traverse the Internet. Surrendering our personal data is difficult to avoid, even if we forswear Facebook, Google, and their like who build their business models on their ability to capture data about us.
In a recent paper by Richard Mortier et al, (2015), they argue that this new world of data requires a new kind of study of human-data interaction, looking at the implications of the data we generate in all kinds of different ways, knowingly or unknowingly.