Discussion of digital ethics is very much on trend: for example, the Proceedings of the IEEE special issue on ‘Ethical Considerations in the Design of Autonomous Systems’ has just been published (Volume 107 Issue 3), and the Philosophical Transactions of the Royal Society A published a special issue on ‘Governing Artificial Intelligence – ethical, legal and technical opportunities and challenges’ late in 2018. In that issue, Corinne Cath (2018, 3) draws attention to the growing body of literature surrounding AI and ethical frameworks, debates over laws governing AI and robotics across the world and points to an explosion of activity in 2018 with a dozen national strategies published and billions in government grants allocated. She also notes the way that many of the leaders in both debates and the technologies are based in the USA which itself presents an ethical issue in terms of the extent to which AI systems mirror the US culture rather than socio-cultural systems elsewhere around the world (Cath 2018, 4).
Agential devices, whether software or hardware, essentially extend the human mind by scaffolding or supporting our cognition. This broad definition therefore runs the gamut of digital tools and technologies, from digital cameras to survey devices (e.g. Huggett 2017), through software supporting data-driven meta-analyses and their incorporation in machine-learning tools, to remotely controlled terrestrial and aerial drones, remotely operated vehicles, autonomous surface and underwater vehicles, and lab-based robotic devices and semi-autonomous bio-mimetic or anthropomorphic robots. Many of these devices augment archaeological practice, reducing routinised and repetitive work in the office environment and in the field. Others augment work by developing data-driven methods which represent, store, and manipulate information in order to undertake tasks previously thought to be uncomputable or incapable of being automated. In the process, each raises ethical issues of various kinds. Whether agency can be associated with such devices can be questioned on the basis that they have no intent, responsibility or liability, but I would simply suggest that anything we ascribe agency to acquires agency, especially bearing in mind the human tendency to anthropomorphize our tools and devices. What I am not suggesting, however, is that these systems have a mind or consciousness themselves, which represents a whole different ethical set of questions.
In his book Homo Deus, Yuval Noah Harari rather randomly chooses archaeology as an example of a job area that is ‘safe’ from Artificial Intelligence:
The likelihood that computer algorithms will displace archaeologists by 2033 is only 0.7 per cent, because their job requires highly sophisticated types of pattern recognition, and doesn’t produce huge profits. Hence it is improbable that corporations or government will make the necessary investment to automate archaeology within the next twenty years (Harari 2015, 380; citing Frey and Osborne 2013).
It’s an intriguing proposition, but is he right? Certainly, archaeology is far from a profit-generating machine, but he rather assumes that it’s down to governments or corporations to invest in archaeological automation: a very limited perspective on the origins of much archaeological innovation. However, the idea that archaeology is resistant to artificial intelligence is something that is worth unpicking.
Social media have been the focus of much attention in recent weeks over their unwillingness/tardiness in applying their own rules. Whether it’s Twitter refusing to consider Trump’s aggressive verbal threats against North Korea to be in violation of their harassment policy, or YouTube belatedly removing a video by Logan Paul showing a suicide victim (Matsakis 2018, Meyer 2018), or US and UK government attempts to hold Facebook and Twitter to account over ‘fake news’ (e.g. Hern 2017a, 2017b), there is a growing recognition that not only are ‘we’ the data for these social media behemoths, but that these platforms are optimised for this kind of abuse (Wachter-Boettcher 2017, Bridle 2017).
Nicholas Carr has just pointed to some recently published research which suggests that the presence of smartphones divert our attention, using up cognitive resources which would otherwise be available for other activities, and consequently our performance on those non-phone-related activities suffers. In certain respects, this might not seem to be ‘news’ – we’re becoming increasingly accustomed to the problem of technological interruptions to our physical and cognitive activities: the way that visual and aural triggers signal new messages, new emails, new tweets arriving to distract us from the task in hand. However, this particular study was rather different.
In this case, the phones were put into silent mode so that participants would be unaware of any incoming messages, calls etc. (and if the phone was on the desk, rather than in their pocket or bag or in another room altogether, it was placed face-down to avoid any visible indicators) (Ward et al. 2017, 144). Despite this, they found that
“… the mere presence of one’s smartphone may reduce available cognitive capacity and impair cognitive functioning, even when consumers are successful at remaining focused on the task at hand” (Ward et al, 2017, 146).
“… words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes …”.
He talks of the way that information and knowledge have become treated as equal or equivalent, and the expectation that knowledge requires a scientific approach and so as a consequence the humanities are viewed as less relevant. He argues that we have shifted away from a worldview where humanity is at the centre of our universe to one in which impersonal forces – technologies – have become the key determinant.
Emma Bryce (2014) has recently written about her autistic brother’s interest in technology – something that is quite commonly associated with folk on the spectrum. I deliberately wound up a conference audience some years ago by characterising computer-usage amongst archaeologists as fetishistic, but I’m not about to claim that digital archaeologists are autistic. However, one phrase at the end of her article jumped out at me: that regardless of where we are, on or off the spectrum, we all use technology as a form of comfort and security.
“By its very structure, technology invites us to practice repetitive behaviours and keep familiar habits alive. It transports us to places we feel comfortable…”
Curiously, according to Bloomberg’s recent ‘The 85 Most Disruptive Ideas in Our History’, the microchip comes second to the first-place jet engine. And their justification seems stranger still – the way in which the jet shrunk the world is perhaps fair enough, though the claim that for the first time the entire surface of the planet was reachable is open to question. Tell that to the likes of Alcock and Brown (first non-stop transatlantic flight, 1919), Macready and Kelly (first non-stop transcontinental flight, 1923), Smith and Nelson (first round the world flight, 1924) (and see Famous Firsts in Aviation for more in this vein). And yet in almost the same breath it is noted that the jet engine technology has become remarkably static.
The digitisation of archaeology over the past twenty years or so could be said to be an unprecedented transformation of the subject. The move from field notebooks (or quite literally in some cases the backs of envelopes, receipts, bus tickets and the like) to site databases, the move from desktop recording and hand logging to digital data capture in the field, the move from local databases to distributed databanks, the introduction and development of CAD, GIS, 3D modelling, and a host of innovations such as agent-based modelling, reflectance transformation imaging, structure from motion, and increasingly refined and ‘intelligent’ search tools … all these would seem to support the idea of a digital transformation of the subject. The democratisation of technology appears to underline this – the fact that we have moved from a time when computers were in the hands of a few, usually academic, archaeologists to a situation in which everyone has a computer in their pocket, in their bag, and on their desk.