Social media have been the focus of much attention in recent weeks over their unwillingness/tardiness in applying their own rules. Whether it’s Twitter refusing to consider Trump’s aggressive verbal threats against North Korea to be in violation of their harassment policy, or YouTube belatedly removing a video by Logan Paul showing a suicide victim (Matsakis 2018, Meyer 2018), or US and UK government attempts to hold Facebook and Twitter to account over ‘fake news’ (e.g. Hern 2017a, 2017b), there is a growing recognition that not only are ‘we’ the data for these social media behemoths, but that these platforms are optimised for this kind of abuse (Wachter-Boettcher 2017, Bridle 2017).
Nicholas Carr has just pointed to some recently published research which suggests that the presence of smartphones divert our attention, using up cognitive resources which would otherwise be available for other activities, and consequently our performance on those non-phone-related activities suffers. In certain respects, this might not seem to be ‘news’ – we’re becoming increasingly accustomed to the problem of technological interruptions to our physical and cognitive activities: the way that visual and aural triggers signal new messages, new emails, new tweets arriving to distract us from the task in hand. However, this particular study was rather different.
In this case, the phones were put into silent mode so that participants would be unaware of any incoming messages, calls etc. (and if the phone was on the desk, rather than in their pocket or bag or in another room altogether, it was placed face-down to avoid any visible indicators) (Ward et al. 2017, 144). Despite this, they found that
“… the mere presence of one’s smartphone may reduce available cognitive capacity and impair cognitive functioning, even when consumers are successful at remaining focused on the task at hand” (Ward et al, 2017, 146).
One theme that came out of the recent CAA 2015 conference in Siena last week circled around the unstated issue of whether the role of digital technology was to support or substitute current (traditional) archaeological practice. This featured particularly strongly in the day-long session organised by James Taylor and Nicolò Dell’Unto, ‘Towards a Theory of Practice in Applied Digital Field Methods’.
Nicholas Carr points to a new essay by Leon Wieseltier, former literary editor at The New Republic recently moved to The Atlantic, on the state of culture in the digital age. It’s a wide-ranging commentary on the impact of technology on culture:
“… words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes …”.
He talks of the way that information and knowledge have become treated as equal or equivalent, and the expectation that knowledge requires a scientific approach and so as a consequence the humanities are viewed as less relevant. He argues that we have shifted away from a worldview where humanity is at the centre of our universe to one in which impersonal forces – technologies – have become the key determinant.
Emma Bryce (2014) has recently written about her autistic brother’s interest in technology – something that is quite commonly associated with folk on the spectrum. I deliberately wound up a conference audience some years ago by characterising computer-usage amongst archaeologists as fetishistic, but I’m not about to claim that digital archaeologists are autistic. However, one phrase at the end of her article jumped out at me: that regardless of where we are, on or off the spectrum, we all use technology as a form of comfort and security.
“By its very structure, technology invites us to practice repetitive behaviours and keep familiar habits alive. It transports us to places we feel comfortable…”
Curiously, according to Bloomberg’s recent ‘The 85 Most Disruptive Ideas in Our History’, the microchip comes second to the first-place jet engine. And their justification seems stranger still – the way in which the jet shrunk the world is perhaps fair enough, though the claim that for the first time the entire surface of the planet was reachable is open to question. Tell that to the likes of Alcock and Brown (first non-stop transatlantic flight, 1919), Macready and Kelly (first non-stop transcontinental flight, 1923), Smith and Nelson (first round the world flight, 1924) (and see Famous Firsts in Aviation for more in this vein). And yet in almost the same breath it is noted that the jet engine technology has become remarkably static.
The digitisation of archaeology over the past twenty years or so could be said to be an unprecedented transformation of the subject. The move from field notebooks (or quite literally in some cases the backs of envelopes, receipts, bus tickets and the like) to site databases, the move from desktop recording and hand logging to digital data capture in the field, the move from local databases to distributed databanks, the introduction and development of CAD, GIS, 3D modelling, and a host of innovations such as agent-based modelling, reflectance transformation imaging, structure from motion, and increasingly refined and ‘intelligent’ search tools … all these would seem to support the idea of a digital transformation of the subject. The democratisation of technology appears to underline this – the fact that we have moved from a time when computers were in the hands of a few, usually academic, archaeologists to a situation in which everyone has a computer in their pocket, in their bag, and on their desk.