Pleasing algorithms

datacenter
CC0 – by Kewl via Pixabay

Social media have been the focus of much attention in recent weeks over their unwillingness/tardiness in applying their own rules. Whether it’s Twitter refusing to consider Trump’s aggressive verbal threats against North Korea to be in violation of their harassment policy, or YouTube belatedly removing a video by Logan Paul showing a suicide victim (Matsakis 2018, Meyer 2018), or US and UK government attempts to hold Facebook and Twitter to account over ‘fake news’ (e.g. Hern 2017a, 2017b), there is a growing recognition that not only are ‘we’ the data for these social media behemoths, but that these platforms are optimised for this kind of abuse (Wachter-Boettcher 2017, Bridle 2017).

Sean Parker, founding president of Facebook, recently gave a (former) insider’s view of social networks:

“The thought process that went into building these applications … was all about: ‘How do we consume as much of your time and conscious attention as possible?’ … It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.” (Allen 2017).

And the ultimate objective is to monetise those eyeballs through targeted advertising.

Archaeologists, along with everyone else, are caught up in this – and even if not everyone is a social media user, as Sean Parker says, “You know, you will be … We’ll get you eventually”. Colleen Morgan has recently written thoughtfully about whether we are complicit in this through encouraging or requiring students to use such tools as part of their studies (Morgan 2018) and concludes that “For now the pedagogical balance may fall on a structured, critical engagement with social media, but any use in the classroom needs to fully consider the monetization of content and personal information provided.”

Crucially, the implications of monetisation go beyond delivering advertisements and counting eyeballs, clicks, ‘engagements’, interactions, ‘likes’ and so on.

“Unlike older media, Facebook and Alphabet know essentially everything about their users, tracking them everywhere they go on the web and often beyond. By making every experience free and easy, Facebook and Alphabet became gatekeepers on the internet, giving them levels of control and profitability previously unknown in media. They exploit data to customize each user’s experience and siphon profits from content creators” (McNamee 2017).

This is what lies behind the concept of ‘filter bubbles’ which affect what each individual sees/experiences as a consequence of the personalised news feed, search results, and other algorithms used by social media firms, and what has had such an impact in relation to the ‘fake news’ debate.

But there’s more to it even than this. The controversies surrounding Logan Paul, ‘fake news’, Twitter policies and so on tend to focus on the content creators, be it Trump, Paul, or the people behind the Toy Freaks or ToysToSee YouTube children-oriented channels, and lose sight of the fact that the platforms are specifically designed to facilitate, and implicitly support and encourage, these behaviours. As a channel owner told Buzzfeed,

“In terms of contact and relationship with YouTube, honestly, the algorithm is the thing we had a relationship with since the beginning. That’s what got us out there and popular … We learned to fuel it and do whatever it took to please the algorithm.” (Warzell & Smidt 2017).

Fuelling the algorithm generates huge levels of footfall, engagement, and consequently income for both creator and platform.

So what can we take from this? Leaving the ethical and political issues raised to one side, what this highlights is that these algorithms aren’t simply something that is ‘done to us’ as end users, that far from it being a passive relationship we actively feed and in the process reinforce their demands and outcomes. Paradoxically, we may be powerless in the face of these algorithms, but those same algorithms are reliant on our actions and interactions which give them their power over us. Nor can we simply shrug this off as a feature of social media and allied platforms – the same can be said of pretty much any piece of software we use. We are increasingly dependent on the digital tools we apply, and that dependence reinforces their power over us, which is fed by our continued application of them. Like the digital social media platforms, they are created and maintained by faceless designers and programmers for the most part whose objectives are not necessarily congruent with our own. So what is true of the Facebook news feed is equally true of ArcGIS, for instance. This could become an argument for the use of open code wherever possible but the same circular power relations remain, albeit framed somewhat differently and underlining that transparency alone is not sufficient: there has to be a will to confront and understand algorithmic agency.

These thoughts were in part triggered by a short piece by Michael Sacasas in which he concludes that:

“It is true, of course, that we will bend to the shape of any number of external realities: natural, social, and technological. To be human is to both shape and be shaped by the world we inhabit. But what is the shape to which our tools and devices encourage us to conform? Who or what do we seek to please by the way we use them?” (Sacasas 2017)

And, to paraphrase Sacasas, do these tools and devices sustain archaeology or erode it?

References

Allen, M. 2017 ‘Sean Parker unloads on Facebook “exploiting” human psychology’, Axios, 9th November, https://www.axios.com/sean-parker-unloads-on-facebook-2508036343.html

Bridle, J. 2017 ‘Something is wrong on the internet’, Medium, 6th November, https://medium.com/@jamesbridle/something-is-wrong-on-the-internet-c39c471271d2

Hern, A. 2017a ‘Can Facebook win its battle against election interference in 2018?’, The Guardian, 28th December, https://www.theguardian.com/technology/2017/dec/28/can-facebook-win-its-battle-against-election-interference-in-2018

Hern, A. 2017b ‘Facebook and Twitter threatened with sanctions in UK ‘fake news’ inquiry’, The Guardian, 28th December, https://www.theguardian.com/media/2017/dec/28/facebook-and-twitter-threatened-with-sanctions-in-uk-fake-news-inquiry

McNamee, R. 2017 ‘How Facebook and Google threaten public health – and democracy’, The Guardian, 11th November, https://www.theguardian.com/commentisfree/2017/nov/11/facebook-google-public-health-democracy

Matsakis, L. 2018 ‘The Logan Paul “Suicide Forest” Video Should Be a Reckoning For YouTube’, Wired, 3rd January, https://www.wired.com/story/logan-paul-video-youtube-reckoning/

Meyer, R. 2018 ‘The Social-Media Star and the Suicide’, The Atlantic, 2nd January, https://www.theatlantic.com/technology/archive/2018/01/a-social-media-stars-error/549479/

Morgan, C. 2018 ‘Is it ethical to use social media for teaching archaeology?’, Middle Savagery, 2nd January, https://middlesavagery.wordpress.com/2018/01/02/is-it-ethical-to-use-social-media-for-teaching-archaeology/

Sacasas, M. 2017 ‘The shape of our tools, the shape of our souls’, The Frailest Thing, 11th December, https://thefrailestthing.com/2017/12/11/the-shape-of-our-tools-the-shape-of-our-souls/

Wachter-Boettcher, S. 2017 ‘How algorithms are pushing the tech giants into the danger zone’, The Guardian, 18th November, https://www.theguardian.com/technology/2017/nov/18/facebook-youtube-revenge-porn-science-and-tech-feature-sara-wachter-boettcher

Warzell, C. & Smidt, R. 2017 ‘Demonitized: YouTubers Made Hundreds Of Thousands Off Of Bizarre And Disturbing Child Content’, Buzzfeed, 11th December, https://www.buzzfeed.com/charliewarzel/youtubers-made-hundreds-of-thousands-off-of-bizarre-and?utm_term=.ylXnlEaBM#.px7YAr1ZK