Source: Brickley, C. (2019). Secret Life of Moderator. The Verge. https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

What Does it Mean to be an Online Platform Moderator?

Mamma-Mia
3 min readDec 30, 2020

--

Yes, no, yes, no. When it comes to moderating an online platform that is what comes to mind, individuals at computers, simply clicking approve or disapprove on any reported content. Yet, Gillespie (2018) writes how this is not necessarily the case, with it in fact being a mass operational, almost automated, process, moderating an overwhelming amount of content- Twitter users generate an average of around 6,000 tweets a second which have to be reviewed (Krikorian, 2013)- reinforcing the codes of conduct that social platforms require users to abide by.

As much as there is software and Artificial Intelligence (AI) available that deciphers much of the posts that are on the platforms, AI is not always equipped to identify discrete infringing content (Vincent, 2019). This causes it to be a necessity for there to be a human work force behind this, following a strict set of moderation guidelines (Vincent, 2019). Online social platform companies hire employees specifically to moderate content, as a result.

However, when having to manually assess the content that is posted, due to attempts to use social platforms to spread damning content, including radicalising, violent and hateful or pornographic posts, employees are regularly exposed to horrific images, videos and speech on a daily basis. As a result, moderators are required by platforms, specifically You Tube and Facebook, to sign PTSD disclosures when they start to acknowledge the possible effect the job could have on their mental health (BBC News, 2020a). Therefore, many are stuck with a work life of trauma even causing scarring PTSD for some, which has led to content moderating jobs becoming seen as the worst in the tech industry (Davidovic, 2019).

Source: IKangai. (2019). Stressed Moderator. Ikangai.com. https://www.ikangai.com/facebook-moderators-break-their-ndas-to-expose-desperate-working-conditions-the-verge/

Newton, for The Verge (2019), originally reported this, with a past moderator coming forward to tell her story. Whilst being a content moderator, she stated how she was faced with gruesome content including that of a man being killed, videos of which many eventually became numb and immune to through continuous exposure. It eventually caused her to start getting panic attacks among other mental health problems. This isn’t an isolated case either as another moderator even stated how they now sleep with a gun next to their bed, out of fear, in reaction to what they had seen whilst at work (Newton, 2019). Consequently, companies have recently had to start paying out compensation to their content moderators due to the development of mental health issues. Facebook themselves, in 2020, even ended up agreeing to pay out around £42 million in compensation to their content moderation workers (BBC News, 2020b).

When developing an understanding of the horrific content and trauma content moderators expose and put themselves through, it causes me to ponder. If the reality of keeping online social platforms safe and moderated, for users, comes at such a cost, are we suggesting that our safety and wellbeing is of a greater importance than those employed to ensure our safety?

References:

BBC News. (2020a). Facebook and YouTube Moderators Sign PTSD Disclosure. Retrieved from https://www.bbc.co.uk/news/technology-51245616 [accessed 31 December 2020]

BBC News. (2020b). Facebook to Pay $52m to Content Moderators over PTSD. Retrieved from https://www.bbc.co.uk/news/technology-52642633 [accessed 31 December 2020]

Davidovic, I. (2019). The People Policing the Internet’s Most Horrific Content. Retrieved from https://www.bbc.co.uk/news/business-49393858 [accessed 31 December 2020]

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media. Yale University Press.

Krikorian, R. (2013). New Tweets per Second Record, and How!. Retrieved from https://blog.twitter.com/engineering/en_us/a/2013/new-tweets-per-second-record-and-how.html [accessed 31 December 2020]

Newton, N. (2019, February 25). The Trauma Floor: The Secret Lives of Facebook Moderators in America.The Verge. Received from https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona [accessed 31 December 2020]

Vincent, J. (2019, February 27). AI Won’t Relieve The Misery of Facebook’s Human Moderators. The Verge.Retrieved from https://www.theverge.com/2019/2/27/18242724/facebook-moderation-ai-artificial-intelligence-platforms [accessed 31 December 2020]

--

--