Human Content Filters
Behind the Screen

Human Content Filters

Professor Sarah T. Roberts explores the seldom-seen world of human content moderators who screen images and text to find and delete offensive, violent, pornographic or terrorist content from social media.

Sarah T. Roberts, a tenured professor at UCLA’s Graduate School of Education and Information Studies, focuses on the moderation of social media content. She reveals how social media’s unseen commercial content moderators edit the material on the internet. Roberts delves into the global impact of their efforts and the often unhealthy effect their work has on their mental state. She depicts a behind-the-scenes, unappreciated army of content moderators who sort out what you see and what everyone else sees, too. 

Content Moderators

The social internet depends on an invisible, poorly paid army of content moderators who regulate the user-generated content that appears on social media platforms. In 2018, Facebook intimated it would staff up to around 10,000 content moderators; Google said it planned to employ double that number.

Professional moderators were the front line of brand and user protection, although…next to nothing was being said about the nature of the work and how the hiring would be undertaken. Sarah T. Roberts

Every day, individual content moderators for social media sites view thousands of images, videos and texts, most of which include violence, pornography, conspiracy theories, misinformation, and political or terrorist propaganda. Moderators must apply cognitive processes and possess linguistic and cultural competencies, a strong understanding of the social media company’s posting policies and guidelines and knowledge of country-specific laws. They make rapid moderation decisions to determine which content violates a site’s posting policies and which can remain online.

Money Makers

Tech companies tend to downplay and conceal the contributions of content moderators; they cite the need to conceal proprietary secrets from competitors and to obscure moderation policies from uploaders looking to evade their rules.

Upstart platforms that come to market claiming they will provide a space without content moderation learn very quickly that it is a poor business decision.Sarah T. Roberts

Social media companies employ moderators to protect their brands and to create an enjoyable environment for users – that is, to make money.

Repetitive, Demanding Work

Content moderator work is repetitive, monotonous and never-ending. Content moderators face risk of psychological injury and other negative effects from viewing disturbing texts, images and video content.

It’s factory work, almost. It’s doing the same thing over and over. content moderator “Josh Santos”

At social media companies that make mental health services available, moderators report avoiding these services because they fear being stigmatized. Some report that talking about their experiences worsened their psychological symptoms. A few content moderators have sued Facebook and Microsoft, among other tech giants, for disability and damages.

Algorithms

An automated program called PhotoDNA limits images of child sexual exploitation, but human moderators must still initially recognize the images, tag them and add them to PhotoDNA’s database. When the database includes an image, PhotoDNA will remove it the next time someone posts it.

That’s a question we get asked a lot: When is AI going to save us all? We’re a long way from that. Facebook head of global policy management Monika Bickert

A similar initiative, the eGLYPH project, attempts to find and delete terroristic content, but the difficulty of defining terrorism stymies its efforts. Similar issues plague the regulation of hate speech and false information. Users who post hateful content continuously improve at avoiding detection.

Second-Class Employees

“Max Breen,” a college graduate, worked as a content moderator for “MegaTech,” a California-based international tech giant. “Max Breen” and “MegaTech” are pseudonyms.

I can’t imagine anyone who does [this] job and is able to just walk out at the end of their shift and just be done. You dwell on it, whether you want to or not. content moderator “Max Breen”

Max soon learned he was part of an underprivileged class. Each day he walked through the MegaTech lobby, which contained a rock-climbing wall. Content moderators couldn’t climb the wall because the company’s health insurance plan didn’t cover the 10 content moderators. MegaTech didn’t even extend invitations to the annual Christmas party to content moderators.

Outsourcing

Tech companies rarely employ commercial content moderators directly; they turn to third parties who handle hiring, management and pay. The contracting firm TCX, for example, hired and managed Max and other members of his team. MegaTech outsourced content moderation work on pornography and spam to workers in India, which caused problems. For example, an American would deem a picture of a family beach holiday harmless, but an Indian worker might remove the image because it included people in bikinis.

While MegaTech…has a global reach and user base, its policies regarding user-generated content were developed in the specific and rarefied sociocultural context of educated, economically elite, politically libertarian and racially monochromatic Silicon Valley, USA.Sarah T. Roberts

The Philippines-based MicroSourcing offers workers who speak English and understand North American culture. These moderators work within special economic zones built by private companies. Workers must shift to Western work hours and start shifts in the evening and finish in the early morning. They work under constant threat of losing their jobs to cheaper Indian contractors.

Some groups attempt to organize content moderators, but these workers live in far-flung places and work amid secrecy. And employers can easily find replacements if workers in one locale make demands.

Librarians

In December 2017, UCLA hosted the All Things in Moderation conference. Academics, activists, journalists, students and content moderators discussed freedom of expression, lawsuits, disability and other aspects of content moderation. A documentary about commercial content moderation, The Cleaners, screened at the Sundance Film Festival in 2018, and the Tech Coalition, made up of technology companies such as Apple, Facebook, Google, LinkedIn, Microsoft, PayPal and Twitter, issued the “Employee Resilience Guidebook,” which discusses issues commercial content moderators face.

We need to create and defend [these] vital spaces of information exchange….The future of American democracy depends on it. media scholar Shannon Mattern

Media scholar Shannon Mattern suggests a return to the professionals who traditionally help people meet their information needs: Librarians. According to Mattern, public librarians can help people navigate information overload.

An Unexpected View

Sarah T. Roberts provides an unexpected view into a profoundly influential, mysterious subculture. Interviewing workers on multiple continents, Roberts gained inside knowledge of this workforce. She writes compassionately about their schedules, situations and stresses and how they shape what you see daily. Roberts demystifies the processes by which social media companies exercise gatekeeping over their content – an issue of considerable political concern.

Compelling books about the internet include Global Free Expression by Benjamin Wagner; FutureNet by Sally Richards; and Custodians of the Internet by Tarleton Gillespie.

Share this Story
Show all Reviews