Content moderation for user-generated content
Lurking inside every website or app that relies on "user-generated content"--so, Facebook, YouTube, Twitter, Instagram, Pinterest, among others--there is a hidden kind of labor, without which these sites would not be viable businesses. Content moderation was once generally a volunteer activity, something people took on because they were embedded in communities that they wanted to maintain.
But as social media grew up, so did moderation. It became what the University of California, Los Angeles, scholar Sarah T. Roberts calls, "commercial content moderation," a form of paid labor that requires people to review posts--pictures, videos, text--very quickly and at scale.
Roberts has been studying the labor of content moderation for most of a decade, ever since she saw a newspaper clipping about a small company in the Midwest that took on outsourced moderation work.
One panel directly explored those costs. It paired two people who had been content moderators: Rasalyn Bowden, who became a content-review trainer and supervisor at Myspace, and Rochelle LaPlante, who works on Amazon Mechanical Turk and is the cofounder of an organizing platform for people who work on that platform, MTurkCrowd.com. They were interviewed by Roberts and a fellow academic, the University of Southern California's Safiya Noble.
Bowden described the early days of Myspace's popularity when suddenly, the company was overwhelmed with inappropriate images, or at least images they thought might be inappropriate. It was hard to say what should be on the platform because there were no actual rules. Bowden helped create those rules and she held up a notebook to the crowd, which was where those guidelines were stored.
"The workers may be structurally removed from those firms, as well, via outsourcing companies who take on CCM contracts and then hire the workers under their auspices, in call-center (often called BPO, or business-process outsourcing) environments," Roberts has written. "Such outsourcing firms may also recruit CCM workers using digital piecework sites such as Amazon Mechanical Turk or Upwork, in which the relationships between the social-media firms, the outsourcing company, and the CCM worker can be as ephemeral as one review."
Each of these distancing steps pushes responsibility away from the technology company and into the minds of individual moderators.