They've... seen things. Horrible things.
At first glance, the role of content moderator at Facebook may seem like a job where you just look at videos and posts all day, and decide if they’re appropriate. Tedious and repetitive, but nothing terrible. That first glance would be wrong. Because: humans. As it turns out, it can be a very mentally and emotionally taxing job.
The types of content moderators are tasked with looking for often include violent and sexually explicit content, such as murders by the likes of ISIS and drug cartels, as well as child pornography. The goal is to remove it as quickly as possible so users don't see it. Unfortunately, that means someone has to do the viewing. Those "someones" are typically employed by outside contractors who work in poor working conditions, many getting paid as low as $16 an hour (some are getting a little more).
Firms like Facebook, Instagram, TikTok, and Twitter use a combination of software and humans to spot and remove videos, photos and other content that breaches their rules. That includes everything from posts by President Donald Trump to hate speech, terrorism, child abuse, self-harm, nudity, and drug abuse.
As you might imagine, seeing these images of real life horrors can be extremely difficult on the viewer. When one moderator told a supervisor she was having difficulty with the thing she had seen she was told that there was a counselor she could talk to - but only once a quarter.
In a recent lawsuit, Facebook was ordered to pay $52 million in compensation to its current and former moderators. This came only after a moderator who had left the company broke the confidentiality agreement with Facebook.
To handle the contractor issue and create a level of quality control across its workforce, TikTok has created its own in-house moderation group, and has been pulling contract Facebook moderators into its fold. Recently, over 25 people have left roles where they worked on Facebook content to join to TikTok, according to LinkedIn. That number is expected to grow - TikTok has announced plans to add thousands of moderators over the course of 2021, up to 10,000 by some reports.
This is not without some controversy, of course. TikTok has been famously embroiled in conflict with the Trump administration in 2020, in part due to the influence the Chinese government has on the platform, and in part because TikTok users used the platform to prank a Trump rally earlier in the year. Along with that, internal documents obtained from the company by The Intercept revealed the company has been censoring content that goes beyond the horrible, and includes people the company views as poor and/ or physically unattractive.
"Livestreamed military movements and natural disasters, video that “defamed civil servants,” and other material that might threaten “national security” has been suppressed alongside videos showing rural poverty, slums, beer bellies, and crooked smiles. One document goes so far as to instruct moderators to scan uploads for cracked walls and “disreputable decorations” in users’ own homes — then to effectively punish these poorer TikTok users by artificially narrowing their audiences."
In some ways, while the moderators are getting a full-time role in what the company claims are better working conditions, it may be a case of out of the frying pan, and into the fire.