Behind the Screen: What happens to Tiktok’s Content Moderators?

Luis, a Colombian student of 28 years old, works through the night to moderate videos on TikTok. He attempts to sleep throughout the day, but the films occasionally haunt his dreams.

He recalls a video recorded at a party showing two individuals carrying what seemed to be bits of flesh. When they turned around, it seemed that they were carrying flayed human facial skin and gristle. “The scariest part was that the pals were utilizing human faces as masks in their games,” he claimed.

Luis listed the kind of content he regularly encounters: “Murder, suicide, paedophilic content, pornographic content, accidents, and cannibalism.”

Carlos, a former TikTok moderator, had nightmares over a video depicting child sexual assault. He stated that the video depicted a five or six-year-old girl. It was so near that she appeared to be turning her back to the camera while dancing.

As a father myself, it struck him extremely hard, he explained. He pressed pause, stepped outside to smoke a cigarette, and returned to the video queue a few minutes later.

As part of their daily duties, TikTok censors in Colombia must sift through horrifying videos like these. They reported to the Bureau of Investigative Journalism pervasive workplace stress and little psychological assistance, demanding or unachievable performance goals, punitive wage reductions, and excessive surveillance. Their continuous attempts to unionize for improved working conditions have been met with opposition.

Keep Reading


With an estimated 100 million subscribers in Latin America, TikTok has hired hundreds of moderators in Colombia to fight a never-ending war against offensive material. They work six days a week on day and night shifts and are paid as little as £235 per month, compared to around £2,000 per month for content moderators in the United Kingdom.

The workers questioned by the Bureau were contracted by Teleperformance, a global services outsourcing firm with over 42,000 employees in Colombia, making it one of the country’s largest private employers. The nine moderators could only talk anonymously for fear of losing their jobs or jeopardizing their work chances in the future.

Neither TikTok nor Teleperformance reacted to specific complaints lists for this article. Both made comments expressing their dedication to the welfare of their personnel.

A traumatizing job

The TikTok recommendation system is widely regarded as one of the most successful artificial intelligence (AI) applications in the world. It learns with almost terrifying precision what each user finds amusing or appealing and then shows them more stuff they are likely to appreciate.

However, TikTok’s AI skill is limited. The firm employs both humans and artificial intelligence to keep its platform free of hazardous information. In addition, when content moderators at TikTok and other sites flag a piece of content for removal, they do not just remove it. In addition, they are gathering information on the precise policies that are violated, which may be used to educate the platform’s machine learning algorithms to recognize such content in the future.

Some social media companies have difficulty with even very simple duties, such as recognizing duplicates of deleted terrorist videos. However, their duty gets more difficult when they are required to immediately erase stuff that no one has ever seen. “The human brain is the most effective instrument for identifying harmful material,” said Roi Carthy, the chief marketing officer of L1ght, an artificial intelligence business specializing in content moderation. When dangerous information is presented in novel formats and settings that AI may not recognize, humans become especially important.

Carthy stated, “Nobody understands how to address content moderation comprehensively, period. This does not exist.”

Carthy stated that the existence of a low-paid and insecure global workforce may be compounding the issue. Videos, which are more complicated than photographs and text, demand a greater computational capacity. This makes the development of AI for video moderation very costly.

“From a financial standpoint, content moderation AI cannot compete with $1.80 per hour,” Carthy added, referring to the average hourly income of content moderators in the global south. “If that’s the only factor you consider, then no AI content moderation business can compete with you.”

Burapha

Sawadee-khrup. I am a multicultural Thai newswriter that is always on the lookout for daily news that are intriguing and unique in my native country Thailand.

Recent Posts

V Surprises ARMY with Two Holiday Releases: A Festive Collab with Park Hyo-shin and “White Christmas” Cover

For K-pop fans, the Christmas season this year has become even more magical as several of their preferred stars reveal…

November 22, 2024

Celine Names TWS as Global Ambassadors Following Suzy Bae Announcement

After Suzy Bae's nomination as Celine's worldwide ambassador, the venerable French luxury fashion company has taken another bold step choosing…

November 22, 2024

Reddit Faces Widespread Outage, Users Turn to Workarounds Amid ‘Server Error’ Messages

Thousands of users of the well-known social network Reddit were left without access after a major outage of the website.…

November 22, 2024

Anne Hathaway Casted as ‘Verity’ in Colleen Hoover’s Book Adaptation

Anne Hathaway is slated to play the much expected film version of Colleen Hoover's best-selling book Verity in front of…

November 21, 2024

Gucci Set to Revolutionize Fashion Presentations with Unified Shows in 2025 under Sabato De Sarno’s Vision

Gucci is ready to change its presentation approach for 2025 in a radical action aimed to revolutionize the fashion industry.…

November 21, 2024

South Korea’s “Korea Discount”: Addressing the Governance Gap to Boost Market Value

As world investors wait for significant changes that might solve long-standing problems of governance and responsibility in South Korea's companies,…

November 21, 2024

This website uses cookies.

Read More