dailyO
Technology

TikTok sued by moderator for PTSD: Who keeps social media safe?

Advertisement
Amrutha Pagad
Amrutha PagadDec 30, 2021 | 17:57

TikTok sued by moderator for PTSD: Who keeps social media safe?

How many hours did you spend on Facebook, Instagram, TikTok, Twitter, or YouTube today? Was your timeline blessed with videos and photos of cute cats and dogs? Did you laugh at a witty joke on Twitter? Did it bring a smile to your face? You must have forgotten about the cute stuff in the next few minutes.

But the internet is not all a happy place, of course. There are all sorts of weirdos and criminals in the same space as you. And the internet is dark and full of horrors.

Advertisement

But do you know about the people who are toiling their mental health and happiness to keep your timeline clean; to ensure that you only see these happy, funny and cute stuff? You are unlikely to know or appreciate the work of these internet scrubbers, just as unlikely as you are to know who collects the trash from your house or neighbourhood to keep your surroundings clean.

CONTENT MODERATORS

social-media-moderat_123021051438.jpg
Former content moderator sues TikTok over PTSD caused during work. Illustration: Seemon, DaiilyO

They are known as content moderators. Social media giants like Facebook, TikTok and others hire thousands of content moderators either directly or through outsourcing. Content moderators sift through thousands of multimedia content to determine whether to take them down or to keep them.

They go through the content and apply the policies or rules of the social media platform they are working for and decide if something has violated the rules. The content they deal with is violent and graphic, including murder, rape, torture, animal and child abuse, and what not under the sun, throughout their time at the job.

Sifting through this dark matter for 8 or 12 hours a day, depending on their work hours, is not easy. In the latest, an outsourced content moderator who worked for TikTok has sued the social media giant and its parent company Bytedance in the US for causing her psychological trauma.

Advertisement

Candie Frazier, in her appeal, said that TikTok failed to protect her mental health after making her watch countless hours of traumatic videos involving rape, cannibalism, suicides, animal mutilation among other disturbing things. She said that TikTok did not blur the graphic elements or even reduce the resolution of the videos she had to watch.

The Washington Post reported that Frazier developed panic attacks, depression and symptoms related to post-traumatic stress disorder or PTSD due to her job.

Frazier is not the only one. Several content moderators have spoken up in the past. In 2020, Facebook agreed to pay US $52 million to its former and current American content moderators to compensate them for mental health issues developed due to the nature of the work.

In some cases, the job of content moderation has also turned deadly. In 2018, a man in Florida, working for a company named Cognizant as a content moderator for Facebook died due to a heart attack at his desk, reportedly due to the workload and the nature of the job. In a case from the Philippines, a content moderator hanged himself in his room in front of his laptop, after his supervisor did not transfer him from the department after 3 requests.

Advertisement

You might forget the cute videos you saw after a few minutes, but these content moderators suffer due to the violent things they viewed during work long after leaving the said job.  

In the American satirical animated show, South Park, Season 19 Episode 5 called Safe Space portrays the burdens of content moderation jobs. Butters, a character on the show, a primary school child is tasked with filtering comments for his classmate Eric Cartman due to online trolls fat-shaming him and causing him mental distress.

However, everyone forgets about the distress being shifted on to Butters. What follows is the apt portrayal of the content moderation job that aims to keep users in their safe space, but at the same time exposes the dirty underbelly of social media to the few on the job.

The job of content moderation on the internet is important. Without it, you can expect your social media timelines to be filled with pornographic or violent content from top to the bottom. However, this is a job many say, is better left to AI. But we still don’t have that powerful or decisive an Artificial Intelligence to do the job.

So, the next time you report something on any social media platform, do spare a thought for these content moderators. 

On the other hand, the social media giants can help ease the job by not stressing out content moderators, forcing them to fill quotas and treating them as disposable workers. Perhaps, instead of treating content moderation as an unwanted social necessity, these companies can make it part of the core business and give the employees the safety nets they deserve.

Last updated: December 30, 2021 | 17:57
IN THIS STORY
Please log in
I agree with DailyO's privacy policy