Content moderation is nearly impossible at scale, and social media companies rely on people to review content, when AI cannot succeed. Facebook moderators, have already gotten a fair amount of media attention for being traumatized by the images of hate and violence they see in their jobs.
Now, Candie Frazier, a TikTok moderator has filed a class-action lawsuit against the company, claiming that the constant exposure to graphic videos has left her with post-traumatic stress disorder. Her demand is that the social media juggernaut (perhaps the larges site in the world) set up a medical fund for its moderators.
Around the web: Spring Free EV Contract Breach
The class-action lawsuit, originally reported by The Verge is filed ageist TikTok and its parent company, ByteDance. She says she watched countless hours of traumatic videos involving cannibalism, rapes, animal mutilation, and suicides, among numerous other disturbing activities.
Around The Web: How to Create Your Own Cryptocurrency: A Beginners Guide
“Despite being aware of how damaging the job of content moderation can be, the suit alleges that ByteDance and TikTok have not implemented industry-wide safety standards, such as disabling audio, minimizing, or blurring parts of the disturbing content. The suit also alleges that the companies do not provide adequate mental health support. ” According to Insider
“Nikko – Hear no evil, speak no evil, see no evil” by KidMoxie is licensed under CC BY-SA 2.0 copyGo to image’s website