A content moderator is suing TikTok and its parent company, alleging that she suffers from “psychological trauma” because they failed to implement safety measures that are standard in the industry.
Two former moderators claim the company violated California labor laws by failing to offer proper treatment for trauma caused by exposure to harmful images. Reading time 5 minutes A small army of ...
Trevin Brownie’s first day as a content moderator for Facebook is etched in his memory, working out of a subcontractor’s nondescript office in the Kenyan capital Nairobi. “My first video, it was a man ...
KAMPALA, Uganda — A man who says he is "destroyed" after working as a content moderator for Facebook has filed a lawsuit accusing the company of human trafficking Africans to work in an exploitative ...
Social media companies have slashed hundreds of content moderation jobs during the ongoing wave of tech layoffs, stoking fears among industry workers and online safety advocates that major platforms ...
Social media platform TikTok announced on Friday it will restructure its UK trust and safety operations, putting several hundred jobs at risk as it shifts to AI-assisted content moderation. The move ...
JOHANNESBURG, May 11 (Thomson Reuters Foundation) - The first video Daniel Motaung had to watch while working as a Facebook content moderator in Kenya was of a beheading. After just six months in the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results