TikTok Can’t Filter New Lawsuit

Content moderation

In terms of social media, TikTok is considered one of the most popular and preferable, particularly to younger users. For those unfamiliar with the platform, the app allows its account holders to create and upload videos. Users may also view and respond or react to the videos of others within a shared community. Despite its appeal, some of the individuals who are employed to review the platform’s incoming content have come forward to express their dissatisfaction with certain working conditions. Specifically, employees have accused the company of failing to provide adequate mental health programs or support for employees that have developed anxiety and depression.

During the second quarter of 2021, about 81 million videos were deemed inappropriate, offensive, disturbing, or violent and were subsequently removed from the TikTok platform. As stated in the lawsuit filed in March, members of the company’s trust and safety team personally removed a majority of these videos, which subjectively violated company policies that aim to protect viewers from illegal or graphic visuals. Only a minor portion of the inappropriate content was censored using automated tools or features. The software is allegedly limited to certain content categories, which forces TikTok to continue to rely on the expertise of human content moderators.

These moderators, however, are subjected to the full scale of delicate and disturbing videos that may include conspiracy theories concerning COVID-19, politics, and world events. The employees are tasked with reviewing each video within 25 seconds. The plaintiffs in the lawsuit, which has reached class action status, also allege that some of their wages have been withheld if shifts are not completed and the amount of time they were initially allowed to exercise toward their personal wellness has been cut in half, from an hour to 30 minutes every week. Employees are seeking damages for the stress and psychological harm that they have suffered and are asking for the company to implement a medical monitoring program to assess moderator well-being.