A number of charitable and other nonprofit organizations have urged video-sharing platform TikTok to do more to protect children by strengthening its policies for the moderation of content relating to suicide and eating disorders.
The call came in an open letter to TikTok’s head of safety, Eric Han. It was signed by almost 30 groups, including the Center for Countering Digital Hate, the American Psychological Association, the UK’s National Society for the Prevention of Cruelty to Children, and suicide prevention organization The Molly Rose Foundation.
TikTok’s algorithm pushes content about self-harm and eating disorders to teenagers almost as soon as they express interest in the topics, according to research published by the CCDH in December. It found that within 2.6 minutes, the platform recommended suicide-related content to teenagers’ accounts created by CCDH researchers. Within eight minutes, it served up content related to eating disorders. Every 39 seconds, the platform recommended videos about body image and mental-health issues.
The organizations accuse TikTok of failing to act swiftly enough in response to the concerns raised by the CCDH report. In their letter, they urged the platform to take “meaningful action,” including: Improvements to the moderation of content relating to eating disorders and suicide; working with experts to develop a “comprehensive” approach to identifying and removing harmful content; providing support for users who might be struggling with suicidal thoughts or eating disorders; and more transparency about, and accountability for, the steps it is taking to address the issues and the effects their efforts are having.
The letter noted that TikTok had removed only seven of 56 hashtags related to eating disorders that were highlighted by the CCDH research. Content containing those hashtags had received 14.8 billion views as of January 2023, including 1.6 billion views since the report was published, the center said.
“Since CCDH’s report was released in December 2022, you have chosen to deny the problem, deflect responsibility and delay taking any meaningful action,” the organizations said in the letter.
“You were presented with clear harms but continue to turn your backs on the young users you claim to protect. Your silence speaks volumes.”
This month, TikTok announced that teenagers on the platform will be limited to one hour of use each day. It said the limit was set after consulting the Digital Wellness Lab at Boston Children's Hospital.
However, users can override this setting when their 60 minutes are up by entering a passcode that allows them to continue using the app. This requires “them to make an active decision,” TikTok said.
“TikTok’s business model is to broadcast content produced by creators to viewers, using algorithms that individually optimize the addictiveness of the content, all so that they can ultimately serve those viewers ads,” said CCDH CEO Imran Ahmed.
“The stakes are too high for TikTok to continue to do nothing, or for our politicians to sit back and fail to act. We need platforms and politicians to have parents’ backs but right now they’re putting profits before people.”
Other organizations that signed the letter included Free Press, the Youth Power Project, the Real Facebook Oversight Board, and the Tech Transparency Project.