Organisations such as the NSPCC have reported TikTok is ‘acting too slow’ to tackle self-harm and eating disorder content and must take “meaningful action.”
Groups such as the Molly Rose Foundation and NSPCC have urged the firm to strengthen its content moderation policies, following the publication of research which suggests the app’s recommendation algorithm reveals self-harm and eating disorder content to teens within minutes of them showing an interest in these topics.
A letter To TikTok’s head of safety written by the organisations has asked the app to take action to improve the moderation of eating disorder and suicide content. The firm has been encouraged to:
- Work with experts to develop a “comprehensive” approach to removing harmful content.
- Support users who may be struggling with eating disorders or suicidal thoughts.
- Regularly report on the steps being taken to address these issues.
The letter has signatures from more than 24 organisations to include the American Psychological Association and the US Eating Disorders Coalition. The companies claim TikTok has only removed 7 of the 56 coded eating disorder hashtags highlighted in research published in December 2022 by the campaign group the Center for Countering Digital Hate (CCDH).
In response to the report, TikTok said the research, which used fake accounts to test the responsiveness of the app’s recommended algorithm, didn’t reflect the experience or viewing habits of its real-life users.
Why does this matter?
The letter was disclosed in the same week as TikTok announced it was to limit teenagers joining the app to an hour of use each day. In the coming weeks, every account belonging to a user below the age of 18 will automatically be set to a 60-minute daily screen time limit.
TikTok have said they consulted the current academic research and experts from the Digital Wellness Lab at Boston Children’s Hospital before choosing this limit. Once the limit is reached, teens will have to enter a passcode to continue their time on the app. In a bid rolled out last year to encourage teens to enable screen time management, TikTok is actively encouraging teens to set a daily screen time limit if they opt out of the 60-minute default and spend no more than 100 minutes on TikTok in a day. The firm claims that their tests found this approach helped increase the use of screen time tools by 234%.
TikTok says it has “robust” existing safety settings for teen accounts. For example, teens aged 13-15 have their accounts set to private by default, enabling them to make informed choices about what they choose to share. Direct messaging is only available to over 16s, and users have to be at least 18 to host a LIVE.
A spokesperson for TikTok said many people who struggled with, or were recovering from, eating disorders used the app in a positive manner for support.
“Our community guidelines are clear that we do not allow the promotion, normalisation or glorification of eating disorders, and we have removed content mentioned in this report that violates these rules. We are open to feedback and scrutiny, and we seek to engage constructively with partners who have expertise on these complex issues, as we do with NGOs in the US and UK,” said the spokesperson.
TikTok’s newsroom provided full information on TikTok’s latest features in Family Pairing and screen time controls.