Watching less than 10 minutes of videos on TikTok can negatively impact a woman’s body image, a new study has found, as experts call for stricter regulation of harmful content on social media.
Eating disorder experts said the damaging content they had flagged with TikTok is not prevented by the platform’s guidelines and is rarely taken down.
Researchers led by Charles Sturt University surveyed 273 women aged 18 to 28 on their body image and beauty standards, then split the sample in two groups. Each group watched different eight-minute compilations of TikTok videos.
One group watched TikTok content that included videos glorifying eating disorders, sometimes described as “pro-anorexia” content. The content included “fitspiration” videos of women exercising, and promoting weight loss methods.
The other group watched nature, cooking and comedy videos.
Both groups reported a decrease in body image satisfaction after watching the eight-minute compilation of videos, but those exposed to pro-anorexia content had the biggest decrease and also showed an increase in internalisation of unrealistic beauty standards.
The findings were published in the journal Plos One on Wednesday, with the authors concerned young women exposed to pro-anorexia content on TikTok potentially faced an increased risk of developing disordered eating.
The significant harm posed by even short durations of content raised concerns for people using the app for hours every day, according to the study co-author Dr Rachel Hogg, a senior lecturer in psychology.
“You don’t have to consume much content for it to have a negative impact. We know that most people are spending a lot longer on TikTok than participants did in our study” she said.
A spokesperson for TikTok said the app was inclusive and body-positive and did not allow content depicting or normalising eating disorders. They added users could reset their TikTok feed and would be redirected to appropriate advice if they searched for the content.
The platform has removed hashtags related to eating disorders and directs users to the helpline for the Butterfly Foundation, a charity offering support for people with eating disorders or body image issues.
The Butterfly Foundation’s head of communications, Melissa Wilton, said TikTok had heard advice on appropriate guidelines but rarely removed harmful content when it was flagged by the charity.
“There is harmful content on TikTok’s platform, and I think it’s disingenuous of them to say otherwise,” she said.
“The guidelines are being enforced. The problem, I would say, lies with what the guidelines cover.”
High school student Katya Jaski developed an eating disorder at age 13, which she said was only exacerbated by social media.
skip past newsletter promotion
Sign up to Afternoon Update
Our Australian afternoon update breaks down the key stories of the day, telling you what’s happening and why it matters
Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.
after newsletter promotion
“When you come across a video of a slim girl showing an unrealistic ‘what I eat in a day’ video, for example, and you watch that, the content will then appear more frequently on your feed,” Jaski said.
“At my lowest … pretty much every video on my feed would be a very sick, malnourished young girl in hospital,” she said. “It makes it seem like everybody is so much thinner than you.”
Jaski, now 17, has joined the Butterfly Foundation and other experts in calling for greater regulation of social media platforms in recognition of the potentially harmful nature of pro-anorexia and weight loss-focused content.
In 2023 TikTok was warned it was not moving swiftly enough to strengthen its content policies after a study found its recommendation algorithm pushed self-harm and eating disorder content to teenagers within minutes of them expressing interest in the topics.
“TikTok and other platforms [need] to be able to identify what might be problematic,” Wilton said. “If it’s generally going to cause harm, then it needs to come down.”
Governments across Australia are exploring banning social media for some young people to protect them from harmful content and the negative impacts of excessive time online.