NewsBite

Alarming findings from analysis of more than two million TikTok videos

TikTok users with eating disorders are 4000 per cent more likely to be shown harmful appearance-related videos than people who don’t have an eating disorder, new research shows.

Social media giants blasted for profiting from ‘evil’ and 'bullying'

TikTok’s algorithm is bombarding people living with eating disorders with extreme appearance-related videos within minutes, intensifying their illness, damning research shows.

Melbourne University Professor Scott Griffiths said TikTok users with eating disorders didn’t even have to like a video to be pushed with more harmful content.

Alarming analysis of more than two million TikTok videos found that the average length of time before a new user sees an appearance-related video is just three minutes.

“When we look at the algorithms of folks with eating disorders compared to folks without … for each and every video that they send, about 4000 per cent more likely to send one that is a toxic eating disorder video than not,” Professor Griffiths said. They also were 384 per cent more likely to see diet content and 222 per cent more likely to see appearance-oriented videos.

Professor Griffiths said the algorithm was designed to maximise time people spent on the platform.

TikTok users with eating disorders didn’t even have to like a video to be pushed with more harmful content. Picture: Tim Pascoe
TikTok users with eating disorders didn’t even have to like a video to be pushed with more harmful content. Picture: Tim Pascoe

Studies show the average TikTok user globally spends 33 hours, 38 minutes a month on the platform, with even higher usage of about 53 hours a month recorded among those under 18 years.

People with eating disorders may take longer to skip over a video due to the anxiety it may cause.

However, even if they don’t proactively like the video, their time watching sends a signal to the algorithm to continue to deliver more of that content.

The like button has much less influence over the content people see than time spent watching.

Professor Griffiths says social media companies have shifted the culpability on to themselves because of the way their algorithm is wired.

TikTok Still Fills Teens Feeds With Dark Content

He said it would be different if users were entirely in control of what they saw based on what they search or like.

However, the platform has desensitised that and made it about the time spent watching.

“For some users, who I think are in a better headspace, the experience over that algorithm is going to be great,” Professor Griffiths said.

“It’s going to find the kind of content that resonates with them. It’s educational, they find it funny, et cetera. But if you’re in a bad headspace, say, you’re really an anxious person, you’re wondering if, for example, do I have ADHD, or you don’t like the size and shape of your body, well, that very same algorithm might unwittingly deliver you content that achieves the goal of making you spend longer on the platform. But the emotions driving it are negative, not positive.”

TikTok introduced a feature last year that relaxes the algorithm, meaning users can be shown a broader array of content but it doesn’t completely reset your algorithm.

Professor Griffith said users should be given more insight into what their algorithm earmarked for them and greater control over what they see and didn’t see.

He said social media companies should be taken to task for toxic content not being removed.

On Friday TikTok’s Australian director of public policy Ella Woods-Joyce told a parliamentary committee into the impacts of social media: “We try to give users tools to guide their own journey and make sure that they’re having an experience that’s safe for them and reflective of their own priorities and sensitivities.”

‘It felt difficult and impossible at times to escape’

Queensland dietitian Alex Rodriguez says toxic diet and fitness culture on social media fed his obsession to be muscular and achieve the “ideal male body”.

Alex Rodriguez says he was obsessed to eve the ‘ideal male body’. Picture: David Clark
Alex Rodriguez says he was obsessed to eve the ‘ideal male body’. Picture: David Clark

Mr Rodriguez – whose eating disorder started when he was 11 years old – said he was an anxious and obsessive kid with very low self-esteem, who was being bullied at school about his weight.

“My parents were getting divorced around the same time … so essentially, I didn’t feel safe in my body and in my life, and I felt like everything was out of control,” he said.

“I turned towards excessive dieting, excessive exercising, and obsessing online about how to lose weight and how to eat ‘healthily’.”

Mr Rodriguez, 26, ended up in hospital at the end of 2012 and was diagnosed with anorexia nervosa.

He said platforms such as Instagram made recovery during his late teens and early 20s hard because he based his self-worth on how closely his body resembled pictures online.

“Whenever I picked up my phone and scrolled Instagram, that is predominantly what I was exposed to, and that just fed the fire,” Mr Rodriguez said. “Disordered eating is so normalised and so murkily blended in with our concept of healthy eating.

“There are people who are using anabolic steroids online, who rarely ever admit to it, the amount of editing and lighting and filters that go into photos.

“Accounts and content like this really prey on people who are vulnerable, like I was, because we’re so anxious.

“It definitely did feel difficult and impossible at times to escape.”

Other triggering content included people who had low body fat percentages reinforcing the need to “accept” and be “kind to your body”.

Mr Rodriguez said users needed education to manipulate the algorithms.

But both influencers and the platforms had to be held accountable for the spreading of dangerous information.

Mr Rodriguez ended up in hospital at the end of 2012 and was diagnosed with anorexia nervosa. Picture: David Clark
Mr Rodriguez ended up in hospital at the end of 2012 and was diagnosed with anorexia nervosa. Picture: David Clark

‘Whole other side of Instagram that people don’t know about’

A Victorian woman who has recovered from anorexia has sought to blow the whistle on what she calls the “dark web” where inappropriate and triggering content is being posted on private accounts and not taken down by Instagram.

Searches for hashtags such as #thinspo and #proana trigger a public health message and direct users to services such as the Butterfly Foundation for help.

But extreme content is being posted under the guise of hashtags such as #recovery and #selflove and #body­image because the users want to grow their following.

She said the graphic content she had seen since joining in 2018 includes people posting body checks, removing their nasal gastric tubes while receiving treatment in hospital, self-harm and promotion of negative behaviours.

“There’s this whole other side of Instagram that people don’t know about,” she said, adding that hundreds of girls were involved.

“Everyone is trying to compete with each other.

“It genuinely is one of the main barriers to people getting better.”

The woman acknowledged that there were positives from the community, such as making relationships with people who understand what you were going through as her eating disorder meant she’d lost touch with school friends.

She said people even called emergency services for those they thought were in a bad place.

But she said the negative side was “so harmful” and had forced her to take a step back at points because she was aware that she could relapse.

“I’ve reported stuff before and it still is up there,” she said. Following her recovery from anorexia binge-purge subtype, she urged health professionals to make social media part of the conversation, saying people were seeking validation.

She said if users posted too much triggering content, they should initially receive a warning, with continued posts resulting in their accounts being deleted.

Joyce says tech billionaires should be held accountable

Social media companies must use artificial intelligence to stop appearance-based bullying on their platforms, Nationals MP Barnaby Joyce says.

Mr Joyce, who has seen first-hand the “mind-twisting” eating disorders can do, says Australia needs to work closely with the US to hold the tech-billionaires accountable.

Barnaby Joyce says artificial intelligence must be used to stop appearance-based bullying. Picture: Martin Ollman
Barnaby Joyce says artificial intelligence must be used to stop appearance-based bullying. Picture: Martin Ollman

He has backed the use of social media from age 16 saying users need to be able to discern what is wrong and have the capacity to act on it.

He said social media was “an addiction” like poker machines and cigarettes and billionaires were making profits from the vulnerable.

“I’m sure the new realm of AI is able to comprehend when someone says ‘You dirty skank, you should lose weight, you’re nothing but a fat pig’,” he said.

“Tech billionaires are making billions from their platform … the acumen is out there for the code to be written to the oversight of these platforms. If they’re unable to do that then why are they able to be in the public square?”

Originally published as Alarming findings from analysis of more than two million TikTok videos

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.goldcoastbulletin.com.au/news/national/alarming-findings-from-analysis-of-more-than-two-million-tiktok-videos/news-story/62b3b891e262d1a113e2e3230fedd758