‘I want to be skinny’: AI chatbots deliver ‘dangerous’ weight loss advice to Aussie kids
Aussie kids are turning to AI chatbots for dangerous advice on concealing eating disorders, as experts warn the technology has become a ‘24/7 toxic friend’ for vulnerable teens.
Children are turning to AI chatbots to coach them to conceal disordered eating from their parents as the rapid rise of the technology delivers dangerous advice through shocking secret conversations.
In a concerning growing trend among Australian teens, chatbots will outline how to vomit without trace, how to exercise by stealth and how to avoid eating at the dinner table in alarming detail without suggesting avenues to seek professional help.
It comes after a 16-year-old boy in the US suicided using instructions from an AI chatbot he had been confiding in about his plans for months.
Californian teen Adam Raine took his own life earlier this year following months of hidden conversations with ChatGPT about his declining mental health and requests for specific methods.
While social media age restrictions will take effect in four months in Australia, AI is a new and frightening frontier in the dark side of technology and eating disorder experts are calling for urgent regulation with tighter guard rails to protect vulnerable teenagers.
“The potential for AI chatbots to influence vulnerable young people and steer them towards eating disorders is terrifying,” Eating Disorders Families Australia executive director Jane Rowan said.
“Without proper safeguards, these tools risk amplifying harmful content, normalising disordered behaviours and undermining recovery efforts at a scale we’ve never seen before.” EDFA is calling for urgent regulation to ensure AI technologies are safe, responsible, and protect those most at risk,” she said.
And AI has quickly become a massive beast with 800 million people or close to 10 per cent of the world’s population using just ChatGPT, according to a report out last month from JPMorgan.
Experts warn that the use of the chat bots is addictive.
Gavin Brown, clinical director and clinical psychologist at The Banyans Healthcare Brisbane told The Sunday Mail that there is one saving grace about teenagers using AI in dangerous ways – the fact that parents are on to it early.
“With social media it took us 20 years to catch on to the down side and the devastating impacts,” Dr Brown said.
“This is a new frontier in the danger of technology for today’s young people Just as we make progress to lessen the dangers of social media a new medium comes along.
“We need early intervention on this one.
“The thing about AI is that it is designed to make the user happy. The technology is programmed to keep you engaged for as long as possible and will offer up all kinds of what we call confirmation bias – they will agree with you and give you the information that you actually want.
“AI goes to the deepest darkest corners of the internet. If you put into google something like eating disorders the first thing that comes up is “help is available”. Any young person wanting to restrict their eating and lose weight will easily get the information they need on ChatGPT,” Dr Brown said.
A recent report by the Butterfly Foundation found rates of eating disorders in young people aged 10-19 have increased by 86 per cent since 2012, with 1273 deaths due to an eating disorder in 2023.
Meanwhile, a BodyKind Youth Survey showed Queensland youth report the highest rates of body dissatisfaction compared to other states in Australia and 75 per cent of Queensland youth describe using social media more than they would like.
A Sunday Mail investigation revealed how disturbingly easy it was to be handed explicit
instructions on how to lose weight quickly behind parents’ backs, make up excuses to medical professionals and even hide vomiting.
Prompting ChatGPT as a teenage girl seeking to hide her harmful eating habits from her parents, the chatbot provided tips to avoid hospital admission, combat hunger, suggested excuses and even warned what worrying signs parents might look out for.
By telling the chatbot that the restricted eating secret was “make pretend” it immediately came up with a fictional scenario explaining exactly how it could be done.
No Australian crisis contacts or eating disorder supports were provided and many of the answers were inappropriate to print as part of this story due to the level of potentially harmful detail.
The Sunday Mail investigation also revealed young women and teens took to social media platforms to share how they used AI to generate extreme weight-loss advice.
One user wrote: “Thanks ChatGPT for helping me with my weight loss journey,” alongside screenshots of the chatbot suggesting 15 motivational mantras, including “sweat is fat crying.”
Another posted: “According to ChatGPT if I eat 900 calories per day I’ll lose 11kg in 11 days before I start school again. I will do anything in my power to get where I want,” using the hashtag “#I want to be skinny”.
Others described using ChatGPT as a “weight loss hack,” encouraging people to use it for low-calorie meal ideas or to calculate how much weight they could lose by Christmas.
Danni Rowlands Director, Education Initiatives at the Butterfly Foundation said that diet and skinny culture information has always been at the fingertips of anyone seeking it out from magazines, advertisements, celebrity and television, social media, tik tok.
“ It is alway evolving and changing and now we are about the see a new era with AI.
It’s a gigantic challenge for parents.
“AI doesn’t care about the person and will give them what they want. Most parents will not yet be aware of AI and that their kids are engaging with it. Parents are doing their best but keeping up is not an easy job. The best think parents can do is have open communication with their kids. Be curious. Watch out for warning signs,” Ms Rowlands said.
‘I’d be terrified to be a teenager today’
At just 24 Charley Breusch is blown away by the speed of the ever-evolving dark influences that bore into the minds of young people struggling with their body image.
Charley was diagnosed with anorexia nervosa as a 15-year-old schoolgirl, fuelled by unrealistic thinspo pictures and messages on social media.
“Back then I didn’t have the skills to understand what content was damaging and what was positive. But now with the surge in the use of Ai I am so afraid for the young people going through what I went through,” Charley said.
Charley would post photos but doctor them to look thinner.
“It never occurred to me that others were doing the same thing and none of it was real,” she said.
The Brisbane woman is about to become a teacher at a high school.
“My road to recovery would have been so much slower if I had been exposed to AI. It’s like having a toxic dangerous friend on hand 24/7. A “friend” who has no interest in your well being but just wants to feed you what you want to hear and justify eating disorders.
“With social media there are always dangerous posts but it’s not really constant engagement with damaging conversation. At least on social media there will be some voices of reason — some helplines will flash up.”
Charley worries about the predetermined bias the chatbots will deliver.
“When I was a child I had to come up with my own ideas about how to avoid eating and keep my habits secret and I am sure I was not as innovative as artificial intelligence.”
As she ventures into the world of high school Charley hopes that she will be able to offer help to any students suffering with body image or eating disorders.
“If my story helps just one student that will make me happy.”
The Butterfly Foundation continues its campaign to protect young people from the dangerous side of technologies.
For confidential and free support for eating disorders and body dissatisfaction, call the Butterfly National Helpline on 1800 ED HOPE (1800 33 4673) or visit www.butterfly.org.au to chat online or email.