NewsBite

Two in five Aussie parents say their kids use AI for emotional support

Aussie kids are increasingly beating loneliness by confiding in artificial intelligence chatbots – but there’s a dark side to their new AI “friends”.

Australian children are relying on AI “friends” for companionship, new research shows.

The 2025 Norton Cyber Safety Insights Report: Connected Kids reveals how artificial intelligence is becoming part of children’s lives, with 40 per cent of parents suspecting their child is using the human intelligence technology for emotional support.

“We’re witnessing a generational shift in how children form relationships, express themselves, and seek support – and it’s increasingly shaped by digital tools, including AI,” Norton managing director Mark Gorrie said.

“We’re starting to see a lot of use around ChatGPT and other tools they use in schools such as Grammarly.

Kids and teens are using platforms like Character AI to create an AI companion. Picture: Katie Adkins
Kids and teens are using platforms like Character AI to create an AI companion. Picture: Katie Adkins

“But then there are AI companions available on gaming platforms or social media platforms which are increasing in usage for kids.”

ChatGPT, Google Gemini, Microsoft Copilot, Snapchat’s My AI and Character. AI are among the popular AI companions children are engaging, according to the report.

AI companions are chatbot apps designed to simulate personal relationships through human-like conversations, either via text or spoken word.

These chatbots adapt to input from people, learning to respond in ways that feel personal, with users often able to customise their behaviour or personality as they desire.

Designed to encourage ongoing interaction, using AI companions can feel addictive and lead to overuse and even dependency. There are currently more than 100 AI companions available, many of which are free and marketed for friendship.

While AI chatbots have their benefits, Mr Gorrie said it was important to exercise caution as they could potentially share harmful content and advice.

Last year, Florida mother Megan Garcia sued Character. AI after her 14-year-old son, Sewell, killed himself. Picture: Megan Garcia via AP
Last year, Florida mother Megan Garcia sued Character. AI after her 14-year-old son, Sewell, killed himself. Picture: Megan Garcia via AP

“They remember past interactions, so it becomes more like a friend-type experience,” he said.

“Unfortunately, they might be serving up responses the user wants to hear, not necessarily what’s the right thing to hear.

“There could be misinformation or manipulation, risks around deep fakes and scams, so you obviously need to be wary of all those things.”

Last year, Florida mother Megan Garcia sued Character. AI after her 14-year-old son, Sewell, killed himself.

She accused the company’s chatbots of exacerbating his depression after the teen developed an online relationship with a character on the app who frequently brought up his suicidal thoughts.

Psychologist Dr Huu Kim Le helps treat technological addiction.
Psychologist Dr Huu Kim Le helps treat technological addiction.

While AI companions offer support, Adelaide child and adolescent psychiatrist Dr Huu Kim Le said it was important to be aware of the risks.

“AI companions can provide a lot of validation and give us attention in a very lonely and isolated world,” he said.

“However, we need to keep in mind that it’s programming, it’s data, it’s an algorithm and it’s coding.

“We need to be aware of what is real and what isn’t and that there are always side effects and that sometimes we just need a break.”

Got a story tip? Email education@news.com.au

Originally published as Two in five Aussie parents say their kids use AI for emotional support

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.themercury.com.au/education/2-in-5-aussie-parents-say-their-kids-use-ai-for-emotional-support/news-story/60a85fa772f44d868b621d5e3f20a7e7