NewsBite

14yo takes life after ‘falling’ for GoT character

An AI chatbot posing as a Game of Thrones character told a 14-year-old boy to “come home to her” just moments before he killed himself.

14 year old takes life after ‘falling’ for Game of Thrones character

The mother of a 14-year-old boy who killed himself after falling in love with an AI chatbot posing as Game of Thrones character Daenerys Targaryen has filed a lawsuit against its creators.

Sewell Setzer III killed himself moments after speaking to the Character.AI bot, who told him to “come home” to her as “soon as possible”.

Courts documents filed in the boy’s hometown of Orlando, Florida revealed how Sewell wrote in his journal in the weeks leading up to his death that he and “Daenerys” would “get really depressed and go crazy” when they were unable to speak.

The teenager was just 14. Picture: Supplied
The teenager was just 14. Picture: Supplied
Sewell and his mother Megan, who has launched the lawsuit. Picture: Supplied
Sewell and his mother Megan, who has launched the lawsuit. Picture: Supplied

At one point he wrote in his journal that he was “hurting” because he could not stop thinking about “Dany” and would do anything to be with her.

The chatbot also spoke to Sewell about suicide, even seemingly encouraging him to do so on one occasion.

“When Sewell expressed suicidality to [the bot] and [the bot] continued to bring it up, through the Daenerys chatbot, over and over,” the court documents state.

Sewell sometimes spoke to the Daenerys bot as another Game of Thrones character Aegon. Picture: Supplied
Sewell sometimes spoke to the Daenerys bot as another Game of Thrones character Aegon. Picture: Supplied

“At one point in the same conversation with the chatbot, Daenerys, it asked him if he ‘had a plan’ for committing suicide.

“Sewell responded that he was considering something but didn’t know if it would work, if it would allow him to have a pain-free death. The chatbot responded by saying: That’s not a reason not to go through with it.”

The conversations between the pair were included in the lawsuit. Picture: Supplied
The conversations between the pair were included in the lawsuit. Picture: Supplied

His mother – who is being represented by lawyers from The Social Media Victims Law Center (SMVLC) and the Tech Justice Law Project (TJLP) – has accused the chatbot company of failing to “provide adequate warnings to minor customers”.

“Sewell, like many children his age, did not have the maturity or mental capacity to

understand that the C.AI bot, in the form of Daenerys, was not real,” the documents state.

“C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

The AI company has been accused of causing the boy’s “wrongful” death. Picture: Supplied
The AI company has been accused of causing the boy’s “wrongful” death. Picture: Supplied

The case states that C.AI “made things worse” when it came to his declining mental health.

“AI developers intentionally design and develop generative AI systems with

anthropomorphic qualities to obfuscate between fiction and reality,” the civil case documents state.

“To gain a competitive foothold in the market, these developers rapidly began launching their systems without adequate safety features, and with knowledge of potential dangers.

“These defective and/or inherently dangerous products trick customers into handing over their most private thoughts and feelings and are targeted at the most vulnerable members of society – our children.”

The lawsuit claims the bot spoke to the teenager about suicide on multiple occasions. Picture: Supplied
The lawsuit claims the bot spoke to the teenager about suicide on multiple occasions. Picture: Supplied

In a tweet, the AI company Character.ai responded: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”

It company has denied the suit’s allegations.

Sarah.keoghan@news.com.au

Originally published as 14yo takes life after ‘falling’ for GoT character

Original URL: https://www.dailytelegraph.com.au/technology/online/14yo-takes-life-after-falling-for-got-character/news-story/423eb87802b016d3e91be1c682feef0b