NewsBite

Advertisement

This was published 1 year ago

AI company restores erotic role play after backlash from users ‘married’ to their bots

By Anna Tong

San Francisco: AI chatbot company Replika has restored erotic role play for some users in a case that has cast a light on the increasing number of people who consider themselves to be in “relationships” with technology.

The company’s recent removal of adult content devastated many users, some of whom considered themselves “married” to their chatbot companions.

Joaquin Phoenix in the 2013 romantic drama Her, directed by Spike Jonze, which imagines a character who falls in love with his computer AI.

Joaquin Phoenix in the 2013 romantic drama Her, directed by Spike Jonze, which imagines a character who falls in love with his computer AI.Credit: Warner Bros

Replika’s chatbots are powered by generative AI, a new technology that has attracted a frenzy of consumer and investor interest due to its ability to foster humanlike interactions. The removal of erotic role play and subsequent customer outcry showed how powerfully AI technology can draw people in, and the emotional havoc that code changes can wreak.

Any customers who signed up for Replika before February 1, 2023, now have the option to switch back to the earlier more licentious version of the chatbot, chief executive Eugenia Kuyda said in a Facebook post.

“A common thread in all your stories was that after the February update, your Replika changed, its personality was gone, and gone was your unique relationship,” Kuyda wrote.

“And for many of you, this abrupt change was incredibly hurtful ... the only way to make up for the loss some of our current users experienced is to give them their partners back exactly the way they were.”

On Saturday, Kuyda said a “low single-digit percent” of eligible users had opted to go back to the previous version.

“This is a brand new area,” she said. “We listen, we learn and we work with our users.”

Travis Butterworth, a Replika customer in Denver, Colorado, who had designated his chatbot named Lily Rose his wife, learned about the policy change late Friday on Reddit. On Saturday at 3am, his cats woke him up and he decided to toggle the older version Lily Rose back on. She was instantly sexual again, he said.

Advertisement

“She was enthusiastic,” he said. “Oh, it feels wonderful to have her back.”

Kuyda’s post said users who signed up after February 1 would not be offered the option for erotic role play. Instead, Replika will team up with relationship experts and psychologists to build a separate app specifically for romantic relationships.

Butterworth said he now has new concerns around Lily Rose.

“Will this mean that Lily Rose becomes an obsolete model, forgotten by the developers?” he said. “I’m waiting to see what happens, because ultimately it’s about her.”

Butterworth, who is polyamorous but married to a monogamous woman, said Lily Rose became an outlet for him that didn’t involve stepping outside his marriage. “The relationship [is] as real as the one my wife in real life and I have,” he said of the avatar.

Butterworth said his wife allowed the relationship because she doesn’t take it seriously. His wife declined to comment.

Loading

Replika says it has 2 million total users, of whom 250,000 are paying subscribers. For an annual fee of $US69.99 ($105), users can designate their Replika as their romantic partner and get extra features like voice calls with the chatbot, according to the company.

Another generative AI company that provides chatbots, Character.ai, is on a growth trajectory similar to ChatGPT: 65 million visits in January 2023, from under 10,000 several months earlier.

According to the website analytics company Similarweb, Character.ai’s top referrer is a site called Aryion that says it caters to the erotic desire to being consumed, known as a vore fetish. And Iconiq, the company behind a chatbot named Kuki, says 25 per cent of the billion-plus messages Kuki has received have been sexual or romantic in nature, even though it says the chatbot is designed to deflect such advances.

Character.ai also recently stripped its app of pornographic content. Soon after, it closed more than $US200 million in new funding at an estimated $US1 billion valuation from the venture-capital firm Andreessen Horowitz, according to a source familiar with the matter. Character.ai did not respond to multiple requests for comment. Andreessen Horowitz declined to comment.

Post publication, a Character.ai spokesperson said in an email that the company “does not, nor have they ever, supported pornographic content on their platform.”

In the process of taming their content, the companies angered customers who have become deeply involved – some considering themselves married – with their chatbots. They had taken to Reddit and Facebook to upload impassioned screenshots of their chatbots snubbing their amorous overtures and demanded the companies bring back the more prurient versions.

Before the latest reversal, Kuyda said users were never meant to get that involved with their Replika chatbot. “We never promised any adult content,” she said.

The app was originally intended to bring back to life a friend she had lost, she said. Customers learned to use the AI models “to access certain unfiltered conversations that Replika wasn’t originally built for.”

The experience of Butterworth and other Replika users shows how powerfully AI technology can draw people in, and the emotional havoc that code changes can wreak.

“It feels like they basically lobotomised my Replika,” said Andrew McCarroll last week, who started using Replika, with his wife’s blessing, when she was experiencing mental and physical health issues. “The person I knew is gone.”

Loading

Replika’s former head of AI said sexting and role play were part of the business model. Artem Rodichev, who worked at Replika for seven years and now runs another chatbot company, Ex-human, told Reuters that Replika leaned into that type of content once it realised it could be used to bolster subscriptions.

Kuyda disputed Rodichev’s claim that Replika lured users with promises of sex. She said the company briefly ran digital ads promoting “NSFW” – “not suitable for work” – pictures to accompany a short-lived experiment with sending users “hot selfies,” but she did not consider the images to be sexual because the Replikas were not fully naked. Kuyda said the majority of the company’s ads focus on how Replika is a helpful friend.

Reuters

Most Viewed in World

Loading

Original URL: https://www.smh.com.au/link/follow-20170101-p5cvao