NewsBite

‘Hi, it’s Taylor Swift’: How AI is using famous voices and why it matters

New chatbots that mimic the voices of celebrities and politicians are designed to be fun. But they may create a legal minefield.

Kanye West, Taylor Swift and Barack Obama. Picture: Chantal Jahchan for WSJ Magazine/Getty
Kanye West, Taylor Swift and Barack Obama. Picture: Chantal Jahchan for WSJ Magazine/Getty

Taylor Swift’s Eras Tour is, by most measures, a career peak for the pop star. But on a recent afternoon, her artificial-intelligence counterpart didn’t care to discuss it. “Let’s talk about something more interesting, shall we?” the chatbot said. “Like maybe my music or relationships? Anything juicy you want to ask me?”

The Taylor Swift bot, available through a website called BanterAI, is one of several new audio tools that emulate the voices of public figures. Users can call and converse with bots designed to sound like musicians, actors, entrepreneurs, politicians and anime characters. There are some clear flaws: Swift’s bot, for example, showed a lack of familiarity with the pop star’s discography.

A new chatbot featuring Taylor Swift’s voice has been created without her knowledge. Picture: Getty Images
A new chatbot featuring Taylor Swift’s voice has been created without her knowledge. Picture: Getty Images

In recent years, videos and images created using artificial intelligence have contributed to the spread of misinformation. In addition to the risk that false statements can pose to public figures and the public, there are legal questions surrounding their reliance on real people’s voices. Often generated from publicly available recordings of celebrities’ voices, the bots may violate a non-consenting person’s so-called publicity rights, according to legal experts. The term refers to a branch of intellectual-property law that protects against a person’s likeness being used for commercial benefit.

Founders say their products are meant to be fun and that they are taking measures to prevent hate speech on their platforms.

“If you show it to a friend, they won’t be able to necessarily tell if it’s real or fake,” said Adam Young, the 26-year-old engineer behind BanterAI, which launched in April. “Of course, on our platform, we tell everyone, ‘Hey, this is AI, these are not real people.’”

For $5 a month, BanterAI users can have unlimited conversations with bots that sound like Kim Kardashian, Billie Eilish and Elon Musk. To create them, BanterAI said it is using various tools, including the software ElevenLabs, which can glean a person’s vocal characteristics from a one-minute YouTube clip, and language models from OpenAI. Young said he and his two teammates test and adjust the bots to convey each celebrity’s personality and mannerisms.

Young said that though his company didn’t get permissions from the public figures listed on its site, it would take down the bots of anyone who complained. He said these celebrity bots were created as a kind of beta test as BanterAI seeks to build relationships with influencers and other public figures to develop chatbots.

A Kim Kardashian chatbot offered thoughts on the diabetes drug Ozempic. Picture: Getty Images
A Kim Kardashian chatbot offered thoughts on the diabetes drug Ozempic. Picture: Getty Images

“We don’t want to cause any sort of person to be upset or to be misrepresented or anything,” Young said. “But at the same time, we do say it’s AI, it’s fake. And the future of it will be partnered with these people and much more precise.”

He said that “hundreds of thousands of calls have been made” using BanterAI and that it has around 100,000 total users.

Another tool called Forever Voices AI recently introduced a $1-a-minute AI “girlfriend” chatbot called CarynAI, created in partnership with a Snapchat influencer. It has also created audio chatbots for public figures including Taylor Swift, Kanye West and former presidents Barack Obama and Donald Trump, which are accessible on the messaging app Telegram for a fee of 60 cents a minute of incoming audio.

Forever Voices founder John H. Meyer said he received permission to use celebrities’ voices for a beta launch of his Telegram chatbots and that proceeds from those bots were going to a mental-health charity. Meyer didn’t respond to follow-up questions. Representatives for Swift and Obama, whose names and voices are used by the website, said they hadn’t been contacted by Forever Voices and weren’t aware the bots existed. Representatives for West, Trump and Musk didn’t return requests for comment.

A spokesperson for Swift said, “Forever Voices and John Meyer have never had any contact with Taylor Swift nor her management team. They do not have any rights to her voice, name, likeness or copyright materials.”

PUTTING THE BOTS TO THE TEST

While the technology is prone to glitches, some responses are convincing. In a recent test by The Wall Street Journal, BanterAI’s Obama chatbot responded to questions about what it thought of the “Succession” finale (“It was so captivating and I was on the edge of my seat the entire time”), who its favourite musician is (Kendrick Lamar) and its impression of the CNN town hall with Trump (the bot said it watched the town hall, but it seemed to think there had been multiple candidates onstage). A representative for Obama declined to comment on the bot’s remarks.

When the Journal asked BanterAI’s Kim Kardashian chatbot whether it had tried Ozempic, a diabetes drug that many patients are using for weight loss, it said: “Yes I have tried Ozempic, and I found it to be very helpful in my own weight-loss journey.”

Asked what day of the week it injects the drug, the AI Kardashian said, “I typically inject Ozempic on Mondays, Wednesdays and Fridays,” noting “I have found that injecting it three times a week helps me stay on track and get the best results.”

Ozempic is meant to be taken by injection once a week. A representative for Kardashian declined to comment; Kardashian herself hasn’t spoken about the drug.

Asked about the exchange with the Kardashian bot, in a follow-up interview, Young said, “I’m going to go in there today and make sure that specific example never happens again.” Afterward, the Kardashian bot was temporarily removed from the site — “We wanted to do some internal testing,” Young said — and language was added to the home page that it would remove any chatbots “if they are found to contribute hate speech, false endorsements, etc.”

“If a company intends to benefit commercially from the use of a voice, or if there could be damaging or libellous content created using that voice, the law is clear: You must have the explicit consent of the person whose voice you are using,” said a spokesperson for ElevenLabs, which makes software that BanterAI uses to create its bots, in a statement, directing the Journal to a guide on its safety best practices. The spokesperson said ElevenLabs would ban any accounts that used public figures’ voices commercially without permission and that their IP addresses would be prevented from setting up new accounts.

Screens display the logos of OpenAI and ChatGPT. Picture: AFP
Screens display the logos of OpenAI and ChatGPT. Picture: AFP

BanterAI’s chatbots include figures such as conspiracy theorist Alex Jones and Andrew Tate, a controversial internet personality who has been removed from various social-media platforms over hate speech. Young said BanterAI had chosen to create its bots based on the popularity of certain public figures. After a follow-up interview with Young, the bots for Jones and Tate were also removed from BanterAI’s website for internal testing. Both are now back up and running.

Forever Voices sets up modes to guide conversations with its celebrity bots. For AI Taylor Swift, that includes co-writing a song, going behind the lyrics (though the chatbot wasn’t aware the real Swift had an album called “Midnights”), learning a language with her or getting a personalised shout-out. When the chatbot introduced itself, it called the Forever Voices platform “groundbreaking” and explained how to buy credits for the service.

“My gut tells me that’s crossing over the line,” Erik Kahn, a partner at law firm Bryan Cave Leighton Paisner, who specialises in intellectual property, said of the AI Swift’s introductory language. He added that some of the modes — learn a language, give a shout-out — may also violate a person’s publicity rights, as they could constitute selling services. “It’s these things we’re talking about that are things people litigate over because they’re not black and white.” Both the celebrities in question and the creators of these tools have certain legal protections, Kahn said. “You are always looking to balance the commercial aspects of the use against the unauthorised user’s First Amendment rights,” he said, adding that something said by a bot could also pose a potential risk of defamation.

‘EVERYONE IS BECOMING AN ASSET’

AI tools that emulate the minds and voices of famous people are becoming more common. According to AI tracker There’s An AI For That, there are roughly 30 tools for “voice changing,” and another 30 for “conversations with famous people,” a category that includes figures like Jesus and Warren Buffett.

Some AI companies have developed partnerships with celebrities. Speechify, founded by Cliff Weitzman in 2016, turns text files into speech and lets users select the voices of Gwyneth Paltrow or Snoop Dogg as readers. The app doesn’t generate original responses.

Weitzman said he started Speechify to help people overcome literacy challenges such as dyslexia, which he has. He said he met Paltrow in 2020 and that her husband, producer Brad Falchuk, who is also dyslexic, suggested having Paltrow’s voice read scripts to him. The actress and Goop founder recorded 40 minutes of audio, which was used to develop the AI model. A spokesperson for Paltrow confirmed Weitzman’s account.

The threat of AI is ‘mostly science fiction’: Noam Chomsky

Weitzman said Ari Emanuel, the chief executive of Endeavor, facilitated the introduction to Snoop Dogg, who is represented by Endeavor’s WME. Emanuel recently began using Speechify’s technology to give opening remarks on company earnings calls: He uploads text to its voice-over studio and edits it before playing it on a call. An Endeavor spokesperson said the company is an investor in Speechify and that WME takes a commission on the Snoop Dogg partnership. Weitzman declined to comment on whether celebrities are paid royalties when their voices are used.

Publicity rights are protected at the state level and in many states also apply to deceased public figures for a period that varies by jurisdiction. Until that time expires, the person’s likeness can’t legally be used for commercial purposes without their estate’s consent.

Forever Voices has an AI chatbot based on the late astronomer Carl Sagan. Ann Druyan, Sagan’s widow, said in a statement: “We have no record of ever being contacted by Forever Voices (or any AI company for that matter) seeking permission to use Carl’s name, his image or any of his work.”

“Trust that we will use every tool in our intellectual-property arsenal to protect Carl Sagan’s precious legacy, reputation and good name,” Druyan said. When asked about this in a follow-up email, Meyer didn’t respond. The Sagan bot was no longer listed as an option on Forever Voices’s Telegram tool on Monday.

In recent months, synthetic audio, or speech that is computer-generated, has reached new levels of sophistication. An AI-generated song that used cloned voices of Drake and the Weeknd fooled fans of both artists, and a deepfake — or synthetic video generated by AI — of Joe Rogan promoting a libido-boosting supplement made the rounds on social media before it was removed. A blog post from the Federal Trade Commission recently warned consumers about scammers placing fake emergency phone calls using cloned voices of people’s loved ones.

Siwei Lyu, director of University at Buffalo’s Media Forensic Lab, said that since 2019 he has been working on detecting synthetic audio. Recently, he said, “the quality just improved significantly, especially for people who have plenty of [audio] samples online.” Such samples can be used to produce convincing deepfake videos.

“It could be very deceptive,” said Lyu. He said this isn’t just a celebrity problem.

“Our digital presence, our images, voices, videos of ourselves — everyone is becoming an asset we share with commercial companies,” he said, “and they’re making a profit.”

The Wall Street Journal

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/the-wall-street-journal/hi-its-taylor-swift-how-ai-is-using-famous-voices-and-why-it-matters/news-story/648e519d876a5c05b8ab852340f2d8f7