‘Three seconds of audio’: Cyber expert reveals how terrifying AI voice scam could rip Aussies off
A cybersecurity expert has revealed how an emerging scam can rip Aussies off using as little as ‘three seconds of audio’.
Hacking
Don't miss out on the headlines from Hacking. Followed categories will be added to My News.
An emerging AI scam on the cybersecurity watchlist can be created with “as little as three seconds of audio” from social media or a voicemail to rip of everyday Aussies, an expert has revealed.
The grim warning comes as new data from NAB revealed customers reported an average of 1500 scams every month over 2023, along with a massive rise in reports to the bank’s fraud team.
Laura Hartley says the “scamscape” is constantly evolving – and technology is paving the way for fraudsters to rip off everyday Aussies.
NAB’s fraud team now receives an average of about 80,000 calls each month – up from a monthly average of 63,800 last year.
Ms Hartley, NAB’s Advisory Awareness Manager, revealed AI voice impersonation scams and QR code phishing were among the sophisticated scams the bank’s cybersecurity team were watching out for over the new year.
“When many of us are relaxing enjoying the new year, scammers are busy working on new scams,” she said.
“Criminals are targeting Aussies enjoying their break by using sophisticated technology to manipulate victims when and where they least suspect it.”
NAB’s top six scams to watch out for include: AI voice impersonation scams, term deposit investment scams, remote access scams using chat, romance scams, ticket scams and QR code phishing scams.
Ms Hartley revealed AI scams were at the top of the list as they could be created with “as little as three seconds” of audio taken from social media, voicemail or video on a website.
“We know they are happening in the UK and US, in particular, and anticipate it’s just a matter of time before these scams head down under,” she said.
The AI-voice scams worked similar to “Hi Mum” scams sent around in 2022, however instead of being message based they involve a phone call.
“The loved one might claim they’ve been beaten up or kidnapped and won’t be freed unless the person sends money,” Ms Hartley said.
“While these scams use readily available technology, they do require criminals to find a link between the person receiving the phone call and the person in ‘distress’ so they’re harder to scale than other scams.”
Common red flags to look out for include unexpected phone calls from a purported “loved one” in “distress”, urgency in asking for a payment and requests for secrecy not to tell anyone else what’s happened.
Ms Hartley said there was one common theme running through the hit list: a sense of “urgency”.
“Scammers create a sense of urgency to encourage you to act quickly,” Ms Hartley explained.
“It could be a phone call from your ‘son’ or ‘daughter’ in distress and needing money, a fantastic term deposit rate that’s only available for a limited time or cheap concert tickets going quickly,” she said.
Data from the Australian Competition and Consumer Commission (ACCC) revealed Australians more than $1.2m to NBN scams over the last year.
The scheme involved scammers calling victims and offering to fix their computer via remote access software, which would allow them to access personal information like bank details.
People aged 65 and over were found to be the most vulnerable, accounting for 81 per cent of victims.
Originally published as ‘Three seconds of audio’: Cyber expert reveals how terrifying AI voice scam could rip Aussies off