How disturbing AI technology could be used to scam online daters
If you see this woman pop up on Tinder do not start talking to her. She may look like a pretty girl, but you will regret ever meeting her.
Security
Don't miss out on the headlines from Security. Followed categories will be added to My News.
The majority of internet users probably like to think they could spot a dating scam from a mile away and just can’t understand how anyone would fall for such a trick.
But there is a reason so many people fall for catfish or online dating scams, and it isn’t because they are dumb or desperate.
Most people are aware of the major indicators of a dating scammer like asking for money, never wanting to video chat and sharing very few pictures of themselves.
But scammers are constantly figuring out new ways to make their stories seem more believable and to get people to trust them.
Take this woman for example. She is young and attractive, and it’s unlikely many potential love-interests would think twice about chatting with her on a dating app.
But this woman isn’t real. And I don’t mean in the sense that someone has stolen her picture from social media and is using it without her knowledge on dating apps.
She doesn’t exist.
The image was generated by a website called ThisPersonDoesNotExist.com that uses AI technology to randomly generate realistic-looking human faces.
Each time you refresh the page a new “person” is created.
While a single picture on its own might not seem a very big threat, when you combine it with the constant advances in deepfake technology there is a real cause for concern.
Deepfake is an AI-based technology that produces hyper realistic images and videos of situations that never happened.
We have seen a rise in this technology being used to blackmail people by making videos of them in embarrassing or sexual situations that never happened.
These videos look so realistic it is hard to prove they are fake.
A recent example of the major issues this technology can cause is when a video made the rounds last year of Barack Obama appearing to call Donald Trump a “dipsh*t”.
There are certain points where you can see blurring or distortion on the video that indicates it isn’t real, but it gives an idea of just how dangerous this technology can be.
With this in mind, there is increasing potential for scammers to use AI-generated images and create a whole new person.
Phillip Wang, the man behind the website ThisPersonDoesNotExist.com, told news.com.au he created it to prove a point to friends about AI technology.
“I then decided to share it on an AI Facebook group to raise awareness for the current state of the art for this technology. It went viral from there,” Mr Wang said.
When asked if he had any concerns about people using the images to catfish or scam others, he said that concern already existed long before the website was made.
“Anyone can download the code and the model and instantly start generating faces on their own machine,” he said.
Mr Wang said creating a site where people could understand just how easy it was to make a fake person was helping to raise awareness about the implications this kind of technology might have in the future.
He said it was becoming increasingly difficult to tell deepfakes from reality, and it was “beyond something that simple photoshop forensics can help defeat”.
HOW IT IS BEING USED NOW
There are an increasing number of cases of deepfakes being used to create fake revenge or celebrity porn.
Zach, a senior reputation analyst at Internet Removals, an organisation that helps people get sensitive content offline, said they first encountered deepfakes in 2017.
“One of our staff was alerted to naked pictures of this A-list celebrity being shared around the internet. We looked it up and there were tonnes of images, and we just couldn’t wrap our heads around how it was being done,” he told news.com.au.
“We didn’t know what we were dealing with. We initially thought it was a group of sick individuals manually photoshopping each picture, which would take a very long time.”
Unfortunately, there is very little people can do to protect themselves from becoming targets of these online attacks. And even getting the photos taken down once they are created can be difficult.
“The person who created the image is often protected as they are seen as being the author of the work as the image is technically created by them,” Zach said.
“It can already be a tricky process to get images removed from the internet, but it becomes even harder when deepfake is involved.”
There are already signs of how scammers are using this technology to their advantage.
Zach said their team came across a scammer on Tinder that encouraged people to video chat with them. Usually, this is something a scammer or bot tries to avoid as the person they are talking to will realise they aren’t real.
But when they accepted the video chat it showed a woman undressing and encouraging the other person to do the same.
The only indication that something was wrong was the audio didn’t match up to the movement of the woman’s mouth.
HOW IT COULD BE USED IN THE FUTURE
One of the first things that Zach and his team do when people tell them they think they have fallen for a dating scam is reverse image search the pictures used by the scammer.
This allows them to see if the same picture has been used anywhere else on the internet, so they can see if the scammer was using someone else’s images.
But with AI-created pictures, obviously the person in the image doesn’t exist, so it can’t be proved they were stolen from anywhere.
However, this tactic doesn’t always help even if the pictures are stolen.
“If people steal a photo of a real person and mess around with one or two pixels or metadata then it is considered a different image, and our search can’t pick it up,” Zach said.
“This makes it nearly impossible to figure out if it is a deepfake or someone’s stolen photo.”
Another issue is if it isn’t immediately obvious someone isn’t real, a lot of people on dating apps don’t even consider something may be off.
“The people who use these dating apps, as much as they say it is there to find love, many of them are just looking for a sexual encounter,” Zach said.
“So when they start talking to someone, they aren’t really thinking with the mindset of ‘is this person real or not’.
“We have never had a client who has matched with someone and then tried to reverse image search them to see if they were who they say they were.”
Zach said people would have to be “increasingly careful” as this type of technology was likely to be used a lot more to scam others.
“Any tool that can create these types of believable images is a major disadvantage to dating app users,” he said.
“We are probably going to start encountering deepfakes more and more without even realising it.”
Originally published as How disturbing AI technology could be used to scam online daters