NewsBite

Aussie teacher says Airbnb AI discriminated against her

A Melbourne maths teacher’s explanation of why she couldn’t make an Airbnb account has disgusted a panel of tech experts.

Aussie teacher says Airbnb AI discriminated against her

A panel of tech experts has ripped into Airbnb after a Melbourne teacher said she could not create an account on the accommodation platform because of the colour of her skin.

Francesca Dias said her white partner had to do it for her.

“So recently I found I couldn’t activate an Airbnb account basically because the facial recognition software couldn’t match two photographs or photo ID of me, and so I ended up having my white male partner make the booking for me,” she told ABC Q+A’s panel on Monday night.

Ms Dias wanted to hear from the experts on how society could “avoid AI bias and reinforcing discrimination”.

Melbourne maths teacher Francesca Dias said she couldn’t make an account on Airbnb. Picture: ABC Q+A
Melbourne maths teacher Francesca Dias said she couldn’t make an account on Airbnb. Picture: ABC Q+A

Catriona Wallace, a future technology expert and founder of the Responsible Metaverse Alliance, explained the problem was the data sets AI starts with.

“To train the algorithms, data sets have to be collected from somewhere, usually society, and often society does not have good representation of the full population in its data sets because that’s how biased we’ve been historically,” Dr Wallace said. “And those historical sets are used to train the machines that are going to make decisions of the future, like what happened to Francesca.

“It is staggering that this is still the case and it’s Airbnb, right? You’d think a big international global company would get that s*** sorted but they still haven’t and we’re seeing that over and over again.

“Even with big brands and tech companies, they are still using data sets that haven’t been properly transformed to reflect the population and it continues on.”

Tech expert Catriona Wallace said it was “staggering”. Picture: ABC Q+A
Tech expert Catriona Wallace said it was “staggering”. Picture: ABC Q+A

Q+A host Patricia Karvelas appeared equally as appalled.

“If you look at something like Airbnb you’d think capitalism would sort it out,” Karvelas said. “Just in terms of consumer numbers. There are lots of brown women.”

Technology journalist Angharad Yeo said the problem was that diverse data sets were not front of mind for companies investing in new technology.

“Because the technology is still new, I think it’s very easy for them to get very excited that it’s being implemented at all,” she said.

However, she offered an “optimistic” view for the future.

“I think it’s one of those areas that really puts a spotlight on these biases because you have real examples when something doesn’t work,” Ms Yeo said.

“It’s very easy to ignore a bias when it’s just like ‘oh maybe I didn’t get promoted over a co-worker and maybe that was because of bias and maybe it wasn’t’.

“When it’s a little bit more hidden, it’s easier to ignore, but when it’s ‘I literally cannot use this service because the AI isn’t working’, then that really makes you go, hang on a second we have a real problem here.”

Q+A host Patricia Karvelas was shocked by the situation. Picture: ABC Q+A
Q+A host Patricia Karvelas was shocked by the situation. Picture: ABC Q+A

Toby Walsh, chief scientist at the UNSW AI Institute, said it wasn’t a technical problem or capitalism problem, it was a regulatory problem.

He said racial discrimination laws existed and should be applied “forcefully” so it was in Airbnb’s best interest “to get it right”.

“We know actually technically how to fix this,” he said.

Airbnb Australia directed news.com.au to a page about the photo matching process on its website.

“No facial matching process is completely accurate every time,” the website states.

“The effectiveness of this process can vary based on the quality and resolution of the photos — and changes in a person’s appearance between the two photos (for example, change in age, change in weight or different outfits).

“As a result, this process may sometimes ‘match’ photos that are not, in fact, of the same person or fail to match photos that are of the same person.”

In a crackdown on house parties, Airbnb is now also using AI globally to weed out potential party hosts.

AI will look out for red flags like how recently users made an account, if they’re trying to book a property in the same city they live, and over what period.

“If someone is booking a room during New Year’s Eve for one night only, and they are from the same city as the host, that’s likely to be a party,” Naba Banerjee, head of safety and trust at Airbnb, told BBC.

In the week after this story was published Airbnb told news.com.au that they had looked into Ms Dias’ case.

“Airbnb is built on trust. The verification process for this guest’s ID did not include the use of any facial recognition technology,” Airbnb Australia manager Susan Wheeldon said.

“In this case, an expired government ID was uploaded. We’re disappointed to hear about our guest’s experience and our team reached out directly to offer to help them complete the ID verification process.”

Read related topics:AirBnBMelbourne

Original URL: https://www.news.com.au/travel/travel-updates/incidents/aussie-teacher-says-airbnb-ai-discriminated-against-her/news-story/8b1e0b9f5b15e40742dc6c29e2b98dcd