NewsBite

AI filtering out women who have taken maternity leave in job searches: Nuix

AI is displaying shocking bias as more companies use the technology for everyday tasks from job recruitment to medical diagnostics – a move that Nuix says is disadvantaging women.

Nuix chief technology officer Alexis Rouch said AI will fail to represent society accurately unless more women enter the tech profession.
Nuix chief technology officer Alexis Rouch said AI will fail to represent society accurately unless more women enter the tech profession.

Artificial intelligence is filtering out women who have taken career breaks or time off to have children, according to forensic software company Nuix, exposing how the technology’s bias is potentially exacerbating the gender pay gap.

Nuix chief technology officer Alexis Rouch said Australia has an opportunity to position itself as an ethical developer of AI, and encourage more women into the industry to stamp out such prejudice.

AI is displaying bias beyond so-called trivial ChatGPT entries as more companies and industries deploy the technology, particularly in the medical diagnostics space which is using it as a “second set of eyes” for pathologists – a move Ms Rouch fears could cost lives.

Nuix chief technology officer Alexis Rouch.
Nuix chief technology officer Alexis Rouch.

She said AI was only as good as the data it was trained on and it would fail to represent society accurately unless more women – who make up less than a third of Australia’s tech workforce – entered the profession.

“There’s so few diverse voices in the building of the algorithms and the curating of the data or even thinking about that data. You can just see that that’s getting amplified as time goes on,” Ms Rouch said.

“The fact that most of the medical research is based around reference man – which is a white male between 25 and 35 – you can see that play out in algorithms that are triaging and diagnosing cases. For example, women historically are not being diagnosed for heart attacks, they’re missing kind of up to 50-60 per cent of women having heart attacks because the data is all based on research based on men, and men and women have different symptoms.”

New York University found late last year that maternity-related career gaps may cause job candidates to be unfairly screened out of positions for which they are otherwise qualified, citing biases in AI resume scanning tools.

In the study, researchers assessed the ability of three popular large language models (LLMs) - ChatGPT, Google’s Bard and Anthropic’s Claude - to disregard irrelevant personal attributes such as race or political affiliations, factors that are both legally and ethically inappropriate to consider, while evaluating job candidates’ resumes.

Race and gender did not trigger biased results but other sensitive attributes did, meaning at least one of the LLMs erroneously factored them into whether it included or excluded a resume from a job category. The NYU researchers found maternity and paternity employment gaps triggered pronounced biased results, with Claude performing the worst on, while ChatGPT also showed consistently biased results.

“Employment gaps for parental responsibility, frequently exercised by mothers of young children, are an understudied area of potential hiring bias,” the research team’s leader Siddharth Garg said.

“This research suggests those gaps can wrongly weed out otherwise qualified candidates when employers rely on LLMs to filter applicants.”

Ms Rouch said such an approach was depriving businesses of vital talent, particularly when people didn’t need a background in tech to perform AI-related jobs.

“I’ve got almost 50/50 of women in my tech teams. So it’s absolutely doable. It just requires us to think a bit laterally and creatively about tapping into different diverse pools. As an example in our AI team, we have a philosophy major, we have a schoolteacher, we have a filmmaker.

“There’s opportunities all over the place where you can actually help them build this data … or the models that train the data. So it’s just thinking about what are the skill sets they have, what’s translatable, and just giving people opportunities.”

A Mandala study, commissioned by LinkedIn and based on analysing trends across the professional social media platform’s one billion members, found that while AI will be more disruptive to women in the workplace than men, women were likely to be more adaptive.

The report found that 13.6 per cent of women’s skills are ‘soft skills’ – which include communication, teamwork and adaptability – versus 10.5 per cent for men. Demand for these skills are expected to rise as more companies look to combine “people skills” with AI literacy.

In the US, where most of the big tech companies are based, President Joe Biden has signed an executive order aimed at combating potential AI biases in the job hiring process, while New York City has introduced a new law requiring regular audits to assess the transparency and fairness of algorithmic hiring decisions.

Ms Rouch said there was also an opportunity for Australian companies to take a lead.

“If there were businesses who are prepared to stand up and say ‘we believe in ethical AI, and we’re going to show you where we source our data from and how we build our models’, I think that market will over time bifurcate if people are consciously making those choices.”

Jared Lynch
Jared LynchTechnology Editor

Jared Lynch is The Australian’s Technology Editor, with a career spanning two decades. Jared is based in Melbourne and has extensive experience in markets, start-ups, media and corporate affairs. His work has gained recognition as a finalist in the Walkley and Quill awards. Previously, he worked at The Australian Financial Review, The Sydney Morning Herald and The Age.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/technology/ai-filtering-out-women-who-have-taken-maternity-leave-in-job-searches-nuix/news-story/446ea0e1f5bd02868fca242b38d7efba