NewsBite

Lawyers caught citing fake cases; warned against using ChatGPT

The legal profession has been sternly warned against using AI to develop integral case documents, after lawyers were caught submitting a ChatGPT-developed brief including case citations that didn’t exist.

The legal profession has been sternly warned against using AI to develop integral case documents.
The legal profession has been sternly warned against using AI to develop integral case documents.

The Australian legal profession has been sternly cautioned against the use of AI to develop integral case documents, after lawyers were caught submitting a ChatGPT-developed brief including case citations that didn’t exist.

The lawyers were acting for plaintiff Roberto Mata, who has sued US airline Avianca after an employee hit his knee with a serving cart during a 2019 flight from El Salvador to New York.

When the airline filed to throw out the case, currently being heard in a New York district court, Mr Mata’s lawyers staunchly objected, submitting a 10-page brief that cited more than half a dozen relevant court decisions.

But neither the airline’s lawyers nor the judge himself could find the decisions or the quotations cited and summarised in the brief, because ChatGPT had made them all up.

Law Council of Australia president Luke Murphy told The Australian the case was a “timely reminder” that although evolving technologies may create opportunities for the industry, “lawyers have overriding obligations to deliver their services competently and diligently”.

“This means they must still meet their rigorous professional and ethical obligations and use their sound judgment when deciding what tools to use – and when and how – to assist in their work,” Mr Murphy said.

While Mr Murphy conceded there was “transformative potential” in AI, he recognised it came with great legal issues. He concluded lawyers must “ensure these benefits are optimised and risks minimised”.

“The profession is approaching the introduction of AI with caution and where these tools are utilised by lawyers, they must do so with care,” he said.

“There is a potential role for AI/ChatGPT to be utilised by lawyers to speed up work that has previously been time and labour intensive, such as large-scale document review/discovery.

“We must continue to shape how we engage with AI in the future and how AI is incorporated into legal practice will be a vital area of focus in the coming years.”

Mr Murphy’s warning comes as Australian BigLaw firms turn to AI as a means to catapult them out of a plummeting market.

Law Council of Australia president Luke Murphy has warned against the use of AI to develop integral case documents even as Australian BigLaw firms turn to it as a means to catapult them out of a plummeting market.
Law Council of Australia president Luke Murphy has warned against the use of AI to develop integral case documents even as Australian BigLaw firms turn to it as a means to catapult them out of a plummeting market.

The Australian revealed earlier this year PwC would introduce an AI tool to help its Australian lawyers conduct research and analysis, manage claims and potentially offer legal advice.

KPMG has also gained access to its own, private version of ChatGPT, via a partnership with Microsoft, named KymChat.

Ashurst and Lander & Rogers have both backed the adoption of AI and machine learning to varying extents, while Gilbert + Tobin identified emerging technologies as necessary to keep the firm moving forward.

“Whatever we do, we will need to make investments in innovation such as technology and AI to ensure we are prepared for the future,” Gilbert + Tobin chief operating officer Sam Nickless told The Australian.

MinterEllison chief talent officer Alissa Anderson said despite economic headwinds facing the legal industry, AI was a great opportunity to reduce the firm's costs and increase its productivity.

“We have an innovation mindset, and over the past 18 months our people have worked with AI solutions to improve productivity, reduce costs, and allow them more time to focus on solving complex problems commercially and practically,” she said.

Ellie Dudley
Ellie DudleyLegal Affairs Correspondent

Ellie Dudley is the legal affairs correspondent at The Australian covering courts, crime, and changes to the legal industry. She was previously a reporter on the NSW desk and, before that, one of the newspaper's cadets.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/nation/lawyers-caught-citing-fake-cases-warned-against-using-chatgpt/news-story/6628d2d89d7069e9e879ebc1f900705a