Could Minter Ellison using AI to speed up discovery process mean less graduate lawyer jobs?
Minter Ellison is now using AI to accelerate discovery in court cases with lots of documents.
Business
Don't miss out on the headlines from Business. Followed categories will be added to My News.
One of Australia’s oldest law firms, Minter Ellison, is using generative artificial intelligence to accelerate a key legal step – discovery – in one of the only known examples of the technology infiltrating modern practice.
The industry’s union warns AI should not replace the work of graduates.
Perth-based Minter Ellison partner Michael Hales lodged an affidavit in a blockbuster court fight brought by iron ore miner Fortescue against Element Zero, where it was revealed the firm is using AI.
“Minter Ellison also utilises tools such as artificial intelligence and technology assisted review tools to accelerate its discovery search processes in matters containing high numbers of documents,” he said.
Discovery, which is the compulsory identification of documents that could be relevant to a dispute, is significant in the Fortescue-Element Zero case in which Andrew Forrest’s company has accused his former staffers, including chief scientist Bart Kolodziejczyk, Bjorn Winther-Jensen and Michael Masterman, of stealing trade secrets to launch their own green iron start up. They deny the allegations and the bitter dispute is still before the courts.
According to one estimate noted in another related document, a reviewer rate of 20 - 25 documents per hour with six full time reviewers would cost between $1.2m and $1.5m.
Hundreds of thousands of documents are expected to be reviewed in the case, potentially taking six months or more to sift through.
Minter Ellison chief digital officer Gary Adler said the firm has been using an “early form” version of AI endorsed by the courts since 2016, and an in-house generative AI tool called Lantern since last year.
“Earlier forms of AI rely on training the machine on a per-document basis (meaning that you must review documents before the machine is smart enough to understand what you want),” he said.
“Lantern lets us instruct the machine the same way we would instruct junior reviewers, meaning that we can scale the benefits of AI much more quickly.”
Mr Adler said Lantern was developed in late 2023, and it can process an average of 3500 documents per hour making it 58 times faster than manual review.
“We spent several months testing its accuracy and defensibility before releasing it into production in early 2024. We were one of the first firms in the world to release an AI-enabled document review capability directly into client service,” he said.
Australian Services Union National Secretary Emeline Gaske told The Australian it welcomed the use of AI as a support tool to boost productivity, but not to replace workers.
“We’re aware AI is being incorporated into the practice of law firms and are in discussion with firms about making sure workers have a say on how that happens,” Ms Gaske said.
“AI should be used to make workers jobs easier but it can’t replace human judgement or decision-making. It is also critical that graduate and junior lawyers develop their skills in research and analysis as part of their development,” she said.
“We won’t allow firms to use AI to eliminate roles, they will need to invest in training to upskill workers to use AI.
“Clients and workers both need assurance that the foundations of AI, including how data is sourced and used, is open and transparent, and align with legal and ethical standards.”
Last week, Federal Court Chief Justice Debra Mortimer said the court is considering releasing a practice note (guidance) about the use of generative AI after consulting with lawyers and court users.
“In the meantime, the court expects that if legal practitioners and litigants conducting their own proceedings make use of generative artificial intelligence, they do so in a responsible way consistent with their existing obligations to the court and to other parties,” she said in a statement.
“Further, it is also expected that parties and practitioners disclose such use if required to do so by a judge or registrar of the court.”
In Queensland and Victoria parties must disclose if they used AI to help prepare their case.
In NSW, generative AI is not prohibited outright, but Chief Justice Andrew Bell banned lawyers from using AI to generate affidavits, witness statements, character references or material to be used in cross-examination.
“Gen AI must not be used for the purpose of altering, embellishing, strengthening or diluting or otherwise rephrasing a witness’s evidence when expressed in written form,” he said.
As well, the brother of a man accused of using counterfeit cash to buy illegal drugs was reprimanded by a judge for using ChatGPT to write a character reference.
A Victorian solicitor was also busted using AI after he tendered a list of fake cases to a family court judge. They were referred to the legal watchdog.
More Coverage
Originally published as Could Minter Ellison using AI to speed up discovery process mean less graduate lawyer jobs?