NewsBite

New rules require lawyers to disclose AI use

Lawyers will be required to tell the judge and other parties if they use AI during a case, as the first Supreme Courts release sweeping guidelines for the use of emerging technologies in the courtroom.

New guidelines will require lawyers to disclose to other parties and the judge if they use artificial intelligence during a case.
New guidelines will require lawyers to disclose to other parties and the judge if they use artificial intelligence during a case.

Lawyers will be required to disclose to other parties and the judge if they use artificial intelligence to help form an argument or write a document in a case, as Supreme Courts release sweeping guidelines for the use of emerging technologies in the courtroom.

The Victorian Supreme Court earlier this month became the first to issue comprehensive guide­lines on the use of AI in court, saying “parties and their practitioners should disclose to each other the assistance provided by AI programs to the legal task undertaken”.

“The use of AI programs by a party must not indirectly mislead another participant in the litigation process (including the court) as to the nature of any work undertaken or the content produced by that program,” the guidelines read.

“Ordinarily, parties and their practitioners should disclose to each other the assistance provided by AI programs to the legal task undertaken. Where appropriate (for example, where it is necessary to enable a proper understanding of the provenance of a document or the weight that can be placed upon its contents), the use of AI should be disclosed to other parties and the court.”

The guidelines caution practitioners against using AI unless “they have an understanding of the manner in which those tools work, as well as their limitations”.

“Parties and practitioners should be aware that the privacy and confidentiality of information and data provided to an external program that provides answers generated by AI may not be guaranteed and the information may not be secure,” they read.

Further, the guidelines indicated anyone signing or filing a document for the court “remains responsible for the accuracy of the content”.

“Whether a court document is signed by an individual or on behalf of a firm, the act of signing a document that is filed with the court is a representation that the document is considered by those preparing it to be accurate and complete,” the guidelines read.

“Reliance on the fact that a document was prepared with the assistance of a generative AI tool is unlikely to be an adequate response to a document that contains errors or omissions.”

Australia’s courts have been grappling with the advent of AI over the past 18 months as the technology continues to infiltrate the courtroom.

In February, an ACT Supreme Court judge dealt with the first known case of AI in court, when the brother of a man found guilty used ChatGPT to write a character reference for him.

The Queensland Supreme Court last week released guidelines for the use of AI for non-­lawyers appearing in court, reminding them that generative chatbots “cannot understand the unique fact situation in your case … or understand your cultural and emotional needs”.

“Information provided by Generative AI chatbots may be inaccurate, incomplete, or out of date,” the guidelines read.

“It may also be based on overseas law that does not apply in Australia. Generative AI chatbots can make up fake cases, citations and quotes, or refer to legislation, articles or legal texts that do not exist.”

The Australian Institute of ­Judicial Administration in January released the first guidelines for how judges should approach the use of AI, accepting that the technology has the capacity to “entirely” replace judicial discretion, determine whether someone will reoffend and predict case outcomes.

Australian courtrooms are already looking to AI to optimise efficiency, with the NSW Judicial Commission developing an automated “Bail Assistant” – which could soon be used to predict bail decisions – to help untangle complex legislation.

The guidelines, compiled by the government-funded AIJA in conjunction with the University of NSW, suggest some AI systems may impinge on open justice, and any AI program must be thoroughly interrogated before being deployed in a courtroom.

Ellie Dudley
Ellie DudleyLegal Affairs Correspondent

Ellie Dudley is the legal affairs correspondent at The Australian covering courts, crime, and changes to the legal industry. She was previously a reporter on the NSW desk and, before that, one of the newspaper's cadets.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/legal-affairs/new-rules-require-lawyers-to-disclose-ai-use/news-story/90a276a23198ee65f8a4724943019d24