First known case of an Australian Supreme Court judge facing ChatGPT
For the first known time since its creation, ChatGPT has infiltrated an Australian courtroom and forced a Supreme Court judge to answer the question many had been dreading: How do I deal with this?
For the first known time since its creation, ChatGPT has infiltrated an Australian courtroom and forced a Supreme Court judge to answer the question many had been dreading: How do I deal with this?
The case comes as a leading ethics barrister and senior UNSW lecturer says judges will soon realise there are “appropriate uses” for artificial intelligence in the courtroom, and courtrooms across the world become divided over what those uses are.
The matter came before ACT Supreme Court judge David Mossop last week when Majad Khan, 22, was found guilty of paying for $63,000 worth of illegal vapes with cash that “was in fact simply paper” and then running off with the goods.
Justice Mossop found the scheme, pulled off with the help of two high school mates and another friend, was premeditated and “thoroughly dishonest”, ruling that Khan was particularly liable for his role in driving the getaway car.
To the mitigate his sentence, Khan provided the judge with a series of character references from family members. Yet therein lay the problem: when Khan’s brother used Chat-GPT to write the reference for his sibling, it praised his “strong aversion to disorder”.
“I have known Majad both personally and professionally for an extended period, and I am well‑acquainted with his unwavering commitment to his faith and community,” the reference read.
Justice Mossop smelled a rat.
“One would expect in a reference written by his brother that the reference would say that the author was his brother and would explain his association with the offender by reference to that fact, rather than by having known him ‘personally and professionally for an extended period’,” his sentencing judgment reads.
The reference continued, praising Khan’s “commitment to cleanliness and order as another facet of his character that stands out. He maintains a meticulous approach to his surroundings, expressing a strong aversion to disorder. His proactive attitude towards cleaning, both inside the house and in the community, reflects a sense of responsibility and respect for the environment.
“His efforts extend to keeping the streets and driveways clean, a testament to his commitment to a well-maintained and orderly community.”
Again, Justice Mossop wasn’t buying it. “It is certainly possible that something has been lost in translation. He may well be committed to cleanliness,” he wrote in the judgment. “However, the non-specific repetitive praise within the paragraph which places such an emphasis on his proactive attitude towards cleaning and strong aversion to disorder is strongly suggestive of the involvement of a large language model.”
Khan’s brother, through defence counsel, said he did not use a large language model to generate the reference but said it was used with “the assistance of computer translation”.
Ultimately, Justice Mossop ruled the “use of language within the document is consistent with an artificial intelligence generated document” and determined he would place “little weight” on the reference when considering the sentence.
Khan was given a suspended sentence of 21 months and 15 days imprisonment, and fined $6000.
Courtrooms across the world have become divided over next steps to employ artificial intelligence to help boost efficiency while balancing the administration of justice.
Level 22 Chambers barrister and UNSW law senior lecturer Brenda Tronson said generative AI tools were “here to stay, including in the law … As with all tools, the important thing is that we learn to use them appropriately. That includes the need to think critically about the output before adopting it, whether in advice or in material presented to a court,” she said.
Ms Tronson said as ChatGPT became “normalised”, it was likely judges would eventually come to the conclusion that there were “appropriate uses” for it. “Judges need to know material presented in court is truly coming from the person presenting it, whether as evidence or submissions made by advocates,” she said.
“Given some of the examples we’ve seen of the misuse of ChatGPT in preparing materials for courts, it’s not surprising many judges are wary. But some are adopting it themselves … As the tools become normalised, more judges are likely to come to the view that there are appropriate uses for them.”
Her comments come off the back of NSW Law Society chief Brett McGrath, in opening the law term in January, saying he had deep concerns about the impact AI would have on the legal profession. Mr McGrath has announced establishment of a taskforce to help solicitors navigate the advent of AI, understand “ethical issues” surrounding it, and advise government as the space becomes more regulated.
Last year, Federal Court judge Melissa Perry said the legal profession should be a long way from depending on technology for deep research or critical decision-making. Former Federal Court chief justice James Allsop has also dismissed the idea of artificial intelligence replacing humans in courtrooms.