Legal experts split as Australian mayor Brian Hood readies ChatGPT lawsuit
A legal bid alleging defamation by the AI chatbot, likely to be brought by Brian Hood, may shape up as a world-first test case.
Legal experts are divided about a lawsuit being prepared by Victorian mayor Brian Hood, who alleges AI chatbot ChatGPT defamed him when it incorrectly identified him as an individual involved in a foreign bribery scandal, in what’s shaping as a world-first test case.
Hepburn Shire Council Mayor Brian Hood alleges OpenAI’s ChatGPT has incorrectly depicted him as a “criminal” who had a role in the RBA’s banknote scandal and was sentenced to 30 months in jail.
Mr Hood had not been charged with any offences, rather he was the whistleblower who alerted authorities to the scandal.
Legal experts are divided over whether Mr Hood will succeed, with concerns the chatbot’s response may not count as publication and that a chatbot’s response could incite serious reputation damage.
The banknote scandal took place when the Hepburn Mayor was working at Note Printing Australia, a subsidiary of the Reserve Bank of Australia.
Gordon Legal sent a concerns notice to OpenAI on March 21, demanding a rectification. The firm has not said where the letter was exactly sent or to which address.
“Mr Hood was not charged with any offences; instead, he was the person who alerted the authorities to the wrongdoing and was praised for his bravery in coming forward,” a statement from Gordon Legal read.
ChatGPT, when asked about the scandal, had given a response which claimed: “That Mr Hood was accused of bribing officials in Malaysia, Indonesia, and Vietnam between 1999 and 2005, that he was sentenced to 30 months in prison after pleading guilty to two counts of false accounting under the Corporations Act in 2012, and that he authorised payments to a Malaysian arms dealer acting as a middleman to secure a contract with the Malaysian Government,” according to Gordon Legal.
Partner James Naughton said Gordon Legal believed serious damage had been done to Mr Hood’s reputation and that his case may become “a test case”.
Mr Hood was working on a council presentation about one month ago when he first learned of the responses coming from ChatGPT when asked about the scandal.
“I couldn’t believe it. I heard what people were saying and I did some searches. I was quite shocked, quite stunned. It was totally unexpected and out of the blue. I felt a bit numb,” he said.
Mr Hood had read about ChatGPT’s public release late last year, but was yet to use the chatbot.
After his friends and acquaintances had told him about what they discovered, he tried ChatGPT himself, searching for something to the effect of tell me about Brian Hood’s role in the RBA foreign bribery scandal, he said.
“What was having a real impact was that some of the paragraphs were completely accurate with times, dates, faces and names,” he said.
“And then you’d strike a paragraph with total nonsense. Total bulls. t.”
Media lawyer and partner at major law firm Thomson Greer Justin Quill said Mr Hood’s case may struggle to prove publication.
“One of the major hurdles Mr Hood will have in bringing this claim is proving that there was a publication to an audience of some description,” he said.
“Because ChatGPT is responsive to what it is asked, the queries may have some impact on the way any publication is viewed. That‘s somewhat of an unknown in such a case as this.”
Queries asked of ChatGPT are stored in a history tab on the left-hand side of the platform’s website. They can be viewed by a user multiple times.
Mr Quill said Mr Hood may also find it difficult to prove serious harm to his reputation.
“It‘s worth noting that Mr Hood hasn’t, at this stage, actually sued,” he said.
“Perhaps the main hurdle for Mr Hood is that he has to demonstrate he has suffered serious harm as a result of the publication. Given these alleged publications are likely to be a very limited number of people, it’s hard to see that the high bar of serious harm has been reached.”
Asked what he sought from OpenAI, Mr Hood said he wanted an apology, a correction and for OpenAI to improve their product so issues such as those he faced would not happen again.
“There needs to be an adequate response from them. If this happened to me, it can happen to everyone,” he said.
Mr Naughton said Mr Hood’s query was the first the firm has had in regard to ChatGPT.
“I think it has some interesting and unique features about the way that defamation law will apply to this new technology,” he said.
Gordon Legal had recently dealt with dozens of requests for defamatory content to be removed from social media platforms, he said.
“I think one of the fundamental differences is that (ChatGPT) acts like an oracle and if you ask it a question, it uses its algorithm to give you an answer in quite definitive terms sometimes and it’s devoid of the usual references or ability to check differences of opinion.”
OpenAI has 28 days to respond to the concerns letter.