Victorian information regulator’s warning for teachers on ChatGPT
The Office of the Victorian Information Commissioner has shared its desire to ‘nudge the needle slightly against the AI hype’ and has aired concerns about the use of ChatGPT in schools.
The Office of the Victorian Information Commissioner has warned teachers against using ChatGPT to write student reports, saying it would be “inappropriate” to do so.
OVIC Privacy and Data Protection Deputy Commissioner Rachel Dixon used a state parliamentary committee on Monday to air her concerns about the use of the AI chatbot launched by OpenAI.
“There are things I would prefer not be used, that have been authorised to be used,” she said.
“You will note that the education ministers nationally made a statement allowing the teachers to use ChatGPT for example.
“I’m sympathetic to the workload that teachers have … It would be inappropriate to use ChatGPT to write a report card and the reason for that is that information on the students performance is travelling to California (and) is held by OpenAI and it never comes back. I’m not being a conspiracy theorist, but should OpenAI have 10 years of educational prowess or failings on all Victorian students, I think that’s probably an undesirable outcome.”
The nation’s education ministers formally backed the Australian Framework for Generative AI in Schools framework last year.
Victoria’s Information Commissioner Sean Morrison also shared his reservations on the increasing use of AI, and said his office had the desire to “nudge the needle slightly against the AI hype”.
“We work with … DGS (Department of Government Services) to comment on whole-of-government guidance and so we’re working in the background about lifting that literacy around AI and use of the appropriate tools,” he told the committee.
“We’re trying to ensure that there’s no use of AI in high-risk environments like child protection. We’ve also commented on federal regimes, and we’re trying to nudge the needle slightly against the AI hype.”
As well as generative AI, Mr Morrison noted that increased service demands and threat actor activity were among the main challenges faced by OVIC.
OVIC does not use AI, and will soon publish its dedicated internal policy prohibiting its use subject to Mr Morrison’s discretion, except for one cyber security tool.