- Exclusive
- National
- NSW
- University
Sydney Uni students allowed to use AI in radical reversal of cheating policy
Sydney University will allow students to use artificial intelligence in a radical reversal of its cheating policy, conceding bans on technology such as ChatGPT are untenable.
The plan, to be phased in next year, will mean students can use AI for homework or assignments but will be tested without access to the tools “at key points” in their degrees.
Pro vice chancellor Professor Adam Bridgeman said the university realised AI was here to stay and students needed to be prepared to use it well in their future careers.
“What we need to do is make sure that we’re not fooling ourselves when we set a piece of homework. AI will help the students complete their homework, and that’s fine,” he said.
“The reality is that the students are using it and we’re not able to detect it. So to just tell students ‘don’t use it’ is untenable.”
Stage one of the plan was signed off by the university’s academic board on Tuesday, reversing the current position that AI must not be used “unless expressly permitted”, to assume it is allowed.
From semester 2, students can use AI in all “non-secure” – or unsupervised – assessments, and coordinators cannot ban its use.
Universities have scrambled to update cheating policies and keep up with the ever-evolving technology since ChatGPT emerged two years ago.
Many initially banned AI, then gradually loosened policies to allow it in some circumstances.
Generative AI has challenged universities’ academic misconduct units. Hundreds of Sydney University students were accused of using AI to cheat in 2023, while there were 166 substantiated AI cheating cases at UNSW the same year.
Universities are concerned AI detection tools are ineffective and easy to get around.
Engineering and commerce student Angad Chawla, 20, and recent pharmacy graduate Helia Nateghi Baygi, 27, are part of a Sydney University AI working group and regularly use AI in their coursework.
“I do believe there are issues of overreliance – just walking through the library, everyone has that [ChatGPT] tab open,” Chawla said.
But he said, ultimately, students realised they could not succeed by relying on AI tools as their understanding would ultimately be tested, either in an exam or a job interview.
Academics also became more savvy when designing assessments, Chawla said. “There are times when the AI doesn’t cut it, plain and simple,” he said.
Baygi said some students were confused about when AI use was allowed, with different rules for different units. She welcomed a university-wide approach.
“Calculators never killed mathematics skills,” she said. “It’s better that we embrace this technology and empower students rather than banning it altogether.”
Sydney University is one of the first Australian universities to make use of AI in non-secure assessments expressly allowed.
Melbourne University has not banned AI but requires students disclose its use and, if students represent generated material as their own ideas, it could be considered misconduct.
At UNSW, teachers set a level of acceptable AI use for each assessment.
The Sydney University changes will require a major shake-up of the institution’s assessments, which will all be categorised as either “lane one”, and held under exam-like conditions, or “lane two”, and considered open book and integrate use of AI.
Bridgeman said there would be parts of degrees, such as program majors, in which students would be required to complete secure “lane one” assessments and AI use was controlled or banned. All other marked work would be considered “lane two”.
Secure assessments could include tasks such as interactive oral assessments and supervised pen and paper examinations.
“I think it’s actually strengthening our position on integrity – it’s certainly not giving up,” Bridgeman said.
“It’s realising that AI is here to stay but that we expect our students to do the things they say they can do.”
Bridgeman said the changes to assessment, requiring key skills be assessed in person, would help fight contract cheating, in which students paid others to do their work.
Deakin University cheating detection expert Professor Phillip Dawson said Sydney University’s new policy showed an acceptance that you couldn’t control a student’s AI use unless you were watching them directly.
“[The policy] is probably where the sector is going to go in the long term, but it’s going to take a while to get here,” he said.
“Everyone needs to look at what restrictions on AI they’re setting and how feasible they are to enforce. Students know when we’re setting pretend rules.”
Dawson said people wrongly assumed they could accurately spot when someone used AI, or that AI detectors would find students using the tools.
He said Sydney University’s new policy needed to be carefully implemented, so students were tested on whether they had key skills, in a secure environment, at important moments in degrees.
- Main image has been digitally altered.
Get the day’s breaking news, entertainment ideas and a long read to enjoy. Sign up to receive our Evening Edition newsletter here.