Chatbots can steal students’ personal data but cheats can’t be caught, AI expert warns
Schools are feeding children’s personal data into AI engines that enrich ‘white male billionaires’, as teachers and uni lecturers make a startling admission.
Schools are feeding children’s personal information into artificial intelligence (AI) engines that enrich “white male billionaires’’, an education conference has been warned.
Six months after Open AI launched ChatGPT, Australian teachers and university lecturers have revealed they still can’t prove when students are cheating with the chatbot to write assignments and exams, due to a technical and regulatory void.
Deakin University education academic Dr Lucinda McKnight railed against a reliance on AI in teaching.
“There is no ethical way to use large language models, because they are not trained in ethical ways,’’ she told Informa’s AI in Education conference in Sydney yesterday.
“They’re trained on a stolen corpus … of materials that have been lifted without copyright permission, (created) by human beings from time immemorial, that are out there on the internet.
“They have built-in biases, assumptions and hate speech.’’
Dr McKnight, who has an Australian Research Council (ARC) grant to explore the nature of writing in a digital world, said AI was concentrating wealth “in the hands of a tiny number of non-diverse billionaires – white, middle-class, able-bodied males’’.
“(AI information) is refined and corrected by … actual humans, many of them black and brown people who have been harmed and traumatised by (content),’’ she said.
Dr McKnight chided conference speakers for describing AI as a “tool’’ for teaching and learning.
“They are not a tool … they are a data exchange service,’’ she said.
“Do not feed AI any private personal information – names, email addresses, or student ID numbers.
“Do not feed in any copyright materials, no images, texts, sounds, computer code designs, trademarks or patents.’’
Australian Institute for Teaching and School Leadership (AITSL) chief executive Mark Grant warned of privacy problems flowing from the use of AI.
“For example, if a teacher puts information into ChatGPT, how does OpenAI then get to use that data?’’ he said yesterday.
“The teacher could put in students’ assessment data, or have AI mark a student’s work.
“Is the AI then training itself on that information and data?
“Does the school, teacher or student retain ownership of the data, or are they effectively giving the AI tool free license to use it as they wish?’’
Mr Grant said education departments, rather than individual teachers, must ensure that ChatGPT or other generative AI does not breach students’ privacy.
Australian Curriculum, Assessment and Reporting Authority chief executive David de Carvalho said the use of AI in teaching and assessment was a matter for state and territory education departments, religious schooling systems and individual private schools.
“We are monitoring the impact of ChatGPT and other AI tools on education and are in the process of developing support materials to assist teachers in thinking about how to teach students about AI,’’ he said.
Several conference speakers revealed that students are routinely using ChatGPT or similar AI apps to research and write assignments.
Describing AI as the “challenge of our age’’, Macquarie University’s interim dean of education, Professor Matt Bower, said “Chat GPT could easily be used by a failing student to pass’’.
He said schools and universities might need to switch to spoken assessments, exams and video presentations to assess students, due to difficulties in detecting the use of AI in assignments.
Professor Bower said ChatGPT had scored 47 out of 100 marks in his university’s teaching course, but had “hallucinated’’ in providing fake links in the reference bibliography.
He said AI was creating a “tidal wave of information junk’’ and called for a focus on fundamental learning.
“The risk is that students won’t learn,’’ he said.
“Don’t think we are going to throw away spelling and arithmetic.
“Students will need the fundamentals – you need to know it so you can work with AI.’’
Professor Bower, who is helping draft guidelines for the use of AI in schools, to be considered by the nation’s education ministers in July, said it was futile to ban the technology.
“An arms race of detection and subversion is just … time wasting,’’ he said.
“AI is going to keep on growing so we’ve got to work furiously to get ahead of it, or at least stay on top of it.’’