NewsBite

Chatbots ‘grooming children’, parliamentary inquiry finds

A parliamentary inquiry into artificial intelligence in schools warns of ‘many privacy risks around students’ personal data, including or profiling and grooming’.

A parliamentary inquiry says AI chatbots can be ‘study buddies’’ for students. Picture: iStock
A parliamentary inquiry says AI chatbots can be ‘study buddies’’ for students. Picture: iStock

Students should adopt artificial intelligence (AI) as a “study buddy’’ despite alarming evidence of child grooming, cheating and deep fakes, a federal parliamentary inquiry has ­recommended.

Generative AI “chatbots’’ used in schools should be trained only on Australian curriculum data to shield children from dangerous, wrong or biased information harvested from the internet, the inquiry found.

It warns that AI tools are being used to monitor children’s moods, create convincing deep fakes, rig research results and present fiction as fact.

Students are using AI to cheat in assignments as teachers struggle to detect the use of chatbots.

The committee calls for legislative safeguards to prevent children being exploited by AI companies that cash in on student data, including facial recognition technologies.

It also warns that pedophiles can use AI to “groom’’ children for abuse.

“There are many privacy risks around students’ personal data, including for profiling and grooming,’’ the House of Representatives standing committee on employment, education and training says in its report, tabled in parliament on Tuesday.

“Chatbots may have age-­inappropriate conversations or display content that is sexual or violent to children.

“GenAI chatbots may present with ‘human-like’ qualities to children, including mimicking common conversational traits that imply a personal or trusted relationship with the student.

“An emerging concern is the introduction of facial recognition technology in the classroom’’.

The committee calls for a ban on the use of generative AI to “detect emotion’’ among schoolchildren, as well as privacy protections for students using AI.

“If managed correctly, GenAI in the Australian education system will be a valuable study buddy and not an algorithmic influencer,’’ its report states. “To make GenAI fit-for-purpose in Australian schools … (it) should be trained on data that is based on the national curriculum.

“An over-reliance on GenAI can adversely affect students’ problem-solving skills, interpersonal skills and decision-making skills, and lead to complacency and disengagement from teaching material.’’

The inquiry found AI can be a “study buddy’’ to reword information, create personalised quizzes or act as a personal tutor to give students immediate feedback and help them identify areas for improvement.

Teachers could use the technology to create lesson plans, study materials and classroom activities, or tailor lessons to individual students’ age, ability or learning styles.

The inquiry said AI “may actually increase the workloads of teachers rather than reduce it … Teachers may need to run AI-detection software, fulfil authentication requirements, and double mark work.”

Universities’ watchdog the Tertiary Education Quality and Standards Agency told the inquiry AI “has the capacity to not only generate fake data and images, but entire studies and journal articles’’ and “can compro­mise the integrity of research’’.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/nation/politics/chatbots-grooming-children-parliamentary-inquiry-finds/news-story/bd9680150bb231fdf2ad5b16ff9df020