National roundtable on AI-generated child exploitation to tackle police resources
Two former child abuse investigators have lifted the lid on how Aussie cops have “one hand tied behind their backs”, while hunting down pedophiles, arguing there’s a simple fix.
Police & Courts
Don't miss out on the headlines from Police & Courts. Followed categories will be added to My News.
Australian police have “one hand tied behind their backs” while hunting down the pedophiles and predators who create and consume child abuse material, with fewer than 20 experts nationwide specialising in victim identification, a former investigator has claimed.
Colm Gannon, CEO of the Australia arm of the International Centre for Missing & Exploited Children, is a former specialist investigator for New Zealand’s Department of Internal Affairs.
He will host a roundtable of national leaders in child protection at Parliament House on Thursday, calling on lawmakers to put AI-facilitated child sexual exploitation on the national agenda.
“When we go to the government … we’re going to provide solutions that are sovereign to Australia … so that Australian law enforcement are not going to be held with one hand tied behind their back,” he said.
He said the recent departures of two of Australia’s most senior victim identifiers has left the nation with as few as “17 to 18” specialists in the field.
“The resources and expertise are so stretched, and (when they leave) they’re not being replaced,” he said.
An AFP spokesperson declined to “provide details on numbers for operational reasons”
It comes as the nation faces a reckoning with the rampant physical and sexual abuse of children in early learning settings.
Two female childcare workers at a centre in Western Sydney were on Wednesday charged with the alleged assault of a 17-month-old boy in their care. Those allegations are not sexual in nature, and the women will face court in August.
Federal Education Minister Jason Clare has supported calls from the sector and child safety advocates to establish a national register for early childhood educators, admitting “we need to accelerate the work to stand that up”.
Meanwhile an explosion in AI-generated child abuse material is pushing law enforcement resources to their limit, Mr Gannon warned, with investigators finding it harder and harder to tell real victims and AI-generated ones apart.
While their counterparts in the US are able to use AI tools to track down offenders and their victims, Australian police are hamstrung by data privacy restrictions.
It’s an issue that ex-Queensland police officer Jon Rouse is “consumed” with, especially since he recently discovered sexual abuse material created from a number of young Queensland boys rescued by his team in 2003 is now being used to train and generate AI abuse images.
“That’s the world that we now live in, and we do not have an effective tool at the moment where our victim ID people can just drop the image in, and it says ‘fake’ or ‘real’,” he said.
The former detective inspector is among those calling for federal investigators to be given access to tools like Clearview, a controversial AI-based facial recognition software created in the US and black-listed by the Office of the Australian Information Commissioner.
Mr Rouse argues that with safeguards in place – including restricting access to a small number of highly-trained specialists – Clearview could help solve countless child abuse “cold cases”.
“I think we need to recognise that the internet is actually a public space, and if you put it on the internet … that’s all there is to it,” he said.
Mr Gannon said while privacy concerns are valid, Australian authorities and decision-makers must come up with solutions, and not just roadblocks.