Tech giants to be brought to heel on child abuse, terror content
Cloud storage services and messaging platforms will be required to ‘take meaningful steps’ to tackle sexual abuse material and terror content.
Cloud storage services and messaging platforms will be required to “take meaningful steps” to tackle child sexual abuse material and pro-terror content by new industry rules overseen by the eSafety Commissioner.
The eSafety Commissioner has hailed the “world-leading industry standards” as a “significant stride forward in the online protection of children” and it may have a “global impact”.
The rules will require cloud-based storage services like Apple’s iCloud, Google Drive, and Microsoft OneDrive, as well as messaging services like Meta’s WhatsApp to prevent their services from being used to store and distribute harmful material.
Infractions could attract penalties of up to $49.5m.
The standards will also clamp down on “nudify” programs and vendors that offer software – often backed by generative artificial intelligence – to generate deepfakes of people naked after being supplied images of them.
The rules were drafted by eSafety Commissioner Julie Inman Grant and presented to parliament in June after she rejected the industry codes proposed by industry associations representing the companies.
“At their heart, the standards are designed to encourage innovation, investment and deployment of systems and technologies that improve safety, by preventing the hosting, sharing, the re-sharing and re-traumatisation of survivors, and the synthetic generation of this reprehensible child abuse content,” she said.
The new rules compel a “designated internet service” – which could be cloud storage services or messaging platforms – to “prevent end-users from distributing known child sexual abuse material” and to identify “phrases and words commonly linked to child sexual abuse material” for detection and deterrence purposes.
Furthermore, it compels those platforms with more than a million Australian users to make a “development plan” which must include investment and activities to detect and identify offending material on their platforms.
The new rules, Ms Inman Grant said, would require changes by companies “no matter where they are headquartered”.
“We are challenging big tech businesses and the industry as a whole, to harness their collective brilliance, vast financial resources and sophisticated tools to help address profoundly damaging content hosted on and distributed by their services,” she said.