NewsBite

Australia’s industry code will force big tech to proactively look for and remove child-abuse material

Australia's eSafety Commissioner, Julie Inman Grant.
Australia's eSafety Commissioner, Julie Inman Grant.

We’ve reached a true watershed moment in the global fight to protect children online, and history will mark that this was the first major salvo launched from right here in Australia.

For the first time anywhere in the world, mandatory industry codes will be put in place requiring tech companies to take meaningful action to detect, remove and prevent the proliferation of illegal and harmful content, including child sexual abuse material.

Where the industry codes submitted to me did not provide appropriate safeguards, we will be moving to develop standards to ensure tech companies in those sectors are required to put in place these important safeguards.

This will inevitably have far-reaching implications for the tech sector. As most of the tech companies affected by these codes and the impending standards are international, what will be required of them in Australia will likely have a global knock-on effect and force them to take global action.

Until now, most of the world’s biggest tech companies, of which almost all are based in the US, have only been required under US law to report images of child abuse to the not-for-profit US National Center for Missing and Exploited Children (NCMEC). But there is no requirement for them to proactively look for those images.

If companies aren’t obliged to look for child abuse material, it stands to reason they’re also less likely to find it, but the problem doesn’t just go away because you’re too scared to turn on the lights.

Over the past two decades this has resulted in an online world in which child sexual abuse material is far too frequently accessed, shared and stored online. And Australia is far from immune.

In the first three months of this year, I have seen a 285 per cent increase in reports of child abuse material coming into my office. This follows consecutive year-on-year doublings of reports during the worst of the Covid-19 pandemic.

And in 2022, NCMEC received around 32 million reports of child abuse material from tech companies. This was mostly made up of reports from Meta, owner of Facebook, WhatsApp and Instagram, which reported around 27 million instances of online child abuse.

But juxtapose this with other tech behemoths like Apple, and you start to see the problem. In the same year that Meta reported its 27 million child abuse images, Apple, with its billions of handsets, hugely popular iMessage service and iCloud file and photo storage services, reported just 234.

These vast inconsistencies have hampered efforts to get a true handle on the size and scope of the problem and allowed this material to multiply and spread.

Governments Want Kids Off Social Media. Why Aren't Platforms Doing It?

But meaningful change begins now.

Back in November last year, industry submitted final drafts of eight codes for registration covering the full online ecosystem including social media services, websites, search engines, app stores, internet service providers, device manufacturers, hosting services, and services such as email, messaging, gaming and dating services.

In February this year I informed industry that I did not intend to register any of those codes and asked them to resubmit amended versions which addressed the issues I identified. Substantially revised codes were submitted on March 31.

In my capacity as Australia’s eSafety Commissioner and under Australia’s Online Safety Act, I have now determined that five codes including the Social Media Services, Internet Carriage Services, App Distribution Services, Hosting Services, and Equipment codes contain appropriate community safeguards as required under the Act to be registered.

But two key codes fail to meet this high bar and I will be rejecting them.

These codes include the Designated Internet Services code, covering apps, websites, and file and photo storage services like Apple iCloud, Google Drive and Microsoft OneDrive; and the Relevant Electronic Services code, covering dating sites, online games, direct and instant messaging services including encrypted services.

These two codes have been rejected for failing to include adequate protections, including a broadly applicable commitment to proactively detect and remove known child abuse material and pro-terror content.

We know file and photo storage services like iCloud, Google Drive, and OneDrive are used by paedophiles to store and share child sexual abuse material. And we also know that email services and partially encrypted messaging services are widely used by these predators to share this illegal content.

eSafety will now draft mandatory and enforceable standards for those sections of the online industry.

And to show you just how fast technology is moving, I will be reserving my decision on a third draft code covering search engines. This code, resubmitted by industry to eSafety in late March is no longer fit for purpose following recent announcements to integrate generative AI into search engine functions.

I have requested that a revised search engine code be submitted within the next four weeks to address the specific concerns I have raised, including the ability of the code to protect against the production of deepfake child sexual abuse material and terrorist and extremist propaganda material.

Our work on this code dovetails with the Australian government’s efforts to achieve “safe and responsible AI”. Industry and Science Minister Ed Husic has just released a report on the emerging technologies by the National Science and Technology Council and announced the launch of a public consultation on this important issue.

The potential harms posed by AI will not only require an all-Australian effort, but indeed global coherence and collaboration.

These industry codes and standards will be backed up by a range of powers to ensure compliance including injunctions, enforceable undertakings, and maximum financial penalties of nearly $700,00 per day for the most egregious continuing breaches.

Australians should feel proud of the leading role we are taking on this most serious of issues, finally forcing the tech industry to turn on the lights and help address the problem it has collectively done too little to tackle.

Julie Inman Grant is the eSafety Commissioner.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/technology/australias-industry-code-will-force-big-tech-to-proactively-look-for-and-remove-childabuse-material/news-story/62fe901576c19ab6735b4ca049db9c95