Opinion: Search engines latest target of laws to protect children
While the world watches Australia’s social media ban, another revolutionary reform is quietly emerging, writes Julie Inman Grant.
It’s fair to say all eyes are squarely fixed on Australia right now as we zero in on one of the most monumental online reforms to take place anywhere in the world for decades.
Of course, I’m talking about Australia’s social media minimum age legislation which from next Wednesday will require some of the world’s most popular social media platforms to take reasonable steps to prevent Australian children under 16 from holding accounts.
Unless you’ve been living under a very big rock, this probably won’t come as much of a surprise to you, but what might, is the realisation that there is another transformative online reform under way in Australia which will fundamentally change how the online industry protects children.
Today, eSafety is publishing new regulatory guidance ahead of another key December date – the 27th, to be exact – which is when the first tranche of Australia’s age-restricted-material codes come into force.
What makes these mandatory codes so important is they put the onus not just on the social media companies but many different types of services across the online industry to do more to protect children from exposure to age-inappropriate content like pornography, high-impact violence, suicide, self-harm and eating disorder content.
We know more and more young people are encountering age-inappropriate content unintentionally at a very young age.
Our own research supports this with one in three young people telling us that their first encounter with pornography was before the age of 13 and this exposure was frequent, accidental, unavoidable and unwelcome with many describing this exposure as being disturbing and in-your-face.
There were good reasons why before the advent of the internet there were restrictions on children being able to walk into adult shops or purchase sexually explicit magazines and videos because children are not developmentally or emotionally ready to process such content. But no such restrictions apply online – until now.
We know that a high proportion of this accidental exposure happens through search engines and once a child sees a sexually violent video – for instance, of a man aggressively choking a woman during sex – they can’t cognitively process, let alone unsee that content.
So, from December 27, search engines have an obligation to blur image results of online pornography to protect children from this incidental exposure, much the same way safe search mode already operates on services like Google and Bing when enabled.
Crucially under the code, search engines will also now be required to automatically redirect searches related to suicide, self-harm or eating disorders to mental health support services.
It gives me some comfort that if there is an Australian child out there thinking about taking their own life, that thanks to these codes vulnerable kids won’t be sent down harmful rabbit holes or to specific information about lethal methods, but will now be directed to professionals who can help and support them.
If this change saves even one life, as far as I’m concerned, I believe it’s worth the minor inconvenience this might cause some Australian adults. Suicide devastatingly reverberates across families and communities, and represents a point of no return.
And I think most Australians would agree and view these commonsense changes as reasonable and proportionate, and perhaps might even wonder why they are not already in place.
But let’s be clear, what this code won’t do is require Australians to have an account to search the internet, or notify the government you are searching for porn.
And while images in search results might be blurred, adults who wish to view that content can still click through to view it without restriction.
While Big Brother-style stories about government overreach might make for good clickbait and short-term political point scoring, overall it diminishes the world-leading and life-changing reforms Australia is introducing.
But how can little old Australia make these hugely powerful, rich and influential international companies play ball and comply with Australian law?
Firstly, these codes were written by the industry, so they already have skin in the game.
My role as eSafety Commissioner is to ensure that the codes industry drafted meet key safety requirements set out in Australia’s Online Safety Act.
But you only need look at our Unlawful Material Codes and Standards, which tackle the worst-of-the-worst unlawful online material including child sexual exploitation, to find your answer.
Since these codes and standards came fully into force in June they have started to make a meaningful impact forcing riskier companies to either change how they do business in Australia or withdraw their services altogether.
Take the example of gaming platform Roblox, which has been the subject of numerous accusations it has been turning a blind eye to predatory adults grooming children.
While many have questioned why Roblox has not been captured under the government’s social media delay, under the codes and standards Roblox has committed to make major changes in Australia that will see it age verify its entire Australian user base to prevent adult predators being able to contact children.
No other country in the world has been able to compel Roblox to make such a huge transformation to their operating model, but we hope what the company have committed to do here will be a blueprint for how it operates throughout the rest of the world.
We’ve also seen Apple and Google deplatform a notoriously dangerous chat-roulette-style app called OmeTV, which Australian law enforcement informed us was enabling the grooming of children by matching paedophiles with children in randomised video chats.
Following a formal warning to OmeTV for failing to comply with its obligations, eSafety reminded Apple and Google of their own obligations under the App Store code not to enable distribution of services that placed children in danger.
Since its deplatforming, OmeTV has also informed us it has also withdrawn access in Australia to its web-based services.
And most recently, we saw a UK-based company which owns three of the world’s most widely used “nudify” websites also withdraw its services from Australia after eSafety gave another warning of non-compliance under the codes and standards pointing out how these services were being used create child sexual exploitation material of mostly female students in Australian schools.
As an Australian (who hasn’t quite been able to ditch her American twang), it makes me immensely proud that as a nation we have decided to take back control of the safety of our children and I hope you, my fellow Aussies, share this pride, too.
Julie Inman Grant is the Australian eSafety Commissioner
Originally published as Opinion: Search engines latest target of laws to protect children