NewsBite

Advertisement

This was published 7 months ago

We’ve been worried about kids on the internet for 30 years. Is it time to toughen up on tech?

By Natassia Chrysanthos

In 1995, the cover of Time Magazine spurred an early moral panic over children and the internet. Under an image of a wide-eyed child hovering over a keyboard and the bold headline, “Cyberporn”, it posed the question: “Can we protect our kids — and free speech?”

Almost 30 years later, we are still figuring it out – although it’s not just porn that sounds alarms. Alongside fears that children as young as 10 are watching violent porn online, Australia’s young people can be exposed to child abuse material, gambling ads and pro-terrorism content. Social media addiction and cyberbullying have also fed into a broader conversation about children’s mental health.

All this was brought into focus this month when a teenager allegedly stabbed a Sydney bishop in an attack that was streamed live on Facebook. Seven teenagers allegedly belonging to a terror cell were later arrested, two of whom it is claimed had graphic videos of Islamic State beheadings.

The events have reignited Australian alarm at the realities of online harm, reopening a debate about how to protect children from graphic content and the dangers of radicalisation.

The live-streamed stabbing has spurred debate about how to protect Australians from harmful content online.

The live-streamed stabbing has spurred debate about how to protect Australians from harmful content online.Credit: x - @AustralianJA

The ensuing conversation can at times be repetitive. This week’s flashpoint, over the Wakeley attack, played into the binary from Time’s cover last century: Australia’s eSafety Commissioner, Julie Inman Grant, ordered X to pull graphic content from the internet because it could potentially radicalise people; its billionaire owner, Elon Musk, called the commissioner a “censorship commissar” seeking authority over all countries on Earth.

Children and online safety experts say the debate must move on because the intervening decades have shown tech companies are self-interested players – meaning it’s time to get tougher on them. Australia was an early pioneer in online safety but its reactive approach, focused on removing harmful content, is not enough. They say Communications Minister Michelle Rowland can no longer allow the industry to write its own rules; she must force it to show how it plans to protect Australian children from harm.

Advertisement
The Time Magazine cover that spurred a moral panic in 1995. While the research behind the article was eventually discredited, underlying concerns about children’s safety and censorship have persisted.

The Time Magazine cover that spurred a moral panic in 1995. While the research behind the article was eventually discredited, underlying concerns about children’s safety and censorship have persisted.

Australia’s first eSafety commissioner, Alastair MacGibbon, says it’s time to turn up the dial. “We should be proud of what the Australian government has done over the last 10 years: we were the first in the world to start asking these questions and introducing legislation. [The government] has continued slowly, judiciously [and] sensibly to increase pressure on those companies to protect Australians,” he says.

“But it’s patently clear it has been insufficient. These companies are refusing to do what is right. As they become more powerful in our daily lives, so, too, should their responsibility to do the right thing.

“The average family out there struggles on a daily basis to get a handle on this stuff, to teach their kids how to be safe online. For the last 20 years, the dominant philosophy has been to push responsibility onto the individual and families, and criticise them when we fail. I think we’re reaping what we’ve sown.”


Legal professor Elizabeth Handsley, president of the Australian Council on Children and the Media, has been researching online safety since the late 1990s. She says there are perennial aspects to the debate over children and the internet, with common traps: turning down solutions because they won’t be 100 per cent effective; dismissing regulation as censorship; an inclination to sit back and let technology companies run the show.

Advertisement

But other elements of the conversation have evolved. “Twenty years ago, we were talking about pornography and violent content – the things children could see – but not their interactions with other people. Social media put this whole thing on steroids,” she says. More recently, those concerns have expanded to include privacy and addiction.

“There’s a growing awareness we need to do something about this. It’s not just a moral panic. There are serious issues we need to address.”

Experts now see a window of opportunity for this to happen. A major review of the federal Online Safety Act is due later this year, when a separate misinformation bill will also be introduced. The events of the past few weeks have only heightened the stakes, inviting a policy debate about the best way to protect children online – not because they are the only group at risk, but they are often the most vulnerable.

“This is a very live issue and it’s one that has been the subject of discussion and work across departments and also with eSafety,” Rowland said on Wednesday, in an interview on ABC’s Radio National.

Experts say it’s time for Communications Minister Michelle Rowland to get tougher on social media giants through mandatory codes of conduct.

Experts say it’s time for Communications Minister Michelle Rowland to get tougher on social media giants through mandatory codes of conduct.Credit: Alex Ellinghausen

“There’s not a single parent who isn’t concerned about what their children are seeing. Part of the review that we’re doing of the Online Safety Act is to make sure that the regulator has the necessary tools. At the same time, we’re looking at other issues, the harms to children, including by having recommender systems that push content like misogynistic rubbish and eating disorder videos.

“Of course, there is a litany of new and emerging harms, ranging from artificial intelligence, deep fakes, sextortion, child sexual exploitation material, and scams. As a government, we have a program of work across all of these areas.”

Advertisement

But it could do more. One key measure on the table is age verification, which means taking steps to ensure that internet users are who they claim to be, and that they meet the minimum age range to comply with laws and regulations. Rowland last year rejected the eSafety commissioner’s advice to pilot an age verification program, saying it would distract the industry from its existing work developing codes of conduct. Opposition Leader Peter Dutton, however, has promised parents their under-age children would be barred from online porn, sports betting and alcohol delivery services through an age verification plan.

The Coalition doubled down on that policy this week. “We urgently need to back the eSafety commissioner and get moving on age verification for children on social media,” communications spokesman David Coleman said. This was backed by the nation’s top spy, ASIO boss Mike Burgess, who told the National Press Club on Wednesday that such a tool would make his job easier.

Loading

Critics of age verification – which ranges from an ID system, as mooted for France, to technology that assesses a user’s facial characteristics – claim the technology is still maturing and there are privacy implications for adult users. Proponents argue it is essential to protect children, particularly from violent pornography. Others say the debate gets bogged down in detail, and if we admit it’s possible to create a safer internet for children, why not do so for everyone?

Rowland, when asked this week, said her department was scoping “what can be done at the moment on age assurance mechanisms”.

“But again, I will detail what eSafety has said in terms of a number of these issues: there’s no silver bullet when it comes to these matters. We do require collective effort, and that includes media literacy,” she said.

Independent MP Allegra Spender said crossbenchers were watching the online safety space closely, particularly around privacy and social media. She pointed to a proposed law in California that would require companies to shut off addictive algorithms for users under 18, as one example of several options that should be examined. “I think you could get real agreement across the parliament about mental health and protecting young people on social media. This is the time to act,” she said.

Advertisement

MacGibbon, now a cybersecurity adviser, agrees the conversation should be broader than age verification. “[It] will play some role in some parts of this, but I don’t think it’s the panacea for this much broader problem,” he says. Instead, there needs to be tougher laws for tech companies, including mandatory codes that force them to show the regulator how they are mitigating harm on their platforms. To date, under the Online Safety Act, technology giants are allowed to write their own voluntary codes of conduct about how they deal with issues like disinformation and porn.

Former eSafety commissioner Alastair MacGibbon said technology giants were “refusing to do what is right” and it was time for more regulation.

Former eSafety commissioner Alastair MacGibbon said technology giants were “refusing to do what is right” and it was time for more regulation.Credit: Alex Ellinghausen

“It’s very clear that we’ve passed the high watermark of tech companies co-operating with governments. Meta is well-considered to have failed in the protection of individuals. It’s very clear X has completely dismantled its efforts to co-operate with the government and reflect society,” MacGibbon says.

“They are consciously choosing to allow inflammatory and violent material to stay on their websites because they get eyeballs and money. Of course, it’s time for the Australian government to do more. Mandatory codes, rather than voluntary codes, that tell these companies what their obligations are in the Australian community. Flip the onus, from the eSafety commissioner to the providers. Force the companies to prove they are actively enforcing the law on their sites. And that’s not censorship.”

That’s also the position of Reset.Tech Australia, a research and policy organisation specialising in online harms. “There’s no incentive for platforms to implement meaningful rules,” says Rys Farthing, the director of children’s policies. “The industry representative group, DIGI [on behalf of X and Meta], writes the rules. And, of course, it’s in their best interest to write them as weakly as possible.”

She points to the European Union’s approach as a positive example, where a new digital services act will force tech companies to identify potential risks to minors and then mitigate them, whether through parental controls, age verification or accessible helplines. “It will [force them to answer] things like: is your design addictive? Does it cause problems with compulsive use? Does it have features that create grooming risk?” Farthing says.

Advertisement

“When we do that, we see loads of changes: platforms realise we shouldn’t make under-18-year-olds searchable to over-18s because that creates a grooming risk. [Just this week] in Europe, they turned off a TikTok feature where they reward users to watch videos because that creates a risk of compulsive use and addiction. Platforms have to change how they build their platforms and products, and make them safer in the first instance.”

On the other hand, Australia’s Online Safety Act focuses on harmful content that can be reported and removed, “not the risks that might lead to the production of such content”.

“It doesn’t give our regulator the scope to meaningfully look at things like whether a platform is addictive or has a grooming risk,” Farthing says.

She thinks the conversation is moving that way, particularly after this week. “It’s really heartening to see this moment in Australian policy, where we’re starting to talk about how we can do that.”

But there will be obstacles. A big one is the “complexity trap”, where conversations about making platforms safer descend into debates about technical detail. “It’s in platforms’ best interest to keep us talking about these small fixes and how complicated they all are. No one wins but the platforms, if we fall into that,” Farthing says.

The lobbying power of social media players shouldn’t be underestimated, either, says Reset.Tech’s executive director, Alice Dawkins. “Government and civil society need to be prepared to scrutinise industry narratives. I think we lack a bit of critical analysis in tech policy. When a highly sophisticated tech company makes a claim about their products and services, it will be believed because it is hard to understand. Tactics like this are routinely deployed by the tech industry to spook governments.”

Early implementation data will also start coming out of Europe, Britain and Canada. “We do not lack information on how to craft these policies, or how they function,” Dawkins says.

Loading

Australia’s Children’s Commissioner, Anne Hollonds, says we should look abroad to what other countries are doing and be part of the international community that pushes the boundaries. She strongly believes in regulatory safeguards and wants age verification to be one of those guardrails. “But I do get frustrated that everyone just points to social media as the baddie,” she says. “It lets everyone else of the hook. It’s a multilayered issue that means reimagining the role of the school, for example. We’re still having old conversations about this and throwing our arms up saying: ‘Where are all these mental health issues coming from?’

“We don’t know for sure, but we do know what helps: strong family relationships, a sense of belonging at school, a strong connection with your peers. I think we could be doing more before things go wrong, to understand child development and what kids need.”

Cut through the noise of federal politics with news, views and expert analysis. Subscribers can sign up to our weekly Inside Politics newsletter.

Most Viewed in Politics

Loading

Original URL: https://www.brisbanetimes.com.au/politics/federal/we-ve-been-worried-about-kids-on-the-internet-for-30-years-is-it-time-to-toughen-up-on-tech-20240423-p5fm37.html