NewsBite

Wake-up call: Children’s tech policy poses test of character for nation

We can demand that our children’s interests take precedence over foreign corporate interests.

The conversation about AI regulation and child safety comes at a critical moment.
The conversation about AI regulation and child safety comes at a critical moment.

If we look at whether we are keeping our children safe, the answer is a resounding no.

The US court case about the tragic death of 14-year-old Sewell Setzer, who died by suicide in February 2024 after Character. AI’s chatbot encouraged him to “come home to me as soon as possible”, demonstrates the lethal consequences when children’s interests are subordinated to corporate priorities and bureaucratic timelines. And let’s not get sucked into the tobacco lobby playbook about causation – we know where that ends up.

But if we look at the dedicated work of advocates, researchers and organisations committed to children’s digital rights, then yes – there are pockets of excellence fighting for children’s agency in our digital world. Their evidence is not moving the dial. Tech companies are using growth and use measures because it’s easier than measuring kids’ wellbeing.

The truth lies somewhere between safe and not safe, and that’s precisely the problem. In a rapidly evolving digital landscape where AI can influence vulnerable young minds in real time, somewhere in between isn’t good enough.

Children can’t see how their data is used and cannot mediate their behaviour accordingly. Neither can their parents.
Children can’t see how their data is used and cannot mediate their behaviour accordingly. Neither can their parents.

The conversation about AI regulation and child safety couldn’t come at a more critical moment. Fresh research from the Alan Turing Institute and the LEGO Foundation has delivered a wake-up call from 1700 interviews with kids and parents that should jolt policymakers into action: children aren’t passive observers of the AI revolution, they’re active participants who deserve a seat at the table.

One in four children aged eight to 12 is using generative AI tools such as ChatGPT and Snapchat My AI, with most operating without safeguards. This isn’t some distant future scenario we need to prepare for; it’s happening right now in classrooms, bedrooms and playgrounds across Australia. Research and personal experience tell us some of this tech can be safer with better design, but we have to be prepared to say there is some that shouldn’t be built at all.

Children’s fundamental interests mirror the goals of democracy itself: agency, choice, privacy and the pursuit of a better life. These principles underpinning healthy capitalism recognise that agency is to a person’s free will what choice is to free markets, and the individual’s vote is to democracy.

Once a child enters any system – healthcare, childcare, school, sports clubs – they are immediately ‘datafied’.
Once a child enters any system – healthcare, childcare, school, sports clubs – they are immediately ‘datafied’.

A child is no different from an adult in deserving agency but they are treated radically differently, stripped of involvement and agreement in decisions that shape their digital lives. The rallying cry “Nothing about us without us” – from its origins in Poland’s fight for a parliament in the 1300s to disability reformers in the 1970s – applies to children more urgently than ever.

While adults can choose their level of tech engagement or even go off-grid, children are tethered to corporations without choice from early childhood.

Once a child enters any system – healthcare, childcare, school, sports clubs – they are immediately “datafied”. Files are created, data is collected and digital profiles are built. These aren’t the child’s files; they belong to organisations, governments, businesses and “secondary users”. Children don’t know this exists, can’t see how their data is used and cannot mediate their behaviour accordingly. Neither can their parents.

The most powerful voices in children’s tech policy are not children, parents, educators or child development experts. They are international corporate representatives with the deepest pockets and loudest megaphones.

Right now, TikTok’s government relations campaign fills airport screens across Australia, selling its narrative about teen benefits. Meanwhile, the organisations holding evidence of TikTok’s harmful impacts cannot afford such advertising budgets.

One in four children aged eight to 12 is using generative AI tools such as ChatGPT and Snapchat My AI. Picture: Richard A. Brooks/AFP
One in four children aged eight to 12 is using generative AI tools such as ChatGPT and Snapchat My AI. Picture: Richard A. Brooks/AFP

This creates a fundamentally unrepresentative policy environment where the richest voices command the largest share of influence. Their industry lobby groups wield overwhelming advantages: multibillion-dollar budgets for sustained lobbying campaigns and claims of implementation expertise that exclude children’s voices.

They have direct access to decision-makers and regulatory discourse, the ability to frame child safety measures as business taxes and red-tape barriers.

These organisations have clear primary fiduciary duties to shareholders and profit maximisation, not children’s best interests or Australia’s national interests.

When Australian industry bodies represent foreign companies in policy development, they must be transparent about where those companies’ activities contravene Australia’s economic, security, sovereignty and duty of care interests.

The Sewell Setzer tragedy isn’t an unlucky anomaly – it’s a predictable outcome of a system that prioritises corporate profits over children’s wellbeing.

When we allow tech companies to regulate themselves, when we accept that children’s safety is secondary to innovation and market access, we create the conditions for such devastating losses. AI marks a historic inflection point. However, unlike past generations, we possess the foresight to anticipate its transformative impact.

Anthony Albanese has vowed to adhere to his election promise to crack down on social media.
Anthony Albanese has vowed to adhere to his election promise to crack down on social media.

We can continue allowing foreign corporate interests to dominate policy discussions about our children’s digital futures or we can demand that children’s voices, rights and interests take precedence.

The choice we make will determine whether tragedies such as Sewell’s become cautionary tales that drive meaningful change or merely footnotes in a longer story of regulatory failure.

Our children deserve better. Our children are not invisible rounding errors in a US or Chinese corporate balance sheet. They deserve agency in the systems that shape their lives, protection from exploitation, and adults who put their interests first.

The test of our character as a nation lies in whether we’re brave enough to make that choice.

Chloe Shorten is a writer and director of non-profit organisations; an advocate for mothers, children and people with disabilities; and author of two books on families.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/inquirer/wakeup-call-childrens-tech-policy-poses-test-of-character-for-nation/news-story/d683a63f2cddfd9e373b92f1f7668724