Big tech is exploiting Australian kids for profit
My child comes home from school, Snapchats friends, has dinner and sits down to start homework. It’s winter in Melbourne, the sky is darkening at 5.30pm, but the curtains are still open. The window faces out to the street.
Imagine a car is parked outside, and its occupant is employed to record everything the child is doing, to download apps to enable online study, socialising and organisation. The tech vendors, in the main, take this as consent to watch, use and pass on her private world.
Kids’ information is enriched with enough sources of data to make a profile of which the FBI would be proud. Then this is sold to the highest bidder. Night after night, for millions of homes.
The data trading industry is a huge, low-profile, billion-dollar machine. Paired with the latest generative artificial intelligence technology, it now has superpowers to build and sell detailed personal profiles about children.
These profiles can include their friend groups, favourite movies and games, music, clothes, illnesses, study habits, school reports and even religious practices.
It infiltrates every aspect of their lives. There are no longer any boundaries.
Anyone who grew up before cameras on phones and social media will tell you how grateful they are that there is no evidence of their youthful misdemeanours. Our kids now have a digital rap sheet that follows them around their entire lives – it is neither fair nor reasonable.
Young people have readily given up their data via social media apps, Fitbit, Apple Watch, Google Home and Alexa, and now education technology (edtech).
Luci Pangrazio from Deakin University, who is considered Australia’s foremost researcher on datafication of children, surveyed 2000 people with more than 5000 home internet-wired devices.
Pangrazio warns of the impact of smart technologies in households with young children and “Google getting a real foothold in the family home”.
The growth of AI kicks it up a thousand notches. AI’s face and image recognition technology and our constant sharing of pictures on social media allow these platforms to see everything we do and with whom we do it. We don’t even have to “like” something for tech to figure out our preferences.
Improved privacy laws for children and updated education about how to safely use digital tools such as AI are crucial to protect our kids’ future.
The Growing Up Digital Australia study conducted by the eSafety Commissioner showed children aged eight to 17 were unaware of the data being collected about them and how that data was being used.
The study found that 38 per cent of children shared highly personal information online, 29 per cent didn’t know how their data was being used and 42 per cent were exposed to targeted advertising.
Children may give a fleeting thought to the security of their personal details but don’t fully understand the risks down the track. They want the likes, the new stuff, the access to their friends – now. They’re kids.
University of Western Sydney’s Amanda Third, a long-time researcher on children’s views of technology, tells me kids don’t generally believe it’s a fair trade to give up their data without knowing what happens to it. In fact, some are quite outraged.
“And they don’t generally read, let alone understand, the complex agreements they sign up to when they join a new platform or service. They’re demanding much more from platforms,” Third says.
“Some are quite outraged at what they say is the giant riddle of data collection, use and storage.”
If I requested all the information multiple data firms had on my children it would likely run into thousands of pieces of personal information. I found that just one of my children’s education apps showed connection to 21 digital services. Each requires the disclosure of personal information.
One even requests her precise location before she can use it. Why would an edtech product need to capture that level of detail?
The consent to collect her data is somewhere in the agreements that are made up of hundreds of pages of barely understandable technical and legal jargon.
Let’s be honest – consent forms are not for consumers. They are an indecipherable pro-forma, legal firewall to stop consumers ever successfully suing the vendor. Nothing more. It is inconceivable that children could provide informed consent and adults, rarely more savvy in this department, often consent on their behalf. So, we all unwittingly opt in.
The protection of our children’s privacy in the face of big data collection is a global hot issue. In April, the US House oversight and investigations subcommittee hearing met to critically examine what was described as the “staggering amount of information … collected on Americans every day frequently without their knowledge or consent”. An incredible cavalcade of Democrats and Republicans came forward to call for urgent action to protect children from corporations and data brokers monetising their private information.
When bipartisan voices speak as one, it is incredibly powerful.
Here in Australia, strict privacy laws do not exist and consent is not required for all data collection.
I’m not fearmongering when I say young people are at risk of being exposed and exploited for corporate profits.
There is some hope, with proposals for improved data protection standards in the federal Attorney-General’s Privacy Act review, a key part of which was to strengthen privacy protections for children and vulnerable people, as well as giving people more control over their personal information.
I spoke with Jordan Wilson-Otto, a Melbourne-based privacy expert at cyber security consultancy elevenM, who said changes to the Privacy Act were critical for the protection of children.
“There are two key proposals coming out of the Privacy Act review that might help create digital spaces that are safer for kids,” Wilson-Otto says.
“One shifts the burden of guarding against privacy harms from children and their parents to the developers and platform owners – the so-called fair and reasonable test. The other puts some long-needed limits on direct marketing targeted at children and the sale of children’s data.”
Velislava Hillman, a world authority on education technology from the London School of Economics, has said there are far-reaching impacts of privacy invasion for children.
“Much has been written about the importance of data privacy and the risks emanating from its loss,” Hillman says. “Privacy in education specifically should be seen as the condition and space for a child to learn and practise basic freedoms and rights to create, express and develop critical thought.”
Privacy loss can lead to a wide range of tangible and intangible harms – from embarrassment to the preventing of certain career pathways. We have a unique opportunity to shape our children’s future, safely, and set a better standard for their privacy.
We parents, schools and teachers have an important role to play in advocating for better protections. It is crucial that we have full transparency and disclosure from the tech sector so we can understand exactly what these tools do and what risks they pose. If schools choose to deploy technology that has the potential to breach privacy, especially if it connects to external computers, they must obtain real consent from parents and provide safer alternatives for those who choose to opt out.
Parents should be able to see what’s in the products and have the option to opt in, not just opt out.
Families also can drive change within the industry by organising to boycott suppliers that trade in children’s information.
Good governance requires that schools engage parents and children before rolling out apps that connect to other computers outside the school.
Ultimately, schools have a crucial role to play in educating children about digital literacy and data rights. But they are under pressure and need government support.
Digital literacy for educators and their students will help shape a safer online environment. Kids should be taught early that media is a tool to do many things, but we must improve their awareness of the risks, just as we do about crossing the road safely or not chatting to strangers.
Parents and teachers don’t have the time or energy to engage in a David and Goliath battle with companies that have revenues larger than that of some countries. We must not be resigned to data misuse. Children’s privacy matters and must be protected. It’s time to send a clear message that our children’s privacy is not – and should not be – for sale.
Chloe Shorten is an advocate for children’s privacy, a director of Alfred Health and the chair of the Centre for Digital Wellbeing. She is mum to three tech-enamoured kids and author of two books on contemporary families.