Experts slam Instagram’s announcement on limiting Aussie teen access
Australian child safety advocates have blasted Instagram’s new measures for underage teens, warning it’s an attempt to “deflect” from the real issue.
National
Don't miss out on the headlines from National. Followed categories will be added to My News.
Child safety advocates have criticised Instagram’s move to introduce separate “teen accounts”, warning it’s a tactic to “deflect” from the social media ban debate.
The new feature will automatically set all new and existing accounts of users under 18 in private mode, which will be rolled out over the next 60 days.
For users under 16, they’ll require parental permission to remove the most restrictive settings, including: Automatically setting accounts to private, removing the ability to message strangers, and limiting access to “sensitive content”.
Once set up, parents can use Instagram’s in-app supervision tools to monitor which accounts their kids are interacting with, and the topics they are seeing in their feeds.
Parents can also limit the amount of time spent in the app under “sleep mode”, which either mutes notifications or makes the app entirely inaccessible.
The move by parent company Meta to automatically opt-in more than 100 million accounts to strict parental controls comes after years of public scrutiny, pressure from mental health and child protection experts, and News Corp’s Let Them Be Kids campaign.
However, Eating Disorders Families Australia executive director Jane Rowan said the changes does “nothing to alleviate” the urgent need for a social media age limit of 16+.
She acknowledged that while the move is an “effort to safeguard young people”, these measures don’t provide adequate protection.
“They are relying on the young people themselves to tell the truth about their age,” Ms Rowan said.
“We know this doesn’t happen now, so pressure remains on parents to closely monitor their young person’s accounts and we also know this is often unsuccessful.
“A higher social media age limit is also essential to shield young people from harmful content on other platforms such as Snapchat and TikTok.”
Skyrocketing rates of anxiety, depression, eating disorders and other harms among children and teens were revealed by News Corp’s ”Let Them Be Kids” campaign, which called for a minimum age of 16 to be introduced for social media access in Australia.
The eSafety Commission, Australia’s online safety watchdog has called on Meta to release more information about the “efficacy of these settings and controls” for teen accounts on Instagram.
“One of the ongoing challenges has been limited information and transparency, including around the measures companies are and are not taking to protect users from harm,” acting eSafety Commissioner Toby Dagg said.
“At the moment, we do not even have a clear picture of how many children are using Meta’s services and what age they are.”
Prime Minister Anthony Albanese last week promised to introduce new laws on platforms like Instagram, Facebook, TikTok and Snapchat for a minimum age between 14 and 16, saying his personal view was to “err on the side of a higher limit”.
Collective Shout founder Melinda Tankard Reist criticised Instagram’s changes as “conveniently timed,” coinciding with Australia’s plans to extend age access for children.
She said it may give parents a “false sense of security” and perhaps become less vigilant, putting their child in a more vulnerable position.
“We would normally welcome any app applying the strictest setting to sensitive and sexually suggestive content,” Ms Reist said.
“But given the hundreds of complaints we have made about porn themed content on Insta and dismissal of our complaints, we question who at Insta will define this. They don’t even act on the exploitative content we point out now.
“Experts are on the side of delaying social media exposure at all and preventing harm, Meta has shown it is more concerned with profits.”
Coalition communications spokesman David Coleman was scathing of the Instagram changes, accusing Meta of attempting to circumvent effective age verification.
“By its own admission, Meta isn’t changing anything on age verification,” he said.
“So if a 10 year old signs up to Instagram in the future, the system is exactly the same as it is today and they can just lie about their age.
“Meta will do everything it can to avoid a real system of age verification, because it will lead to them losing huge numbers of underage users over time.”
‘BEYOND TRUSTING THEM’: META WHISTLEBLOWER
A Meta whistleblower says the social media giant has known about features that could protect children on their platforms “for years,” but chose not to do anything until countries like Australia stood up and said “it’s time”.
Frances Haugen, a former Meta product manager who in 2021 leaked thousands of documents exposing the inner workings of the company, said despite the positive announcement of Instagram teen accounts there remained some “pretty serious missing features”.
“There aren’t time limits, they still send notifications during the school day - these are intentional choices to drive more usage,” she told ABC.
“We should not lose steam just because they’ve gotten this part of the way there.
Ms Haugen said it was not yet clear how effective meta’s age verification would be for Instagram teen accounts.
That’s why we need to pass regulations beyond just trusting them,“ she said.
Ms Haugen said social media users already give so much data to platforms by sharing videos of themselves, engaging with certain topics, who they message, or if they turn up at a school each day.
“We know even off the shelf technologies today can find 97 or 98 per cent of under 16s, let alone a company like Facebook that has all this other data,” she said.
“So we can find the kids and stop them, if that’s a priority for society.”
Ms Haugen said tech platforms were being dragged toward protections for underage users thanks to governments in Australia, Canada, the UK and parts of the US, noting it wasn’t a coincidence those same countries would be the first to get the Instagram teen accounts.
“Facebook knows more serious regulations are coming,” she said.
“These changes really are symptoms of ‘does Facebook feel that the winds are changing?’ and the winds are changing.”
WHAT YOU NEED TO KNOW
How will the Instagram age limit work?
The sweeping update will automatically place teens into Teen Accounts, and those under 16 will then require a parent’s permission to change any privacy and restriction settings.
Instagram will implement these changes on accounts where a date of birth is listed, as well as on accounts they attempt to verify as Teen Accounts.
What is a ‘teen account’?
Instagram Teen Accounts will be set to private by default, and teen users will need to accept new followers individually. Just like with the regular private account setting, people who don’t follow you, can’t see your content or interact with you.
What are the parental controls?
To turn off any of the new restrictions it will require a parent’s account to be linked to the teen and the parent to enable the changes.
What is the TikTok age limit in Australia?
TikTok requires users to be aged 13 years and over.
What is the Facebook age limit in Australia?
Facebook also requires users to be aged 13 years and over.
More Coverage
Originally published as Experts slam Instagram’s announcement on limiting Aussie teen access
Read related topics:Let Them Be Kids