NewsBite

Roblox to require facial scans to protect children from adult conversations online

Gaming giant Roblox has unveiled sweeping new safety measures for its 380 million users amid growing scrutiny over dangers for children on the hugely popular online platform.

One of the world’s largest gaming platforms will now require users to submit to facial age-estimation scans in a global push to stop children being exposed to adult conversations online.

Roblox announced the launch of its new age-estimation system, with the checks initially available as a voluntary feature before becoming mandatory for Australian users in the first week of December, just days before the nation’s under-16 social media ban comes into force on December 10.

A global expansion of the system will follow in January 2026.

The platform, which counts an estimated 380 million players, will be the first major gaming service to mandate age checks for communication features.

Roblox announced the launch of its new age-estimation system, with the checks initially available as a voluntary feature before becoming mandatory for Australian users in the first week of December.
Roblox announced the launch of its new age-estimation system, with the checks initially available as a voluntary feature before becoming mandatory for Australian users in the first week of December.

The system, provided by US firm Persona, uses artificial intelligence to scan a user’s face and guess their age before assigning them to one of six age categories, from under 9 through to over 21.

After the scan, users will only be allowed to chat with peers in their own age group or adjacent brackets — a move Roblox said is critical to reducing the risk of inappropriate contact between children and adults.

“This helps ensure users are able to socialise with others in their age groups safely, while also limiting contact between minors and adults that they do not know,” VP and head of User and Discovery Products at Roblox, Raj Batia, said.

Roblox faces more than 35 lawsuits in the US, including allegations it allowed child sexual exploitation on its platform.
Roblox faces more than 35 lawsuits in the US, including allegations it allowed child sexual exploitation on its platform.

Chief safety officer Matt Kaufman said the technology had proven accurate within one to two years but users could correct results by uploading government ID.

“We decided that simply going with age estimation was the path of least friction and the path that provided the best outcome and that minimised the collection of personal information,” he said.

Roblox stressed that neither it nor Persona stores the facial image or links it to identity.

The company will also introduce a new Safety Centre for parents to set Parental Controls and access guidance on managing their children’s online behaviour.

“[We] determine one that you’re a real person, and two, estimate your age, and then we immediately delete the video and image that are taken during that process,” Mr Kaufman said.

“There is no dating. There is no allowance for sexual content, no profanity, and we also don’t allow people to share images and video when they’re talking to each other.”

The platform, which counts an estimated 380 million players, will be the first major gaming service to mandate age checks for communication features.
The platform, which counts an estimated 380 million players, will be the first major gaming service to mandate age checks for communication features.

Roblox — which faces more than 35 lawsuits in the US, including allegations it allowed child sexual exploitation on its platform — will roll out age restrictions in Australia, New Zealand and the Netherlands in early December ahead of its global deployment.

With Australia’s social media ban for under 16s in force on December 10, a government spokesperson said Roblox stepping up was because of Labor’s tough line on digital safety.

“Platforms must ensure users are safe and any platform is free to implement age assurance as part of their work to ensure this,” the spokesperson said.

“Roblox is not exempt from their responsibilities under the Online Safety Act or their social responsibility to young Australians.

“And because we have been so strong on online safety, Roblox has committed to new safety measures including making accounts for users aged under 16 private by default and introducing tools to prevent adult users from contacting under 16s without parental consent.

“We’ll continue to work closely with industry as we progress our significant online safety reforms, including a digital duty of care, which consultation has now opened for.”

Earlier this year, Australian politicians called for Roblox to be included in the country’s under-16 social media ban, but eSafety Commissioner Julie Inman Grant confirmed it would be exempt, as would Discord and Steam.

Despite Roblox’s chat feature, Mr Kaufman said it was not a social media platform, but hoped the company could help regulators “lift online standards”.

Mr Kaufman said the changes marked a major step in creating safer online experiences.

“These new systems that we’re creating and launching to do age estimation before somebody gains access to chat, we believe, will make this platform safer and more civil for everybody on the platform,” he said.

- with NewsWire

Originally published as Roblox to require facial scans to protect children from adult conversations online

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.ntnews.com.au/education/support/technology-digital-safety/roblox-to-require-facial-scans-to-protect-children-from-adult-conversations-online/news-story/fc30078fd9236ff78a7764d0c805a51d