This was published 1 year ago
Facebook, Instagram launch tool to block sharing of children’s explicit images
Facebook and Instagram have unveiled a new platform to let young Australians stop their nude images being spread on social media networks in an attempt to stem the tide of child sexual abuse material online.
The system, which is called Take It Down and run by the US non-profit National Centre for Missing and Exploited Children (NCMEC), does not require the child to locate the images they want removed online or to send in a copy, which can be traumatic and legally fraught.
Instead, the platform generates a unique alphanumeric code called a “hash” from an image or video submitted by the user without the content ever leaving the user’s device. This hash is then shared with the social networks, which then match it against any images already online or those others try to upload to their websites. Potentially offensive posts are then reviewed by each social media services’ moderators and removed if they breach its rules, triggering a report back to the NCMEC, which can share it further with law enforcement.
Josh Machin, the head of public policy in Australia for Facebook’s owner Meta Platforms, said the new tool built on an existing platform designed to stop the sharing of adults’ non-consensual intimate images called StopNCII.
“It’s a pretty significant legal challenge, actually, to be able to develop a portal like this because possession of child sexual abuse material is obviously a crime,” Machin told this masthead. “So we needed to work through not just some pretty significant technical hurdles, but also a legal framework in order to be able to deliver a platform like this.”
Adult video services PornHub and OnlyFans, and live video app Yubo are also part of the program, which is designed to be interoperable with any company’s systems. No other technology giants have yet joined the project. WhatsApp, which is also owned by Meta and encrypts user messages so that they are not visible to the company, is also not participating.
eSafety Commissioner Julie Inman Grant, who has long pushed technology companies to do more to tackle what she has previously called a “paedophile’s paradise” online, hailed Take It Down as an “extremely important” tool to help victims and survivors. The commission is already referring minors who report image-based abuse to the tool.
“We’d like to see more companies put their hands up to take part,” Inman Grant said.
She has already used Australian law to demand answers from a host of technology giants about what steps they are taking to pull down abusive material.
A key limitation of the hashing system used by Take It Down is that it only captures the exact image or video, meaning manipulated images or videos may go undetected or require the user to make multiple reports.
Meta has a separate system that is designed to tackle that problem and Machin said that WhatsApp had other measures in place to find and remove abusive material.
OnlyFans' chief operations officer Keily Blair said the platform invested heavily in keeping its platform adult-only and said the Take It Down initiative built on that work.
A spokesman for PornHub's parent company MindGeek echoed those points and urged other sites to sign up. "We encourage all image-sharing platforms to follow our lead and participate in Take it Down," the spokesman said.
Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.