Meta’s encryption to ‘conceal the worst crimes’ against kids
Child abuse investigators are bracing for expanded encryption on Facebook and Messenger to have crushing impacts, with one expert convince it will protect criminals, sex offenders.
Child abuse investigators are bracing for expanded encryption on Facebook and Messenger to have crushing impacts that may be disguised by a rise in meaningless tips that lead nowhere.
Tech giant Meta revealed in December it had begun a global rollout of default end-to-end encryption for messages and calls across its Facebook and Messenger platforms, but would not detail progress in individual countries.
While privacy advocates welcomed the move, crime-fighting agencies and charities said it would have a devastating effect on the ability to detect and prevent child abuse.
Hetty Johnston, founder of child protection organisation Bravehearts, said investigators expected a sharp decline in actionable leads originating from Meta about offenders sharing child abuse images.
Police could instead be provided with reports that contained information with no evidentiary use.
“End-to-end encryption is protecting criminals and sex offenders,” Ms Johnston said.
“It will conceal the worst crimes and (Meta will) come up with a whole bunch of inactionable rubbish that will blow out their numbers.”
An experienced child abuse investigator also told The Australian “Meta may counter public criticism by increasing reporting to law enforcement with information that is not of investigative value … We’re all expecting that. The proof will be in the data”.
An Australian Federal Police spokeswoman said early statistical data indicated an “upward trend of rising numbers of reports relating to online child sexual exploitation has continued in 2024”.
However, the AFP-led Australian Centre to Counter Child Exploitation “does not yet have any substantive data relating to the impact of E2EE (end-to-end encryption) on these numbers”.
“E2EE is extremely problematic for all law enforcement, because it removes our ability to see the content of material,” the spokeswoman said.
“For example, if one person sent an encrypted message containing child abuse material to another person on the same platform, law enforcement would not be able to see or identify that the content was unlawful and therefore would not be able to investigate the conduct.”
The ACCCE’s triage unit received 40,232 reports of online child sexual exploitation in the 2022-23 financial year. The “large majority” of these were received from the National Centre for Missing and Exploited Children in the US, the spokeswoman said.
US-based social media companies are required by law to report child sexual abuse material to the NCMEC.
Of the 32 million reports to the NCMEC in 2022, 27 million, or 84 per cent, were from Mark Zuckerberg’s Meta platforms, led by Facebook (21.16 million), Instagram (5 million) and WhatsApp (1.01 million).
Meta has advised users default end-to-end encryption will mean “nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us”.
Among the many arrested following a tip from the NCMEC is former Queensland University of Technology law lecturer Gordon Douglas Chalmers, who allegedly posed on Facebook as singer Justin Bieber and duped children into sending him explicit images.
Investigators say Mr Chalmers’s alleged indecent communications with about 200 children may have remained concealed if end-to-end encryption had been in place.
The NCMEC said in December “Meta’s choice to implement end-to-end encryption on Messenger and Facebook with no ability to detect child sexual exploitation is a devastating blow to child protection”.
A spokeswoman said on Sunday that “as of now, NCMEC’s CyberTipline has not seen a decrease in reporting by Meta as a result” of end-to-end encryption.
Meta, which declined to answer questions from The Australian, maintains encryption improves safety and security for users. It said it has spent five years working to protect privacy and promote safety in end-to-end encrypted messaging, and it expects to continue to provide more reports to law enforcement than its peers.
But Ms Johnston said that as well as expanding encryption, Meta joined other social media firms in slashing “trust and safety” teams including those overseeing Australia and was “not interested” in monitoring child welfare.
“They are actually working to protect criminal gangs and criminal behaviour such as pedophiles. They’re a criminal organisation in themselves,” she said.
“How can a company who knows this stuff is going on, on their networks, who knows children are being raped and abused, do that? It’s clearly an absolutely heartless, dangerous organisation. Everybody is thrown to the wolves here by Meta.”