‘Dystopian dragnets’: Apple’s warns proposed online safety standards could turn private companies into ‘arms of the state’
The US tech titan says child abuse is abhorrent but believes proposed new standards to stamp out online exploitation will expose more Australians to cyber crime.
Apple has warned the eSafety Commissioner’s proposed standards to tackle online child abuse threatens to turn private companies into “arms of the state” and create “dystopian dragnets”.
The $US2.8 trillion tech titan says child exploitation is “abhorrent”. But believes online safety can be strengthened without introducing new standards which would compel tech companies to screen and hand customer data over to government agencies without a warrant or court order.
“Forcing providers to comb through the private storage and communications of all its users, without any particularity, reason for suspicion, or other constraint, improperly turns private companies into arms of the state and would up-end the trusted relationship between a provider and its users,” Apple said in its submission about the new standards.
“There is evidence from other platforms that innocent parties have been swept into dystopian dragnets that have made them victims when they have done nothing more than share perfectly normal and appropriate pictures of their babies.”
But eSafety Commissioner Julie Inman Grant dismissed Apple’s warning, saying the new standards would not force companies to indiscriminately search protected communications, saying the obligation is to detect and remove only verified child sexual abuse or known pro-terror material.
A spokesman for Ms Inman Grant said the new rules were not require what is known as “backdoors” in encryption services to enable law enforcement to access such material.
But Apple said this sentiment is not expressed in the draft standards.
“We recommend that eSafety adopt a clear and consistent approach expressly supporting end-to-end encryption so that there is no uncertainty and confusion or potential inconsistency across codes and standards.”
“Encryption provides an essential layer of additional security because it ensures that a malicious actor cannot obtain access to a user’s data even if the actor is able to breach a service provider’s networks.
“It shields everyday citizens from unlawful surveillance, identity theft, fraud, and data breaches, and it serves as an invaluable protection for journalists, human rights activists, and government employees who are constantly targeted by malicious actors.”
Crucially, Apple warned if the standards are adopted in their current form, it could spur other countries to follow — even those which “lack the robust legal protections afforded to Australians”.
“If such governments know that service providers have put into place scanning systems pursuant to mandates from the Australian government, they will seek to use those systems for their own purposes: if we are forced to build it, they will come. So too will criminal actors, drawn to where protections are weaker and preying on innocent users is easier.”
But an eSafety spokesman rejected claims the standards would be used as a tool for government surveillance.
“Tech companies control the technology they deploy and the implementation of those technologies. These companies can clearly indicate in their policies that scanning is confined to know CSAM (child sexual abuse material) and regularly resist attempts by undemocratic governments to use tools for broad surveillance,” he said.
“Child sexual abuse material (CSAM) is spreading at a pace, scale and volume we have not seen before. eSafety’s recent transparency reports have also revealed the biggest tech companies aren’t doing enough to tackle the proliferation of this horrific and harmful material.”
Apple director of user privacy and child safety Erik Neuenschwander also said the company had adopted several crime-fighting initiatives and encryption did not act as a shield for pedophiles.
“People who quite unfortunately are seeking to reach out and exploit children on our platform, or any platform … it’s difficult for them to remain hidden,” he said.
“There’s now a potential victim there on the other side of that communication. And so by creating tools that empower those potential victims to take action to get themselves out of that situation, and then highlight to authorities that these people are seeking to exploit children, that is a point of intervention that we think can be quite powerful and prevent future victimisation.”
Apple has a Communication Safety feature which is a default on account holders under the age of 13 and automatically intervenes when pictures containing nudity or inappropriate material are sent or received. Apple also has a sensitive content warning which intervenes when users perform searches for queries related to child exploitation.
“The goal is to disrupt the grooming of children by making it harder for predators to normalise this behaviour and to create a moment for a child to think when facing a critical choice,” Apple said in its submission.
“We believe there are alternative ways to achieve the goal of combating abhorrent content that do not require undermining the privacy and security of all Australians and we urge eSafety to allow providers flexibility to pursue those means.”