Apple facing $1.2 billion lawsuit for failing to stop distribution of child pornography
Victims of child sexual abuse in the US have sued the tech giant for $1.2 billion in damages, claiming it failed to stop the distribution of illegal material.
Online
Don't miss out on the headlines from Online. Followed categories will be added to My News.
Victims of child sexual abuse in the US have sued Apple for $US1.2 billion in damages, over claims the tech giant failed to stop the distribution of illegal material.
The lawsuit against the trillion-dollar company is being brought by a 27-year-old woman who was molested by a male family member as an infant, The New York Times reported. The relative, who was eventually arrested and sent to prison, took photographs of the abuse and shared them online with other offenders.
The woman, who spoke to The Times on the condition of anonymity, said she and her mother were reminded of the abuse nearly every day due to multiple notifications from law enforcement that someone had been charged with possessing the images.
“It was hard to believe there were so many out there,” she said.
“They were not stopping.”
In late 2021, the woman received a notification that the images were found on a Vermont man’s MacBook, with authorities later confirming they’d also been stored in Apple’s iCloud.
She ultimately decided to sue the company, she told The Times, because it had broken its promise to protect victims like her.
As many as 2689 victims could be eligible for compensation as part of the lawsuit, which was filed in Northern California over the weekend.
Under US law, child sexual abuse victims are entitled to a minimum of $US150,000 ($234,683) in damages, meaning Apple’s payout could exceed $US1.2 billion ($1.87 billion) if it’s found liable by a jury.
The filing refers to NeuralHash, a tool unveiled by Apple in 2021 that allowed it to scan for illegal images of sexual abuse via its iPhones, which would store a database of distinct digital signatures (known as hashes), associated with known child sexual abuse material.
Those digital signatures would be compared against photos in a user’s iCloud storage service, and flag and report any matches of suspected sexual abuse material to authorities.
NeuralHash never came to fruition – after cybersecurity experts said the technology could open the door to other government surveillance requests, Apple dropped the plan, saying it was impossible to scan iCloud photos without “imperilling the security and privacy of our users”.
With NeuralHash, Apple has been selling defective products that harmed a class of customers, the lawsuit said, because it briefly introduced “a widely touted improved design aimed at protecting children” but “then failed to implement those designs or take any measures to detect and limit” child sexual abuse material.
The complainants are not only seeking compensation, but for Apple to change its practices.
In response to the lawsuit, Apple spokesperson Fred Sainz described the material as “abhorrent”, and said the company is “committed to fighting the ways predators put children at risk”.
“We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users,” he told The Times in a statement.
The lawsuit comes amid increased scrutiny of Apple’s failure to effectively monitor its platforms or scan for images and videos of child sexual abuse.
Following a 2019 investigation by The Times that revealed tech companies had failed to rein in abusive material, Apple executive Eric Friedman, who was responsible for fraud protection, messaged a senior colleague to say he thought the company was underreporting child sexual abuse material.
“We are the greatest platform for distributing child porn,” he wrote in the 2020 text exchange, pointing out that was because Apple gave priority to privacy over user trust and safety.
In August, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused the company of vastly undercounting how often the material appears on its products, and had been implicated in 337 recorded offences of child abuse images between April 2022 and March 2023 in England and Wales, The Guardian reported at the time.
While Facebook and Google filed more than one million reports each of suspected child sexual abuse material to America’s National Centre for Missing & Exploited Children (NCMEC), Apple submitted just 267.
“There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost neglible number of global reports of abuse content they make to authorities,” NSPCC head of child safety online policy, Richard Collard, told The Guardian.
“Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety.”
Originally published as Apple facing $1.2 billion lawsuit for failing to stop distribution of child pornography