Big tech faces ‘reckoning’ over online harms, says eSafety Commissioner Julie Inman Grant
Technology companies can no longer ignore the horrific and real-world consequences of online harms, and must do more to protect users, especially children.
Tech giants could be accountable for “the horrific and real-world consequences of online harms” and might be forced to do more to protect users of their social media platforms after a landmark ruling by a US court, according to Australia’s eSafety Commissioner Julie Inman Grant.
Last week a US federal appeals court ruled a lawsuit could proceed against TikTok over the “blackout challenge” – a dare that encouraged people to choke themselves until they passed out.
The case was brought by Tawainna Anderson, whose 10-year-old daughter Nylah died in 2021 while attempting the challenge. The appeals court said TikTok could potentially be responsible because it exercised editorial judgment and wasn’t merely acting as a content host when its algorithm recommended blackout-challenge content on the girl’s “For You” page.
“Had Nylah viewed a blackout challenge video through TikTok’s search function” rather than seeing it because the site was suggesting it, “then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content”, the appeals court said.
Social media companies are legally protected from lawsuits over content under section 230 of the Communications Decency Act of 1996, which shields providers from liability for content others upload to their platforms.
Ms Inman Grant said the case of Nylah Anderson reinforced that the onus should be on the tech giants to eliminate harmful content on their platforms and to “proactively and thoughtfully build in safety guardrails”.
“For too long, the tech sector has been effectively shielded from accountability for conduct and content enabled on their platforms through section 230,” Ms Inman Grant told The Australian.
“As someone who was in DC shaping section 230 in 1996 when the internet was in its infancy, none of us in the industry would have dreamt that this ‘intermediary immunity provision’ would have remained untouched for 28 years, given how much the industry has changed and how much harm technology with no care, responsibility or guardrails has wrought on humanity.
“If this ruling stands, technology companies can no longer ignore the horrific and real-world consequences of online harms. They must do more to protect users, especially children, and enforce their own age restrictions.
“Algorithms and recommender systems can operate to promote and effectively normalise a range of harmful activities including dangerous challenges, self-harm, unhealthy eating habits, hate, racism and violence. These attitudes and beliefs can spill over into the real world, sometimes with very tragic consequences.
“As we outlined in our recommender systems and algorithms position paper, there are key steps companies can and should take to make these systems less dangerous. We will continue to ask tough questions of all platforms to better understand what they’re doing to prevent children and young people from being drawn into a vortex of harmful content.”
Politicians in the US want section 230 to be revoked or changed because they say it gives tech companies too much control over what Americans see online. Some Democrats have said social media platforms let too much negative content spread, while Republicans often argue the platform censors conservative content.
The Communications Decency Act was created to protect children from accessing sexually explicit content online. Section 230 was intended to be a “good Samaritan” provision encouraging companies to proactively curate online activity and give them immunity from lawsuits if they chose to block content.
TikTok, in its argument to the appeals court in July, said its algorithm was protected speech under the first amendment to the US constitution. Ms Anderson, in her appeal over the lower court’s dismissal, argued TikTok shouldn’t be protected if it sent a self-harm video to a 10-year-old through its algorithm.
Anderson family lawyer Jeffrey Goodman said on Wednesday: “This ruling ensures that the powerful social media companies will have to play by the same rules as all other corporations.”
TikTok, owned by Chinese company ByteDance, declined to comment.