NewsBite

TikTok exposed by deadly content

A US federal appeals court ruling has raised questions about a law that says social-media companies aren’t responsible for the content its users post.

The TikTok logo is seen at the booth during the media day at the Gamescom video games trade fair in Cologne. A US lower court must redetermine if TikTok is liable for the 2021 death of a 10-year-old.
The TikTok logo is seen at the booth during the media day at the Gamescom video games trade fair in Cologne. A US lower court must redetermine if TikTok is liable for the 2021 death of a 10-year-old.

A federal appeals court ruled that a mother’s lawsuit could proceed against TikTok over the “blackout challenge,” raising questions about a law that says social-media companies aren’t responsible for the content its users post.

The Philadelphia-based Third US Circuit Court of Appeals revived the lawsuit on Tuesday and said a lower court must redetermine if TikTok is liable for the 2021 death of 10-year-old Nylah Anderson.

The appeals court said TikTok could potentially be responsible because it exercised editorial judgment, and wasn’t merely acting as a content host, when its algorithm recommended blackout-challenge content on the girl’s “For You Page.” Social-media companies are legally protected from lawsuits over content.

“Had Nylah viewed a Blackout Challenge video through TikTok’s search function,” rather than seeing it because the site was suggesting it, “then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content,” the appeals court said in its decision.

The blackout challenge, which the girl died attempting, encourages people to choke themselves until they pass out.

Social-media companies are immune from lawsuits over content it hosts under Section 230 of the Communications Decency Act of 1996, a federal law that offers broad protections to tech companies that host user-generated content. Politicians across the spectrum have criticised the law in recent years, saying the foundational internet law gives tech companies too much power.

A lower court cited Section 230 in 2022 when it dismissed the suit against TikTok over Anderson’s death.

Her mother, Tawainna Anderson, alleged in her 2022 lawsuit that TikTok repeatedly pushed dangerous videos to her daughter.

Politicians want the law to be revoked or changed because they say it gives tech companies too much control over what Americans see online. Some Democrats have said social-media platforms let too much negative content spread, while Republicans often argue the platform censors conservative content.

The Communications Decency Act was originally created to protect children from accessing sexually explicit content online. Section 230 was intended to be a “good Samaritan” provision encouraging internet companies to proactively curate online activity and give them immunity from lawsuits if they chose to block content.

The law has stood despite challenges to it. The Supreme Court essentially upheld Section 230 when it ruled last month that social-media companies were allowed to moderate content under the First Amendment. Changes to Section 230 could limit protections social-media platforms currently have.

TikTok, in its argument to the appeals court last month, said its algorithm was protected speech under the First Amendment. Anderson, in her appeal over the lower court’s dismissal, argued TikTok shouldn’t be protected if it sent a self-harm video to a 10-year-old through its algorithm.

The Anderson family said Wednesday that social-media companies should use their technology to prevent children from consuming dangerous content.

“Nothing will bring back our beautiful baby girl,” the family said. “But we are comforted knowing that — by holding TikTok accountable — our tragedy may help other families avoid future, unimaginable suffering.”

Jeffrey Goodman, a lawyer for the Anderson family, said Wednesday, “This ruling ensures that the powerful social media companies will have to play by the same rules as all other corporations.”

TikTok, owned by the Chinese company ByteDance, didn’t immediately return a request for comment Wednesday.

Some Democrat-led states have moved to ban social-media companies from using algorithms to show content to children without parental consent. New York passed a law in June would only show minors content from accounts they follow, instead of random videos an algorithm found. Lawmakers in California have proposed a similar law.

The Wall Street Journal

Original URL: https://www.theaustralian.com.au/business/the-wall-street-journal/tiktok-exposed-by-deadly-content/news-story/c016f86bfd3fc57c5bec92146e5a76f9