NewsBite

TikTok struggling to remove graphic suicide video shared from Facebook livestream

A close friend of the man whose suicide video went viral on TikTok has said he has become a “meme for his darkest moment”. He has also set up a campaign to get social media networks to ban accounts posting violent videos.

TikTok suicide video explained

A close friend of the man whose suicide video went viral on TikTok and other social networks this week has called for urgent changes, arguing his friend’s life could have been saved if Facebook had responded when the incident was first reported.

And Josh Steen, who said Facebook took more than two hours to remove the distressing video after his friend died, welcomed Prime Minister Scott Morrison’s public call for reform on Wednesday, telling News Corp that putting pressure on powerful social networks was “the only way I believe we can get change”.

Deeply disturbing footage of 33-year-old American man Ronnie McNutt ending his life went viral on popular video platform TikTok this week, shocking users when it automatically played in their “For You” video stream without warning.

Its spread was particularly harmful given TikTok’s large audience of young teens.

But Mr Steen, a graphic designer who collaborated on a podcast with Mr McNutt, said it wasn’t just TikTok that failed to remove the video from the internet, as he and his friends had “watched this thing literally work its way across the globe, and no matter how many times we report(ed) accounts, posts, or harassment, it wouldn’t go away”.

“I watched my friend die after I attempted contact multiple times,” he said. “Now children across the globe are watching this, without context.

“My friend has become a meme for his darkest moment. I can’t let that just slide.”

TikTok is struggling to remove a disturbing video from its network today. Picture: Getty
TikTok is struggling to remove a disturbing video from its network today. Picture: Getty

Mr Steen said he reported the live video to Facebook while his friend was still alive, hoping the company would shut it down and give Mr McNutt time to reconsider his actions.

Unfortunately, he received responses from Facebook hours later, he said, stating the video did not violate its ‘community guidelines’.

“Had there been an intervention of any kind, even digitally, the outcome would be vastly different,” he said.

“They did not respond to my initial support request until almost two hours later, and almost an hour after Ronnie had (suicided).”

The livestream video remained on Facebook for more than two hours, by which time it had been viewed by more Facebook users, captured, and shared to other social media.

A Facebook spokeswoman said the company removed the graphic video from its platform “on the day it was streamed and have used automation technology to remove copies and uploads since that time”.

“Our thoughts remain with Ronnie’s family and friends during this difficult time,” she said.

But since appearing on Facebook on August 31, parts of the six-minute video have appeared on sites including TikTok, Instagram, Twitter, and YouTube; some of which are still hosting the footage.

Australian Vanessa Pappas is the new CEO of TikTok. Picture: Supplied
Australian Vanessa Pappas is the new CEO of TikTok. Picture: Supplied

A TikTok spokeswoman said its “systems, together with our moderation teams, have been detecting and removing these clips for violating our policies” since they appeared.

Prime Minister Scott Morrison also joined calls for TikTok to take urgent action against the disturbing video, saying “no child should be exposed to horrifying content like this”.

In a video, he said the incident showed the “serious side” of social media, and called for the powerful companies behind the networks to take action to prevent harm.

“Those who run these organisations have a responsibility to those who are watching it and particularly when it comes to children,” he said.

“The rules in the real world, how you behave in the real world, how you talk to each other, the protections you put in place in the real world have to be the same in the social media world. There’s not a special set of rules.”

Mr Morrison said it was important that algorithms designed to unearth popular videos did not accidentally promote “this very damaging material,” and tech companies with huge reach should take responsibility for content on their platforms.

“You need to be accountable. You need to be responsible for making sure that your product does not harm Australians,” he said.

“And my Government will be doing everything to make sure we hold you to account for that.”

Julie Inman Grant, eSafety Commissioner. Picture: Supplied
Julie Inman Grant, eSafety Commissioner. Picture: Supplied

Mr Steen said he was “glad one world leader has made a statement” on the issue, and said this kind of pressure was “the only way I believe we can get change”.

He and other friends have organised a grassroots campaign called Reform For Ronnie, calling for social networks to accept responsibility for content on their platforms, respond “efficiently” to harmful content, and ban accounts posting violent videos.

Schools across the country sent warnings to parents about their students’ use of social media on Tuesday, following the spread of the footage that was recommended and played automatically to many unsuspecting TikTok users.

Australian eSafety Commissioner Julie Inman Grant warned the footage could harm young and vulnerable people who viewed it, and TikTok and other social media needed to work faster to delete it.

Ms Inman Grant said parents shouldn’t raise the issue with their children unnecessarily, but should “keep an eye on those who are more vulnerable and at-risk,” monitor their interactions both online and off, and have “open and ongoing discussions” with their children about suicide and self-harm.

Cyber safety expert Susan McLean said parents should suggest their kids take a break from TikTok and other social media for the next two days while the video was removed.

“TikTok are now scrambling to ban people who post it but it’s too little too late,” Ms McLean said.

“Had their algorithms flagged it, we wouldn’t be having this situation.

“Please ensure you do not allow your older teens on the app today, if they have it.”

Ms McLean said TikTok’s response to the emergence of the video was far too slow, and should make parents think twice before letting younger teenagers use the network, particularly those under the recommended age of 13.

St Vincent’s College in Sydney sent Ms Inman Grant’s warning, also noting in a letter to parents they would encourage parents to avoid raising the issue with their daughters who may not have heard or been exposed to this video.

“Drawing undue attention to the issue may cause unnecessary worry or distress and increase exposure,” their letter stated.

“We advise that they monitor their daughters who are more vulnerable and at-risk, and check

in with them about their interactions on and offline. If a young person is presenting with signs

of distress, parents should follow their own procedures for managing and supporting their

daughters risk.”

A TikTok spokeswoman said the social network was trying to remove the clip and accounts that shared it.

“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family,” she said.

But young TikTok users are still uploading more videos, saying they had been traumatised by seeing the video.

Roy Morgan found TikTok was the second most popular social network for children aged 14 and under in Australia, and more than two thirds of 1.6 million Australians using the platform were aged under 25 years.

This incident is not the first time Facebook’s platform has been used to stream violent videos, with murders and suicides appearing on the site since it introduced Facebook Live in 2016.

The company promised to crack down on users broadcasting violent videos after the Christchurch mass murder was streamed to the social network last year.

WHAT TO DO IF YOU NEED HELP:

* If a life is in danger, ring triple-0

* Kids Helpline: 1800 55 1800. Online support is open from 8am-midnight

• Suicide Callback Service: 1300 659 467

• eHeadspace: 1800 650 890

• Lifeline: 13 11 14. Online support 7pm-4am daily

• Beyondblue. 1300 22 4636. Online support 3pm-midnight

MORE NEWS

PM warns tech giants Facebook, Google

What Facebook’s threat means for you

Elon Musk now richer than Mark Zuckerberg

Originally published as TikTok struggling to remove graphic suicide video shared from Facebook livestream

Original URL: https://www.dailytelegraph.com.au/technology/tiktok-struggles-to-remove-graphic-suicide-video-shared-from-facebook-livestream/news-story/1f9d5769e84c286676c9036d85bc5a54