NewsBite

Thai gunman continued posting to Facebook as he carried out country’s worst mass shooting over the course of several hours

A terrorist’s live stream prompted Facebook to explore new methods to prevent it ever happening again. Then it happened again.

Thailand shooting: Gunman shot dead after shopping centre massacre

In March of last year an Australian terrorist massacred worshippers at a mosque in Christchurch.

The crime itself and its motivations were chilling enough, but many were shocked to find footage of the event, eerily similar to a first person shooter video game, was live streamed on the world’s biggest social media website.

The video stayed up for hours, showing up in the feeds of Facebook users as they trawled through advertisements and autoplaying viral videos to connect with their friends and families.

By the time some figured out what they were watching it was too late to look away.

New Zealand Prime Minister Jacinda Ardern hugs a mosque-goer days after the worst mass shooting in New Zealand's history. Picture: Hagen Hopkins / Getty Images
New Zealand Prime Minister Jacinda Ardern hugs a mosque-goer days after the worst mass shooting in New Zealand's history. Picture: Hagen Hopkins / Getty Images

Outrage followed and questions began circulating about how something like this could be allowed to happen.

“People are looking to understand how online platforms such as Facebook were used to circulate horrific videos of the terrorist attack, and we wanted to provide additional information from our review into how our products were used and how we can improve going forward,” Facebook’s vice president of product management Guy Rosen said in the days following the terrorist attack.

The video was first reported to Facebook almost half an hour after it began, 12 minutes after it had ended, which Facebook said meant it wasn’t removed as quickly as it might have been if it had been flagged while it was still live.

“During the entire live broadcast, we did not get a single user report,” Mr Rosen said.

“This matters because reports we get while a video is broadcasting live are prioritised for accelerated review. We do this because when a video is still live, if there is real-world harm we have a better chance to alert first responders and try to get help on the ground.”

As is so often the case, things got much worse when a link to download the video was shared on 8Chan, the somehow worse offshoot of the already horrible 4Chan message board, populated mainly by toxic incels, racists and child pornographers (“and some, I assume, are good people”).

That helped the video be reuploaded to and subsequently removed from Facebook a further 1.2 million times in the first 24 hours.

Two weeks after that attack Facebook finally addressed the issue in a letter to the people of New Zealand from Facebook COO Sheryl Sandberg, telling Kiwis the company was “exploring” a number of options to stop the same thing happening again.

RELATED: Facebook’s free speech fail

RELATED: ‘Is Google capable of telling the truth?’

Facebook COO Sheryl Sandberg wrote a letter to New Zealanders in the wake of the massacre. Picture: Lino Mirgeler / AFP
Facebook COO Sheryl Sandberg wrote a letter to New Zealanders in the wake of the massacre. Picture: Lino Mirgeler / AFP

While Facebook “explored” its options, Australia changed its laws.

The Commonwealth Criminal Code was amended in April last year, creating two new offences relating to “abhorrent violent material”, which includes audio or video depicting terrorism, murder, attempted murder, rape, kidnapping or torture.

These laws mean websites hosting this content have to take quickly take it down if contacted by Australia’s internet watchdog, the office of eSafety Commissioner.

If they refuse to, there are other options.

In September a notice was issued to telcos to block eight websites who were still hosting the Christchurch video or the shooter’s manifesto.

Around the time it was also revealed that 413 reports of abhorrent content had been made since the new powers were given.

More than 90 per cent of the reports related to child pornography or other sexual abuse material.

Seven per cent was “abhorrent violent content” including livestreams of torture, kidnapping and murder.

“Perpetrators attempt to use the internet to amplify and promote their terrorist agendas and violent crimes. Removing this abhorrent violent material from online access prevents a range of social harms. These include the trauma and suffering of victims and their family members, the radicalisation of other potential perpetrators and the use of such material to threaten, harass or intimidate Australians or specific community groups,” eSafety Commissioner Julie Inman Grant told news.com.au.

She added that Australians deserve to be protected against the potential harm caused by this content but the threshold for what content we need protecting from needs to be very high “in a society that values freedom of expression”.

“There is a range of material on the internet, including terrorist and violent criminal material, that is capable of causing harm, particularly to children. However, exposure to such material may be restricted or limited by using filtering and other control tools.”

The eSafety Commissioner website hosts guides on how to use these tools.

RELATED: YouTube boss’ hypocritical rule

RELATED: Social media remove ads, but not for misinformation

Australia's eSafety Commissioner Julie Inman Grant was given new powers following the Christchurch massacre, but they weren’t used in the latest incident.
Australia's eSafety Commissioner Julie Inman Grant was given new powers following the Christchurch massacre, but they weren’t used in the latest incident.

One of the options “explored” by Facebook included a one-strike policy that could take your live streaming privileges away if you break Facebook’s rules.

The flaw in this approach is obvious: mass shooters who broadcast their crimes on Facebook rarely get the chance to do it again anyway after they’re either captured or killed by police.

Last weekend, a Royal Thai Army officer in the middle of a mass shooting that killed 29 people and wounded 58 others regularly took to Facebook during his hours long massacre, including in a livestream where he asked viewers if he should surrender or not.

A screenshot of the video posted to Facebook by the Thai shooter during his massacre last weekend. Picture: AFP Photo / Social Media
A screenshot of the video posted to Facebook by the Thai shooter during his massacre last weekend. Picture: AFP Photo / Social Media
Another image on his page showed him masked and holding a gun. He also posted that he was looking for “vengeance” in the hours leading up to the massacre.
Another image on his page showed him masked and holding a gun. He also posted that he was looking for “vengeance” in the hours leading up to the massacre.

He was eventually killed by Thai commandos, which Thailand’s public health minister Anutin Charnvirakul confirmed in a post on Facebook.

Facebook was quick to clarify to news.com.au that the “very short” live stream by the shooter didn’t contain any actual depictions of violence and as such was not classed as abhorrent violent material.

Facebook briefed the eSafety Commissioner who agreed with that classification.

“We have removed the gunman’s presence on our services and have found no evidence that he broadcasted this violence on FB Live. We are working around the clock to remove any violating content related to this attack. Our hearts go out to the victims, their families and the community affected by this tragedy in Thailand,” the company that knows all of our names, friends and relatives told news.com.au through an unnamed spokesperson.

Thailand is in mourning following its worst mass shooting ever. Picture: Lauren DeCicca / Getty Images
Thailand is in mourning following its worst mass shooting ever. Picture: Lauren DeCicca / Getty Images

Facebook has a policy to remove content that praises, supports or represents mass shooters and this policy was eventually used to remove the content once it was reported to the social media platform.

When a post is reported to Facebook it gets reviewed by a team of 15,000 content reviewers (whether these reviewers are contractors or actually employed directly by Facebook is unclear and the company didn’t answer when we asked them that).

These 15,000 people review content in over 50 languages, meaning they aren’t all able to review every piece of reported content, unless Facebook has somehow found 15,000 people fluent in over 50 languages and willing to use this skill in a repetitive, traumatising, and not all that well paid job.

This means that Facebook could have as little as 300 moderators for each language (though they could be spread proportionally through the most dominant languages on the platform).

Facebook has 2.4 billion users around the world and one moderator for every 160,000 of them.

What do you think Facebook should do to stop terrorists and mass shooters live streaming while they carry out their atrocities? Let us know in the comments below.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.news.com.au/technology/online/social/thai-gunman-continued-posting-to-facebook-as-he-carried-out-countrys-worst-mass-shooting-over-the-course-of-several-hours/news-story/f2a8274e739905736a861b8f584a242d