Opinion: Social media giants actively fan flames of division, disorder for profit
Social media companies can and should proactively detect and block material that could incite real-world violence. They avoid responsibility by claiming to be neutral, but their algorithms are anything but, writes David Tuffley.
Opinion
Don't miss out on the headlines from Opinion. Followed categories will be added to My News.
In the aftermath of the Wakeley church stabbings, the role of social media in the violent events that ensued has come under scrutiny.
While the attack on Bishop Mar Mari Emannuel was horrific, the aftermath proved even more destabilising as footage of the scene rapidly spread across social media.
Within minutes, the viral videos had inflamed tensions, sparking riots involving hundreds of angry protestors. The situation spiralled out of control, as protesters clashed with police and property was being vandalised. All of this played out in real-time on people’s social feeds.
This episode underscores the power social media companies like Meta (Facebook) and X (Twitter) now wield in shaping narratives and how they potentially incite conflict. When tragic events are captured on video and quickly disseminated online without filters or context, it can become a spark that creates civil unrest. While the eSafety commissioner’s take-down orders are an effective mechanism to remove violently explicit content once it is out there, the core problem will not be solved until social media platforms implement more robust filtering at the source.
These companies have the capability and moral obligation to proactively detect and block material that could incite real-world violence and to do so before anyone sees it. Tech giants avoid responsibility by claiming to be neutral. But their algorithms determine what content goes viral and reaches mass audiences. By allowing violent, inflammatory videos to spread, they are actively fanning the flames of division and disorder for profit.
Removing the News tab as Meta has done makes this situation even worse, as it leaves a vacuum of reliable information that gets filled by random unverified content and misinformation spreading through people’s social circles. There must be accountability for this.
This is not about censorship, but a basic duty of care. Just as broadcasters have standards for what can be aired in the public interest, social media platforms should be held to the same standards now their reach and influence is so powerful.
Increased transparency around content curation algorithms and processes is also essential. Users have a right to understand how their information feeds are being shaped and what principles guide this process of deciding what people see or don’t see online.
Most urgently, there needs to be an independent inquiry into the events at Wakeley to determine precisely how social media activity accelerated the conflict. Lessons must be learned, and meaningful reforms enacted.