SA Law Society calls for new laws to force big tech to remove harmful content
The creation of a legislated duty of care would force social media giants to remove harmful content from their platforms, the Law Society of South Australia says.
SA News
Don't miss out on the headlines from SA News. Followed categories will be added to My News.
Social media companies should be forced to remove harmful content, such cyber-bullying posts and online abuse, from their platforms via the creation of a legislated duty of care, the Law Society of South Australia says.
Such a legal obligation would help protect people from harmful content by putting the onus on online platforms – rather than individual users – to take down damaging posts, pictures and videos, the society argues.
Penalties would apply to social media companies who do not comply with the rules, under the proposal.
The society this month put its plight in a submission to the Law Council of Australia, which asked state-based law societies for their suggestions on how to improve online safety.
The Law Council will then consider the ideas in its own submission to the federal government’s fast-tracked review of the Online Safety Act.
It comes amid well-publicised challenges authorities have faced in trying to have harmful content removed from the internet.
Society president Alex Lazarevich said the UK already had laws to tackle damaging content and Australia could follow suit.
Mr Lazarevich said a duty of care regime would “place obligations on companies to safeguard users from harmful content, such as scams, violent content, child exploitation material, hate speech and material that undermines democracy”.
“Further, a regime may impose additional obligations to protect young people from accessing harmful content such as pornography or gambling material,” he said.
“From a practical perspective, placing the onus of removing harmful content on online platforms, rather than individuals who post obscene material, would likely be far more effective and efficient in removing harmful content.”
The UK’s parliament last year passed laws that require social media platforms to remove illegal content, such as revenge porn and material promoting self-harm, from their sites. Companies must also take steps to stop such content from appearing on their platforms.
Mr Lazarevich said if the Australian government decided to pursue a duty of care, it would need to extensively consult on the technicalities of the framework.
“For example, what type of content should meet the definition of ‘harmful,’ to what extent should online platforms be legally responsible for facilitating the distribution of harmful content …, to what degree should online platforms be expected to monitor content, and what are appropriate penalties for noncompliance,” he said.
Under the existing Online Safety Act, the eSafety Commissioner has the power to issue removal notices to certain media providers, compelling them take down cyber-bullying and cyber-abuse material. However, Commissioner Julie Inman Grant has been met with resistance when enacting her authority.
Earlier this month she abandoned her bid to force social media platform X to remove graphic footage of a stabbing attack at a church in Sydney following numerous setbacks, including the social media platform refusing to remove the videos and not complying with a Federal Court order to temporarily hide the material.
Federal Communications Minister Michelle Rowland fast-tracked a statutory review of the Online Safety Act by 12 months to make sure the laws were fit-for-purpose. She encouraged statekholders and individuals to have their say during the review.