Andrew Forrest backs ALP’s online safety law reforms
Tech and social media giants including Google and Meta face fines up to $50m if they allow hate, terrorism and crime to flourish on their Australian platforms.
Big tech and social media giants including Google and Meta face up fines to $50m if they allow hate, terrorism and crime to flourish on their Australian platforms, after a Labor-commissioned review found penalties against tech companies are too weak.
The government has recommended a raft of changes that would up-end the Online Safety Act 2021, going as far as fining search engines for “surfacing, selling or distributing” apps that are able to be used to stalk others or create deepfakes including nude images.
It has also proposed changes that could disrupt online adult-only and entertainment services, pushing for an overarching safety approach that maintains the best “interests” of children first.
Failure to comply with content removal notices would be increased to a $10m fine for companies while court-enforced civil penalties should be raised to the greater of either 5 per cent of global annual turnover or $50m, the review recommends.
The government has also been urged to consider business disruption powers and the feasibility of enforcing online companies to have a local presence “for the purpose of facilitating enforcement action”.
Andrew Forrest – suing Meta for failing to take down scams featuring his likeness – said the government’s move was an “important step” and now “was the time for action”.
Dr Forrest has been urging Meta to remove scams – some of which have fleeced Australians of their life savings – for the past decade. He wrote a letter personally appealing to Facebook founder and Meta chief executive Mark Zuckerberg, who snubbed his pleas.
Dr Forrest has since won the right to sue Meta on its home turf in California, testing a 30-year-old law that gives tech giants immunity from the content published on their platforms.
Communications Minister Michelle Rowland’s office has proposed some 67 recommendations and determined the Online Safety Act 2021 was not fit for purpose in a world where emerging technologies pose significant harm to the wellbeing of Australians.
“Our government has been proactive in ensuring our legislative framework remains fit for purpose. That’s why we’ve wasted no time in committing to legislate a digital duty of care to place the onus on online services to keep their users safe,” Ms Rowland said.
The review has also called for increased transparency for online services and pushes for more power for the Office of the eSafety Commissioner.
Under the proposed changes, the eSafety Commissioner would be able to issue a formal removal notice on behalf of adults and children facing cyber abuse or cyber bullying 24 hours after the victim first lodged a complaint.
Dr Forrest’s lawyer, Simon Clarke, told The Australian the online safety review recommended “key measures” his client had long called for but it needed to start walking the talk.
“Today the Australian government took another important step in regulating social media and ensuring they are accountable for the scams and other harms that are being experienced in Australia because of these platforms,” Mr Clarke said.
“The review of the Online Safety Act recommends some key measures we have long been calling for: requiring major platforms to have a local presence in Australia, a social media licensing scheme, and a duty of care that includes contact points for those who are wronged in Australia.
“While the review is welcome, now is the time for action. We call on both sides of politics to move as quickly as possible to protect innocent Australians by immediately committing to implementing those critical recommendations in full.”
The review has called for big tech platforms and social media companies to be required to share data with “authorised researchers” to determine compliance under an overarching duty-of-care model.
One recommendation included a major fine for search engines and social media platforms that recommended harmful apps or services to search queries, citing a devastating case in which a 14-year-old girl from NSW took her own life after she had been the target of a deepfake nude scandal at her school.
“The government should explore how best to prohibit search engines and app stores from surfacing, selling or distributing ‘nudify’ apps and undetectable stalking apps,” recommendation 27 reads.
The review admits the nation’s decade-old Online Safety Act – and its 2021 update – had failed to keep up with emerging technologies, conceding “we have been overtaken” and “it has not been able to cope with the scale of problems in the online world”.
Services that reach more than 10 per cent of the population need to have “additional mandatory responsibilities” including annual reports assessing risk that must be submitted to the online regulator, the review recommends. The regulator could deem if other services with lesser reach but potential for harm are liable under the same requirements.
The review has called for the nation to adopt a “singular overarching duty of care” that would apply to all online services accessible by Australians.
It should be built with the best interests of children as the “primary consideration”, and risks to children should determine how services may be used.