Australian Human Rights Commissioner Lorraine Finlay warns government legislation has ‘not struck right balance’
Australia’s Human Rights Commissioner says Labor’s proposed misinformation laws could undermine democracy and erode public trust.
Australia’s Human Rights Commissioner Lorraine Finlay says Labor’s proposed laws to combat online misinformation could undermine democracy, erode public trust and jeopardise free speech, as she warned the bill in its current form does not “strike the right balance”.
Ms Finlay said that while there was a pressing need to tackle the growing problem of unchecked misinformation and disinformation, the draft legislation risked targeting “different perspectives and opinions”, with the labels being invoked around the world to “delegitimise alternative opinions and justify censorship”.
“If you go too far, you effectively censor a wide range of materials and it means that people don’t get to have free and open discussions about really important policy issues,” she told The Australian.
“In Australia, obviously, we have some really challenging complex issues that we’re negotiating in terms of a whole range of things and we need people to be able to speak frankly and freely about their thoughts or views.”
The draft legislation would empower the Australian Communications and Media Authority to fine social media giants millions of dollars for online misinformation and content it deems “harmful”, and seek information from platforms about the measures they have in place. The watchdog would also be able to enforce an industry standard if self-regulation failed.
The AHRC said in its submission to the government that the draft bill did not adequately define the term “misinformation” and that the definition of harm was “extremely broad and not clearly defined”.
“For example, it is not clear how the definition of harm as including the ‘disruption of public order or society in Australia’ accommodates the lawful exercise of the right to protest and where that balance will be struck,” the submission, seen by The Australian, reads.
“Similarly, categories such as ‘harm to the health of Australians’, ‘harm to the Australian environment’ and ‘economic or financial harm to Australians, the Australian economy or a sector of the Australian economy’ are each categories about which reasonable people may legitimately have different perspectives and views.”
Ms Finlay said there was a risk of information being “opportunistically labelled as misinformation’ or disinformation” to delegitimise alternative opinions and justify censorship.
“One of the real dangers that we've certainly seen overseas is that it’s not just when democratic processes are undermined, but when people perceive that they’re being undermined and that loss of trust is really critical,” she said.
“People not only need to be able to speak freely, but they need to believe that they are able to speak freely.”
She said if Labor’s laws were progressed, the government would need to build in an oversight mechanism allowing Australians to see what was being moderated and be offered pathways to appeal ACMA decisions.
While acknowledging the argument that governments and tech platforms were not doing enough to combat misinformation, Ms Finlay also pointed to examples where content moderation had potentially gone too far.
“There have been recent revelations through the Freedom of Information process showing that in fact content moderation may be greater than people understood it to be,” she said.
Data released under FOI applications earlier this year showed more than 4000 social media posts had been secretly censored by governments during the height of the Covid-19 pandemic, including posts that simply stated “no masks”.
Opposition communications spokesman David Coleman said it was significant that the draft bill had now drawn criticism from the AHRC and the Media, Entertainment and Arts Alliance.
“Australians have different views, they are entitled to those views, and the last thing we want is for the government to decide what is and is not an acceptable opinion,” he said.