NewsBite

commentary

Meta’s disregard for the public interest is ‘galling’, says independent MP Zoe Daniel

Independent member for Goldstein Zoe Daniel. Picture: Martin Ollman
Independent member for Goldstein Zoe Daniel. Picture: Martin Ollman

We’ve let Big Tech write their own rules for long enough. The government needs to take a clear position.

The social media platforms’ audacity was once again on full show last week, with Meta’s re- appearance at the Joint Select Committee on Social Media and Australian Society.

Meta was called back after a remarkable first appearance where company representatives claimed, straight-faced, that their products do not do harm to children. It’s easy to make such bold claims when you’re marking your own homework.

Of course, Australia does have a regulatory framework for online safety and an industrious eSafety Commissioner. But our Online Safety Act was written in a world where removing online content was the key tool for online harms remediation. That tool is still necessary, and vital for survivors of harms like image-based online abuse, but it is undercooked for the full breadth of issues where tech platforms play a role, ranging from our kids’ addiction to the ‘‘endless scroll’’, leading to everything from anxiety to eating disorders, to your nan’s near-swindling from a convincing online scam.

Big Tech’s fingers are all over a range of issues in the public interest, that go far beyond occasional Facebook posts or what some of us once called “tweets”. Their powerful underlying systems actively and knowingly influence kids’ social lives, play kingmaker in online public debates, and run a data free-for-all in the name of advertising.

We can’t tackle the power of these highly diversified companies with the game of whack-a-mole that is content moderation alone – especially when it’s self-evident that the platforms are motivated by profit, not community benefit, or even safety.

The performative way in which they talk about public interest is at best galling.

Because Meta and its peers not only deny and defy existing regulation, they actively lobby against attempts at policy reform. In the US, they work both in their own voice and through various intermediaries, such as industry associations they pay handsomely with membership dues and research groups they fund.

Australia runs on a similar playbook. Meta is one of the top-tier members of DIGI, an industry association that has the rare privilege of holding the pen on Australia’s online safety codes and the misinformation code. We learned this week the real effect of DIGI’s shoddy work on online safety.

DIGI drafted an industry code last year that set the minimum standards for Australian children’s default privacy protections. Yet despite warnings from child rights advocates, DIGI decided default privacy should be switched off at 16 years old, rather than 18, knowingly exposing teenagers to established, serious risks such as grooming.

Meta says its “standards are a floor and not a ceiling” and claims to go above and beyond expectations with various “safety safeguards”. But when I asked about one feature known to be harmful to teens, the ‘‘People You May Know’’ (PYMK) feature, Meta admitted it was still switched on for under-18s in Australia.

The PYMK feature recommends accounts to another, and in internal research leaked by whistleblower Frances Haugen, Meta documented that 75 per cent of inappropriate adult minor contact – aka grooming – happened because of PYMK. In the words of a Meta employee at the time, “how on earth have we not turned off PYMK between adults and ­children?”

This alone is a striking example of why we need to know much more about the hundreds of features in use on tech platforms that carry real-world risks to the public.

But we also need to take from this the obvious point that tech simply can’t be trusted to write its own rules. And the occasional parliamentary hearing is not enough to get timely and sufficient accountability out of these enormous companies who couldn’t care less about Australia’s national interests.

We need to reshape our vision of what online safety looks like and follow the models that are achieving meaningful behavioural change – namely the Digital Services Act and Digital Markets Act in Europe – that regulate the underlying systems of digital platforms (ie ‘‘the algorithm’’) and impose a duty of care on the companies to do no harm.

Surely that’s a basic starting point? Big Tech must be on notice to manage the diverse risks to the public, ranging from mental health to electoral integrity.

These are big reforms and a tough set of adversaries – but the future of our democracy, and our young people, are at stake.

Zoe Daniel is the federal member for Goldstein.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/media/metas-disregard-for-the-public-interest-is-galling-says-independent-mp-zoe-daniel/news-story/7783e5551f8669f665c7599d4e1dc47c