NewsBite

‘Don’t kneecap artificial intelligence with needless rules’, Productivity Commission warns

The Productivity Commission has warned governments against designing new regulations for AI technologies, arguing that new rules could stifle productivity benefits.

Industry and Science Minister Ed Husic says the government expected to finalise a new regulatory framework – including mandatory guardrails for high-risk AI applications – by the end of 2024. Picture: NCA NewsWire / Martin Ollman
Industry and Science Minister Ed Husic says the government expected to finalise a new regulatory framework – including mandatory guardrails for high-risk AI applications – by the end of 2024. Picture: NCA NewsWire / Martin Ollman

The Productivity Commission has encouraged governments not to design “unnecessary and confusing” new regulations to govern artificial intelligence applications, warning that “new technology does not necessarily imply the need for new rules”.

Research from the independent economic advisory body has suggested AI has the potential to contribute to major productivity gains, but concluded it was impossible to know whether the technology could deliver on this promise in the long-term.

In three papers being released on Friday, the Productivity Commission found low levels of public trust in AI technologies could pose a “significant barrier” to uptake and governments should act as “exemplars” to demonstrate the safe and effective uses of AI in service delivery.

This was critical, it said, given AI had the potential to address some of Australia’s most enduring productivity challenges by filling gaps in areas that required specialised knowledge and skills.

The PC found this was especially the case in services, which account for 80 per cent of GDP and 90 per cent of the workforce, but which had historically experienced poor productivity growth.

While the PC found that AI presented both opportunities and challenges, it warned against governments resorting to new legislation or regulations as a kneejerk response to new challenges.

Getting the regulatory response right would be central to unlocking the beneficial productivity gains from AI usage, while also providing strong safeguards against possible adverse outcomes, the PC said.

“Many potential harms have been encountered with past technologies and adequately dealt with by existing regulatory frameworks in areas such as consumer protection, privacy, anti-discrimination, negligence and sector-specific and profession-specific requirements,” the PC said. “AI is no different.”

Commissioner Stephen King told The Australian: “Only if the current rules and regulations are clearly not fit for purpose, only then do you think about new rules and regulations.

“Nothing is worse than passing technology-specific regulation and finding it’s obsolete within five years,” Mr King said. “We’ve got this discourse that AI is big and scary and the robots are coming. No, they are not. It’s not big and scary. It’s already here.”

In January, Labor revealed it would consider new “mandatory safeguards” for AI systems in high-risk areas – including a new dedicated legislative framework – as it moves to realise a $600bn-a-year boost to the national economy by building trust and confidence in the technology.

Industry and Science Minister Ed Husic said the government expected to finalise a new regulatory framework – including mandatory guardrails for high-risk AI applications – by the end of 2024 and that it would be far-reaching in its scope.

Mr King suggested Australia was in a better position than Europe to capitalise on AI technologies because the EU’s AI Act “started much more from the perspective that AI is scary and we have to stop it”.

“Within the EU there are concerns about the potential for that sort of very specific, very broad AI regulation stopping innovation, investment and harming productivity … We are much better placed than that.”

In its research papers, the PC encourages governments to ask whether any potential risks from AI usage can be adequately addressed by “existing regulation” or whether “modifications to this regulation, or improvements to its enforcement, are required”.

If a new regulatory instrument is needed, the PC has suggested that a technology-neutral approach should be considered in the first instance. It also said that Australia would need to ensure it had the right digital infrastructure and skills to harness the benefits of new AI technologies, including cloud computing services that could use and store large datasets.

“Ensuring that there are adequate levels of digital literacy in the community would support both a more productive workforce and AI uptake,” it said.

Given the reliance of some AI systems on vast amounts of data to learn and make predictions, the PC said governments should establish “clear and functional mechanisms for data collection, curation, sharing and use”.

However, this could have implications for the rights of data-holders over their personal information.

The PC concluded that productivity gains would depend, in part, on whether the quality of Australian data holdings could be raised and then used for AI applications without “undermining the incentives of data holders or increasing risks to ­individuals”.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/nation/politics/dont-kneecap-artificial-intelligence-with-needless-rules-productivity-commission-warns/news-story/31eb0efc950232e6bcc6288acc557941