NewsBite

ASIC chair Joe Longo cautions banks, brokers over AI

The corporate watchdog is calling on banks and brokers to help safeguard market integrity and avoid any ‘unintended consequences’ in their rush to adopt artificial intelligence.

Sam Altman – AI regulation in Australia is on a good path

The corporate watchdog is calling on banks and brokers to help safeguard market integrity and avoid any “unintended consequences” in their rush to adopt artificial intelligence.

Joe Longo, chair of the Australian Securities & Investments Commission, warns that recent developments in generative AI “potentially create new and different risks and issues”. AI is set to be a “high and important priority” for the regulator, he says.

“To be clear, my and ASIC’s interest is – and will always be – the safety and integrity of the financial ecosystem,” Mr Longo will tell a financial markets forum on Tuesday.

“As with any new technology, to the extent that AI affects that ecosystem, to that extent we will be involved. As we realise the potential of tech, we have to do all we can to avoid negative disruption, learned market abuse, misinformation, discrimination, and bias – whether intended or unintended.”

ASIC’s new focus on AI will ­extend beyond its impact on the operation of wholesale markets.

It will also look at the role of AI in “the whole economy, including consumers and small business”, he says.

“The point is, the fear of missed opportunities cannot be allowed to drive poor decisions, outcomes, or controls,” Mr Longo will say.

“While the potential in this field is enormous, our vigilance must be unwavering, and the industry will look to you to lead the way.” To support this, in the next financial year ASIC plans to consult with the financial industry on expanding automated order processing rules to futures markets to reflect developments with AI.

The regulator also plans to update its electronic trading guidance for the same reason.

“ASIC will also continue to scan the environment to understand how AI is being applied and the risks and opportunities attached to those methods of application,” Mr Longo will say.

It will also look at the role of AI in the digitisation of assets, carbon markets.

Moreover, ASIC’s expectation for the financial industry is that “appropriate controls” on the use of AI technology will be “part of the design phase and in place before new tech is switched on”.

“There is a very real danger here that entities may rush too quickly into innovations without applying appropriate controls and proper governance,” Mr Longo says.

“It’s important that the whole financial market ecosystem works to uplift controls – just as a convoy must go at the pace of its slowest vessel, so too is the financial ecosystem reduced to the strength of its weakest link.”

He sees a “danger” that “the fear of being ‘‘left behind’’ will drive some uses of tech that have “unintended consequences”, and notes the recent cyberattack on service provider ION as a “reminder of just how interconnected global markets are, and the implications a bad actor can indirectly have on market intermediaries – and the markets themselves”.

But while calling for “robust governance and operational resilience measures”, Mr Longo notes that there’s “as yet no real consensus on how to regulate AI, if at all”.

The European Commission has proposed an AI law that takes a risk-based approach, while prohibiting some particular forms. In Britain a ‘‘pro-innovation’’ devolved regulatory model is proposed. China and Canada are proposing laws directed at regulating uses of AI.

The Australian government’s recent discussion paper, ‘‘Safe and Responsible AI in Australia’’, sought input on how Australia should approach the question of AI regulation.

“These are very important questions – some would say existential – and they are as yet unanswered,” Mr Longo says.

“But there is another, very practical issue … if you can’t explain how a particular system works … how can you justify using it?”

He will call on intermediaries to “take a leadership role as gatekeepers of the financial industry”.

Mr Longo wants them to “ensure that every use of AI” within their organisation “is understood and the reason a particular decision, recommendation or prediction is made can be explained.”

David Rogers
David RogersMarkets Editor

David Rogers began writing about financial markets in 1987. He has worked for Standard & Poor's, Thomson Financial, BridgeNews, Tolhurst Noall, Dow Jones Newswires and The Wall Street Journal. David has extensive real-time reporting experience in economics, foreign exchange, equities, commodities and bonds.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/financial-services/asic-chair-joe-longo-cautions-banks-stockbrokers-over-ai/news-story/a6f602234ee3794c83de895f90180124