Commonwealth Bank to process financial crime alerts faster
Commonwealth Bank has begun operating an AI-driven system to help speed up investigations into financial crimes.
Australia’s largest bank has launched a system to help speed up investigations into financial crimes and it could plug in generative AI to produce quick summaries in the future.
The Commonwealth Bank is set to replace its current method of investigating financial crimes, which involves about 12 separate applications to detect and triage alerts relating to sanctions, fraud and people linked with transactions.
Instead, the bank soft launched a purpose built “alert and investigation management” system hosted on a cloud platform weeks ago.
It can produce a visual map of common people linked with transactions which are laundered, for example, businesses and banking accounts which were previously unconnected alerts and connect them in one investigation. Historically, this has been a manual job.
Speaking at the PegaWorld conference in Las Vegas, operations technology general manager Ben Donnelly said CBA was driven to a new approach due to an increase in complex alerts, as well as the need to comply with regulatory changes.
“We’ve been able to now process a much higher volume of alerts in a much bigger and much more effective way,” he said.
“We’ve got a single point of entry to those alerts coming into the process.
“We can apply classification priorities. We can identify false positives, we can match cases with existing alerts.”
An estimated 70 million transactions flow through CBA, and Mr Donnelly said tackling financial fraud was one of its biggest priorities as well as those of banking institutions globally.
He declined to reveal how many alerts were being flagged by the new system, but said the “nature of what we’re moving towards is the accuracy of that process rather than rather than just the volume”.
Pega financial services senior director Jonathan Tanner said the new CBA system – which he said was being looked at by other Australian and international banks – could compete with the rapidly evolving nature of financial crime, which has already adopted generative AI to scam people on social media with fake videos.
Mr Tanner said the technology meant accounts suspected of being fraudulently transacted could be shut down quicker. It was easier to identify patterns for a higher success rate and generative AI could be used as an add on to create “suspicious matter reports” in seconds.
“From a regulator point of view, you’re getting those suspicious matter reports to the regulator a lot quicker,” he said.
“We can pull that information from the case, populate the report (and) show it to the analysts; we always believe human in the loop, they’ll do a quick read through to make sure everything’s okay.”
Having previously worked at NAB and ANZ bank in process automation roles, Mr Tanner said generative AI was not a threat to workers.
“It’s more about we don’t need to hire more because the problem they have today is that every time these crimes ramp up a level, really the only way the banks can respond to that today is to hire more people,” he said.
“At this stage of its evolution (AI), we believe that human in the loop is important because you have to have that accountability that sits with somebody.”
The journalist travelled as a guest of Pega.