NewsBite

How criminals are getting help from AI hacking tools

How criminals are getting help from AI hacking tools

A baffling incident from Hong Kong, where a worker was fooled by deepfake colleagues into sending fraudsters $US25 million, shows the risks that companies are facing.

Nick BonyhadyTechnology writer

Subscribe to gift this article

Gift 5 articles to anyone you choose each month when you subscribe.

Subscribe now

Already a subscriber?

One of the first malicious generative AI tools closed down not with a bang or a whimper, but a warning.

The creators of WormGPT – which was designed as a ChatGPT equivalent freed of ethical constraints to aid hacking, phishing and fraud – posted a screed on their private chat group in August blaming the media for its demise. But amid the self-justifications and complaints about the scrutiny it attracted, the five anonymous creators made clear how easily anyone else could create a criminal AI.

Loading...

Subscribe to gift this article

Gift 5 articles to anyone you choose each month when you subscribe.

Subscribe now

Already a subscriber?

Read More

Nick Bonyhady
Nick BonyhadyTechnology writerNick Bonyhady is a technology writer for the Australian Financial Review, based in Sydney. He is a former technology editor, industrial relations and politics reporter at the Sydney Morning Herald and Age. Connect with Nick on Twitter. Email Nick at nick.bonyhady@afr.com

Latest In Technology

Fetching latest articles

Original URL: https://www.afr.com/technology/one-ai-hacking-tool-has-fallen-others-will-rise-to-take-its-place-20240305-p5fa1l