We are locked into an arms race that no one wants to happen, global researchers warn
A CHILLING letter claims the world is on the cusp of opening a dangerous Pandora’s box — and there is no going back.
A LEADING Sydney robotics and artificial intelligence expert is spearheading a boycott of a top South Korean university over fears around the development of killer robots.
Professor Toby Walsh from the UNSW is among more than 50 of the world’s leading artificial intelligence and robotics researchers from 30 different countries who have declared they will cease contact with South Korea’s leading research facility KAIST, formerly the Korea Advanced Institute of Science & Technology over the issue.
It comes after the university — which was established by the Korean government in 1971 — opened an AI weapons lab in collaboration with a major arms company which builds cluster munitions in contravention of UN bans.
The university’s collaboration with the munitions maker, Hanwha Group, has rankled global researchers due to the conglomerate’s involvement in manufacturing cluster bombs which are usually dropped from planes or launched from the ground and release or eject smaller sub-munitions typically designed to kill personnel and destroy vehicles.
The boycott comes in advance of a meeting next Monday in Geneva, Switzerland, of 123 member nations of the United Nations discussing the challenges posed by lethal autonomous weapons. Twenty-two of those nations have already called for an outright and pre-emptive ban on such weapons.
The open letter announcing the boycott against the South Korean university said autonomous weapons are the “third revolution in warfare” and warned about letting the genie out of the bottle.
“At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons,” the letter said.
“We therefore publicly declare that we will boycott all collaborations with any part of KAIST until such time as the President of KAIST provides assurances, which we have sought but not received, that the Center will not develop autonomous weapons lacking meaningful human control,” the researchers said.
“If developed, autonomous weapons will be the third revolution in warfare. They will permit war to be fought faster and at a scale greater than ever before. They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora’s box will be hard to close if it is opened.”
RELATED: China’s AI pursuit set to alter future military balance of power
RELATED: United Nations moves to consider controls around autonomous weapons
RELATED: Elon Musk joins calls to ban intelligent killer robots
Professor Walsh organised the boycott which involves researchers from 30 countries and includes three of the world’s top deep learning experts, Professor Stuart Russell from the University of California, Berkeley, who authored the leading textbook on AI and roboticist Prof Wolfram Burgard, winner of the Gottfried Wilhelm Leibniz Prize, the most prestigious research prize in Germany.
“Back in 2015, we warned of an arms race in autonomous weapons,” Professor Walsh said in a statement alongside the letter. “We can see prototypes of autonomous weapons under development today by many nations including the US, China, Russia and the UK. We are locked into an arms race that no one wants to happen.
“KAIST’s actions will only accelerate this arms race. We cannot tolerate this.”
Professor Walsh has long campaigned against the development of autonomous weapons.
He has previously travelled to speak in front of the United Nations in an effort to have the international body prevent the proliferation of so-called killer robots with the ability to think for themselves.
Speaking to news.com.au last year he said “the arms race is already starting.”
He believes it’s no longer a question of whether military weapons are imbued with some level of autonomy, it’s just a matter of how much autonomy — which poses a number of worrying scenarios, particularly if they fall into the wrong hands.
“They get in the hands of the wrong people and they can be turned against us. They can be used by terrorist organisations,” he warned.
“It would be a terrifying future if we allow ourselves to go down this road.”