NewsBite

Advertisement

This was published 6 months ago

Opinion

Civilian deaths from killer drones are this generation’s ‘Oppenheimer moment’

Artificial intelligence and emerging technologies are beginning to have a profound impact on the way war is waged. Some forms of autonomous weapons systems have existed for years, but the types, duration of operation, geographical scope, and environment in which they operate have been limited.

An increasing number of civilians have been killed by armed drones using laser-guided bombs and other munitions in Burkina Faso, Ethiopia, and other war-torn countries. In January, three Burkina Faso military drone strikes that the government claimed targeted Islamist fighters in fact killed at least 60 civilians at two crowded markets and a packed funeral. Loitering munitions, or so-called one-way attack drones that stay in the air searching for a target before attacking, have been used since 2020 in Libya, Nagorno-Karabakh, Ukraine, and other conflicts, attracting scrutiny from the United Nations and others.

Armed drones are changing the way war is waged.

Armed drones are changing the way war is waged.Credit: iStock

These weapons systems are not completely autonomous. They are still controlled by human operators who decide and supervise the selection of targets and use of force. But meaningful human control is being put to the test as militaries race to embrace technology that advances remote warfare. Concerns are mounting over the development of killer robots – weapons that would select and use lethal force against targets based on sensor processing rather than human inputs.

Some call it this generation’s “Oppenheimer moment”. The geopolitical tensions and challenges faced by physicists and other scientists 70 years ago in their search for nuclear arms control is now being replicated in the 21st century. Military investments in autonomy and other emerging technologies are sending humanity down a dangerous path, and nations need to act together to meet the challenge with foresight and courageous political leadership.

Australia often highlights the benefits of emerging technologies in the military, but seems less interested in elaborating on what its officials call “potential” risks, particularly when it comes to autonomous weapons systems. But without action, the risks will become real. Australia should heed calls to enshrine the principle of meaningful human control in international law.

Fielding weapons systems that operate without meaningful human control effectively delegates life-and-death decisions to machines. This crosses a dangerous line. Autonomous weapons systems raise an array of fundamental ethical, humanitarian, legal, operational, moral, security, and proliferation concerns.

Israeli soldiers lift a drone near the border with the Gaza Strip in April.

Israeli soldiers lift a drone near the border with the Gaza Strip in April.Credit: Getty

All governments, including Australia’s, have a responsibility to act; to put in place the rules to protect humanity. Autonomous weapons systems are a grave problem that can affect any country in the world, so clear, strong, global rules are crucial.

For the first time, discussion of “lethal autonomous weapons systems” has been added to the provisional agenda of the annual session of the UN General Assembly, which opens in September. The UN Secretary-General António Guterres will provide a report compiling the views of governments on the challenges and concerns raised by autonomous weapons systems and how to address them through regulation.

Advertisement

Last year, Australia told the General Assembly that “the time is not yet ripe” for negotiating a legally binding instrument to prohibit or restrict autonomous weapons systems. It has said the same thing for the past decade, calling proposals to draw up legal rules “premature” and instead focusing attention on how the existing laws of war apply.

Loading

Australia instead supports voluntary measures such as a 2023 US political declaration aimed at ensuring “responsible” military use of artificial intelligence and autonomy. Measures that fall short of new international law are insufficient and provide no restraint, instead paving the way for a future of automated killing.

Australia’s narrow national security approach to arms control resulted in the country sitting out initiatives driven by humanitarian imperatives, such as the negotiation of the 2017 Treaty on the Prohibition of Nuclear Weapons. Australia has increased its defence spending and through the AUKUS program is now directly involved in the development of nuclear-powered submarines and related technologies.

These moves stand in stark contrast to Australia’s previous role in bringing about the international treaties banning antipersonnel landmines and cluster munitions, as well as biological and chemical weapons. Australia continues to dedicate diplomatic and financial resources that help ensure these humanitarian disarmament treaties succeed. It needs to act on killer robots before it is too late.

Australian foreign policy should prioritise seeking to prevent human suffering, and to protect humanity. The government should respond positively to calls from the Australian Human Rights Commissioner and civil society organisations to work to explicitly prohibit lethal autonomous weapons systems.

Australia should use this coming session of the UN General Assembly to express its strong commitment to work with urgency and with all interested stakeholders toward an international treaty on autonomous weapons systems.

Mary Wareham is deputy director of the Crisis, Conflict and Arms Division at Human Rights Watch.

Most Viewed in Technology

Loading

Original URL: https://www.theage.com.au/technology/civilian-deaths-from-killer-drones-are-this-generation-s-oppenheimer-moment-20240527-p5jh1l.html