AI Leaders Push for Killer Robot Ban at UN

0
62

A disturbing film illustrates the unsettling possibilities of lethal autonomous weapons, and how “slaughterbots” could forever alter the nature of warfare and national security.

GENEVA, SWITZERLAND (November 13, 2017) – Drone technology today is very close to having fully autonomous capabilities. And many of the world’s leading AI researchers worry that if these autonomous weapons are ever developed, they could dramatically lower the threshold for armed conflict, ease and cheapen the taking of human life, empower terrorists, and create global instability.

On Monday November 13, during an event at the United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots, artificial intelligence (AI) researcher Stuart Russell explained why so many AI researchers oppose lethal autonomous weapons, illustrated by a short and highly disturbing film that depicts a future in which mass-produced and easily accessible autonomous weapons lead to a world in which no one can feel safe.

Russell, a renowned expert on AI and Professor at UC Berkeley, explains “I participated in the making of this film because it makes the issues clear. While government ministers and military lawyers are stuck in the 1950s, arguing about whether machines can ever be ‘truly autonomous’ or are really ‘making decisions in the human sense’, the technology for creating scalable weapons of mass destruction is moving ahead. The philosophical distinctions are irrelevant; what matters is the catastrophic effect on humanity.”

“It’s time for countries to move from talking about the ethical and other challenges raised by lethal autonomous weapons concerns to taking preventative action,” said Mary Wareham, coordinator of the Campaign to Stop Killer Robots. “Endorse the call of the scientific community and non-governmental organizations to preemptively ban weapons that would select and fire on targets without meaningful human control.”

For years now, the US and other nations have used drones to carry out attacks against their enemies, blurring the line between human and robot warfare. But the autonomous weapons in question are far different from today’s remote-controlled UAVs. In contrast, fully autonomous weapons would require no human guidance as they operate. Armed with explosives, facial recognition algorithms, and other currently existing technology, these weapons systems could be custom-programmed to find identify, select and kill them automatically.

While robotic weapons usually conjure images of the Terminator and other unrealistic science-fiction scenarios, the film makes clear that fully autonomous drones could be far cheaper, smaller, and more effective than humanoid robots. Their low cost could make them the next class of weapons of mass destruction. The film also depicts how difficult it will be for governments and militaries to keep the weapons out of the wrong hands.

Toby Walsh, an AI professor at the University of New South Wales, explains: “The academic community has sent a clear and consistent message. Autonomous weapons will be weapons of terror, the perfect tool for those who have no qualms about the terrible uses to which they are put. We need to act now before this future arrives.”

Instead of deploying an army of Terminators, a terrorist group or a fringe country could buy and clandestinely deploy small, insect-like AI-drones capable of infiltrating buildings, exploiting personal data, and causing mass casualties at very low cost in a way that is difficult to defend against or even deter.

This past August, Walsh authored an open letter exhorting scientists to support a ban on lethal autonomous weapons, and it garnered the support of directors and founders of over 100 leading AI and robotics companies from 28 countries. And just this past week, over 200 Canadian scientists and over 100 Australian scientists in academia and industry penned open letters to Prime Minister Justin Trudeau and Malcolm Turnbull urging them to support the ban. These letters follow a 2015 open letter released by the Future of Life Institute and signed by more than 20,000 AI/Robotics researchers and others, including Elon Musk and Stephen Hawking who also signed Walsh’s 2017 open letter.

These letters indicate both grave concern and a sense that the opportunity to curtail lethal autonomous weapons is running out. As the Campaign to Stop Killer Robots’ website explains: “Several nations with high-tech militaries, particularly the United States, China, Russia, Israel, South Korea, and the United Kingdom are moving toward systems that would give greater combat autonomy to machines. If one or more chooses to deploy fully autonomous weapons, a large step beyond remote-controlled armed drones, others may feel compelled to abandon policies of restraint, leading to a robotic arms race. Agreement is needed now to establish controls on these weapons before investments, technological momentum, and new military doctrine make it difficult to change course.”

More information about these concerns can be found at autonomousweapons.org.

Given these unprecedented risks, the 2016 Fifth Review Conference at the UN decided to establish a Group of Governmental Experts (GGE) on lethal autonomous weapons. The CCW’s meeting will take place next week, November 13 – 17. It will be chaired by Ambassador Amandeep Singh Gill of India.

The final version of the film, which was produced with support from the Future of Life Institute and will be publicly released on YouTube in time for the CGE meeting on Monday at the UN, includes Stuart Russell making following closing statement:

“This short film is more than just speculation. It shows the results of integrating and miniaturizing technologies that we already have. I’m Stuart Russell, a Professor of Computer Science at Berkeley.

“I’ve worked in A.I. for more than 35 years. Its potential to benefit humanity is enormous—even in defence. But allowing machines to choose to kill humans will be devastating to our security and freedom. Thousands of my fellow researchers agree. We have an opportunity to prevent the future you just saw, but the window to act is closing fast.”

LEAVE A REPLY