We Are The University

Death By Robot - Who Will Be to Blame?

university-of-canterbury

Mon Jun 10 2013 12:00:00 GMT+1200 (New Zealand Standard Time)

Death By Robot - Who Will Be to Blame?

Monday, 10 June 2013, 2:38 pm
Press Release: University of Canterbury

Death By Robot - Who Will Be to Blame? UC researcher asks

June 10, 2013

The autonomous weapon systems – commonly referred to as killer-robots – will, in the near future, come from the air not the ground, a University of Canterbury (UC) robot expert says.

The issue of whether robots should be allowed to take a human life, without direct supervision or command, has been raised recently by the United Nation.

The UN’s Human Rights Council has recently heard that countries are developing armed robots that can kill without the need for human choice or intervention.

UC robot researcher and acting director of UC’s Hit Lab, Dr Christoph Bartneck, says it is important to make a clear distinction between science and fiction.

``Many of the recent reports or articles on killer robots used imagery from the Terminator movies to illustrate human-like killing machines.

``Using fiction to talk about real world problems is misleading at best. This may seriously harm the research and development of androids. The autonomous weapon systems we will be dealing with in the near future come from the air, not the ground.

``It is a requirement for a just war that participating agents must be responsible for their actions. For an autonomous killing machine to fight a just war, we would need to give them the legal state of a person, so that it could take responsibility for its actions.

``But the idea for a just war may have fallen out of fashion. We no longer declare or end wars. We directly invade or attack from the air. Maybe the only glimpse of hope is that autonomous weapon systems, left to their own devices, will quickly run out of battery or fuel. Land mines remain a deadly threat long after the original conflict has ended.

Advertisement - scroll to continue reading

``We are systematically using tools to kill each other and even autonomous machines are a tried and tested method to kill humans, both soldiers and civilians. Land mines are maybe one of the best examples for such autonomous killing machines, although they are of course rather simple.

``But even the simplicity of killing machines can be their greatest asset. The Russians maintained an autonomous system during the cold war called the Dead Hand that, once activated, would automatically launch intercontinental missiles in case a nuclear attack on Russia was detected.

``Today’s weapons systems have become more complex, such as the Phalanx Gun System, which caused the death of a soldier in 1989. Despite the increase of complexity the fundamental questions remain the same. Who will take the responsibility for a non-deterministic weapon system? And how do we define autonomy?’’

Dr Bartneck says legal requirement of responsibility becomes even more pressing in a civilian context. In the near future society will have to deal with more fatalities through autonomous cars than autonomous war machines. The first documented death caused by robot occurred in 1979 when a factory worker was hit by a robotic arm.

``Isaac Asimov, a famous writer wrote: `Violence is the last refuge of the incompetent’. By that definition, autonomous killing machines are hopelessly incompetent, since all they know is violence,’’ Dr Bartneck says.

ENDS

© Scoop Media

Advertisement - scroll to continue reading

a.supporter:hover {background:#EC4438!important;} @media screen and (max-width: 480px) { #byline-block div.byline-block {padding-right:16px;}}

Using Scoop for work?

Scoop is free for personal use, but you’ll need a licence for work use. This is part of our Ethical Paywall and how we fund Scoop. Join today with plans starting from less than $3 per week, plus gain access to exclusive Pro features.

Join Pro Individual Find out more

Find more from University of Canterbury on InfoPages.