- Ban the killer robots before it's too late updated Mon April 1, 2013
- As wars become increasingly automated, we must ask ourselves how far we want to delegate responsibility to machines. Where do we want to draw the line?
- 'Mantis:' the monster-sized hexapod robot updated Fri April 19, 2013
- Stomping through the fields and industrial wastelands of Britain, this giant six-legged walking robot is a world first, say its creators.
- Killer robots on the battlefield? updated Wed November 21, 2012
- Bonnie Docherty of Human Rights Watch discuses calls for a pre-emptive ban of autonomous killing machines.
- 'Age Atomic' is a blast for sci-fi fans updated Thur March 21, 2013
- Among science fiction and fantasy, there are a genres and sub-genres: hard science, swords and sorcery, cyberpunk, steampunk, apocalyptic stories, time travel and space opera, just to name a few.
- Flying robots learn mind-boggling tricks updated Mon March 4, 2013
- Professor Raffaello D'Andrea isn't short of admirers for his autonomous flying robots and the amazing tricks they perform.
Ban the killer robots before it's too late
updated 11:09 AM EDT, Wed April 3, 2013
STORY HIGHLIGHTS
- UK robotics professor leading calls for a worldwide ban on autonomous weapons
- We can't rely on robots to conform to international law, says Noel Sharkey
- Sharkey is chairman of and NGO leading a campaign to "Stop Killer Robots"
- Autonomous robots could destabilize world security and trigger unintentional wars
Editor's note: Noel Sharkey is Professor of Artificial Intelligence and Robotics and Professor of Public Engagement at the UK's University of Sheffield and Chairman of the International Committee for Robot Arms Control.
(CNN) -- As wars become increasingly automated, we
must ask ourselves how far we want to delegate responsibility to
machines. Where do we want to draw the line?
Weapons systems have been
evolving for millennia and there have always been attempts to resist
them. But does that mean that we should just sit back and accept our
fate and hand over the ultimate responsibility for killing to machines?
Over the last few months
there has been an increasing debate about the use of fully autonomous
robot weapons: armed robots that once launched can select their own
targets and kill them without further human intervention.
"We just shouldn't grant machines the decision about who lives or dies
Noel Sharkey, Chairman of ICRAC
Noel Sharkey, Chairman of ICRAC
Professor Noel Sharkey
Some have argued that
robots could be more accurate on the battlefield than human soldiers and
save more civilian lives. But this is speculation based on assumptions
about future developments of computer hardware and software. It is no
more than "hopeware" -- since the 1950s, Artificial Intelligence has
moved at a snail's pace compared to what proponents have predicted.
Others argue that even if
robots could be more accurate under some restricted circumstances at
some unknown time in the future, we just shouldn't grant machines the
decision about who lives or dies.
At this point, we cannot
rely on machines having the independent facility to conform to
international law. Current sensing systems are not up to the task. And
even if machines had adequate sensing mechanisms they would still be
missing the vital components of battlefield awareness and common sense
reasoning to make decisions about who and when it is appropriate to
kill.
Robots do not have the
agency to decide if striking a target is proportional to the expected
military advantage. There is no metric for this. Much of war is art and
not science. A military commander must make a qualitative decision about
the number of civilian lives that can be risked for a particular
military objective. And that commander can be held accountable.
A robot doesn't have the
moral agency to be held accountable. Some would argue that the commander
who sends a robot on a mission would be responsible (last point of
contact). But that could be unfair since it could be the fault of the
mission programmer, the manufacturer or one of dozens of little
companies providing components. Maybe it should be the senior staff or
policy makers who had the idea to use robots. Or the device could have
been tampered with in the industrial supply chain or even damaged in
action. Forensics are extremely difficult with such complex devices.
"Is anyone considering how autonomous weapons could destabilize world security and trigger unintentional wars?
Noel Sharkey, Chairman of ICRAC
Noel Sharkey, Chairman of ICRAC
Yet a recent U.S. DoD directive
(November 2012) gives a green light to research and development of
autonomous weapons systems while presenting a cautious route to their
deployment.
This is borne from a
culmination of U.S. military road maps dating back to 2002 and it is a
bad move. It sends the wrong message to other nations. As the most
militarily advanced nation on the planet, the U.S. has the opportunity
to take the take the lead in halting these developments.
Thanks to the U.S.'s use
of drones, more than 70 other countries have acquired the technology in
a new arms race. It is simply blinkered to think that they will not
follow suit with autonomous weapons. Is anyone thinking about how an
adaptive enemy will exploit the weaknesses of robot weapons with
spoofing, hacking or misdirection?
Is anyone considering
how unknown computer programs will interact when swarms of robots meet?
Is anyone considering how autonomous weapons could destabilize world
security and trigger unintentional wars?
In April this year in London, a group of prominent NGOs will launch a large civil society campaign to "Stop Killer Robots."
They are seeking a new legally binding preemptive international treaty
to prohibit the development and deployment of fully autonomous robot
weapons.
The aim is to stop these
weapons getting into the arsenals of the world's militaries while there
is still an opportunity. Once there has been large national investments
in the technology, it may be too late.
Do you think autonomous robot weapons should be outlawed? Leave your comments below.
end quote from:
Historically, every new weapon developed eventually is used on human beings somewhere on earth. So this is something to think about regarding killer robots. Though movies like "The Terminator" are popular (when I saw it with my programming background I saw in coming then in the 1980s). It was all a matter of when.
Computer programs run in autonomous machines will likely make much different mistakes than humans do. And on the battlefield this likely will always (to a greater or lesser degree allow humans to win). All a human has to do is to be aware of the presence of a killer robot in the air, water or land and to know it's weaknesses to defeat it.
But the problem is often drones in the air are quiet and we don't even know there are there watching us. So, people are completely defenseless against armed drones that they can't even hear or know they are there at all. You could be working in your garden outside and be perfectly happy outside one day in the sun only to be blown apart completely by a hellfire missile one day. But, you wouldn't have time to be in distress or to even feel any pain. You would just be scattered all over your yard in pieces. Here one moment and gone the next. And that likely is the actual future of killer robots that the world needs to address.
Here for me is the nightmare scenario: For example, right now we have the sectarian war between Sunnis and Shia Muslims in Syria. For example, what if one side or the other got flying drones to target and extinct the other sect?
This isn't limited to Shias and Sunnis. It could be Christians versus Muslims. It Could Be Muslims versus Hindus. It could be any group that didn't like any other group even politically. The main problem I see with flying drones in the future is genocide of whole groups worldwide.
Drones make possible genocide of whole nations or groups of people without any accountability whatsoever. Whoever had enough money to do it might get away with it if the weapons couldn't be traced back to them.
Ungoverned, flying drones could reduce human populations by half by 2100 through various means.
I'm saying this as an intuitive looking into the future right now.
Here for me is the nightmare scenario: For example, right now we have the sectarian war between Sunnis and Shia Muslims in Syria. For example, what if one side or the other got flying drones to target and extinct the other sect?
This isn't limited to Shias and Sunnis. It could be Christians versus Muslims. It Could Be Muslims versus Hindus. It could be any group that didn't like any other group even politically. The main problem I see with flying drones in the future is genocide of whole groups worldwide.
Drones make possible genocide of whole nations or groups of people without any accountability whatsoever. Whoever had enough money to do it might get away with it if the weapons couldn't be traced back to them.
Ungoverned, flying drones could reduce human populations by half by 2100 through various means.
I'm saying this as an intuitive looking into the future right now.
No comments:
Post a Comment