Lethal autonomous weapon
-
- "Killer robot" redirects here. For articles dealing with the concept of robots and/or artificial intelligence killing or eradicating humans and/or other living beings, see Existential risk from artificial general intelligence, AI takeover, and grey goo
Lethal autonomous weapons (LAWs) are a type of autonomous military robot that can independently search and engage targets based on programmed constraints and descriptions.[1] LAW are also called lethal autonomous weapon systems (LAWS), lethal autonomous robots (LAR), robotic weapons, or killer robots. LAWs may operate in the air, on land, on water, under water, or in space. The autonomy of current systems as of 2018 is restricted in the sense that a human gives the final command to attack - though there are exceptions with certain "defensive" systems.
Contents
[hide]Being autonomous as weapon[edit]
Being "autonomous" have different meanings in different fields of study. In engineering it may refer to the machine's ability to produce without the operation of workers. In philosophy it may refer to an individual being morally independent. In political science it may refer to an area's capability of self-governing. In terms of military weapon development, the identification of a weapon as autonomous is not as clear as in other areas.[1] The specific standard entailed in the concept of being autonomous can vary hugely between different scholars, nations and organizations.
Scholars like Peter Asaro and Mark Gubrud are trying to set the threshold lower and judge more weapon system as autonomous. They believe that any weapon system that is capable of releasing a lethal force without the operation, decision or confirmation of a human supervisor can be deemed as autonomous. According to Gubrud, A weapon system operating partially or wholly without human intervention is considered autonomous. He argues that a weapon system does not need to be able to make decisions completely by itself in order to be called autonomous. Instead, it should be treated as autonomous as long as it actively involves in one or multiple parts of the "preparation process", from finding the target to finally firing.[2][3]
Other organizations, however, are setting the standard of autonomous weapon system in a higher position. The Ministry of Defence (United Kingdom) defines autonomous weapon systems as "systems that are capable of understanding higher level intent and direction. From this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state. It is capable of deciding a course of action, from a number of alternatives, without depending on human oversight and control - such human engagement with the system may still be present, though. While the overall activity of an autonomous unmanned aircraft will be predictable, individual actions may not be."[4]
As a result, the composition of treaty between states requires a commonly accepted labeling of what exactly constitutes an autonomous weapon.[5]
Automatic defensive systems[edit]
The oldest automatically-triggered lethal weapon is the land mine, used since at least the 1600s, and naval mines, used since at least the 1700s. Anti-personnel mines are banned in many countries by the 1997 Ottawa Treaty, not including the United States, Russia, and much of Asia and the Middle East.
Some current examples of LAWs are automated "hardkill" active protection systems, such as a radar-guided gun to defend ships that have been in use since the 1970s (e.g. the US Phalanx CIWS). Such systems can autonomously identify and attack oncoming missiles, rockets, artillery fire, aircraft and surface vessels according to criteria set by the human operator. Similar systems exist for tanks, such as the Russian Arena, the Israeli Trophy, and the German AMAP-ADS. Several types of stationary sentry guns, which can fire at humans and vehicles, are used in South Korea and Israel. Many missile defense systems, such as Iron Dome, also have autonomous targeting capabilities. Automatic turrets installed on military vehicles are called remote weapon stations.
The main reason for not having a "human in the loop" in these systems is the need for rapid response. They have generally been used to protect personnel and installations against incoming projectiles.
Autonomous offensive systems[edit]
Systems with a higher degree of autonomy would include drones or unmanned combat aerial vehicles, e.g.: "The unarmed BAE Systems Taranis jet-propelled combat drone prototype may lead to a Future Offensive Air System that can autonomously search, identify and locate enemies but can only engage with a target when authorized by mission command. It can also defend itself against enemy aircraft" (Heyns 2013, §45). The Northrop Grumman X-47B drone can take off and land on aircraft carriers (demonstrated in 2014); it is set to be developed into an Unmanned Carrier-Launched Airborne Surveillance and Strike (UCLASS) system.
According to The Economist, as technology advances, future applications of unmanned undersea vehicles might include mine clearance, mine-laying, anti-submarine sensor networking in contested waters, patrolling with active sonar, resupplying manned submarines, and becoming low-cost missile platforms.[6] In 2018 the U.S. Nuclear Posture Review alleged that Russia is developing a "new intercontinental, nuclear-armed, nuclear-powered, undersea autonomous torpedo" named "Status 6".[7]
Russian Federation is actively developing artificially intelligent missiles,[8] drones, unmanned vehicles, military robots and medic robots.[9][10][11][12]
Israeli Minister Ayoub Kara stated in 2017 that Israel is developing military robots as small as flies in order to assassinate leaders of Hezbollah and Hamas, and that these robots may be operational within as few as three years.[13][14]
Ethical and legal issues[edit]
Standard used in US Policy[edit]
Current US policy states: "Autonomous … weapons systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force."[15] However, the policy requires that autonomous weapon systems that kill people or use kinetic force, selecting and engaging targets without further human intervention, be certified as compliant with "appropriate levels" and other standards, not that such weapon systems cannot meet these standards and are therefore forbidden.[16] "Semi-autonomous" hunter-killers that autonomously identify and attack targets do not even require certification.[16] Deputy Defense Secretary Robert Work said in 2016 that the Defense Department would "not delegate lethal authority to a machine to make a decision", but might need to reconsider this since "authoritarian regimes" may do so.[17] In October 2016 President Barack Obama stated that early in his career he was wary of a future in which a US president making use of drone warfare could "carry on perpetual wars all over the world, and a lot of them covert, without any accountability or democratic debate".[18][19]
Probable violations of ethics and international acts[edit]
Stuart Russell, professor of computer science from University of California, Berkeley stated the concern he has with LAWS is that it is unethical and inhumane. The main issue with this system is it is hard to distinguish between combatants and non-combatants.[20]
There is concern (e.g. Sharkey 2012) about whether LAWs would violate International Humanitarian Law, especially the principle of distinction, which requires the ability to discriminate combatants from non-combatants, and the principle of proportionality, which requires that damage to civilians is proportional to the military aim. This concern is often invoked as a reason to ban "killer robots" altogether - but it is doubtful that this concern can be an argument against LAWs that do not violate International Humanitarian Law.[21]
LAWs are said to blur the boundaries of who is responsible for a particular killing,[22] but Thomas Simpson and Vincent Müller argue that they may make it easier to record who gave which command.[23]
Campaigns on banning LAWs[edit]
The possibility of LAWs has generated significant debate, especially about the risk of "killer robots" roaming the earth - in the near or far future. The group Campaign to Stop Killer Robots formed in 2013. In July 2015, over 1,000 experts in artificial intelligence signed a letter warning of the threat of an arms race in military artificial intelligence and calling for a ban on autonomous weapons. The letter was presented in Buenos Aires at the 24th International Joint Conference on Artificial Intelligence (IJCAI-15) and was co-signed by Stephen Hawking, Elon Musk, Steve Wozniak, Noam Chomsky, Skype co-founder Jaan Tallinn and Google DeepMind co-founder Demis Hassabis, among others.[24][25]
According to PAX fully automated weapons (FAWs) will lower the threshold of going to war as soldiers are removed from the battlefield and the public is distanced from experiencing war, giving politicians and other decision-makers more space in deciding when and how to go to war.[26] They warn that once deployed, FAWs will make democratic control of war more difficult - something that author of Kill Decision - a novel on the topic - and IT specialist Daniel Suarez also warned about: according to him it might recentralize power into very few hands by requiring very few people to go to war.[26]
There are websites protesting the development of LAWs by presenting a possible future that scares the viewers if such researches on applying artificial intelligence on designation of weapons continues. On these websites, news about ethical and legal issues are constantly updated for visitors to recap with recent news about international meetings and research articles concerning LAWs.[27]
See also[edit]
References[edit]
- ^ ab Crootof, Rebecca (2015). "The Killer Robots Are Here: Legal and Policy Implications". 36 Cardozo L. Rev: 1837 – via heinonline.org.
- ^ Asaro, Peter (2012). "On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making". RED CROSS. 687: 94.
- ^ "Autonomy without Mystery: Where do you draw the line?". 1.0 Human. 2014-05-09. Retrieved 2018-06-08.
- ^ "Unmanned aircraft systems (JDP 0-30.2)". GOV.UK. Retrieved 2018-06-08.
- ^ "Taylor & Francis Group". www.taylorfrancis.com. Retrieved 2018-06-08.
- ^ "Getting to grips with military robotics". The Economist. 25 January 2018. Retrieved 7 February 2018.
- ^ "US says Russia 'developing' undersea nuclear-armed torpedo". CNN. 2018. Retrieved 7 February 2018.
- ^ http://www.newsweek.com/russia-military-challenge-us-china-missile-own-decisions-639926
- ^ https://www.rbth.com/defence/2017/08/09/comrade-in-arms-russia-is-developing-a-freethinking-war-machine_819686
- ^ https://www.rbth.com/defence/2017/06/06/rise-of-the-machines-a-look-at-russias-latest-combat-robots_777480
- ^ https://www.rbth.com/science_and_tech/2016/02/10/is-terminator-back-russians-make-major-advances-in-artificial-intelligence_566553
- ^ https://www.rbth.com/news/2017/05/15/virtual-trainer-for-robots-and-drones-developed-in-russia_763016
- ^ http://www.jpost.com/Israel-News/Politics-And-Diplomacy/Kara-I-wasnt-revealing-state-secrets-about-the-robots-482616
- ^ https://www.cnbc.com/2017/03/17/mini-nukes-and-inspect-bot-weapons-being-primed-for-future-warfare.html
- ^ US Department of Defense (2012). "Directive 3000.09, Autonomy in weapon systems" (PDF). p. 2.
- ^ ab Gubrud, Mark (April 2015). "Semi-autonomous and on their own: Killer robots in Plato's Cave". Bulletin of the Atomic Scientists.
- ^ "Pentagon examining the 'killer robot' threat". Boston Globe. 30 March 2016.
- ^ "Barack Obama on 5 Days That Shaped His Presidency". Daily Intelligencer. Retrieved 3 January 2017.
- ^ Devereaux, Ryan; Emmons, Alex. "Obama Worries Future Presidents Will Wage Perpetual, Covert Drone War". The Intercept. Retrieved 3 January 2017.
- ^ Russell, Stuart (27 May 2015). "Take a stand on AI weapons". International weekly journal of science. 521.
- ^ Müller, Vincent C. (2016). "Autonomous killer robots are probably good news". Ashgate.
- ^ Nyagudi, Nyagudi Musandu (2016-12-09). "Doctor of Philosophy Thesis in Military Informatics (OpenPhD #openphd ) : Lethal Autonomy of Weapons is Designed and/or Recessive". Retrieved 2017-01-06.
- ^ Simpson, Thomas W; Müller, Vincent C. (2016). "Just war and robots' killings". Philosophical Quarterly.
- ^ "Musk, Hawking Warn of Artificial Intelligence Weapons". WSJ Blogs - Digits. 2015-07-27. Retrieved 2015-07-28.
- ^ Gibbs, Samuel (27 July 2015). "Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons". The Guardian. Retrieved 28 July 2015.
- ^ ab "Deadly Decisions - 8 objections to killer robots" (PDF). p. 10. Retrieved 2 December 2016.
- ^ "Front page". Ban Lethal Autonomous Weapons. 2017-11-10. Retrieved 2018-06-09.
Further reading[edit]
- Heyns, Christof (2013), ‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions’, UN General Assembly, Human Rights Council, 23 (3), A/HRC/23/47.
- Krishnan, Armin (2009), Killer robots: Legality and ethicality of autonomous weapons (Aldershot: Ashgate)
- Müller, Vincent C. (2016), ‘Autonomous killer robots are probably good news’, in Ezio Di Nucci and Filippo Santoni de Sio (eds.), Drones and responsibility: Legal, philosophical and socio-technical perspectives on the use of remotely controlled weapons, 67-81 (London: Ashgate).
- Stuart Russell's campaign against LAWs
- Sharkey, Noel E (2012), ‘Automating Warfare: lessons learned from the drones’, Journal of Law, Information & Science, 21 (2).
- Simpson, Thomas W and Müller, Vincent C. (2016), ‘Just war and robots’ killings’, The Philosophical Quarterly 66 (263), 302-22.
- Singer, Peter (2009), Wired for war: The robotics revolution and conflict in the 21st Century (New York: Penguin)
- US Department of Defense (2012), ‘Directive 3000.09, Autonomy in weapon systems’. <2014 final.docx="" killer="" paper="" policy="" robots="">.2014>
- US Department of Defense (2013), ‘Unmanned Systems Integrated Road Map FY2013-2038’. <http://www.defense.gov/pubs/DOD-USRM-2013.pdf%3E.
- The Ethics of Autonomous Weapons Systems (2014) Seminar at UPenn <https://www.law.upenn.edu/institutes/cerl/conferences/ethicsofweapons/schedule-required-readings.php>
No comments:
Post a Comment