Ethics and Responsibility- the issue of autonomous weapons system

Ethics and Responsibility- the issue of autonomous weapons system

As Indians, we've all heard of Lord Krishna's Sudarshana Chakra, Lord Rama's Agni Bann, Lord Brahma's Brahma Astra, and many more. But what unites them all? The answer lies in their automatic nature and use of artificial intelligence. However, the topic of this article is not on these conventional weapons, but rather on the autonomous weapons systems (AWS henceforth) of the twenty-first century and how, despite being around since antiquity, they are still controversial.

In order to understand that, the article first elaborates on the meaning of "autonomous weapons systems" and then discusses both sides of the debate. Next, it discusses India’s progress in developing autonomous weapons systems, and finally, it closes with the way forward.

What is an autonomous weapons system?

According to the Arms Control Association, autonomous weapons systems, also referred to as "slaughterbots," "lethal autonomous weapons systems (LAWS)," and "killer robots," are lethal tools that have been given the ability to survey their surroundings, identify potential enemy targets, and autonomously choose to attack those targets on the basis of complex algorithms. Further as per the Association, such systems need to incorporate a number of fundamental elements, including a mobile combat platform, various types of sensors to examine the platform's surroundings, processing systems to classify objects discovered by the sensors, and algorithms directing the platform to launch an attack when an allowable target is detected. (Klare, 2019)

Though these weapons appear futuristic and have significant military advantages, a number of humanitarian advocates have expressed ethical concerns.

Arguments in favour of AWS

As previously stated, many people support AWS due to their military advantages. But apart from that, many people support it on ethical grounds. One such argument is that AWS are ethically preferable to human fighters. But how? Army University Press expands on this by referencing roboticist Ronald C. Arkin. (Amitai Etzioni, 2017) According to Arkin, autonomous robots will behave more humanely in the future. First, they will not need to be programmed with a self-preservation impulse; second, the judgements of AWS will not be affected by emotions such as fear or hysteria; third, the systems will be able to handle far more incoming sensory information than humans without discarding or distorting it to match preconceived conceptions; and finally, the robots may be more trusted to report ethical transgressions they detect than a team of people. Similarly, in the same AUP report, various other reasons in support of AWS were presented, such as reduced stress for soldiers, a lower number of sexual assaults, and other crimes. The report emphasises how stressed-out soldiers are more likely to conduct crimes like sexual assaults, lose control of their emotions, and commit other offences. So, with the help of AWS such crimes can be prevented. (Amitai Etzioni, 2017)

Arguments against AWS

As was previously indicated, some are using moral justifications to defend the AWS, while others are using the same justifications to criticise it. They wonder how an automated weapons system will distinguish between combatants and civilians. There would probably be many innocent casualties from a single error. Therefore, they argue that it is preferable to limit them, if not outright prohibit them.

Another important source of concern is the issue of accountability when autonomous weapons systems are used. Ethics expert Robert Sparrow draws attention to this moral dilemma by pointing out that an essential tenet of international humanitarian law demands that someone be held accountable for the killing of civilians. He argues that any weapon or other means of war that makes it difficult to assign blame for the casualties it causes does not meet the requirements of jus in bello and therefore should not be used in battle. (Amitai Etzioni, 2017)

India and autonomous Weapons system

Though India has a long history of developing and employing autonomous weapons systems in combat, its progress in the development and application of these systems is still in its early stages, as India has only recently begun taking proactive steps in this regard. For instance, the Ministry of Defense established a 17-member artificial intelligence task force to provide recommendations regarding AI and AWS. Later, the task force advocated improving AI to raise the armed forces' operational preparedness. The task group also recognised new applications for this system, including unmanned aerial vehicles (UAVs), watercraft, aerial vehicles, and unmanned tanks. (Nair, 2020)

Additionally, as a step forward in this area, India demonstrated the deployment of offensive drone technology during the Army Day Parade in 2021. Similarly, the country has deployed MUNTRA, an unmanned, remotely operated tank with three variants: surveillance, mine detection, and reconnaissance, in locations where nuclear and bioterrorism concerns exist.

Is there any international law that regulates AWS?

Given the potential cascading impacts of this system, numerous international bodies, groups, and individuals have been working to develop an international system to regulate AWS. One of these is the law of armed conflict, sometimes referred to as "international humanitarian law" (IHL henceforth). It primarily consists of a set of regulations that specify the limitations and prohibitions that must be followed in armed conflicts, both international and domestic. According to SIPRI, the IHL regulations that restrict the development and use of AWS can be divided into three categories. The first and second categories state that: a) any weapon would be considered inherently illegal if it is already prohibited by a particular treaty; and b) any weapon or means would be deemed outlawed if it is designed to inflict unnecessary harm or suffering. The third category focuses on general guidelines, such as the fact that using a weapon or AWS would be prohibited in the following situations: a) the attack is intended to strike military targets or civilians without discrimination; b) the attack is anticipated to result in incidental civilian casualties, civilian injuries, etc. (vincent boulanin, 2021)

What we need to do?

As Future of Life Institute points out, the first step should be a new international law on autonomous weapons. Second, some autonomous weapons, notably those that target humans and are very unpredictable or function beyond meaningful human control, must be prohibited. (Institute) Furthermore, as recommended in the United Nations' 2013 report, member states should declare and execute moratoria against the testing, manufacture, transfer, and deployment of lethal autonomous robotics (LARs) until an internationally agreed-upon framework for LARs is developed. (Amitai Etzioni, 2017).


References

  1. Amitai Etzioni, O. E. (2017, June ). Pros and Cons of Autonomous Weapons Systems. Retrieved from Army University Press: https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/May-June-2017/Pros-and-Cons-of-Autonomous-Weapons-Systems/
  2. Institute, F. o. (n.d.). The push to prohibit Slaughterbots. Retrieved from Future of Life Institute: https://futureoflife.org/project/lethal-autonomous-weapons-systems/
  3. Klare, M. T. (2019, March ). Autonomous Weapons Systems and the Laws of War. Retrieved from Arms Control Association : https://www.armscontrol.org/act/2019-03/features/autonomous-weapons-systems-laws-war
  4. Nair, J. (2020, April 09). Lethal Autonomous Weapon Systems – A Challenge to Humanity. Retrieved from Indian Defence Review : http://www.indiandefencereview.com/news/lethal-autonomous-weapon-systems-a-challenge-to-humanity/
  5. vincent boulanin, l. b. (2021, June ). AUTONOMOUS WEAPON. Retrieved from SIPRI: https://www.sipri.org/sites/default/files/2021-06/2106_aws_and_ihl_0.pdf


Pic Courtsey-

(the views expressed are those of thE author and do not represent views of CESCUBE.)