18
Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019 55 KILLER ROBOTS AND THEIR COMPLIANCE WITH THE PRINCIPLES OF LAW OF WAR Asif Khan 1 , Muhammad Abid Hussain Shah Jillani 2 , Maseehullah 3 ABSTRACT Robots are no longer fictional as they used to be years ago. Technological advancements and developments in artificial intelligence (AI) development have allowed innovation of robots that carry out diverse functions. Among these are robot innovations aimed at replacing soldiers in battle fields. These robots have been argued to be more ethical and clinical than human soldiers by some scholars. Others have argued that the increasing level of autonomy in these robots leading to innovation of fully autonomous weapons should be banned. They argue that the ability of these weapons to differentiate between civilians and combatants and thus may cause unnecessary death of civilians. This paper discusses how difficult it is for the law of war (international humanitarian law ) to be applied by the algorithm by discussing how killer robots (also known as autonomous weapons system) cannot comply to the basic law of war principles like distinction, proportionality and precaution. These principles call for unquantifiable decisions which need human-like characters which killer robots do not possess. The paper also argues how humanitarian law accepts responsibility for a human agency, 1 Doctoral Student at Zhengzhou University, school of law, China. LL. B (5 years) AWKUM/Master of law in international law IIUI, Pakistan. ORCID ID https://orcid.org/0000- 0002-5059-5976 . He is serving as an advocate high court at Peshawar High court, Peshawar, Pakistan. The author may be contacted at [email protected]. 2 Doctoral Student at Zhengzhou University, school of law, China. LL. B / (Bahauddin Zakariya University, Multan, Pakistan). He is serving as advocate High court at Punjab high court, Pakistan. The author may be contacted at [email protected] / 3 Doctoral Student at the at BIT University, school of law, China. LL.B. (University of Peshawar)/ LLM international law (international Islamic University Islamabad Pakistan. He is serving as advocate high court at Peshawar High court, Peshawar, Pakistan. The author may be contacted at [email protected].

KILLER ROBOTS AND THEIR COMPLIANCE WITH THE …

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

55

KILLER ROBOTS AND THEIR COMPLIANCE WITH THE PRINCIPLES OF LAW OF WAR Asif Khan1, Muhammad Abid Hussain Shah Jillani2,

Maseehullah3

ABSTRACT

Robots are no longer fictional as they used to be years ago. Technological advancements and developments in artificial intelligence (AI) development have allowed innovation of robots that carry out diverse functions. Among these are robot innovations aimed at replacing soldiers in battle fields. These robots have been argued to be more ethical and clinical than human soldiers by some scholars. Others have argued that the increasing level of autonomy in these robots leading to innovation of fully autonomous weapons should be banned. They argue that the ability of these weapons to differentiate between civilians and combatants and thus may cause unnecessary death of civilians. This paper discusses how difficult it is for the law of war (international humanitarian law ) to be applied by the algorithm by discussing how killer robots (also known as autonomous weapons system) cannot comply to the basic law of war principles like distinction, proportionality and precaution. These principles call for unquantifiable decisions which need human-like characters which killer robots do not possess. The paper also argues how humanitarian law accepts responsibility for a human agency,

1 Doctoral Student at Zhengzhou University, school of law, China. LL. B (5 years) AWKUM/Master of law in international law IIUI, Pakistan. ORCID ID https://orcid.org/0000-0002-5059-5976 . He is serving as an advocate high court at Peshawar High court, Peshawar, Pakistan. The author may be contacted at [email protected]. 2 Doctoral Student at Zhengzhou University, school of law, China. LL. B / (Bahauddin Zakariya University, Multan, Pakistan). He is serving as advocate High court at Punjab high court, Pakistan. The author may be contacted at [email protected] / 3 Doctoral Student at the at BIT University, school of law, China. LL.B. (University of Peshawar)/ LLM international law (international Islamic University Islamabad Pakistan. He is serving as advocate high court at Peshawar High court, Peshawar, Pakistan. The author may be contacted at [email protected].

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

56

making it difficult to determine responsibility in cases involving killer robots. Qualitative research methodology has been applied to the following article.

KEYWORDS: Killer Robots, Geneva Convention, Artificial intelligence INTRODUCTION

Killer robots (also known as Autonomous Weapon System (AWS)) can be described as weapons that emerged in recent years and are being developed advancingly using artificial intelligence and technologies associated with robotics (Aoun, 2017). These weapons operate by their ability to select and attack selected targets without human interventions. Many countries are in the process of developing these weapons for use in future armed conflicts. This paper aims to analyze the legal implications associated with the development and use of these weapons concerning international humanitarian law.

The history of the manufacture and use of autonomous weapons can be traced back to First and Second World Wars. Since then, weapons have been developed with capabilities to perform different tasks with less control from human beings. A good example of this is automated air defence systems which can fire to targets without human control. It is beyond a reasonable doubt to say that everything that nearly all war-related weapons have autonomous versions currently and improvements are being made day to date to make them more lethal (Department of The U.S. Army, 2010). The weapons are there already, and more are being made. The question is thus what they can do in what missions and what implications will arise upon their use.

The application of military force in wars is always governed by international humanitarian law which is responsible to ascertain ethical acceptance and humanitarian bases. International humanitarian law thus encompasses for taking to account human mal-ethics and weaponry and can thus condemn the use of certain weapons and war tactics (Aoun, 2017). The use of Killer Robots that

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

57

are unable to make decisions on life or death of human beings has raised legal questions that necessitate intervention by international humanitarian law because of hostilities associated with the weapons as well as fundamental ethical and legal issues in development and deployment of the systems(Bhuta, 2016).

The qualitative research approach has been deployed, focusing mainly on publications related to Killer Robots, books written by some distinguished writers, scholars of international law and lawyers. The explanation they been based on these writers is, for instant professor Sharkey of AI and robotics; peter Asaro, the philosopher of technology, AI, robotics and American professor Naval war college professor Micheal N. Schmit.

KILLER ROBOTS The gradually increasing autonomy in modern weapons has raised serious concerns, as discussed above. As the description of autonomy goes, killer robots or autonomous weapons system (AWS) are expected to carry out duties without human supervision or control. The use of these weapons pauses a very fundamental threat regarding the predictability of how best they can operate within the requirements of the law (Busuttil, 1998). The purpose and importance of this research are to unfold new trend of development of autonomous weapons like the unmanned ground and aerial vehicles whose information remains limited (Aoun, 2017). The paper also aims to present challenges facing killer robots concerning compliance with law of war and explain how compliance can be achieved to avoid extreme challenges that may result from autonomy. Minor themes like an elaboration on who should be held responsible in case of mishaps by killer robots will also be touched on. Being the scope of law of war to listen to war-related crimes like breaches of law of war principles, indiscriminate killing of civilians and other related cases, there should be a clear description and laws governing the manufacture and use of killer robots in law of war and the paper will discuss this too (Sharkey, 2010).

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

58

There are many works done on the topic so far. Among these is research by international human right and law of war academy in Geneva . This research is dated back in 2014 and focused to unfold legal and policy-related implications of the topic to relevant authorities. The research, therefore, looked deep into the legality of killer robots under the reflection of law of war (Roach, 1984). It examined the numerous advances being made and their ethical compliance doubts which raise societal concerns with regard to the international humanitarian law. The research concluded that there were several international legal implications associated with the use and development of killer robots.

Another related research to this paper is a report from different debates by expert committees on autonomous weapons. Experts convened to deliberate on the issue it gained international concerns on ethical, technical, military and humanitarian issues in Switzerland in March 2014. Another expert meeting was held under UN frameworks in 2014 (UNCCW, 2001). Further meetings were also conducted by the experts in 2015 and 2016. These discussions aimed to understand better the issues raised concerning the manufacture and the use of autonomous weapons so to be able to different advice governments. These meetings looked deep into what characterizes autonomous weapons, functionality, and ethical requirements. The reports indicated that there were challenges associated with the autonomy of weapons. The reports advised IHL to formulate a committee to come up with approaches that can address the challenges associated with autonomous weapons.

Erika Steinholt Mortensen also contacted very detailed research on the topic in 2016. The research sought to unfold how law of war can manage and control the making and use of killer robots in the event of armed conflict. The research recommended that there should be a recommendable degree of human control in autonomous weapons to ensure there is compliance with set laws. If technology advances to enable killer robots to be able to comply with law of war

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

59

autonomously, the author concludes that human control can then be withdrawn (Leveringhaus, 2016).

A review through the above articles will tell one that killer robots manufacture and use leads as a burning issue in international law and other legal manuals related to weapons. Despite governments continuously increasing funding in research and development of autonomous weapons, the systems are deviating further from the law. This paper aims to discuss the general overview of autonomous weapons, to explain the main challenge to the IHL principles and, at the end of the paper, to make recommendations for filling the gap, for example, what States Parties are going to do to solve the problem and ban or bring these autonomous weapons under the close control of human beings.

The definition of “autonomous” according to the oxford dictionary, is “having the ability to freedom to govern itself or control own affairs”. In military use, the term was first used by US defence officials during research on autonomous weapons (Pocar et al., 2013). They defined killer robot or AWS as a system that requires only human activation to be capable of targeting and targeting without any human intervention. Another definition is given by the Human Right Watch (HRW) who classify the autonomy in three classes, namely human out of loop, on loop and in loop (Bhuta, 2016). HRW observe more express the main class out of loop of present circumstances are those robotic weapons frameworks that are “fit for choosing targeting, conveying power with no human info or connection”, though the circle on the loop arms is the one that can “choose goals and convey power under the control of person who can supersede robotic weapons activity ”.

Both sorts can be known as a robotic weapons framework where the control of human is restricted in such way the weapon can be labeled “human out of loop”. At present, military personnel with their specific capacities as landmarks and attacking targets are using various kinds of weapons. Few cautions arms, for instance, include

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

60

independence feature for capturing missiles, missiles, cannon shells, cluster bombs.

According to a report by UN special reporter, killer robots are weapons that once activated, will be able to choose who to attack and contact the attack without being led by anyone. Under the frame of the UN, killer robot has been discussed since 2013 to regulate them as they are classified as dangerous or excessively injurious weapons. Governmental experts were then chosen to form a Group of Governmental Experts (GGE) (Solis, 2016). The group is given the mandate to act legally against killer robots. Many countries have, however, not joined this effort by the UN as they feel they have not yet advanced to use autonomous weapons and are thus not ready for the discussion. The IHL remains in crisis on how to monitor these weapons looking into it that not all countries have contributed to its efforts to manage and control the weapons despite some of these countries being classified as important manufactures and developers of these weapons (Liu, 2012).

This raises concerns on how compliant these weapons are to IHL. The following part of this paper will look at challenges facing autonomous weapons concerning their compliance with IHL.

KILLER ROBOTS AND THEIR CURRENT CHALLENGES TO LAW OF WAR

There are several reasons as to why killer robots find it difficult to comply with law of war’s principles. These principles of law of war aims to protect civilians and those who are unable to fight during war or are not directly involved during the war. For instance, children are protected from being hurt during the war by these principles because they are not able to fight (Nehal et al., 2016). The principles also protect middle parties in the war who aim to help during the war without taking part like members of red cross and other well-wishers who may be willing to help in treating the injured among other activities. These principles include principles of proportionality, distinction, military necessity, and the general precautions which

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

61

should be observed during armed conflict to make sure innocent people are not targeted. Law of war rules also govern and control the manufacture or use of any weapons that when used, may end up causing superfluous injuries. This is following Geneva’s article 36, written in 1936. This was formed after the bombing of Hiroshima and Nagasaki using a weapon which is classified among those caused unnecessary injuries during the second world war.

Principle of Distinction and Killer Robots

The article was therefore formulated requiring countries to assess the legality of new weapons. As new technological advancements overtake most developments in the world, law of war has had a challenge in assessing the legality of new technological weapons being developed day today. These are weapons based on different artificial development programs and are being developed using artificial intelligence. The weapons have had several issues with compliance to law of war among them the inability to comply with the principle of distinction and the principle of proportionality. According to article 36 states parties are needed to comply with these principles before they consider developing or deploying any weapon (Kastan, 2013). states parties must make sure that they do not develop weapons that are not allowed by law of war’s principles.

According to the first rules of customary international humanitarian law “the states parties in conflict must be differentiate among

civilians and combatants at all time. Attacks can be only aimed to against combatants. Attacks on civilians should not be aimed at”. (Customary IHL, Rule 1). The rules set out for the first time in the St. Petersburg delectation state that “the only reasonable goal to be accomplished by states during wartime to destroy the enemy’s armed

forces.” (St. Petersburg delectation, 1868). Those rules were later found in major convention and Geneva conventions additional protocols (AP-I & AP-II). A core principle of law of war which states that armed conflict parties must is a principle of distinction.“ distinguish at all time between the civilian and combatants as well as

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

62

among the military purpose and civilian purpose, their operation shall therefore only be directed towards military targets”. This mean prohibiting of indiscriminate attacks and using indiscriminate battle tactics.

Currently, the most significant concern with autonomous weapons being developed and those that may come up later is how they will be able to comply with this principle because most are being developed for mass destruction (Marchant et al., 2011). This shortcoming threatens IHL’s legal authority to protect innocent civilians. Autonomous weapons cannot be ascertained to be able to follow IHL’s requirement.

In modern conflicts, combatants do not necessarily need to identify themselves by uniforms or bare any form of identification. Militants should, therefore, identify combatants through assessing their behaviour concerning the situation to be able to associate them with the ability to do something which makes them combatants (Carnahan, 1998). It is very difficult currently to explain how killer robots will be able to access one’s behaviour or emotion state to be able to determine who combatants are and who are not. This means that they may be unable to distinguish among combatants and civilian, thus identify or attack innocent people. Robots need to have human-like characters if they are to be able to differentiate between civilians and combatants. It is not easy to program such robots, and this makes killer robot the ability to comply with this IHL principle questionable (Crootof, 2018). States parties are needed to comply with these principles before they consider developing or deploying any weapon (Kastan, 2013). states parties must make sure that they do not develop weapons that are not allowed by the principles of IHL.

Principle of Proportionality and Killer Robots

It is a requirement of IHL that attacks do not cause unnecessary loss of civilian life. State parties should, therefore, ensure they avoid causing death or suffering to civilians during armed attacks. Loss of civilian life should, therefore, be accidental and be within proportions

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

63

of anticipated goals of the military. The rules of proportionality are contained within the requirements. Article 57 insists this by quoting “Article 57. “an attack that could cause unintended loss of life, haram to civilian, destruction to civilian property or a combination that is disproportionate in terms of the expected clear and direct military advantage”. The US airforce, under their aim to observe proportionality, says that “principle of proportionality in an attack shall be necessary arbitrary decision to be determined case to case .” (Carty, 2017) This means that a survey and assessment of proportionality in an attack must be done before an attack is done.

Killer robot development and use have issues regarding complying with this principle. These are both in programming the weapons and deploying them for attacks. It is difficult to program robots that can be able to access different essential situations as there is an infinite number of situations. Proportionality of attacks, a very important principle in law of war depends fully on human judgment and the ability to make decisions rightly. It is not clear how able killer robot is to make correct judgments concerning proportionality in attacks (Carpenter, 2016).

A good example explaining this is a situation where militants may be assumed dead after they fail to signal red smoke or an agreed type of signal in the agreed time. To do away with combatants, an autonomous jet may be sent to bomb the place. Militants may, however, manage to set the signal after the jet has been released. The autonomous jet will, however, proceed and bomb the militants, who would not have died if someone were controlling the jet as he would have seen the signal and aborted the mission. Killer robot is unable to adjust to consistently changing situations which must be considered to ensure proportionality (Cassese et al., 2011). A killer robot may be sent in a house that is believed to be housing combatants, for instance, terrorists. The instructions to the weapon may, therefore, shoot to kill anyone detected by the weapon. Unknown to the launchers, there might be innocent people kidnapped by the terrorists. Because of the inability of the autonomous weapon to access the

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

64

situation and make the decision to differentiate between terrorists and kidnappers, the innocent kidnapped people are likely to be shot and be killed and increase the numbers of people killed innocently during the mission thus doing the mission unproportionally. I would like to insist that the lack of human judgment in killer robot remains the greatest disadvantage of killer robots in complying with the IHL principle of proportionality. However, in order to comply with this principle, it is necessary for States to comply with this principle before developing or deploying, for example, to increase and eliminate the lack of human judgment or to develop all killer robots such as human-in-loop weapons.

Principle of Precaution and Killer Robots

This principle requires different precautions to be taken before and during an attack. This is to ensure there is no unnecessary damage to property or death, especially of civilians or parties that are not of interest in the conflict. Precaution’s principle firstly lay down in

article 2(3) of the Huge convention (XI) 1907 which explain that “when there is a need of decisive attacks for the military purpose against military or novel objectives within an vulnerable town and a pause cannot be tolerated for the enemy, the military commander shall be take all the required steps of ensure that a town affected minimum ” now it is more clear in additional protocol I article 57(1). (Regulations, 1907). This convention raised the concern that there was a need to avoid conditional damage during armed conflicts. Under the requirements of IHL, militaries are required to carry out their objectives and identify targets located far away or in populated areas with great precaution. In situations where there can be uncertainty, great keenness is required to avoid damages (UNCCW, 1980). An excellent example of this is a situation whereby combatants may be near the important property like oil pipes, tanks, planes, etc. Attacks in such situations must be carried with great precautions. The inability of the killer robot to act with precaution in attacks limits their ability to comply with the precaution principle. In complex situations, these machines are unable to take the necessary

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

65

precautions. According to this principle, States Parties must, before developing a killer robot, ensure that these weapons must be in the control of human beings or that they can take precautions before they enter the war or if the situation changes in the war.

Principle of Military Necessity and Killer Robots

IHL aims to ensure that conflicting parties act in a way that does not justify actions against illegal international law. This principle can be traced back to Francis Lieber’s code of 1863 (Fleck, 2013). This code compelled parties involving in a war to measure what is lawful within laws of war and act within that. Objectives carried out by militaries should be within legal and legitimate measures of military objectives. Militaries should, therefore, avoid the use of power that goes beyond the goes they want to achieve (ICRC, 2014). A good example of this force is the atomic bombs used in bombing Hiroshima and Nagasaki. The effect of these bombs is active even to date. This goes beyond the military goals which the mission aimed to achieve. The IHL aims to stop such actions through the principle of military necessity (Ferreira et al., 2015). Attacks should only be carried where necessary and within allowable power use. If a certain legal force wants to kill a certain criminal or terrorist in some confinement, it is illegal to use weapons that may end up damaging nearby buildings or affecting other people around the place. Killer robots cannot access the outcomes of an attack and therefore are unable to comply with this principle is used in most cases.

ARTICLE 36 OF ADDITIONAL PROTOCOL-I OF GENEVA CONVENTION AND KILLER ROBOTS

Drafters of Geneva additional protocols came up with a provision that guides the development of weapons. This was to regulate the methods and tactics used in future wars to ensure compliance with the laws. This resulted to the formation of part III of AP I. IHL observes article protocol one or article 36 which says “In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

66

to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party” (Pilloud et al., 1987). This article is set to prevent the development and deploy of weapons that are not in line with law of nation. Countries, however, believes that law of nation agrees to the formation of weapons system. Still, they forget this additional protocol that regulates what weapons are being made by necessitating compliance to international humanitarian law (Kelleher & Dombrowski, 2015).

The issue of responsibility also arise the question of the killer robots with the compliance with the law of war. It is unavoidable that killer robots in one way or the other will threaten or kill civilians or other parties who are not involved in conflicts. During wars, if mishaps or violation of the laws of war happen, someone should be held accountable. When someone is held responsible, the victims feel cared for, and this also helps to stop the possibility of a future recurrence of the same errors or errors related to those accounted for. However, it is not clear who is to be held liable for mishaps when killer robot is used. (Krishnan, 2009). Many people are involved in operations involving killer robot. The commander, producer, manufacturers, programmers and even commissioners and operators of the machine. None of these people can be held responsible currently because there are no legal bases regarding autonomous machines to hold anyone responsible as we speak.

RESPONSIBILITY

There are three types of responsibility to be accounted for: individual, state and command responsibilities. State responsibility can be traced back to 1648 in a treaty of peace signed by Westphalia. This was further stressed in the doctrine of fundamental international laws. When a state uses an autonomous weapon, it can be held responsible for breaches to international law that the machine can make. Article 8 of the Draft Articles on Responsibility of States for Internationally Wrongful Acts holds that “conduct of a person or

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

67

group of persons shall be considered an act of a State under international law if the person or group of persons is acting on the instructions of or under the direction or control of, that State in carrying out the conduct.” (Marchuk, 2015). The second part of the article says that “Such conduct will be attributable to the State only if

it directed or controlled the specific operation and the conduct complained of was an integral part of the operations”(Marchuk, 2015). This means that people will be held accountable for personal mistakes, while the state will take responsibility independently. Because states have programs like training that include training under international law, they can be held responsible for errors of war that breach international law as things stand currently.

Under personal accountability, it is a requirement under article 36 that a state assesses the legality of a weapon to be used in an attack and its adherence to international rules through all stages of the attack. The article was interpreted during the Third Committee of the Diplomatic Conference to mean that “It should be noted that article 36 is intended to require States to analyze whether the employment of a weapon for its normal or expected use would be prohibited under some or all circumstances. A state is not required to foresee or analyses all possible misuses of a weapon, for almost any weapon can be misused in ways that would be prohibited.” (Levie, 1980). States, however, can hold their troops responsible. If a pilot, for instance, miscalculates distance and ends up bombing the wrong place, it is the responsibility of the pilot to be accountable even though IHL will conduct the country for explanations (Pocar et al., 2013).

Recent advancements in law of war now hold different people responsible for crimes committed during armed conflicts. People that can be held responsible individually for killer robot mistakes include the commander, the programmer, manufacturer or even cooperation’s

or the robot itself (Schmitt, 2011).

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

68

Commanders will be held responsible because they are responsible for giving directions to their subordinates during an attack. The failure of the subordinates can, therefore, be attributed to wrong instructions by the commanders. Programmers can be held responsible if the right commands where given but the machine was wrongly programmed such that it could not obey the commands or obeyed them wrongly. Cooperation can be held accountable for the poor manufacturing or development of a robot and such. If the commander gives the right commands and at the right time, the programming was right, and the manufacture, breaches to international law may happen still. For instance, a robot may misjudge a situation leading to a breach in principles of international law discussed in the paper. In this case, it won't be easy to determine who should be blamed because all the parties played their parts well, but the robot failed because it lacks human characters required for it to make the right quick judgments that are human. This raises questions or who should be held accountable for such instances. It would be wrong to hold the state responsible if they had assessed the situation and determined the killer robots would work perfectly in the mission (Solis, 2016).

RECOMMENDATIONS

After the exploration of the nature and range of influences that affect the development of killer robots, this research has been able to unfold several insights on killer robot. These insights are on their ability, where to be used in battlefields and their use magnitudes. Baring these insights, the research recommends the following.

killer robot development should be limited to stay within the loop of human control. This is because it is very difficult for AWS or killer robot to make correct judgments under different circumstances and situations. Staying under human control will make these machines obey law of war principles.

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

69

Human being interference should be allowed when the weapons are engaging or selecting targets. This is because wrong targeting or engaging by the weapons cannot be reversed once it happens.

States should draw principles and laws to govern their weapons before they deploy them to battlefields such that they state clearly who to be able held responsible by law of war in case mishaps happen.

Law of war should be aware of what weapons are being developed by which country and should ban lethal weapons that are considered superfluously dangerous (Anderson & Waxman, 2013).

There is a need for states to develop killer robot to hold talks and draw lines where not to go beyond and come up with guidelines on the weapons (Anderson & Waxman, 2013).

The international community needs to be ahead of technology to be able to put bans in time to avoid developments of weapons through technologies they consider dangerous.

CONCLUSION With the wake of killer robot in the world, many countries are now struggling to come up with totally autonomous weapons. This, however, raises three main concerns. One of these is how these killer robots will be controlled while thousands of miles away and how they may be contacted. Second is how these machines will be able to comply with basic law of war principles and lastly who will be accountable for breaches or mishaps done by these robots (Fry, 2013).

After analysis of different articles, it is concludable that complete autonomy will take human out of the loop. States should thus consider banning these weapons or make sure they are brought within the connives of international law that protects civilians and parties not interested in armed conflicts during a war (Sharkey, 2010). Weapons deployed into wars must be able to stay and act within the

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

70

requirements of law of war principles. Unfortunately, research in this paper established that it is very difficult for killer robot to act within law of war requirements. It is also not clear who will be held accountable wherever these killer robot cause violations during armed conflicts.

NOTES & REFERENCES Anderson, K., & Waxman, M. C. (2013). Law and ethics for autonomous

weapon systems: Why a ban won't work and how the laws of war can. Retrieved from https://scholarship.law.columbia.edu/cgi/ viewcontent.cgi?article=2804&context=faculty_scholarship

Aoun, J. (2018). Robot-proof: higher education in the age of artificial intelligence. Cambridge, MA: The MIT Press.

Bhuta, N. (Ed.). (2016). Autonomous weapons systems: law, ethics, policy. Cambridge University Press..

Busuttil, J. J. (1998). Naval Weapons Systems and the Contemporary Law of War. Oxford University Press.

Carnahan, B. M. (1998). Lincoln, Lieber and the laws of war: the origins and limits of the principle of military necessity. Am. J. Int'l L., 92, 213.

Carpenter, J. (2016). Culture and human-robot interaction in militarized spaces: A war story. Routledge.

Carty, A. (2017). Philosophy of international law. Edinburgh: Edinburgh University Press.

Cassese, A., Acquaviva, G., Fan, M., & Whiting, A. (2011). International criminal law: cases and commentary. Oxford University Press.

Crootof, R. (2018). Autonomour Weapon Systems and the Limits of Analogy. Harv. Nat'l Sec. J., 9, 51.

Customary international humanitarian law. Retrieved from https://ihl-databases.icrc.org/customary-ihl/eng/docs/v1_cha_chapter1_rule1

Department of The U.S. Army. (2010). U.S. Army Weapons Systems 2010. Arlington, VA: OASA (ALT). Retrieved from. https://fas.org/man/dod-101/sys/land/wsh2010/wsh2010.pdf.

Ferreira, M. I. A., Sequeira, J. S., Tokhi, M. O., Kadar, E., & Virk, G. S. (2015). A world with robots. In International conference on robot ethics: ICRE (Vol. 84).

Fleck, D., & Bothe, M. (2013). The handbook of international humanitarian law. Oxford, United Kingdom: Oxford University Press.

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

71

Fry, J. D. (2013). The XM25 Individual Semi-Automatic Airburst Weapon System and International Law: Landing on the Wrong Planet. UNSWLJ, 36, 682.

ICRC. (2014) Autonomous Weapon Systems - Q & A. Publication. Retrieved from https://www.icrc.org/en/document/autonomous-weapon-systems-challenge-human-control-over-use-force

Kastan, B. (2013). Autonomous weapons systems: a coming legal singularity. U. Ill. JL Tech. & Pol'y, 45.

Kelleher, C. M., & Dombrowski, P. (Eds.). (2015). Regional Missile Defense from a Global Perspective. Stanford University Press.

Krishnan, A. (2009). Killer robots: legality and ethicality of autonomous weapons. Ashgate Publishing, Ltd..

Leveringhaus, A. (2016). Ethics and autonomous weapons. Springer. Liu, H. Y. (2012). Categorization and legality of autonomous and remote

weapons systems. Int'l Rev. Red Cross, 94, 627. Marchant, G. E., Allenby, B., Arkin, R., & Barrett, E. T. (2011).

International governance of autonomous military robots. Colum. Sci. & Tech. L. Rev., 12, 272.

Marchuk, I. (2015). Fundamental Concept of Crime in International Criminal Law. Springer.

Nehal, B., Beck, S., Geiss, R., Liu, H. Y., & Kress, K. (2016). Autonomous Weapons Systems: Law. Ethics, Policy. Cambridge: Cambridge University Press.

Pilloud, C., Sandoz, Y., Swinarski, C., & Zimmermann, B. (Eds.). (1987). Commentary on the additional protocols: of 8 June 1977 to the Geneva Conventions of 12 August 1949. Martinus Nijhoff Publishers.

Pocar, F., Pedrazzi, M., & Frulli, M. (Eds.). (2013). War crimes and the conduct of hostilities: challenges to adjudication and investigation. Edward Elgar Publishing.

Pocar, F., Pedrazzi, M., & Frulli, M. (Eds.). (2013). War crimes and the conduct of hostilities: challenges to adjudication and investigation. Edward Elgar Publishing.

Regulations, H. (1907). Convention (IV) respecting the Laws and Customs of War on Land and its Annex: Regulations concerning the Laws and Customs of War on Land. Geneva: International Committee of the Red Cross.

Journal of Law and Society Law College Vol. L, No. 75 University of Peshawar July 2019

72

Report to the Third Committee on the Work of the Working Group Committee III, Doc No CDDH/III/293 in Levie, Howard S. (1980) Protection of War Crimes: Protocol 1 to the 1949 Geneva Conventions. Ocena Publications.

Roach, J. A. (1984). Protection of War Victims: Protocol I to the 1949 Geneva Conventions. Vol. IV. By Howard S. Levie. Dobbs Ferry: Oceana Publications, Inc., 1981. Pp. xiii, 535. Index. $37.50. American Journal of International Law, 78(2), 540-541.

Schmitt, M. N. (2011). Essays on law and war at the fault lines. Springer Science & Business Media.

Sharkey, N. (2010). Saying ‘no!’to lethal autonomous targeting. Journal of military ethics, 9(4), 369-383

Solis, G. D. (2016). The law of armed conflict: international humanitarian law in war. Cambridge University Press.

St Petersburg. (1868) First International Agreement Prohibiting the Use of Certain Weapons. Retrieved from https://hhr-atlas.ieg-mainz.de/articles/jevglevskaja-st_petersburg

The UN. (10 October 1980). Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects. Geneva. Retrieved April 23, 2020, from https://www.un.org/en/genocideprevention/documents/atrocity-crimes/Doc.38_CCW.pdf