Autonomous Weapons and the Future of Warfare: Legal and Ethical Implications

Jyoti Bhakta

Vellore Institute Of Technology, Vit Chennai

This Blog is written by Jyoti Bhakta, a First-Year Law Student of Vellore Institute of Technology, Vit Chennai

Abstract

Autonomous weapons systems (AWS) represent a paradigm shift in modern warfare, offering unprecedented capabilities while raising profound ethical and legal questions [1]. This article explores the multifaceted implications of AWS on the future of conflict, examining their potential applications, challenges, and societal impact. AWS, defined as weapons capable of selecting and engaging targets without human intervention, are rapidly evolving from semi-autonomous systems to fully independent platforms. Their development is reshaping military strategies, with emerging trends pointing towards unmanned and AI-driven warfare [2]. Future conflicts may involve large-scale deployments of autonomous systems, including drone swarms and AI-powered decision-making tools, potentially leading to combat scenarios that unfold at speeds beyond human reaction times. The integration of AWS into military operations promises increased precision, reduced human casualties, and enhanced operational efficiency [3]. However, these advantages come with significant risks, including vulnerabilities to cyber attacks, unpredictable behavior, and the potential lowering of barriers to conflict initiation. Legally, AWS challenges existing frameworks of international humanitarian law, particularly in areas of accountability, distinction between military and civilian targets, and proportionality in the use of force. Efforts to create new legal structures governing AWS use are ongoing but face obstacles in achieving global consensus. Ethically, the delegation of lethal decision-making to machines raises critical questions about human dignity, moral responsibility, and the fundamental nature of warfare. The removal of human judgment from battlefield decisions could lead to a dehumanization of conflict, with profound implications for military ethics and global security. The article also examines the strategic implications of AWS, including its impact on military doctrine, force structure, and the global balance of power. It explores the technological challenges in ensuring the reliability and control of autonomous systems, as well as the potential for breakthrough advancements in AI to further complicate these issues. International perspectives on AWS vary, with major powers investing heavily in their development while smaller nations and non-state actors grapple with the implications of this technological shift.

1. See Asaro, Peter. "On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-making," International Review of the Red Cross 94, no. 886 (2012): 687-709.

2. Scharre, Paul. Army of None: Autonomous Weapons and the Future of War (New York: W.W. Norton & Company, 2018), 3-5.

3. Anderson, Kenneth, and Matthew C. Waxman. "Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can," Stanford University Hoover Institution Research Paper (2013): 1-32.

The role of international organizations and NGOs in shaping the debate and advocating for ethical guidelines is also discussed. Ultimately, the article emphasizes the need for proactive governance and ongoing international dialogue to address the challenges posed by AWS. It calls for balanced approaches that combine regulation, ethical oversight, and technological innovation to ensure that the development and use of autonomous weapons align with international law and humanitarian principles.

Keywords: Autonomous Weapons, Future Warfare, International Humanitarian Law, Human Dignity, Reduce Human Risk.

1. Introduction

Autonomous weapons systems (AWS) are advanced military weaponry that can operate independently, make decisions, select, and engage targets without human intervention [4]. Thus, these systems use artificial intelligence (AI) and machine learning algorithms to make critical decisions in battle, raising certain ethical and legal concerns. To set AWS in perspective, they represent great technological advancement from earlier weapons such as landmines and automated defense systems down to historically dated remotely operated systems, like the U.S. Predator drones. Countries like the United States, Russia, China, and Israel are heavily investing in AWS research and development today [5]; many systems are already being used, such as Israel's Iron Dome missile defense system [6]. Whereas most current AWS rely on human intervention for critical decisions, the trend is toward fully autonomous systems. This trend is transforming warfare, as battlefield decisions become increasingly reliant on data, automation, and AI, thus lessening the need for human soldiers. Machine learning algorithms enable AWS to improve and adapt with time as they detect new threats. The machine-driven nature of the new combat introduces a new landscape of rapid warfare that fundamentally transforms military strategies, in which speed and precision are key. AI, robotic, and cyber capabilities will increasingly dominate warfare, where unarmed systems like aerial vehicles, drones, and robotic ground vehicles already perform tasks from surveillance to precision strikes. If further advancements unfold, autonomous systems might handle a broader spectrum of operations, including logistics and offensive strikes, thus cutting down on casualties while enhancing operational efficiency [3]. However, these advancements raise significant concerns about unintended consequences, such as accidental escalations or system malfunctions in high-stakes scenarios. Future conflicts may involve large-scale deployments of AWS, including drone swarms and autonomous cyber warfare tools, where systems engage in combat at speeds beyond human reaction time. AI will play a critical role in analyzing vast amounts of data, predicting enemy movements, and optimizing resource allocation. The introduction of cyber warfare and AWS creates new vulnerabilities since systems may be vulnerable to a cyber attack, which complicates future warfare dynamics.

2. Autonomous Weapons in Future Warfare

AWS is not the only future involving aerial drones. Autonomous systems could be deployed across all domains: air, land, sea, and space [7].

4. Work, Robert, and Shawn Brimley. "20YY: Preparing for War in the Robotic Age," Center for a New American Security Report (2014): 7-9.

5. Horowitz, Michael C. "The Promise and Peril of Military Applications of Artificial Intelligence," Bulletin of the Atomic Scientists 97, no. 1 (2019): 1-3.

6. See United Nations Institute for Disarmament Research, "The Weaponization of Increasingly Autonomous Technologies," UNIDIR Resources no. 2 (2020), https://unidir.org/technology-and-weapons, accessed October 18, 2024.

7. Roff, Heather M. "The Strategic Robot Problem: Lethal Autonomous Weapons in War," Journal of Military Ethics 13, no. 3 (2014): 211-227.

The deployment of ground-based robotic soldiers, autonomous submarines, and self-flying drones represent only some of the applications [8]. These systems could surveil, conduct reconnaissance missions, and engage in direct attack operations on behalf of human operatives [9]. Even though truly autonomous weapons are still on the drawing board, there are many indications that the future for autonomous systems does appear to involve coordination with human soldiers, and hybrid operations. Thus, while drone air support may be provided to infantry units, autonomy with logistics will relieve the manpower strain, by taking it off the human personnel. The combination of AI with traditional weaponry will amplify combat effectiveness by emotionally energizing human instinct against mechanical precision. Swarm tactics employ multiple of autonomous systems working in concert toward a common goal. This swarm could consist of drones or robotic units that attacked the enemy's defenses from multiple angles, thereby overwhelming it. The use of swarm tactics has the prospect of changing the military strategy by bringing a lot of speed, complexity, and saturation in attacking, thus giving a tough time for human adversaries to respond effectively. This tactic is already being simulated in military exercises and could occupy the heart of warfighting in the future. Surely, AWS will do wonders to change the scape of battles. Autonomous systems can think and act orders of magnitude faster than humans, allowing for a response capability on the order of seconds for warfare. So, the battles of the future will very likely be fought at far greater speeds, not taking minutes or hours, but probably rather only seconds between decisions and actual engagement this case, seconds of decision targeted upon real-time evolving strategic options. Increased speed could lead to larger conflicts as well, with autonomous systems being able to continue to function to great lengths without becoming fatigued and without resting, thus opening up extended durations of combat operations.

3. Legal Framework

3.1 Existing International Laws and Treaties

International Humanitarian Law (IHL) pertains to the conduct of wars, aimed at reducing civilian harm with provisions for fighters to adhere to principles of distinction and proportionality [10].

8. NATO Science & Technology Organization, "Autonomous Systems – Issues for Defence Policymakers," STO Technical Report (2023), https://www.sto.nato.int, accessed October 18, 2024.

9. Ilachinski, Andrew. "AI, Robots, and Swarms: Issues, Questions, and Recommended Studies," CNA Analysis & Solutions (2017): 35-40.

10. Heyns, Christof. "Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions," U.N. Doc. A/HRC/23/47 (April 9, 2013): 7-10.

The important statutes in this regard include the Geneva Conventions and the UN Convention on Certain Conventional Weapons (CCW) [11]. But as to the first quenches of the existence of AWS, of necessity, they hence bore some inherently large gaps that had little or no applicability whatsoever.

3.2 Challenges in Applying Current Laws to Autonomous Weapons

AWS poses challenges to many elements of IHL [12]. For example, the principle of distinction obliges adversary forces to identify military from civilian targets. How will AWS identify these things-level targets with no direct human inputs? Similar to this, proportionality—that is, - does not permit harm to be magnified anyhow greater than the advantage anticipated in design to target—is difficult given that the AWS engages in making targeting decisions themselves. Ensuring the accountability of autonomous systems' actions remains to be resolved within current international law.

3.3 Efforts to Create New Legal Frameworks

International organizations and NGOs call for the enactment of new laws to foster the desired regulation of AWS. Some suggest that fully autonomous weapons be treated similarly to the bans on chemical and biological weapons. Others argue for creating guidelines to ensure meaningful human control over AWS [8]. Discussions on the matter have thus far occurred in the UN Group of Governmental Experts on Lethal Autonomous Weapons Systems, but reaching worldwide agreement has continued to be a challenge.

3.4 Potential Need for New Laws Governing AI and Autonomous Systems in War

As AWS enters mainstream use, there will be a growing need to establish specific laws to govern them. Such laws will have to reflect the unique challenges posed by AI, such as responsibility, transparency, and the limits on machine decision-making. A new legal framework could serve to ensure that autonomous systems are operated and founded within the domains of IHL while respecting human rights.

4. Ethical Considerations

4.1 Moral Responsibility and Accountability

One of the most prominent ethical issues attached to Autonomous Weapons Systems revolves around accountability-finding out who is responsible for their actions [13].

11. Geneva Academy of International Humanitarian Law, "Autonomous Weapon Systems under International Law," Research Paper Series no. 8 (2023), https://www.geneva-academy.ch/research/our-project/detail/22, accessed October 18, 2024.

12. [^12] See Sharkey, Noel. "The Evitability of Autonomous Robot Warfare," International Review of the Red Cross 94, no. 886 (2012): 787-799.

13. Asaro, Peter. "The Liability Problem for Autonomous Artificial Agents," AAAI Spring Symposium Series (2016): 190-194.

If an Autonomous Weapon System is guilty of the crime of war-related violations, will it be the programmer of the system, the commander who turned the system on, or the system itself? AWS presents a challenge to the traditional notion of accountability because machines cannot hold responsibility for the consequences of their actions, moral or otherwise. This raises concerns regarding the impunity of those deploying such systems.

4.2 Human Dignity and the Value of Human Decision-Making

At the heart of the considerations is whether it is ethically acceptable for machines to make life-and-death decisions. Critics argue that delegating those decisions to machines, tends to diminish human dignity [14] because machines cannot develop moral reasoning or demonstrate empathy about the intricacies of moral judgment on the battlefield. The abrogation of decision makers in execution from the category of killing plausibly becomes a most fundamental moral issue [15].

4.3 Potential for Reduced Human Casualties Against Lowering Barriers of Warfare

AWS proponents argue these systems would save human lives by keeping soldiers out of harm's way, and greatly improving the precision of military strikes. But there is the fear that AWS could ease the entrance into war, in that without unnecessarily risking human lives, governments might find themselves, much more readily than before, going into wars that, prima facie, might otherwise not be justified if considerable loss of troops was involved.

4.4 Ethical Implications of Transferring Lethal Decision-Making to Machines

Delegation of lethal decisions to machines raises some ethically profound questions. Can a machine be entrusted with discretion to follow rules of ethics, particularly due to the lack of an unambiguous solution or in circumstances forming ambiguities that appeal to emotional intelligence? The use of Automated Warfare Systems would lead to dehumanization, whereby human life is termed less than efficient from the military point of view.

5. Military and Strategic Implications

AWS presents its benefits on the battlefield, including more accuracy, reduced human error, the capability of entering into a mission in any environment unsafe for human life, etc [16]. AWS operates continuously, working 24/7, with no rest, supporting continuous operations.

14. IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, "Ethically Aligned Design," Version 2 (2023), https://ethicsinaction.ieee.org, accessed October 18, 2024.

15. Campaign to Stop Killer Robots, "Critical Analysis of Autonomous Weapons Systems," Global Review Report (2023), https://www.stopkillerrobots.org/learn, accessed October 18, 2024.

16. Horowitz, Michael C. "When Speed Kills: Lethal Autonomous Weapon Systems, Deterrence and Stability," Journal of Strategic Studies 33, no. 1 (2019): 5-7.

Their speed and autonomy may likely offer them a significant upper hand in high-intensity conflicts, where speedy decision-making becomes crucial. That said, AWS has substantial risks. Autonomous systems are susceptible to remote control or blocking by hackers, thereby subjecting these advanced technology systems to the enemy. The indeterminacy of AI systems also adds mantra to the danger wee glitches in programming may birth unacceptable results on the battlefield. Exaggerated dependence on AWS may birth strategic weaknesses, as militaries become reliant upon these systems. The rise of AWS hammered military incumbents to rethink doctrines, strategies, and force structuring [17]. Age-old strategies relying on humans would have to make way for the rise of autonomous systems. The unique idea of designating new military units dedicated to autonomous warfare becomes tangible, followed by alterations in the prioritization of military asset allocation. With that, AWS might tilt the global balance of power as nations with capable AI and robotics would be militarily superior for sure. Such tools could bring in new forms of deterrence that threaten any potential aggressor with a wielding autonomous system [18]. Yet, an autonomous arms race could emerge as countries compete for the ultimate AWS technology.

6. Technological Considerations

6.1 Present Capabilities and Limitations of Autonomous Systems

According to the report on technology and systems that can be largely perceived as autonomous, AWS has shown remarkably good developments [19]. However, AWS continues to face considerable limitations. Most of the current systems are still using pre-programmed algorithms, and this greatly hinders their capacity to adapt to unanticipated circumstances. Also, as artificial intelligence systems make autonomous decisions, complex, dynamic environments, especially in warfare, hinder the AI systems by requiring demands of ethical judgment and contextual understanding during decision-making.

6.2 The Role of Artificial Intelligence and Machine Learning in Future Weapons Systems

AI and machine learning will play a major role in developing AWS [20]. The operations of machine learning algorithms are such that the system learns and adapts better with experience [21].

17. SIPRI Military Expenditure Database, "Global Military Spending on Autonomous Systems," Annual Report (2023), https://www.sipri.org/databases/milex, accessed October 18, 2024.

18. Belfer Center for Science and International Affairs, "The AI-Military Nexus," Strategic Analysis Report (2023), https://www.belfercenter.org/project/artificial-intelligence-and-security, accessed October 18, 2024.

19. Cummings, M.L. "Artificial Intelligence and the Future of Warfare," Chatham House Research Paper (2017): 5-7.

20. Russell, Stuart, and Peter Norvig. Artificial Intelligence: A Modern Approach, 3rd ed. (Upper Saddle River: Prentice Hall, 2010), 27-28.

21. DARPA, "AI Next Campaign Technical Report," Defense Advanced Research Projects Agency (2023), https://www.darpa.mil/work-with-us/ai-next-campaign, accessed October 18, 2024.

That is, they could improve their performance over time. For instance, such an improved performance could be noted when the autonomous drone is sent on an intelligence-gathering mission and learns to deal with the individual navigation of obstacles in a hostile environment, as well as make an improved identification of enemy targets. The biggest challenge is to ensure that all these systems are predictable and continue being reliable, even in situations that demand high stakes.

6.3 Challenges in Maintaining Reliability, Predictability, and Control

Perhaps the greatest challenge facing AWS will be how to make them reliable and predictable in the frontline. AI systems by their very nature are opaque: this makes it exceedingly difficult to predict how they will behave in complex environments. Although human oversight and control are important to stave off unintended consequences, it becomes an extremely challenging technological question when the system becomes highly automated.

6.4 Potential Technological Breakthroughs and Their Implications

As AI technology continues to advance, the future may bring about breakthroughs allowing for greater autonomy and enhanced decision-making and reliability in the hands of the AWS. Such enhancement will yield systems suited to independent reasoning, moral judgment, and complex decision-making. There are however concerns that AI could grow beyond human control to situations where machines could make decisions that a human could not fully understand or anticipate.

7. International Perspectives

7.1 Positions of Major World Powers on AWS

Nations have advanced different positions in AWS development [22]. The United States and Russia, both seeing through national military powers the wondrous prospects for securing technological superiority through AWS technology development [23], are by and large, heavily investing in this. China, again, sustains being on the right track to being the main AI and ML force, meanwhile, the EU prefers a slower onset, calling on the remaining states to restrict the use of fully autonomous lethal systems [24].

7.2 Concerns of Smaller Nations and Non-State Actors

By an AWS race among great powers, smaller nations are left to grapple with a security dilemma. They will lack sufficient resources to keep pace in AWS arms competition, and therefore fall victim to foreign power's technological dominance.

22. Human Rights Watch, "Losing Humanity: The Case against Killer Robots," Report (2012): 3-5.

23. United Nations Office for Disarmament Affairs, "Convention on Certain Conventional Weapons Review Conference Report" (2023), https://www.un.org/disarmament, accessed October 18, 2024.

24. Stockholm International Peace Research Institute, "Emerging Military and Security Technologies," Annual Review (2023), https://www.sipri.org/research/emerging-military-and-security-technologies, accessed October 18, 2024.

On the other hand, non-state actors-in terms of terrorists-could also bring up such asymmetric warfare using these autonomous systems, like relatively cheap drones or hacking into the AWS platform.

7.3 Role of International Organizations and NGOs

International organizations, such as the UN, and NGOs, such as Human Rights Watch, are directly involved in molding o and expanding the debate concerning AWS. This is generally centered around proposing global norms that safeguard the moral use of autonomous systems, with some ever have urged for the outright ban of fully autonomous weapons. One of the billions of campaigns to preclude AWS is the Campaign to Stop Killer Robots.

8. Societal and Economic Impact

8.1 Probable Changes in the Labour Market and Needs of Military Personnel

The introduction of AWS might bring roles and responsibilities in the military sector to a radical change [25]. The more automation that occurs, the less the demand today for regular soldiers will be felt and more preference will be given to technical specialists who can design, manage, and maintain self-driving systems under the supervision of an overseer. People with skills in AI, robotics, and cyber security may therefore get highly valued but there may remain redundancy in terms of military personnel strength.

8.2 How AWS Would Achieve the Society's Approval and Acceptance

Society's acceptance as well as perception of AWS will bear much on the future use of these systems [26]. Technology may be supported in public opinion if it is argued that such technologies would lessen the risk of soldiers becoming casualties of war; conversely, the very idea of making such life-or-death decisions at the behest of machines is deeply troubling. Public support for AWS would depend on their requirements for transparency, accountability, and oversight consistent with ethical principles.

8.3 Economical Impact of the Investment in Autonomous Military Technologies

Investing in AWS could have wide and profound economic implications for nations and armies [27], particularly in terms of defense spending allocations. The initiation of the design cycle in the development and upkeep of advanced autonomous systems will require substantial investment which could subsequently accrue changes to various portfolios and budgets in defense spending. It is perhaps also valid to note, nonetheless, that AWS could offer offsets through savings accrued through reduced personnel leasing and risk to human life.

25. Manyika, James, et al. "Jobs Lost, Jobs Gained: Workforce Transitions in a Time of Automation," McKinsey Global Institute Report (2017): 53-55.

26. World Economic Forum, "The Future of Jobs Report 2024," Global Analysis (2023), https://www.weforum.org/reports/the-future-of-jobs-report-2023, accessed October 18, 2024.

27. Brookings Institution, "AI and Future Warfare: Societal Implications," Policy Brief (2023), https://www.brookings.edu/research/ai-and-future-warfare, accessed October 18, 2024.

9. Proposed Solutions and Regulations

9.1 International Bans or Restrictions on Autonomous Weapons

Some might suggest complete bans against fully autonomous weapons, akin to those levied on chemical and biological weaponry [28]. The proponents of this view argue that, about AWS, moral dilemmas and unintended consequences far exceed the gains. Notably, gaining international consensus on such a possible ban is immensely hard given the ambitions and causes of certain states opposed to restricting their usage of autonomous systems [19].

9.2 Regulatory Frameworks for the Development and Use of AI in Warfare

Instead of a ban, some experts would favor the development of a regulated framework under which AI could operate in warfare. By laying down rules of conduct ensuring human oversight, accountability, and ethical decision-making in AWS, nations following this model could continue to develop autonomous systems in compatibility with international laws and ethical conduct [29].

9.3 Transparency and Confidence-Building Measures

Transparency is then the key to building trust among nations regarding the use of AWS [30]. Confidence-building measures could include the sharing of information concerning AWS development, joint military exercises, and verification mechanisms to reduce the thin line that separates misunderstanding and escalation. The confirmed measures shall then bolster international cooperation on AI safety and the approach to the responsible use of autonomous technology.

10. Conclusion

AWS is a game-changer in revolutionizing the best methodologies for the very future of warfare. Despite all the mythical praises sung about the advantages AWS provides in terms of augmenting efficiency, and precision, and minimizing exposure to risks, there are crucial ethical, legal, and strategic challenges that must be addressed as they pose serious threats. Therefore, as AWS continues to progress, it is advisable to address these challenges through international cooperation and the establishment of robust legal frameworks. Violation mechanisms for radical systems have been in line with treaty violations for centuries and offer certain benefits for nations as well as corporations.

28. Scharre, Paul, and Michael C. Horowitz. "An Introduction to Autonomy in Weapon Systems," Center for a New American Security Report (2015): 12-14.

29. Arms Control Association, "Autonomous Weapons Systems: Technical, Military, Legal and Humanitarian Aspects," Policy Analysis (2023), https://www.armscontrol.org/subject/55/date, accessed October 18, 2024.

30. United Nations Institute for Disarmament Research, "The Weaponization of Increasingly Autonomous Technologies: Regulatory Frameworks," UNIDIR Publication (2023), https://unidir.org/publications/autonomous-weapons, accessed October 18, 2024.

The rapid advancement of the military-industrial sector, especially AMS, underlines the need for a continued conversation on the ethical and legal conception of autonomous warfare between nations. The future of AWS, however, is to be driven by the moves of nation-states; the world could thus benefit from following a most balanced approach between regulation, ethical oversight, and development the principles underlying measures for technological regulations must be widely consulted, and gradually embraced. Slightly paradoxically, one can assert that a radical transformation in the job profiles of the soldier may be forthcoming-thus necessitating a call for stronger international regulation in its place to avert entirely autonomous warfare, in as much as the latter would amount to the difference between a more humanistic global order versus a mechanized warfare with no blind eye. But at the end of the day, a human-being063---some more than others--sits in the cockpit; autonomous warfare in the future may see increasing autonomy but not being made accountable for it.

Reference

1. See Asaro, Peter. "On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-making," International Review of the Red Cross 94, no. 886 (2012): 687-709.

2. Scharre, Paul. Army of None: Autonomous Weapons and the Future of War (New York: W.W. Norton & Company, 2018), 3-5.

3. Anderson, Kenneth, and Matthew C. Waxman. "Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can," Stanford University Hoover Institution Research Paper (2013): 1-32.

4. Work, Robert, and Shawn Brimley. "20YY: Preparing for War in the Robotic Age," Center for a New American Security Report (2014): 7-9.

5. Horowitz, Michael C. "The Promise and Peril of Military Applications of Artificial Intelligence," Bulletin of the Atomic Scientists 97, no. 1 (2019): 1-3.

6. See United Nations Institute for Disarmament Research, "The Weaponization of Increasingly Autonomous Technologies," UNIDIR Resources no. 2 (2020), https://unidir.org/technology-and-weapons, accessed October 18, 2024.

7. Roff, Heather M. "The Strategic Robot Problem: Lethal Autonomous Weapons in War," Journal of Military Ethics 13, no. 3 (2014): 211-227.

8. NATO Science & Technology Organization, "Autonomous Systems – Issues for Defence Policymakers," STO Technical Report (2023), https://www.sto.nato.int, accessed October 18, 2024.

9. Ilachinski, Andrew. "AI, Robots, and Swarms: Issues, Questions, and Recommended Studies," CNA Analysis & Solutions (2017): 35-40.

10. Heyns, Christof. "Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions," U.N. Doc. A/HRC/23/47 (April 9, 2013): 7-10.

11. Geneva Academy of International Humanitarian Law, "Autonomous Weapon Systems under International Law," Research Paper Series no. 8 (2023), https://www.geneva-academy.ch/research/our-project/detail/22, accessed October 18, 2024.

12. [^12] See Sharkey, Noel. "The Evitability of Autonomous Robot Warfare," International Review of the Red Cross 94, no. 886 (2012): 787-799.

13. Asaro, Peter. "The Liability Problem for Autonomous Artificial Agents," AAAI Spring Symposium Series (2016): 190-194.

14. IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, "Ethically Aligned Design," Version 2 (2023), https://ethicsinaction.ieee.org, accessed October 18, 2024.

15. Campaign to Stop Killer Robots, "Critical Analysis of Autonomous Weapons Systems," Global Review Report (2023), https://www.stopkillerrobots.org/learn, accessed October 18, 2024.

16. Horowitz, Michael C. "When Speed Kills: Lethal Autonomous Weapon Systems, Deterrence and Stability," Journal of Strategic Studies 33, no. 1 (2019): 5-7.

17. SIPRI Military Expenditure Database, "Global Military Spending on Autonomous Systems," Annual Report (2023), https://www.sipri.org/databases/milex, accessed October 18, 2024.

18. Belfer Center for Science and International Affairs, "The AI-Military Nexus," Strategic Analysis Report (2023), https://www.belfercenter.org/project/artificial-intelligence-and-security, accessed October 18, 2024.

19. Cummings, M.L. "Artificial Intelligence and the Future of Warfare," Chatham House Research Paper (2017): 5-7.

20. Russell, Stuart, and Peter Norvig. Artificial Intelligence: A Modern Approach, 3rd ed. (Upper Saddle River: Prentice Hall, 2010), 27-28.

21. DARPA, "AI Next Campaign Technical Report," Defense Advanced Research Projects Agency (2023), https://www.darpa.mil/work-with-us/ai-next-campaign, accessed October 18, 2024.

22. Human Rights Watch, "Losing Humanity: The Case against Killer Robots," Report (2012): 3-5.

23. United Nations Office for Disarmament Affairs, "Convention on Certain Conventional Weapons Review Conference Report" (2023), https://www.un.org/disarmament, accessed October 18, 2024.

24. Stockholm International Peace Research Institute, "Emerging Military and Security Technologies," Annual Review (2023), https://www.sipri.org/research/emerging-military-and-security-technologies, accessed October 18, 2024.

25. Manyika, James, et al. "Jobs Lost, Jobs Gained: Workforce Transitions in a Time of Automation," McKinsey Global Institute Report (2017): 53-55.

26. World Economic Forum, "The Future of Jobs Report 2024," Global Analysis (2023), https://www.weforum.org/reports/the-future-of-jobs-report-2023, accessed October 18, 2024.

27. Brookings Institution, "AI and Future Warfare: Societal Implications," Policy Brief (2023), https://www.brookings.edu/research/ai-and-future-warfare, accessed October 18, 2024.

28. Scharre, Paul, and Michael C. Horowitz. "An Introduction to Autonomy in Weapon Systems," Center for a New American Security Report (2015): 12-14.

29. Arms Control Association, "Autonomous Weapons Systems: Technical, Military, Legal and Humanitarian Aspects," Policy Analysis (2023), https://www.armscontrol.org/subject/55/date, accessed October 18, 2024.

30. United Nations Institute for Disarmament Research, "The Weaponization of Increasingly Autonomous Technologies: Regulatory Frameworks," UNIDIR Publication (2023), https://unidir.org/publications/autonomous-weapons, accessed October 18, 2024.