AI in warfare: Loitering Munitions – Current Applications and Legal Challenges

  Focus - Allegati
  26 febbraio 2024
  28 minuti, 22 secondi

By: Francesco Ancona – Junior Researcher G.E.O, Area Difesa&Sicurezza

Abstract

With the current rapid evolution of artificial intelligence and machine learning programs, most states are prioritizing the development of new and improved autonomous systems, with the intent to carry out several missions with a higher degree of rapid reaction time, reduced costs, and less manpower. Modern warfare is becoming more and more autonomous, with several functions now being delegated to increasingly complex programs and systems, including combat. The purpose of this article is to analyze the military applications of weapon systems using AI, specifically loitering munitions. Their current capabilities, their employment in the Nagorno-Karabakh conflict (2020), and in the more recent Ukraine-Russia war (2022), while also taking into consideration the legal and ethical challenges that current and future systems will face, will be analyzed. The article will combine a security approach on the current applications, and future trend, of different autonomous systems, along with a legal and ethical approach aimed at discerning the main challenges.

Introduction

Throughout history, war has always been conducted as primarily an act of friction by and between men, “a physical contest between people, each using force to compel our enemy to do our will” (Clausewitz, 1832). In this contention, technology has been playing a key role in warfare, often re-shaping how combat is conducted through its successful doctrinal employment. Indeed, technological breakthroughs such as the invention of firearms and trains, the development of mechanized, and later armored vehicles, as well as combat aircrafts, and finally the creation of nuclear weapons, have each shifted not only the dynamics of warfare tactics, but also the strategy of how wars should be waged (Fuller, 1961; Currie, 2022). In the last couple of decades, technological achievements in the field of Artificial Intelligence (AI) have allowed for newer systems to be increasingly integrated in many aspects of our daily lives, with current uses in the domain of information and communication technologies (for navigation, social media algorithms, etc.), industry (for process automation and optimization), marketing and sales and even in healthcare (Burke, 2022; Limbasiya, 2023). AI systems have also been making their way in the military industrial complex, both as platforms providing supporting functions, such as intelligence, surveillance, navigation, and enhanced Command and Control (C2) capabilities (Lingel et al., 2020; Smith, Savage, 2023), as well as platforms assisting with different complex tasks of identifying and selecting targets, and carrying out strikes. This latter function of AI, its current applications, and the challenges for International Humanitarian Law that such capabilities are facing, will be the subject of analysis of this paper. However, before exploring their military capabilities, a clarification is needed with regards to what is Artificial Intelligence, what differentiate autonomous systems from remote-controlled platforms, and how human and machine interact with each other.

Different levels of autonomy and current applications of AI in warfare

There are many different definitions of AI. At its simplest form, AI can be defined as a system, program, or machine capable of quickly performing different complex tasks with human-like intelligence (Laskowski, 2023; Council of Europe, 2022; IBM, 2023). Because of the widespread outreach and possible applications (and implications) of AI technologies across all domains, it is no surprise that most States militaries have implemented their own AI strategy to leverage the technology’s inherent advantages such as increased reaction time, reduced costs, and better defense again cyber-threats (UK Ministry of Defence, 2022; US DoD, 2023). In terms of integration with weapon systems, a distinction should be made between two different categories: a) “automated”; b) “autonomous”. According to the International Committee of the Red Cross (ICRC), one of the leading entities in this field, these greatly differ between one another in terms of level of autonomy, functionalities (i.e the tasks they can perform, as well as their complexity), and, most importantly, their level of human control or supervision (ICRC, 2014). Automated systems can be defined as weapons that are “not remotely controlled, but function in a self-contained and independent manner once deployed”. Automated sentry guns, sensor-fused munitions and certain anti-vehicle landmines fall in such category (Kellenberger, 2011). Based on this definition, some military bodies have argued that Unmanned Air Systems (UAS), such as drones, should not be considered neither fully “automated”, as they can be remotely piloted, nor totally remote either, as functions such as navigation, take-off and landing can be “automated” (House of Commons, 2014; UK MoD, 2017). The UK MoD attempted to provide an all-encompassing definition of automated systems, i.e those that “in response to inputs from one or more sensors, are programmed to logically follow a pre-defined set of rules in order to provide a predictable outcome” (Ibid; House of Lords, 2023).

On the other side of the coin, there are “autonomous” weapon systems, which in the future could be integrated by more advanced forms of AI. Although no single agreed definition exists, many military manuals and international bodies seem to agree on some core features. For example, the ICRC defines autonomous weapon system as meaning “a weapon that can select and apply force to targets without human intervention (ICRC, 2015; ICRC, 2022); the US DoD defines them as “weapon systems that, once activated, can select and engage targets without further intervention by a human operator” (US DoD, 2023), while the UK MoD provides the following definition: “systems capable of understanding higher level intent and direction; from this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state; it is capable of deciding a course of action, from a number of alternatives” (UK MoD, 2017). Thus, the common features of autonomous weapon systems that can be discerned from these definitions are: a) that such systems are able to perform several functions; b) that they can operate without the need of human supervision and/or approval; c) that (at least in the future) they are also able to dynamically adapt according to changes occurring to their surrounding environment.

Some military departments have also provided a further categorization of autonomous weapon systems based on their complexity. The U.S DoD, for example, identifies three major categories in relation to autonomous weapon systems, dividing them according to their level of automation and control: 1) “autonomous weapon system” (as defined above); 2) ”human-supervised autonomous weapon system”: designed to provide human operators with the ability to intervene and terminate engagements; 3) “semi-autonomous weapon system”: once activated is intended to only engage individual targets or specific target groups that have been selected by a human operator (US DoD, 2023). Finally, autonomous weapon systems can be also be subdivided into three different categories depending on their level of human-machine integration, control and supervision: 1) “Human-in-the-Loop Weapons”: systems that can select targets and deliver force only with a human command; 2) “Human-on-the-Loop Weapons”: systems that can select targets and deliver force under the oversight of a human operator who can override its actions; and 3) “Human-out -of-the-Loop Weapons”: systems that are capable of selecting targets and delivering force without any human input or interaction (Human Rights Watch, 2012; US Air Force, 2010; ICRC, 2014).

Some examples of currently fielded weapon systems with semi-autonomous capabilities, falling into the category of “Human-on-the-Loop”, include the Phalanx 1-B, Close-in Weapon System (CIWS), a ship-based 20 mm gun system that autonomously detects, tracks and attacks targets (Raytheon, 2023); Counter rocket, and anti-air platforms such as the Israeli Iron Dome and the German Oerlikon Skyshield, both capable of detecting, tracking, selecting and engaging autonomously (Rafael, 2023; Rheinmetall, 2023). Another category of weapon systems which currently possess some degrees of autonomous functions are loitering munitions, the capabilities and employment of which will be analyzed in the next sections.

Loitering Munitions: current capabilities

The proliferation and complexity of loitering munitions has seen a rapid increase in the past decade. Today, more than 20 States are producing and employing such systems, and the trend is expected to increase in the coming years (Gettinger, 2022; Gettinger, 2023; Gettinger, Michel, 2017). Unlike drones, loitering munitions are a type of unmanned aerial vehicle designed to identify, track, and engage beyond-visual range with an explosive warhead, of varying weight, upon impact with the target (Gettinger, Michel, 2017). Loitering munitions are designed to be portable, easy to launch, and single use, making them a cost-effective, safer, and more flexible alternative to artillery and complex missile systems. Indeed, thanks to these features, they are capable (according to the manufacturers) of conducting multiple types of missions (intelligence, surveillance, reconnaissance, precision strike, counter-battery, etc.) while loitering above a certain area for an extended period, allowing for greater decision-making options (U-Vision, 2023; IAI, 2023;Gettinger, Michel, 2017). While most of the currently employed loitering munitions’ tasks are automated, such as take-off, and landing, more advanced systems possess different levels of autonomous capabilities, such as navigation, target detection, tracking, and some even engagement. Indeed, loitering munitions such as the Israeli Harpy and Harop drones, the Russian Lancet-3, the Turkish Kargu-2, as well as the American Switchblades (300 and 600 series) are all equipped with GPS guidance, electro-optical and infrared sensors with image processing, allowing them (with varying degree) to autonomously identify and track objects (AeroVironment, 2023; Atherton, 2023; IAI, 2023). Most notably, the Harpy and Harop loitering munitions are considered by many analysts as the prime examples of autonomous weapons systems capable of engaging targets (in this case radar signatures stemming from air defenses) with limited to no human intervention (Watts, Bode, 2023; House of Lords, 2023; Sauer, 2021). One of the most recent systems is the Kargu-2. Developed in 2020 by STM for Turkey’s armed forces, it is a small quadcopter with a range of 10 km, or 30 minutes, of remote controlled or autonomous flight time, and it is allegedly equipped with electro optic (EO) and infrared (IR) cameras, an automatic target recognition (ATR) system which uses machine learning algorithm for identification. It is also capable to be employed in swarm with other models. (Daily Sabah, 2019; STM Youtube, 2020; STM, 2020; Susar, 2020). While the company describes its system as using a “Human-in-the-Loop principle” (STM, 2023), an investigation carried out by a UN Panel of Experts seemed to suggest that a Kargu-2 drone might have been used by Libya’s Government of National Accord to autonomously “attack targets without requiring data connectivity between the operator and the munition” (UNSC, 2021).

Employment and effectiveness in recent conflicts

Only in recent years, loitering munitions have been extensively employed in conflict scenarios. In this section, two case studies where such systems have been utilized, as well as their military impact, will be analyzed.

a. The Nagorno-Karabakh Conflict (2020)

In September 2020, Azeri forces launched an offensive operation in the eastern part of Nagorno-Karabakh, a disputed region between Azerbaijan and the now-defunct Armenian Republic of Artsakh. During this offensive, Azeri forces extensively used previously purchased Israeli Harop and Harpy-2 munitions, along with other Turkish made drones, (Gibbons-Neff, 2016; Hauer, 2021) to systematically neutralize targets. These light “suicide drones” have a range of up to 200 kilometers, are capable of operating either manually or autonomously, and are equipped with a 16-kilogram mass of explosive warhead (IAI, 2023). According to OSINT sources, these systems were utilized in the opening days of the September offensive with great effect. Indeed, thanks to their anti-radiation capabilities, a series of well-coordinated attacks was carried out by Azerbaijan during and after the 30th of September against Armenian T-72 tank columns, as well as against Armenian Surface to Air Missile (SAM) defense systems, and artillery sites (APA, 2020), which paved the way for the Azeri air force to strike in some areas, and for its ground forces to capture strategic locations such as Susha (Khan, 2020; Hambling, 2020; Dixon, 2020). The impact that these systems had on Armenian military equipment and personnel is impressive, considering the extensive arsenal that the Armenian army bolstered. Official figures vary, but according to OSINT-based research, in just roughly forty days of firefights and clashes, combined Azeri drones and loitering munitions managed to find and destroy a significant number of tanks, armored personnel carriers (APC), artillery pieces, radar and SAM sites (9K33 Osa, long range S-300, and at least 1 Tor-M2KM) and EW equipment (Oryx, 2020). Arguably the success of the Azerbaijani military offensive was largely due to the coordinated employment of both reconnaissance drones, and suicide munitions, which, thanks to their small size, managed to elude the radar signature of Armenian anti-air defenses. Although Armenian SAMs did manage to shoot down a number of these munitions, considering the relative tally of losses, it is clear that these systems have played a decisive factor in efficiently crippling the Armenian defensive capabilities (Davies, 2020; Vallée, 2023).

b. The Ukraine-Russia conflict (2022)

The war in Ukraine has seen the most extensive operational use of “suicide drones” to this date, particularly by Russian Armed Forces. During the first year of conflict, after its Air Force lost its edge in the battle for air superiority due to an unsustainable amount of aircraft losses at the hands of Ukrainian air defenses (Oryx, 2022), Russia reportedly began to put more emphasis on the use of a mix of various types of reconnaissance drones, such as the Zala 421 and the Orlan and Eleron series, as well as loitering munitions like the Lancet-3, and, in much greater number, the Iranian made Shahed-136. The latter two have been employed to engage specific high-value targets, such as air defenses, artillery sites and other static targets (Watling, Reynolds, 2023; Hambling, 2022; Army Recognition, 2022), as well as sensitive civilian infrastructures (power stations, transmission lines, water reservoirs, etc.) several kilometers behind the frontlines (Kramer, 2022; ISPI, 2022).

The Shahed-136, in particular, has been proving to be a valuable asset for Russian Armed forces, and a headache for Ukrainian Surface to Air Missile (SAM) and artillery sites. It was introduced in the Russian arsenal in September 2022 as a temporary, and relatively cheap, solution aimed at filling the capacity gap caused by the depletion of Russia’s drone fleet and cruise missiles arsenal, combined with the heavily reduced local production capacity caused by long standing western sanctions on high-tech components, essential for manufacturing these systems (ISPI, 2022; Alberque, Barrie, et al, 2023). The Shahed-136 system, which Russia is domestically producing under Iranian license as Geran-2 (Albright, Burkhard, Fargasso, 2023), is as a long-range “one-way-attack” loitering munition, equipped with photo and video equipment for reconnaissance, and a 30-to-50-kilogram explosive warhead. It has a claimed range exceeding 2,000 kilometers, with a cruise speed of up to 180km/h and the ability to loiter for several hours. Several of these systems can also be launched in “salvo” from a common truck. In terms of autonomous functions, the Geran-2 system is quite simplistic. Most notably, thanks to its navigation system and a combination of satellite and radio signal, the Geran-2 can be pre-programmed to fly and attack autonomously a specific pre-established location (Army Recognition, 2023; Albright, Burkhard, 2023). Latest models are also equipped with the Russian Komet-M digital receivers for improved navigation, signal, as well as anti-jamming (Altman, 2023; Sparacino, 2023). Some analysists have not excluded the possibility that a variant of the suicide drone might also mount an infra-red camera for striking hard target, allowing the system to fly directly, more accurately, and autonomously to the heat source during its terminal phase (Hambling, 2022). This capability, however, has not been confirmed. Similarly to the Nagorno-Karabakh conflict, the impact that loitering munitions, like the Geran-2, are having from both a tactical and strategic level is considerable. From an operational perspective, the longer-range provided by the Iranian munitions, combined with their cheap costs, their low radar detection, and the inherent ability of being launched in high amounts from anywhere across the frontline, is increasingly creating wider gaps in Ukrainian air defenses, thus leaving other high-value targets, such as artillery emplacements, supply and communication networks, as well as critical infrastructure, more vulnerable to deep attacks (Wilkins, 2023; Vallée, 2023; ISPI, 2022; Watling, 2023; Hindustan Times, 2023). Although most of these attacks are effectively countered by kinetic (like ZSU and missile-based air defenses (Hawkinson, 2023; Hambling, 2022), due to the sheer number of systems launched at any time, some managed to go through Ukraine’s air defenses destroying 4 self-propelled howitzers, 2 APCs, as well as several electrical power infrastructures (Terror in the Details, 2023; Cicurel, 2022; Wilkins, 2023; Hambling, 2022; Arhirova, 2023). Their number, versatility, range, and endurance make them well suited for both low-cost SEAD operations, as well as for exploring and identifying gaps in defenses, paving the way for cruise missile strikes (Watling, Reynolds, 2023; Vallée, 2023; Alberque, et alt, 2023).

Legal challenges of increased autonomy

In the context of International Humanitarian Law (IHL), the body of Public International Law governing the rules limiting the detrimental effects of war against non-combatants, autonomous weapons systems, including loitering munitions, have been the subject of debate among scholars of this field of law. Legal scholars have identified at least three major legal challenges, posed by the potential development and use of more independent autonomous systems. These are related specifically to the rules of distinction, proportionality, and precaution according to Additional Protocol I to the Geneva Conventions of 1949, namely:

1) their ability to discriminate between lawful targets and civilian targets and civilians;

2) their risk of incidentally injuring civilians and damaging civilian objects;

3) the ability of a human operator to understand the system and to verify that it can operate in compliance with international humanitarian law (Boulanin, Davidson, 2020; Boulanin, 2015; ICRC, 2014). Regarding the first point, it is argued that for systems to be able to distinguish lawful from unlawful targets by themselves, they would have to be equipped with scanners and sensors that would allow them distinguish between civilian objects, and military objectives. However, in warfare, very often the context can change quickly, hence these systems, designed and programmed beforehand under specific parameters based on specific conditions, would not be able to consider all the changing factors and variables that occur in the battlespace over time, and adapt their engagement parameters accordingly, leading to potential unpredictable outcomes, and indiscrimination in attacks (Boulanin, Davidson, 2020; ICRC, 2021). Even in systems where humans retain control over the trigger, studies suggest that in fast-paced, stressful, and uncertain conditions, the operator might simply uncritically over-rely on what the system suggests, a condition known as “automation bias” (Watts, Bode, 2023; Turek, Moyes, 2021; Cummings, 2017). The second point is related to the principle of proportionality in warfare, which obliges military commanders to take all possible precautions before an attack in order not to cause excessive and disproportionate damage in relation to the anticipated military advantage (Additional Protocol I, Art. 51.5b). So, when deciding to carry out an attack autonomously, these systems would need to programmed to comply with such rule through a qualitative analysis to judge whether an attack carried out against a lawful target would be considered proportionate, or if all feasible precautions have been taken. A contextual assessment, it is argued, that always requires human judgement (Schmitt, Thurnher, 2013; Boulanin, Davidson, 2020; ICRC, 2014). The third and final point concerns the predictability of autonomous systems. Indeed, in order to comply with the rules of proportionality and precaution, the commander must be sure that the weapons he chose to use will work in a certain way, and will have predictable and reliable effects. If a weapon’s effects cannot be controlled, or are not fully foreseen, in any environment or circumstance, then he would run the risk of violating international humanitarian rules (Ibid). The ICRC has been particularly vocal about these legal and ethical risks associated to increasingly autonomous systems infused with more advanced and independent forms of AI. The ICRC has been advocating for a comprehensive, and binding, set of norms and rules for development and use of autonomous weapon systems, by for example limiting the types of targets, the geographical scope, and the context in which they would be employed, as well as rendering human-supervision mandatory (ICRC, 2021). In this sense, a Group of Governmental Experts, within the context of the works on the UN Convention on Certain Conventional Weapons (CCW), was established in 2016 to discuss on issues related to technologies in the area of lethal autonomous weapons systems (Digwatch, 2023). In 2019, the GGE adopted a list of guiding principle intended to help the member States to find a common ground in discussing the legal and ethical risks of LAWS (UN, 2019). While the GGE convenes regularly to discuss such issues, the path leading to a regulatory framework on such systems seems very far away, mostly due to the fact that a commonly agreed definition of LAWS does not exist, as it would need to encompass the much broader topic of human-machine interaction (Sauer, 2021).

Conclusion: what lies ahead?

The development, use, and capabilities of loitering munitions have increased over the past years. The recent conflicts in Nagorno-Karabakh and Ukraine have demonstrated their effectiveness in striking high value targets far behind the frontlines at a relatively low price, often acting as substitutes of cruise/SEAD missiles and artillery systems, and in some cases even outclassing them in terms of range (Vallée, 2023). However, as other authors argued, it is possible that loitering munitions will be operationalized also for other purposes, such as early warning and close air support (Vallée, 2023; Wilkins, 2023). Over the last few years, different European States have become increasingly interested in investing in procuring existing or developing local solutions (IISS, 2024), while China, one of the technological leaders in this field, is taking lessons from the war in Ukraine in order to develop both offensive and defensive doctrines to employ and defend against loitering munitions (Goldstein, Waechter, 2023). In terms of capabilities of such systems, the development trend seems to be leading towards increased system’s autonomy, catalyzed in great part by both AI advancements in the field of machine and deep learning, as well as by the need for circumventing radio signal interference caused by EW equipment, to which AWS would be immune (Rogoway, 2023; Wilkins, 2023). Indeed, commercially-available advanced AI software and hardware using different sensors, capable of automatic recognition and categorization of objects by elaborating large quantities of data, are already being used in some autonomous systems (Rogoway, 2023). Ukrainian Armed Forces, for example, have recently introduced an AI-infused drone called Saker, capable of both First-Person-View (FPV) attacks, as well as of identifying and potentially engaging targets autonomously with human supervision, with the idea of limiting reaction time, and nullifying the effects of jamming that would otherwise prevent direct control from the operator (Hambling, 2023). Finally, with regards to the conundrums related to IHL, while some western States, most notably the U.S and UK, have been implementing guidelines and regulations for the development and use of “responsible” AI and autonomous systems (Flournoy, 2023; Albon 2023; House of Lords, 2023), such as a comprehensive review, and constant human supervision during the engagement phase, other States might not be so inclined to do so. The reasoning behind it is that, because of their contained costs of production, versatility, combined with the ease of integration with increasingly advanced forms of AI, mass producing a system capable of carrying out attacks autonomously, even if less efficient, would be more advantageous, as it can be already seen in Ukraine.

Contenuto dell’Informazione

1

Confermata

Confermato da altre fonti indipendenti; logico in sé; coerente con altre informazioni sull’argomento

2

Presumibilmente Vera

Non confermato; logico in sé; consistente con altre informazioni sull’argomento.

3

Forse Vera

Non confermato; ragionevolmente logico in sé; concorda con alcune altre informazioni sull’argomento

4

Incerta

Non confermato; possibile ma non logico in sé; non ci sono altre informazioni sull’argomento

5

Improbabile

Non confermato; non logico in sé; contraddetto da altre informazioni sul soggetto.

6

Non giudicabile

Non esiste alcuna base per valutare la validità dell’informazione.

Affidabilità della fonte

A

Affidabile

Nessun dubbio di autenticità, affidabilità o competenza; ha una storia di completa affidabilità.

B

Normalmente Affidabile

Piccoli dubbi di autenticità, affidabilità, o competenza, tuttavia ha una storia di informazioni valide nella maggior parte dei casi.

C

Abbastanza Affidabile

Dubbio di autenticità, affidabilità o competenza; tuttavia, in passato ha fornito informazioni valide.

D

Normalmente non Affidabile

Dubbio significativo sull'autenticità, affidabilità o competenza, tuttavia in passato ha fornito informazioni valide.

E

Inaffidabile

Mancanza di autenticità, affidabilità e competenza; storia di informazioni non valide.

F

Non giudicabile

Non esiste alcuna base per valutare l’affidabilità della fonte.

Bibliography

Von Clausewitz C., ”Vom Kriege”, 1832 [A-1]

Fuller Maj-Gen. J.F.C., “The conduct of war, 1789-1961: a study of the impact of the French, Industrial, and Russian revolutions on war and its conduct”, 1961 [A-1]

Currie C., “The Evolution of War: How AI has Changed Military Weaponry and Technology”, MAIEI, May 2022: https://montrealethics.ai/the-evolution-of-war-how-ai-has-changed-military-weaponry-and-technology/ [B-1]

Burke J., “How significant is AI's role in Industry 4.0?”, Tech Target, December 2022: https://www.techtarget.com/searchenterpriseai/tip/How-significant-is-AIs-role-in-Industry-40 [B-1]

Limbasiya J., “AI and generative AI are revolutionizing manufacturing…here’s how”, CIO, December 2023: https://www.cio.com/article/1260026/ai-and-generative-ai-are-revolutionizing-manufacturingheres-how.html#:~:text=Generative%20AI%20can%20assist%20less,to%20reduce%20the%20learning%20curve.&text=Manufacturing%20operations%20are%20inherently%20prone,mitigate%20these%20often%20serious%20risks [B-1]

Lingel S., et al., “Joint All-Domain Command and Control for Modern Warfare - An Analytic Framework for Identifying and Developing Artificial Intelligence Applications”, RAND, 2020: https://www.rand.org/pubs/research_reports/RR4408z1.html [B-1]

Smith J., Savage S., “Enhance command and control with AI and machine learning”, Microsoft, June 2023: https://www.microsoft.com/en-us/industry/blog/government/2023/06/20/enhance-command-and-control-with-ai-and-machine-learning/ [B-1]

Laskowski N., “What it Artificial Intelligence (AI)?”, Tech Target, November 2023: https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence [B-1]

Council of Europe, “Wha’s AI?”, 2022: https://www.coe.int/en/web/artificial-intelligence/what-is-ai

IBM, “What is Artificial Intelligence?”, 2023: https://www.ibm.com/topics/artificial-intelligence#:~:text=What%20is%20artificial%20intelligence%20(AI)%3F,capabilities%20of%20the%20human%20mind [A-1]

UK Ministry of Defence, “Defence Artificial Intelligence Strategy”, June 2022 [A-1]

US Department of Defence, “Data, Analytics, and Artificial Intelligence Adoption Strategy”, June 2023 [A-1]

US Department of Defence, “DOD DIRECTIVE 3000.09 AUTONOMY IN WEAPON SYSTEMS”, January 2023: https://www.esd.whs.mil/portals/54/documents/dd/issuances/dodd/300009p.pdf [A-1]

ICRC, “ICRC Expert Meeting Report, “Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian aspects”, Switzerland, 26-28th March, 2014 [A-1]

Kellenberger J., “International humanitarian law and new weapon technologies”, 34th Round Table on Current Issues of International Humanitarian Law, San Remo, 8–10th September 2011 [A-1]

House of Commons, Defense Committee, “Remote Control: Remotely Piloted Air Systems – Current and Future UK use”, 10th report of session 2013-2014, Vol. 1, March 2014 [A-1]

UK Ministry of Defense, “Joint Doctrine Publication 0-30.2 Unmanned Aircraft Systems Joint Doctrine Publication 0-30.2 (JDP 0-30.2)”, August 2017 [A-1]

House of Lords, “Proceed with Caution: Artificial Intelligence in Weapon Systems”, AI in Weapon Systems Committee, Report Session 2023-24, December 2023: https://publications.parliament.uk/pa/ld5804/ldselect/ldaiwe/16/16.pdf [A-1]

ICRC, “International humanitarian law and the challenges of contemporary armed conflicts”, Report 32IC/15/11 for the 32nd International Conference of the Red Cross and Red Crescent, Oct. 2015, Geneva, 8–10 December 2015 [A-1]

ICRC, “International Committee of the Red Cross (ICRC) position on autonomous weapon systems: ICRC position and background paper”, January 2022: https://international-review.icrc.org/articles/icrc-position-on-autonomous-weapon-systems-icrc-position-and-background-paper-915 [A-1]

Human Rights Watch, “Losing Humanity: the Case against Killer Robots”, International Human Rights Clinic, November 2012: https://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf [A-1]

US Air Force, Report AF/ST-TR-10-01-PR, “Technology Horizon, a Vision for Air Force Science & Technology During 2010-2030”, May 2010 [A-1]

ICRC, “Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian aspects”, Expert Meeting Report [A-1]

Raytheon, “Phalanx Weapon System”: https://www.rtx.com/raytheon/what-we-do/sea/phalanx-close-in-weapon-system [A-1]

Rafael, “Iron Dome Family”, 2023: https://www.rafael.co.il/worlds/air-missile-defense/short-range-air-missile-defense/ [A-1]

Rheinmetall, “Stationary Air Defense”, 2023: https://www.rheinmetall.com/en/products/air-defence/air-defence-systems/stationary-air-defence [A-1]

Gettinger D., “Phoenix Ghosts are part drones, part missiles. How does that change combat?”, Bulletin of the Atomic Scientists, June 2022: https://thebulletin.org/2022/06/phoenix-ghosts-are-part-drones-part-missiles-how-does-that-change-combat/ [B-1]

Gettinger D., “One-Way Attack Drones: Loitering Munitions of Past and Present,” Vertical Flight Society, May 2023: https://vtol.org/news/press-release-vfs-publishes-study-on-one-way-attack-drones [B-1]

Gettinger D., Michel A., “Countries with Loitering Munitions”, Drone Center, February 2022: https://dronecenter.bard.edu/files/2017/02/CSD-Loitering-Munitions.pdf [B-1]

UVision Smart Loitering Systems, “About Loitering Munitions”, 2023: https://uvisionuav.com/loitering-munitions/ [A-1]

IAI, “Loitering Munitions for Air Force”, 2023: https://www.iai.co.il/defense/air/loitering-munitions [A-1]

IAI, “Harop Loitering Munition”, 2023: https://www.iai.co.il/p/harop [A-1]

IAI, “Harpy Autonomous Weapon for All Weather”, 2023: https://www.iai.co.il/p/harpy [A-1]

Atherton K., “Everything to know about Switchblades, the attack drones the US gave Ukraine”, Popular Science, July 2023: https://www.popsci.com/technology/switchblade-drones-explained/ [B-1]

AeroVironment, “Switchblade 600 Loitering Munition”, 2023: https://www.avinc.com/lms/switchblade-600 [A-1]

Watts T., Bode I., “Loitering munitions: flagging an urgent need for legally binding rules for autonomy in weapon systems”, ICRC Humanitarian Law & Policy, June 2023: https://blogs.icrc.org/law-and-policy/2023/06/29/loitering-munitions-legally-binding-rules-autonomy-weapon-systems/ [A-1]

Sauer F., “Stepping back from the brink: Why multilateral regulation of autonomy in weapons systems is difficult, yet imperative and feasible”, International Review of the Red Cross, March 2021: https://international-review.icrc.org/sites/default/files/reviews-pdf/2021-03/stepping-back-from-brink-regulation-of-autonomous-weapons-systems-913.pdf [A-1]

Daily Sabah, “Domestically-developed kamikaze drones to join Turkish army’s inventory as of 2020”, September 2019: https://www.dailysabah.com/defense/2019/09/12/domestically-developed-kamikaze-drones-to-join-turkish-armys-inventory-as-of-2020 [B-1]

STM Youtube, “Kargu - The Kamikaze Drones Getting Ready For The Swarm Operation”, 2020: https://www.youtube.com/watch?v=3d28APIfwSI [A-1]

STM, “Tactical Mini UAV Systems”, 2020: https://www.stm.com.tr/uploads/docs/1689676408_tacticalminiuavsystems.pdf [A-1]

STM, “Kargu”, 2023: https://www.stm.com.tr/en/karg... S., “KARGU UAV System”, Defence Turk, April 2020: https://en.defenceturk.net/kar... [A-1]

United Nations Security Council, “Final report of the Panel of Experts on Libya established pursuant

to Security Council resolution 1973 (2011)”, S/2021/229, March 2021: https://documents.un.org/doc/undoc/gen/n21/037/72/pdf/n2103772.pdf?token=zG460fUs5yhCakzS6r&fe=true [A-1]

United Nations, “Report of the 2019 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems”, Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons System, September 2019: https://documents.un.org/doc/undoc/gen/g19/285/69/pdf/g1928569.pdf?token=LT8hRfJbrbFW4XWbEt&fe=true [A-1]

Gibbons-Neff T., “Israeli-made kamikaze drone spotted in Nagorno-Karabakh conflict”, The Washington Post, April 2016: https://www.washingtonpost.com/news/checkpoint/wp/2016/04/05/israeli-made-kamikaze-drone-spotted-in-nagorno-karabakh-conflict/ [B-1]

Hauer N., “Turkish and Israeli military tech in the Nagorno-Karabakh war”, Times Now News, January 2021: https://www.timesnownews.com/columns/article/turkish-and-israeli-military-tech-in-the-nagorno-karabakh-war/706217 [B-1]

APA, “The enemy's military equipment was destroyed in the direction of Jabrayil region”, September 2020: https://apa.az/en/nagorno_garabagh/The-enemy's-military-equipment-was-destroyed-in-the-direction-of-Jabrayil-region-colorredVIDEOcolor-331600 [B-2]

Khan N., “How Azerbaijan's drone power brought Armenia to its knees. Is India ready for modern warfare?”, Times Now News, November 2020: https://www.timesnownews.com/india/article/how-azerbaijans-drone-power-brought-armenia-to-its-knees-is-india-ready-for-modern-warfare/686609 [B-1]

Hambling D., “The ‘Magic Bullet’ Drones Behind Azerbaijan’s Victory Over Armenia”, Forbes, November 2020: https://www.forbes.com/sites/davidhambling/2020/11/10/the-magic-bullet-drones-behind--azerbaijans-victory-over-armenia/?sh=2bd6657c5e57 [B-1]

Hambling D., “How Can Ukraine Counter Russia’s ‘Swarm’ Drone Offensive?”, Forbes, September 2022: https://www.forbes.com/sites/davidhambling/2022/09/28/how-can-ukraine-counter-russias-swarm-drone-offensive/?sh=457ec21649bf [B-1]

Hambling D., “Ukraine’s AI Drones Seek And Attack Russian Forces Without Human Oversight”, Forbes, October 2023: https://www.forbes.com/sites/davidhambling/2023/10/17/ukraines-ai-drones-seek-and-attack-russian-forces-without-human-oversight/?sh=23b8388e66da [B-1]

Dixon R., “Azerbaijan’s drones owned the battlefield in Nagorno-Karabakh and showed future of warfare”, The Washington Post, November 2020: https://www.washingtonpost.com/world/europe/nagorno-karabkah-drones-azerbaijan-aremenia/2020/11/11/441bcbd2-193d-11eb-8bda-814ca56e138b_story.html [B-1]

Oryx, “The Fight For Nagorno-Karabakh: Documenting Losses On The Sides Of Armenia And Azerbaijan”, September 2020: https://www.oryxspioenkop.com/2020/09/the-fight-for-nagorno-karabakh.html [A-1]

Oryx, “List Of Aircraft Losses During The Russian Invasion Of Ukraine”, March 2022: https://www.oryxspioenkop.com/2022/03/list-of-aircraft-losses-during-2022.html [A-1]

Davies J., “Unmanned Aerial Systems in Nagorno-Karabakh: A Paradigm Shift in Warfare?”, Human Security Centre, November 2020: http://www.hscentre.org/uncategorized/unmanned-aerial-systems-in-nagorno-karabakh-a-paradigm-shift-in-warfare/ [B-1]

Vallée P., “The Role of Unmanned Aerial Vehicles in Current and Future Conflicts”. Les Cahier de la Revue Défense Nationale, 2023 [A-1]

Watling J., Reynolds N., “Meatgrinder: Russian Tactics in the Second Year of Its Invasion of Ukraine”, Royal United Services Institute, May 2023 [A-1]

Army Recognition, “Russia engages Kub and Lancet kamikaze drones in Ukraine”, June 2022: https://www.armyrecognition.com/defense_news_june_2022_global_security_army_industry/russia_engages_kub_and_lancet_kamikaze_drones_in_ukraine.html [B-1]

Kramer A. “”In retreat at the front, Russia strikes deep into Ukraine”, The New York Times, October 2022: https://www.nytimes.com/2022/10/05/world/europe/ukraine-russia-war.html [B-1]

ISPI Commentary, “Assessing Russian Use of Iranian Drones in Ukraine: Facts and Implications”, October 2022 [A-1]

Alberque W., Barrie D., Gwadera Z., Wright T., “Russia’s War in Ukraine: Ballistic and Cruise Trajectories”, International Institute for Strategic Studies, September 2023 [A-1]

Albright D., Burkhard S., Fargasso S., “Highlights of Institute Assessment of Alabuga Drone Documents, Supplied by Dalton Bennett at the Washington Post”, Institute for Science and International Security, August 2023: https://isis-online.org/uploads/isis-reports/documents/Highlights_of_Institute_Assessment_of_Alabuga_Drone_Documents_Final_August_17.pdf [B-1]

Albright D., Burkhard S., “Electronics in the Shahed-136 Kamikaze Drone”, Institute for Science and International Security, November 2023: https://isis-online.org/uploads/isis-reports/documents/Electronics_in_the_Shahed-136_Kamikaze_Drone_November_14_2023_FINAL.pdf [B-1]

Army Recognition, “Shahed-136”, at: https://armyrecognition.com/iran_unmanned_ground_aerial_vehicles_systems/shahed-136_loitering_munition_kamikaze-suicide_drone_iran_data.html [B-1]

Altman H., “Russia’s Shahed-136 Drones Now Feature Tungsten Shrapnel”, The Warzone, September 2023: https://www.twz.com/russias-shahed-136-drones-now-feature-tungsten-shrapnel [B-1]

Sparacino M., “Operativi i droni-kamikaze Shahed/Geran prodotti in Russia”, Analisi Difesa, Agosto 2023: https://www.analisidifesa.it/2023/08/operativi-i-droni-kamikaze-shahed-geran-prodotti-in-russia/ [B-1]

Wilkins D., “The 2022 Russo-Ukrainian War: Current and Future Employment of Unmanned Platforms Supporting Infantry Operations”, US Army Maneuver Center of Excellence, 2023 [B-1]

Hindustan Times, “Russia’s Iskander Missiles & Geran-2 Drones Strike Ukraine Air Base Housing MiG-29 Jets”, 2023: https://www.youtube.com/watch?v=oUsjlqOGKkg [B-1]

Hawkinson K., “Ukraine's air force shot down 34 Iran-made Shahed drones overnight, officials say”, Business Insider, September 2023: https://www.businessinsider.com/ukraine-air-force-shot-down-iran-made-shahed-136-drones-2023-9?r=US&IR=T [B-1]

Arhirova H., “Russian drone debris downed power lines near a Ukraine nuclear plant. A new winter barrage is likely”, AP News, October 2023: https://apnews.com/article/russia-ukraine-war-drones-nuclear-plant-e0c746327fa85378e9a1d05243726cd3 [B-1]

Terror in the Details, “Western-made Components in Russia’s Shahed-136 Attacks”, 2023: https://stories.iphronline.org/terror-in-the-details/index.html [B-1]

Circurel A., “Russia Begins Using Iranian Drones Against Ukraine”, The Jewish Institute for National Security of America, September 2022 [B-1]

Boulanin V., Davidson N., “Limits of autonomy in weapon systems, identifying practical elements of human control,” SIPRI & ICRC, June 2020 [A-1]

Boulanin, “Implementing Article 36 weapon reviews in the light of increasing autonomy in weapon systems”, SIPRI insights on peace and security, No. 2015/01, November 2015 [A-1]

ICRC, “EXPERT MEETING AUTONOMOUS WEAPON SYSTEMS TECHNICAL, MILITARY, LEGAL AND HUMANITARIAN ASPECTS GENEVA”, March 2014 [A-1]

ICRC, “ICRC position on autonomous weapon systems”, May 2021: https://www.icrc.org/en/document/icrc-position-autonomous-weapon-systems [A-1]

Turek A., Moyes R., “SENSOR-BASED TARGETING SYSTEMS: AN OPTION FOR REGULATION”, Article 36, November 2021: https://article36.org/wp-content/uploads/2022/01/Sensor-based-targeting.pdf [B-1]

Cummings M. L., “Artificial Intelligence and the Future of Warfare”, Chatham House, January 2017: https://www.chathamhouse.org/sites/default/files/publications/research/2017-01-26-artificial-intelligence-future-warfare-cummings-final.pdf [A-1]

ICRC, “Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I)”, June 1977: https://ihl-databases.icrc.org/en/ihl-treaties/api-1977 [A-1]

Schmitt M., Thurnher J., “Out of the loop: autonomous weapon systems and the law of armed conflict”, Harvard National Security Journal, 2013: https://centaur.reading.ac.uk/89863/ [B-1]

Digwatch. “GGE on lethal autonomous weapons systems”, 2023: https://dig.watch/processes/gge-laws [B-1]

IISS, “Europe comes full circle on loitering munitions”, February 2024: https://www.iiss.org/online-analysis/military-balance/2024/01/europe-comes-full-circle-on-loitering-munitions/ [A-1]

Goldstein L., Waechter N., “Chinese Strategists Evaluate the Use of 'Kamikaze' Drones in the Russia-Ukraine War”, RAND, November 2023: https://www.rand.org/pubs/commentary/2023/11/chinese-strategists-evaluate-the-use-of-kamikaze-drones.html [B-1]

Rogoway T., “Drone Warfare’s Terrifying AI-Enabled Next Step Is Imminent”, The Warzone, February 2024: https://www.twz.com/news-features/drone-warfares-terrifying-ai-enabled-next-step-is-imminent?fbclid=IwAR1yFMw41Xpx0Pof6Vc6cjtPuuJnp2KLPbFiRPSVU18HJaBr3UiZOu3YxYE; [B-1]

Albon C., “Pentagon updates autonomous weapons policy to account for AI advances”, Defense News, January 2023: https://www.defensenews.com/artificial-intelligence/2023/01/25/pentagon-updates-autonomous-weapons-policy-to-account-for-ai-advances/ [B-1]

Flournoy M. A., “AI Is Already at War How Artificial Intelligence Will Transform the Military”, Foreign Affairs, December 2023: https://www.foreignaffairs.com/united-states/ai-already-war-flournoy [B-1]

Riproduzione Riservata Ⓡ

Condividi il post