{"id":1601603,"date":"2024-04-10T00:02:58","date_gmt":"2024-04-10T00:02:58","guid":{"rendered":"https:\/\/dissidentvoice.org\/?p=149646"},"modified":"2024-04-10T00:02:58","modified_gmt":"2024-04-10T00:02:58","slug":"death-by-algorithm-israels-ai-war-in-gaza","status":"publish","type":"post","link":"https:\/\/radiofree.asia\/2024\/04\/10\/death-by-algorithm-israels-ai-war-in-gaza\/","title":{"rendered":"Death by Algorithm: Israel\u2019s AI War in Gaza"},"content":{"rendered":"

Remorseless killing at the initiation of artificial intelligence has been the subject of nail-biting concern for various members of computer-digital cosmos.\u00a0 Be wary of such machines in war and their displacing potential regarding human will and agency.\u00a0 For all that, the advent of AI-driven, automated systems in war has already become a cold-blooded reality, deployed conventionally, and with utmost lethality by human operators.<\/p>\n

The teasing illusion here is the idea that autonomous systems will become so algorithmically attuned and trained as to render human agency redundant in a functional sense.\u00a0 Provided the targeting is trained, informed, and surgical, a utopia of precision will dawn in modern warfare.\u00a0 Civilian death tolls will be reduced; the mortality of combatants and undesirables will, conversely, increase with dramatic effect.<\/p>\n

The staining case study that has put paid to this idea is the pulverising campaign being waged by Israel in Gaza. \u00a0A report<\/a> in the magazine +972<\/em> notes that the Israeli Defense Forces has indulgently availed itself of AI to identify targets and dispatch them accordingly.\u00a0 The process, however, has been far from accurate or forensically educated.\u00a0 As Brianna Rosen of Just Security<\/em> accurately posits<\/a>, \u201cRather than limiting harm to civilians, Israel\u2019s use of AI bolsters its ability to identify, locate, and expand target sets which likely are not fully vetted to inflict maximum damage.\u201d<\/p>\n

The investigation opens by recalling the bombastically titled<\/a> The Human-Machine Team: How to Create Human and Artificial Intelligence That Will Revolutionize Our World<\/em>, a 2021 publication available in English authored by one \u201cBrigadier General Y.S.\u201d, the current commander of the Israeli intelligence unit 8200.<\/p>\n

The author advances the case for a system capable of rapidly generating thousands of potential \u201ctargets\u201d in the exigencies of conflict.\u00a0 The sinister and morally arid goal of such a machine would resolve a \u201chuman bottleneck for both locating new targets and decision-making to approve the targets.\u201d\u00a0 Doing so not only dispenses with the human need to vet, check and verify the viability of the target but dispenses with the need to seek human approval for their termination.<\/p>\n

The joint investigation by +972<\/em> and Local Call<\/em> identifies the advanced stage of development of such a system, known to the Israeli forces as Lavender.\u00a0 In terms of its murderous purpose, this AI creation goes further than such lethal predecessors as \u201cHabsora\u201d (\u201cThe Gospel\u201d), which identifies purportedly relevant military buildings and structures used by militants.\u00a0 Even that form of identification did little to keep the death rate moderate, generating what a former intelligence officer described<\/a> as a \u201cmass assassination factory.\u201d<\/p>\n

Six Israeli intelligence officers, all having served during the current war in Gaza, reveal how Lavender \u201cplayed a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.\u201d\u00a0 The effect of using the AI machine effectively subsumed the human element while giving the targeting results of the system a fictional human credibility.<\/p>\n

Within the first weeks of the war, the IDF placed extensive, even exclusive reliance on Lavender, with as many as 37,000 Palestinians being identified as potential Hamas and Palestinian Islamic Jihad militants for possible airstrikes.\u00a0 This reliance signalled a shift from the previous \u201chuman target\u201d doctrine used by the IDF regarding senior military operatives.\u00a0 In such cases, killing the individual in their private residence would only happen exceptionally, and only to the most senior identified individuals, all to keep in awkward step with principles of proportionality in international law.\u00a0 The commencement of \u201cOperation Swords of Iron\u201d in response to the Hamas attacks of October 7 led to the adoption of a policy by which all Hamas operatives in its military wing irrespective<\/em> of rank would be designated as human targets.<\/p>\n

Officers were given expansive latitude to accept the kill lists without demur or scrutiny, with as little as 20 seconds being given to each target before bombing authorisation was given.\u00a0 Permission was also given despite awareness that errors in targeting arising in \u201capproximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.\u201d<\/p>\n

The Lavender system was also supplemented by using the emetically named \u201cWhere\u2019s Daddy?\u201d, another automated platform which tracked the targeted individuals to their family residences which would then be flattened.\u00a0 The result was mass slaughter, with \u201cthousands of Palestinians \u2013 most of them women and children or people not involved in the fighting\u201d killed by Israeli airstrikes in the initial stages of the conflict. As one of the interviewed intelligence officers stated with grim candour, killing Hamas operatives when in a military facility or while engaged in military activity was a matter of little interest.\u00a0 \u201cOn the contrary, the IDF bombed them in homes without hesitation, as a first option. It\u2019s much easier to bomb a family\u2019s home.\u00a0 The system is built to look for them in these situations.\u201d<\/p>\n

The use of the system entailed resorting to gruesome, and ultimately murderous calculi.\u00a0 Two of the sources interviewed claimed that the IDF \u201calso decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.\u201d Were the targets Hamas officials of certain seniority, the deaths of up to 100 civilians were also authorised.<\/p>\n

In what is becoming its default position in the face of such revelations, the IDF continues to state, as reported<\/a> in the Times of Israel<\/em>, that appropriate conventions are being observed in the business of killing Palestinians.\u00a0 It \u201cdoes not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist\u201d.\u00a0 The process, the claim goes, is far more discerning, involving the use of a \u201cdatabase whose purpose is to cross-reference intelligence sources… on the military operatives of terrorist organizations\u201d.<\/p>\n

The UN Secretary General, Ant\u00f3nio Guterres, stated<\/a> how \u201cdeeply troubled\u201d he was by reports that Israel\u2019s bombing campaign had used \u201cartificial intelligence as a tool in the identification of targets, particularly in densely populated residential areas, resulting in a high level of civilian casualties\u201d.\u00a0 It might be far better to see these matters as cases of willing, and reckless misidentification, with a conscious acceptance on the part of IDF military personnel that enormous civilian casualties are simply a matter of course.\u00a0 To that end, we are no longer talking about a form of advanced, scientific war waged proportionately and with precision, but a technologically advanced form of mass murder.<\/p>The post Death by Algorithm: Israel\u2019s AI War in Gaza<\/a> first appeared on Dissident Voice<\/a>.\n

This post was originally published on Dissident Voice<\/a>. <\/p>","protected":false},"excerpt":{"rendered":"

Remorseless killing at the initiation of artificial intelligence has been the subject of nail-biting concern for various members of computer-digital cosmos.\u00a0 Be wary of such machines in war and their displacing potential regarding human will and agency.\u00a0 For all that, the advent of AI-driven, automated systems in war has already become a cold-blooded reality, deployed [\u2026]<\/p>\n

The post Death by Algorithm: Israel\u2019s AI War in Gaza<\/a> first appeared on Dissident Voice<\/a>.<\/p>\n","protected":false},"author":30,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2576,287,1659,82,72338,20184,473],"tags":[],"_links":{"self":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts\/1601603"}],"collection":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/users\/30"}],"replies":[{"embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/comments?post=1601603"}],"version-history":[{"count":1,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts\/1601603\/revisions"}],"predecessor-version":[{"id":1601604,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts\/1601603\/revisions\/1601604"}],"wp:attachment":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/media?parent=1601603"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/categories?post=1601603"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/tags?post=1601603"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}