Nonilex , to random
@Nonilex@masto.ai avatar

to review assurances it is not violating
has been accused of violating the in . Now it has to prove innocence to the , at the risk of losing critical arms supplies. State received Israel’s required written assurances, ahead of a Sun deadline, that its use of US-supplied defense equipment does not violate or US ….

https://www.washingtonpost.com/national-security/2024/03/20/israel-gaza-weapons-international-law/

Nonilex OP ,
@Nonilex@masto.ai avatar

Each [of the 7] was given an initial 45 days for a snr ofcl to provide “credible & reliable written assurances” that it will use defense material in accordance w/ & other laws.
The imposes specific requirements on , protection of & treatment of .

.

estelle , to random
@estelle@techhub.social avatar

The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:

  1. An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”

  2. An AI outputs "100 targets a day". Like a factory with murder delivery:

"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"

  1. "The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."

🧶

estelle OP ,
@estelle@techhub.social avatar

It was easier to locate the individuals in their private houses.

“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Yuval Abraham reports: https://www.972mag.com/lavender-ai-israeli-army-gaza/

(to follow) 🧶#longThread @palestine @israel @ethics @military @idf @terrorism

#AIWar #technoCriticism #ethics #efficiency #innovation #Targeting #industrialization #intelligence #casualties #dev #history #data #EDA #planning #OSINT #military #army #C4I #ES2 #IDI #IDF #genocide #Lotem #Habsora #Lavender #dataDon #dataGovernance #tech #techCulture #engineering #AI #generativeAI #fix #externalities #productivity #bias #AIWarfare #AIEthics #AIRisks #collateralDamage #dehumanisation #airStrikes #bombing #counterTerrorism #israel #Gaza #Hamas #warCrimes #JewishSupremacy

18+ estelle OP ,
@estelle@techhub.social avatar

The current commander of the Israeli intelligence wrote a book released in English in 2021. In it, he describes human personnel as a “bottleneck” that limits the army’s capacity during a military operation; the commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”

So people invented the machine to mark persons using AI. Then the army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance.
Senior officer B.: “They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy.”

https://www.972mag.com/lavender-ai-israeli-army-gaza/

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • test
  • worldmews
  • mews
  • All magazines