Palestinians are under a mass #surveillance. When a Palestinian's rating rises above a certain threshold, Israeli intelligence officers add them to the kill list.
#Lavender has learned to detect non-combatants as a threat, because its training data included the profiles of civil defense workers like firefighters and emergency responders.
The meaning of this decision is dramatic: it means you’re including many people with a civilian communication profile as potential targets.
But #Lavender and Where’s Daddy? are only two parts of a sprawling network of weaponized #surveillance and repression that dominates Palestinian lives.
They use #spyware to hack into phones, maintain a web of cameras at military checkpoints and in Palestinian residential areas, and collect photos and other biometric data. The information they capture about Palestinians living under apartheid feeds the databases they use to train new genocidal technologies.
"The #Israeli#Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
It is infuriating and certainly not funny.
Some citizens say that the army has to go periodically "mow the lawn". It reflects an awful colonial mentality.
Now the society has increased another step and decided of #ethnicCleansing.
A cynical Frenchman chose an eradication image: #Lavender.
What kind of developers are that who develop software which selects people to be killed? Can you imagine working in such a team? Are they working with user stories on Kanban boards?
It’s horrifying that we have gotten so close to that seemingly far away future where AI kills actual humans.
"Israeli PM Benjamin Netanyahu’s office says it will allow “temporary” aid deliveries via a crossing in the northern #Gaza Strip “to ensure the continuation of the fighting and to achieve the goals of the war”."
"For every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander."
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:
An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”
An AI outputs "100 targets a day". Like a factory with murder delivery:
"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"
"The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."
Here is a follow-up of
Yuval Abraham's investigation:
"The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties" https://www.972mag.com/lavender-ai-israeli-army-gaza/
It was easier to locate the individuals in their private houses.
“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The current commander of the Israeli intelligence #Unit8200 wrote a book released in English in 2021. In it, he describes human personnel as a “bottleneck” that limits the army’s capacity during a military operation; the commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”
So people invented the #Lavender machine to mark persons using AI. Then the army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance.
Senior officer B.: “They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy.”
The sources said that the approval to automatically adopt #Lavender’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the #AI system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, if Lavender decided an individual was a militant in Hamas, the sources were essentially asked to treat that as an order.
“Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”
The sources said that the approval to automatically adopt #Lavender’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the #AI system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, if Lavender decided an individual was a militant in Hamas, the sources were essentially asked to treat that as an order.
“Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”
“The #protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used #Lavender.
“It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”