18+ dillyd , to random
@dillyd@turtleisland.social avatar
appassionato , to palestine group
@appassionato@mastodon.social avatar

Palestinians are under a mass . When a Palestinian's rating rises above a certain threshold, Israeli intelligence officers add them to the kill list.

https://stopkiller.ai/

@palestine


appassionato OP ,
@appassionato@mastodon.social avatar

@palestine

has learned to detect non-combatants as a threat, because its training data included the profiles of civil defense workers like firefighters and emergency responders.

The meaning of this decision is dramatic: it means you’re including many people with a civilian communication profile as potential targets.

https://stopkiller.ai/

@palestine


appassionato OP ,
@appassionato@mastodon.social avatar

@palestine

But and Where’s Daddy? are only two parts of a sprawling network of weaponized and repression that dominates Palestinian lives.

They use to hack into phones, maintain a web of cameras at military checkpoints and in Palestinian residential areas, and collect photos and other biometric data. The information they capture about Palestinians living under apartheid feeds the databases they use to train new genocidal technologies.

https://stopkiller.ai/

plink , to israel group
@plink@mastodon.online avatar

from
Rights Advocates Demand Probe Into Reports That Uses to Target

"The system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."




@palestine @israel

https://www.commondreams.org/news/meta-ai-whatsapp

JVNardulli , to random
@JVNardulli@mastodon.social avatar
irisRichardson , to random
@irisRichardson@mastodon.art avatar
ALT
  • Reply
  • Loading...
  • TheWeeOwl , to random
    @TheWeeOwl@mastodon.art avatar

    The lavender in my garden is starting to grow and smell lovely, no sign of flowering starting, but it will be on the way.
    Meanwhile I have this artwork of it in full bloom and keeping the bees busy...
    https://theweeowlart.etsy.com/listing/1699644837

    ALT
  • Reply
  • Loading...
  • IrisRichardson , to random
    @IrisRichardson@mastoart.social avatar
    TheWeeOwl , to random
    @TheWeeOwl@mastodon.art avatar

    Lavender is a classic bee plant and a fantastic plant for every garden for many reasons. When in flower it looks and smells wonderful. It's a great source of nectar for butterflies and bees.
    My painting is available here...
    https://theweeowlart.etsy.com/listing/1699644837

    ALT
  • Reply
  • Loading...
  • eric , to palestine group
    @eric@social.coop avatar

    #Lavender is traditionally used in France to reduce the moth population.

    Only a small proportion of French Jews emigrate to #IsraelPalestine. These Binationals are subject to compulsory military service.

    This army prepared and launched the first #AIWar in 2021: https://techhub.social/@estelle/111510965384428730

    A development team has designed a more efficient product, which a Frenchman has suggested calling Lavander: https://techhub.social/@estelle/112220409975979758 @palestine

    #humour #innovation #techBros #ethics #Gospel #tech #AIEthics #AI

    estelle ,
    @estelle@techhub.social avatar

    @argumento @eric @palestine

    It is infuriating and certainly not funny.
    Some citizens say that the army has to go periodically "mow the lawn". It reflects an awful colonial mentality.

    Now the society has increased another step and decided of .
    A cynical Frenchman chose an eradication image: .

    Zionists have their own images. Sorry.

    adachika192 ,
    @adachika192@hcommons.social avatar

    @estelle @argumento @eric @palestine

    So, the AI system was named by a French-Israeli soldier in the connotation of ‘repelling moths’? Have I got it right?

    robert , to random
    @robert@flownative.social avatar

    What kind of developers are that who develop software which selects people to be killed? Can you imagine working in such a team? Are they working with user stories on Kanban boards?

    It’s horrifying that we have gotten so close to that seemingly far away future where AI kills actual humans.

    https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes

    SubtleBlade , to random
    @SubtleBlade@mastodon.scot avatar

    Top chief exposes his true identity in online lapse

    Exclusive: unmasked as head of 8200 and architect of strategy after book written under pen name reveals his Google account
    https://www.theguardian.com/world/2024/apr/05/top-israeli-spy-chief-exposes-his-true-identity-in-online-security-lapse

    sebastian , to palestine group
    @sebastian@mastodon.cc avatar

    An honest view on as part of the campaign:

    "Israeli PM Benjamin Netanyahu’s office says it will allow “temporary” aid deliveries via a crossing in the northern Strip “to ensure the continuation of the fighting and to achieve the goals of the war”."

    https://www.aljazeera.com/news/liveblog/2024/4/5/israels-war-on-gaza-live-biden-presses-israel-for-immediate-ceasefire

    @palestine

    appassionato , to palestine group
    @appassionato@mastodon.social avatar

    Collateral damage

    "For every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander."

    https://www.972mag.com/lavender-ai-israeli-army-gaza/

    @palestine

    appassionato , to palestine group
    @appassionato@mastodon.social avatar

    Lavender

    “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

    They called it "Where's Daddy."

    https://www.972mag.com/lavender-ai-israeli-army-gaza/

    @palestine

    Miro_Collas , to palestine group
    @Miro_Collas@masto.ai avatar

    Israel’s war on Gaza updates: 6 months of war sees children deprived of aid | Israel War on Gaza News | Al Jazeera
    https://www.aljazeera.com/news/liveblog/2024/4/3/israels-war-on-gaza-live-condemnation-of-israel-over-aid-worker-killings

    • Israeli forces bring in bulldozers to Jenin during raid
    • Hamas: Netanyahu is not interested in negotiations
    • Israeli army issues response to AI ‘Lavender’ system claims
    • Islamic Jihad attacks Israeli territory from Gaza


    @palestine

    estelle , to random
    @estelle@techhub.social avatar

    The terrible human toll in Gaza has many causes.
    A chilling investigation by +972 highlights efficiency:

    1. An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”

    2. An AI outputs "100 targets a day". Like a factory with murder delivery:

    "According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"

    1. "The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."

    🧶

    18+ estelle OP ,
    @estelle@techhub.social avatar

    Here is a follow-up of
    Yuval Abraham's investigation:

    "The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties"
    https://www.972mag.com/lavender-ai-israeli-army-gaza/

    @israel @ethics @military @idf

    estelle OP ,
    @estelle@techhub.social avatar

    It was easier to locate the individuals in their private houses.

    “We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

    Yuval Abraham reports: https://www.972mag.com/lavender-ai-israeli-army-gaza/

    (to follow) 🧶#longThread @palestine @israel @ethics @military @idf @terrorism

    #AIWar #technoCriticism #ethics #efficiency #innovation #Targeting #industrialization #intelligence #casualties #dev #history #data #EDA #planning #OSINT #military #army #C4I #ES2 #IDI #IDF #genocide #Lotem #Habsora #Lavender #dataDon #dataGovernance #tech #techCulture #engineering #AI #generativeAI #fix #externalities #productivity #bias #AIWarfare #AIEthics #AIRisks #collateralDamage #dehumanisation #airStrikes #bombing #counterTerrorism #israel #Gaza #Hamas #warCrimes #JewishSupremacy

    18+ estelle OP ,
    @estelle@techhub.social avatar

    The current commander of the Israeli intelligence wrote a book released in English in 2021. In it, he describes human personnel as a “bottleneck” that limits the army’s capacity during a military operation; the commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”

    So people invented the machine to mark persons using AI. Then the army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance.
    Senior officer B.: “They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy.”

    https://www.972mag.com/lavender-ai-israeli-army-gaza/

    18+ estelle OP ,
    @estelle@techhub.social avatar

    The sources said that the approval to automatically adopt ’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, if Lavender decided an individual was a militant in Hamas, the sources were essentially asked to treat that as an order.

    “Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”

    Yuval Abraham: https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel

    18+ estelle OP ,
    @estelle@techhub.social avatar

    The sources said that the approval to automatically adopt ’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, if Lavender decided an individual was a militant in Hamas, the sources were essentially asked to treat that as an order.

    “Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”

    https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data

    estelle OP ,
    @estelle@techhub.social avatar

    “The was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used .

    “It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

    Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”

    https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • test
  • worldmews
  • mews
  • All magazines