The putrefied odor of Lavender in Gaza – 04/06/2024 – Marcelo Leite

The putrefied odor of Lavender in Gaza – 04/06/2024 – Marcelo Leite

[ad_1]

It’s been two weeks since this column deplored drone technology sanitizing the killing of Palestinians by the State of Israel. It is imperative to return to the topic now, because the bad smell that blows from the Gaza strip is worse than we can bear.

Before proceeding, it is worth reaffirming the obvious: the massacre perpetrated by Hamas terrorists on October 7 deserves full repudiation. There is no insurgency or anti-colonial struggle that can justify the angry murder of more than a thousand civilians.

If it is not valid for 1,200 innocent people, it is worth even less for, in the exercise of their supposed right to defend themselves, Israeli forces slaughtering another 32 thousand inhabitants of a devastated land in six months. Children and women, mostly.

Targeting hospitals, ambulances and humanitarian aid convoys are war crimes. Mobilizing hunger as a weapon is a form of terror. Point.

An Israeli attack killed seven members of the World Central Kitchen (WCK) organization, run by Spanish chef José Andrés. Israel’s president, Isaac Herzog, apologized, and the army removed a colonel and a major, but it would take much more for the military power to redeem itself – a ceasefire, for starters.

The scandal that authorizes the resumption of the scandalous subject in this space, in principle reserved for science topics, lies in another indefensible technological weapon of the vengeful fighters: artificial intelligence. The tool that is said to be capable of freeing humanity is in use to free soldiers from responsibility.

Bethan McKernan and Harry Davies report in the British daily The Guardian that Israeli forces used a database based on artificial intelligence to select 37,000 suspected Hamas members as potential bombing targets. The system is called Lavender (no comments).

The reports were originally obtained by journalist Yuval Abraham and provided to the Guardian for publication in English. In Israel, they were published in +972 magazine and in the Local Call newsletter.

I don’t doubt that the WCK employees died due to a gross algorithm error, the kind that humans only make out of malice or negligence. It is certain that hundreds, perhaps thousands of non-combatant Palestinians will have been slaughtered in these mechanical slips.

Computer circuits do not accommodate such moral failures, nor can they be blamed for them. The frivolous use of artificial intelligence in war, however, was certainly decided and authorized by authorities. Can they be tried for war crimes, in this unprecedented borderline situation?

“This is unparalleled in my memory,” said an intelligence officer who used Lavender. “Everyone there lost someone on October 7th, including me. The machine did this [matar] coldly. And that made it easier.”

Another Lavender operator said that the system saves a lot of time. He devoted about 20 seconds to deciding whether to eliminate each target, and he did this dozens of times each day.

Two of the informants said that in the first weeks of the war there was permission to kill 15 to 20 civilians during air strikes on low-ranking militants. The attacks were carried out with “dumb” bombs, which brought down entire houses, killing everyone inside them.

Artificial intelligence, dumb bombs and numb soldiers – an explosive combination for the vestiges of humanity that still existed in the genocidal extermination of children, women and other innocent Palestinians.


LINK PRESENT: Did you like this text? Subscribers can access five free accesses from any link per day. Just click the blue F below.

[ad_2]

Source link