Israel, Project Lavender and Google's Project Nimbus
AI is being used to kill, and the Israeli Military have taken technology way too far with the help of Americans and U.S. AI systems.
Hello Everyone,
Reader discretion is advised. In this article I explore insider information on the AI systems the Israeli Military has used in their massacre in Gaza. Where a majority of innocents, children and women have died. Without dignity I might add, while others remain starving in subhuman conditions.
As some of you know I’m very concerned with how AI is being integrated into the Military and National Security apparatus of several countries including in Israel. This coincides with research into more dangerous and automated drone technology and AI with more control over targeting enemies on the ground.
Google’s Project Nimbus
Project Nimbus is a cloud computing project of the Israeli government and its military. The Israeli Finance Ministry announced April 2021, that the contract is to provide "the government, the defense establishment, and others with an all-encompassing cloud solution." It seems to include Google made targeting that’s reminiscent of Google’s project Maven.
This is deeply troubling with more information coming from Time’s exclusive with the Google worker activists inside of Google and their concerns. When Eddie Hatfield was fired I felt transported back in time to the protests at Google around Project Maven.
A new report published by +972 magazine and Local Call indicates that Israel has allegedly used an AI-powered database to select suspected Hamas and other militant targets in the besieged Gaza Strip. According to the report, the tool, trained by Israeli military data scientists, sifted through a huge trove of surveillance data and other information to generate targets for assassination. This sounds more like the capabilities of Palantir, than Google per se. But it’s likely both supply Israel with military tech. Otherwise Google would be more transparent about it.
Lavender AI
Israel’s alleged use of the AI program called Lavender has emerged after Israeli-Palestinian publication +972 Magazine and Hebrew-language outlet Local Call carried out a joint investigation and cited six Israeli intelligence officials involved in the use and development of it.
AI today is being used for War Crimes
The Israeli military’s reported use of an untested and undisclosed artificial intelligence-powered database to identify targets for its bombing campaign in Gaza has alarmed human rights and technology experts who said it could amount to “war crimes”.
According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. The revelations are very disturbing, in that they likely used Palantir and perhaps also Google’s tech to build this apparatus.
There is no evidence Google or Amazon’s or Palantir’s technology has been used in killings of civilians. But it’s not as if the Israeli military built this themselves.
From my understanding there’s limited human supervision in how this has been working. It is becoming clear the Israeli army is and has “deployed untested AI systems … to help make decisions about the life and death of civilians”.
Gaza is likely a Testing Ground for American Military Tech
American Cloud and software and possibly German weapons. If you consider the children of Gaza a genocide, like many of us global onlookers do, this means all of these parties are complicit. The civilian casualties have been notably high and Gaza’s infrastructure all but decimated. The U.S. has stood idly by in a great show of cowardice.
With geopolitical uncertainty it’s clear companies are testing out their software and military tech in the Ukraine and Israel. Google’s No Tech for Apartheid group is around 40 people and they wrote a Medium article recently. But this is clearly way bigger than Google’s internal conflicts. Project Nimbus may have led to Lavender AI’s capabilities.
The Israeli Military in December 2023 had claimed that the AI system, named "the Gospel," has helped it to rapidly identify enemy combatants and equipment, while reducing civilian casualties. Clearly Israeli military tech is a bit like an American testing ground.
Project Nimbus is a $1.2bn contract to provide cloud services for the Israeli military and government. This technology allows for further surveillance of and unlawful data collection on Palestinians, and facilitates expansion of Israel’s illegal settlements on Palestinian land. This must use advanced facial recognition systems that are the kind used in China. This is Surveillance Capitalism being used on a population without an ability to defend themselves.
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. A 20-second rotation by a human was all the “human-in-the-loop” needed for the system to kill without remorse.
The recent killings of foreign aid workers by the Israeli military perhaps was also perpetuated by the AI Lavender system. On April 2nd, 2024 in a bizarre event, Seven staff members of the nonprofit World Central Kitchen were killed in an Israeli airstrike in Gaza overnight, the organization said Tuesday morning. The seven aid workers killed were from Australia, Poland, the U.K., a dual citizen of the U.S. and Canada, and the Palestinian territories.
Israel conducted relentless waves of airstrikes on the territory, flattening homes and whole neighborhoods. At present count, according to the Gaza Health Ministry, more than 33,000 Palestinians, the majority being women and children, have been killed in the territory. It’s not clear how much of this was done via AI targeting, with the help of companies like Palantir, Google, Amazon and so forth. I suppose the truth will come out in time.
“Where’s Daddy”
Google prohibits using its tech for “immediate harm,” but Israel is harnessing its facial recognition to set up a dragnet of Palestinians. Google’s track record with military tech isn’t very reassuring and its lack of transparency with its own workers and staff is especially chilling and an indication of its involvement or at least its machine learning technology.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity
According to the reporting I have seen, in fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.” Yet a human wasn’t in the loop in any meaningful way. This means the Israeli Military have been very reliant on their AI systems for the killings and bombings.
Now, in a wider climate of growing international indignation at the collateral damage of Israel’s war in Gaza, many workers at Google are connecting the dots and as activists get fired, it’s causing more concern. Google and Palantir’s involvement in Israeli tech isn’t clear, but the contracts exist.
Additional automated systems, including one called “Where’s Daddy?” also revealed by +972 Magazine for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences. It’s heartbreaking to hear that AI has been used in this manner, and Israel deserves some sort of legal course of action. These are among the worst AI facilitated crimes against humanity that we know of. Certainly of 2024.
Given that Israel is a close ally of the U.S., I’m not sure how much rule of law we might actually witness. Given that the U.S. National Defense sector and whitehouse now works with Cloud giant closely on National Security issues, they may have even orchestrated these test-runs in close cooperation with Israel.
This is how proxy wars operate, they test real technology for the potential global war we might be facing in the next decade. The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.
When Google DeepMind was merged with Google Brain, what do you supposed happend? When it was acquired by Google in 2014, DeepMind reportedly signed an agreement that said its technology would never be used for military or surveillance purposes. But a series of governance changes ended with DeepMind being bound by the same AI principles that apply to Google at large. Google’s AI Principles are very probably not taken very serious by Google’s C-suite.
The Lavender machine joins another AI system, “The Gospel,” about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military’s own publications. The Israeli Military had help, a lot of help to build all of these AI military systems that have led to a lot of casualties.
Project Nimbus
Lavender AI
Where’s Daddy
The Gospel
This likely won’t help Google’s already tarnished reputation just for a $1.2 Billion Cloud contract. Of course we know this is a big part of Palantir’s business. Google DeepMind produces frontier AI models that are deployed via [Google Cloud’s Vertex AI platform] that can then be sold to public-sector and other clients.” One of those clients is Israel.
The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes. So increasingly in the geopolitical tension, there’s a new licence on military automation and drone swarm technologies in the works. American tech makers seem almost patriotic in their developments of these tools. The Biden Administration are using “National Security” as a reason for all sorts of questionable things, tariffs and policy decisions. Israel for its part and in its brutal retribution has trapped itself in a forever war that might further harm its state.
AI is being used for Automated Kill Zones (by Israel)
Iran is a significant drone manufacturer. Israel’s crimes in Gaza won’t easily be forgotten in the Middle East, much less in the rest of the world. During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.
A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.
Lavender marks people
The Gospel marks buildings
The Value of Civilians?
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties.
It was cheaper to demolish entire buildings and wipe out entire families of Palestinians. AI helped them decide.
Limited Human-in-the-Loop Mechanics
If an AI does it, does you feel any less guilty for murder I wonder? One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male.
While I had heard and understood that Israel was using AI to target and likely were using American software integrations in these tools, I didn’t realize just how automated their chosen system had become.
If the Biden Administration allowed this to occur, there’s no limit what the U.S. might do if a real war-zone occurred in a global conflict.
A Clear Disregard for Human Life
Israel were so bent on revenge, ethics and human standards went out the window. In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants.
Video Summary of this Lavender & Where’s Daddy
Read the article by Greg Reese.
It seems Russian companies and Israel companies have been treated very differently in these global wars. The U.S. seems to have way too much power in how these AI weapons might be developed and tested in proxy-wars and by their allies.
The Israel Army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.
I don’t know what
would say about these things, but I can take a lucky guess.If America is sharing their tech with “allies”, even who commit massacres, I’m sure China and Russia are as well.
Lavender was developed by the Israel Defense Forces’ elite intelligence division, Unit 8200, which is comparable to the US’s National Security Agency or GCHQ in the UK. But how was it developed and with the help of whose technology?
Israel is fairly tech savvy but Palantir is the natural choice. We know via Bloomberg that Palantir provides “war related services” to Israel and its Military. Palantir Technologies Inc., the data analysis firm that provides militaries with artificial intelligence models, has agreed to a strategic partnership with the Israeli Defense Ministry to supply technology to help the country’s war effort, it was announced in January, 2024.
This means that Israel’s AI systems have likely become even more sophisticated in the last few months even as most of the AI augmented killings have already occured. If these aren’t military automated systems, what are?
After October 7, 2023 — when Hamas-led militants launched a deadly assault on southern Israeli communities, killing around 1,200 people and abducting 240 — the army, the sources said, took a dramatically different approach. Under “Operation Iron Swords,” the Israeli army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance.
Palantir’s CEO boasted to Bloomberg that there was great demand for their tech in Israel. Billionaire Eric Schmidt is fast-tracking his drone AI company startup in Estonia. The Ukraine have realized mass producing drones is hugely important in a war where they lack bodies now. America is literally preparing and testing for war, in case an invasion of Taiwan kicks things off in the next few years as there is likely a 35% to 55% chance of happening.
The U.S. obviously knew everything there was to know about these AI military systems active in Gaza. Which makes Amazon, Google and Palantir’s leadership truly complicit in what’s to come. Even OpenAI is working with the Pentagon. So this is what it has come to? Even killing innocents to improve AI weapons.
Technologists like to speculate about AGI, they don’t seem to speculate much about the automation of the military.
The sources said that the approval to automatically adopt Lavender’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the AI system. Even a 10% error rate was considered ‘good enough’ by the Israel military forces.
What’s clear is if a global conflict occurred, war might taken on nefarious symptoms of technological advances.
In Lavender we Trust
According to the expose, from that moment, sources said that if Lavender decided an individual was a militant in Hamas, they were essentially asked to treat that as an order, with no requirement to independently check why the machine made that choice or to examine the raw intelligence data on which it is based.
What if this was your house?
What if the attacker had bombs and AI and no mercy? What if you had nowhere to run? What a slaughter by AI we have witnessed.
Surveillance Capitalism and Predictive Analytics used for Death Machines - Lavender fields of Ash
The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ.
This does sound like something Google’s algorithms might be good at.
According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant. So Israel used a terrorist social credit scoring system, much like China would. Backed by American tech, how surprising is that?
Lavender learns to identify characteristics of known Hamas and PIJ operatives, whose information was fed to the machine as training data, and then to locate these same characteristics — also called “features” — among the general population, the sources explained.
Meanwhile, Press freedom groups are condemning a new Israeli law that would ban Al Jazeera's right to operate in Israel and Gaza. The Israel-Gaza war has taken a severe toll on journalists. As of April 9, 2024, CPJ’s preliminary investigations showed at least 95 journalists and media workers were among the more than 34,000 killed so far. Human rights don’t appear to be very high on the Israeli agenda these days. War makes people do terrible things, and it’s worse if they have AI’s capabilities.
Now, all that’s left of Gaza in many regions appears to be lavender fields of ash, death and destruction. I don’t have the heart to continue.
How much of Gaza is Destroyed?
How much of Gaza is in ruins? The war has damaged or destroyed approximately 62 percent of all homes in Gaza – 290,820 housing units – leaving more than a million people without homes, as of April, 2024.
Is +972 Magazine a Trustworthy Source?
+972 Magazine is an independent, online, nonprofit magazine run by a group of Palestinian and Israeli journalists. They are left-wing. I repeat, +972 Magazine is a left-wing news and opinion online magazine, established in August 2010 by a group of four Israeli writers in Tel Aviv. They specialize in analysis directly from the ground in Israel-Palestine.
They specifically state, and I repeat: The magazine is committed to human rights, democracy, and freedom of information, and while they do actively opposes the Israeli occupation. However, +972 Magazine does not represent any outside organization, political party, or agenda.
If you think their work offers a valuable vantage point and want to learn more, subscribe to their Newsletter. They have 116k Twitter followers. @972mag
Reactions and Reflections
Some of the reactions to the piece by Yuval Abraham were intense and first-hand accounts:
I personally would have never come across +972 Magazine without this piece. Without the sources this story would have not occured.
I just don’t know how you recover from this if you are Gaza.