theAnalysis.news

Gaza: AI Targeting a Cover for Genocide


Listen Later







{"@context":"http:\/\/schema.org\/","@id":"https:\/\/theanalysis.news\/gaza-ai-targeting-a-cover-for-genocide\/#arve-youtube-zthdig3onie65aeb6fde7dc3052389924","type":"VideoObject","embedURL":"https:\/\/www.youtube.com\/embed\/zthDIg3oniE?feature=oembed&iv_load_policy=3&modestbranding=1&rel=0&autohide=1&playsinline=0&autoplay=1&enablejsapi=1","name":"Gaza: AI Targeting a Cover for Genocide","thumbnailUrl":"https:\/\/theanalysis.news\/wp-content\/uploads\/2024\/01\/Shir-Hever.jpg","uploadDate":"2024-01-22T13:42:00+00:00","author":"theAnalysis-news","description":"Shir Hever discusses investigative work by the Israeli\/Palestinian magazine\u00a0+972, which exposed the use of AI targeting to justify the Israeli bombing of apartment buildings and hospitals. TranscriptListenDonateSubscribeGuestMusic Paul Jay Hi, I'm Paul Jay. Welcome to theAnalysis.news. In a few minu"}

Shir Hever discusses investigative work by the Israeli/Palestinian magazine +972, which exposed the use of AI targeting to justify the Israeli bombing of apartment buildings and hospitals.

Gaza: AI Targeting a Cover for Genocide
South Africa’s Case Lays Out Genocidal Intent – Francis Boyle
The UN and Israel’s Occupation of the Palestinian Territories – Ardi Imseis
Brutal Occupation Underpins Class Inequality for Israelis and Palestinians – MK Ofer Cassif
Growing Up Privileged in Apartheid, Colonial Israel – Shir Hever on Reality Asserts Itself (pt 1/4)
Israel, World Capital of Homeland Security Industries – Shir Hever on Reality Asserts Itself (pt 3/4)
Fear and Loathing in Israel – Shir Hever on Reality Asserts Itself (pt 2/4)
An Occupier’s Peace or a Just Peace – Shir Hever on Reality Asserts Itself (pt 4/4)
Radical Transformation is Needed for an Israeli-Palestinian Peace – Nadim Houry
Strategically and Morally Bankrupt: U.S. Policy in the Middle East – Col. Lawrence Wilkerson
Justifying Genocide – Shir Hever pt 2
Demand a Cease-Fire in Gaza – Shir Hever pt 1
A Brutal Occupation Begets a Brutal War Between Israel and Hamas – Trita Parsi
Censorship in Germany, Israeli Hacking & Saudi-Iran Peace Deal – Dr. Shir Hever
Why David Clennon Refused Audition for Hit & Run, a Netflix Israeli Co-pro
Is BDS Effective Strategy? – Shir Hever Pt 3/3
Fighting for Peace and Equality in Israel – Rula Daood and Alon-Lee Green
Who Rules Israel – Shir Hever pt2
Why Did 72% of Israelis Want Attack on Gaza to Continue?
Is Israel a Strategic Asset or Liability? – Wilkerson
Abby Martin’s “Gaza Fights for Freedom”
Israel’s War on Palestine – Ali Abunimah
Does Israel Have the Right to Exist as a Jewish State? – Ali Abunimah on Reality Asserts Itself (3/5)
Gaza Under Siege – Eva Bartlett on Reality Asserts Itself Pt 2/2
Gaza Under Siege – Eva Bartlett on Reality Asserts Itself Pt 1/2
Class Struggle in Palestine – Ali Abunimah on Reality Asserts Itself (4/5)
Palestinians can Learn From the African America Struggle – Ali Abunimah on Reality Asserts Itself (2/5)
Awakened by the Palestinian Intifada – Ali Abunimah on Reality Asserts Itself (1/5)
Justice Requires an End to Israeli Jewish Supremacy Over Palestinians – Ali Abunimah on Reality Asserts Itself (5/5)
From a Zionist Youth to Outspoken Critic of a Jewish State – Michael Ratner on RAI Pt 2/7
One State or Two, Solution Must be Based on Palestinian Rights Phyllis Bennis on RAI Pt 4/4
Fmr. Israeli Intel. Chief Says Palestinian Israeli Conflict Greater Risk than Nuclear Iran Pt 2/4
Vietnam War Created Middle East Activist – Phyllis Bennis on Reality Asserts Itself Pt 1/4
Syria’s Six Wars and Humanitarian Catastrophe – Phyllis Bennis on Reality Asserts Itself Pt 3/4

  • Transcript
  • Listen
  • Donate
  • Subscribe
  • Guest
  • Music
  • Paul Jay

    Hi, I’m Paul Jay. Welcome to theAnalysis.news. In a few minutes, Shir Hever will join us again to discuss the current situation and the Israeli onslaught against the Palestinian people in Gaza and the West Bank, and specifically the use of artificial intelligence to target Hamas leaders militants without much care for how many civilians get killed. In fact, some people argue killing civilians might even be part of the objective. I’ll be back in just a minute.

    The magazine +972, which is an Israeli Palestinian news magazine published in Israel, and as I say, with Palestinian journalists, published an article very recently about the use of artificial intelligence to target Hamas leaders militants in Gaza. It was not only about the use of AI, but the willingness, and policy even, to not care much about how many civilians are killed in the course of targeting the Hamas leaders and a great reliance on using AI to target an apartment building, school, or a hospital, with maybe 20 seconds of checking whether AI is actually telling you anything that makes any sense.

    Now joining us to discuss the article and what this means, and also the significance in terms of the South African case of accusing Israel of genocide is Shir Hever. Shir was born in Israel. He now works for BDS in Germany and is, by training, a political economist. Thanks for joining us, Shir.

    Shir Hever

    Thanks for having me, Paul.

    Paul Jay

    So tell us what’s in the article, and then we can talk about what this means.

    Shir Hever

    Yeah, this is an article by Yuval Abraham, who is a Jewish Israeli working for +972 Magazine, a very serious investigative journalist who has received confidential information from anonymous sources. These anonymous sources are high-ranking officers in Israeli intelligence. He has put a picture of how the targeting acquisition mechanism works for this particular attack on the Gaza Strip, which amounts to the crime of genocide. I guess we’ll get to the point of why it is relevant to talk about genocide in this context.

    What he found out from the testimonies of these officers is that the Israeli normal mechanism of acquiring targets by using military intelligence units that identify so-called desirable targets, which means some high-ranking militant or asset belonging to Hamas that they want to destroy or kill, then making a certain calculation of what would be the collateral factor for that attack. The collateral factor means how many civilians are going to be killed for the sake of killing one person that you want to assassinate. Normally, they would have a collateral factor limited to about five. So, five civilians are okay to kill in order to kill one person who is suspected not convicted in a court of being a member of Hamas.

    This ratio has been used in previous wars. This time, they are using a different formula. First of all, the officers are saying that they are horrified to discover that the ratio has risen to almost 100, meaning that an entire apartment building can be destroyed. Sometimes, the objective goal of destroying this building is not to kill some high-ranking Hamas officer but to cause panic, dismay, and suffering among the civilian population as a way to pressure Hamas in a very tried, tested, and always failed method of colonial violence.

    Paul Jay

    When you say officer, to clarify a bit, the journalist from +972 states that he has talked to intelligence, military officials, or soldiers who are involved in the targeting. He’s talking to people with direct knowledge of what’s happening. I should add to this: he was interviewed on CNN, which, to their credit, gave him a fair amount of time to explain what was happening. He is quite a credible journalist.

    Shir Hever

    Yes. He also gave an interview to Democracy Now. His position is very moral and very ethical. He’s focusing on the needless killing of civilians, and he’s horrified by it, and for very good reason. 

    I think we should pay attention to something that came out of this article that didn’t receive enough attention, in my opinion, which is the fact that this is the first time in history that artificial intelligence has been weaponized and used as a weapon of war. You have entire units of intelligence officers who used to produce about five to six targets per day, mainly these five or six Hamas officers or militants that they wanted to kill, and established a certain collateral factor for each target. Now, there is an artificial intelligence tool that is manufacturing more than 100 per day. This is what is enabling the Israeli military to carpet bomb the Gaza Strip.

    This is, first of all, unprecedented. But we also have to understand how this technology works. Artificial intelligence is sometimes seen as some kind of black box, something that we’re not able to understand. I do think that for our very safety, for our very understanding of what is happening to modern warfare, we need to know. We need to know how this works.

    The way that artificial intelligence allegedly works, what the Israeli military is claiming while trying to promote this as a product, is to say that the artificial intelligence is using facial recognition software in order to go over thousands of pictures and videos from drones and surveillance cameras in order to get an analysis of every centimeter of the Gaza Strip. Then they can say, here we have identified a certain target, and we’ve also identified everyone around that target as a way to know how many civilians are there and what’s the collateral factor. A very important factor to this is that they also claim to be able to assess how many possible Israeli hostages are in the area. That’s very important that they say with facial recognition, we will be able to avoid accidentally bombing Israeli hostages in the Gaza Strip. This is the product they’re selling.

    Now, what we’re really seeing on the ground is something completely different. What we see is, in fact, an artificial intelligence model, which is very similar to Chat GPT. In this language model, because it has some kind of conversation with the officer, the artificial intelligence gathers these pictures and creates a target, but then it begins a process of teaching itself. That’s the whole idea of machine learning of artificial intelligence, that is teaching itself to see what kind of target would be more convincing for the soldier to squeeze the trigger. There’s a soldier sitting on a cannon or guiding a fighter plane to bomb a certain area, and the soldier receives the target from artificial intelligence and has to make a decision, yes or no. That’s those 20 seconds that you talked about. Sometimes, it’s less than 20 seconds, according to the testimonies of those soldiers.

    Basically, artificial intelligence teaches itself how to condense the information in the most abbreviated form and in the most convincing form so that the soldiers don’t bother reading everything and squeeze the trigger right away. In a way, the manipulation here is on the Israeli soldiers themselves. They are the weapons that are being utilized by artificial intelligence to kill more people in Gaza. This is really a very dangerous development.

    Paul Jay

    If I’m understanding the technology correctly, AI does not have X-ray vision. They are going based on some photographs and some radar, but it’s probabilities. What they’re really feeding whoever’s going to actually fire, the soldier that’s going to fire, is that there’s a probability that so and so is in this building. It’s not like there’s some direct evidence, necessarily, maybe sometimes, but if you’re having so many targets, it’s mostly probability. Based on just probability, they’re willingly killing hundreds of people in these attacks, each one of them.

    Also, anyone that’s worked with AI just on a text basis, and I’ve done quite a bit, it’s amazing how accurate it is most of the time, but it’s also amazing how often it makes shit up that’s completely, utterly wrong. In fact, they have a term for it in the AI world called hallucinating. AI tends, once in a while, to out and out hallucinate stuff that has nothing to do with reality, and it’s being relied on for targeting.

    Shir Hever

    It does have one thing to do with reality, and that’s the whole point because the way that artificial intelligence has been programmed is to study by interfacing with the user. If the AI comes to the conclusion that telling you something you want to hear, something you want to see, will create positive feedback, then the AI is more likely to go in that path. If it’s an uncomfortable truth that the AI is supposed to tell you, you notice that if you talk to Chat GPT, it will try to avoid giving you an uncomfortable truth. If you try to tell Chat GPT, I’m looking for a certain book on a certain topic. If that book doesn’t exist, you’re not going to get that answer. Chat GPT will invent a book to give you what you want to hear, even if that book was never written.

    Now, this is exactly how it works with the bombing of Gaza because, as you say, AI doesn’t have X-ray vision. Theoretically, the soldiers can vet all of the pictures that the AI is using to make a decision to create a target. But as Yuval Abraham, the author of this article, says, the soldiers eventually, because they lack patience, because they don’t want to go through this very tedious process of vetting each and every picture, they end up just checking the gender of the main target. If it’s a woman, they don’t shoot because they don’t believe that it’s a Hamas fighter. If it’s a man, they shoot without checking anything further.

    Now, this is something that would teach the AI to always show a picture of a man. That’s how you teach the AI to only show pictures of men. And that is why if there is a man somewhere in the radius of the explosion, that’s what the AI will focus on, and that’s how the soldiers can be convinced.

    Now, I want to tie this to the issue of genocide because there’s a lot of debate in the legal world about why South Africa chose the crime of genocide, which is such a serious crime, as the focus of their lawsuit at the International Court of Justice. From a moral and ethical point of view, I think, of course, they were correct because this is what it is. From a legal point of view, from a strategic point of view, you could say this is a crime that’s difficult to prove. I do think that one of the most important issues about proving genocide is that societies that cross this red line, from waging war to committing genocide, have to go through a process of getting their own soldiers to cross that red line. That is one of the most difficult things.

    If it was in Rwanda, where the Hutu have consistently called the Tutsi cockroaches in order to dehumanize them and to get the soldiers to not see them as human beings because that’s the only way you can get the soldiers to kill indiscriminately civilians. The Nazis, of course, had very elaborate mechanisms of dehumanizing Jews, dehumanizing Sinti and Roma as a way to get the soldiers to obey orders and to commit genocide. It’s very difficult. It’s much easier to convince your soldiers to defend your homeland in battle than to go around killing soldiers.

    This is really the reason that Israel needs artificial intelligence because from the point of view of the soldiers, they are getting a target, and they’re making an educated decision based on data that they’re getting from the AI. But if you go to Gaza and you look on the ground as the reporters in Gaza who are dying every day but nevertheless continue to report our recording, this is just indiscriminate carpet bombing because you have these hundreds of soldiers, each one of them thinking that he’s unique and just got the best target, squeezes the trigger again and again and again. I’m hearing reports from the Israeli artillery units. They have these M-107 cannons, which have a rate of fire that allows them to shoot 500 shells, 155-millimeter shells per 24 hours. And that’s what they’re doing. 

    Paul Jay

    Well, maybe the real point of AI is to give a fig leaf of justification for carpet bombing. In other words, instead of just calling it carpet bombing, we’re claiming we’re targeting, and this is now just collateral damage when we kill civilians when the reality is the objective is to kill a lot of civilians and make Gaza completely utterly unlivable. But you can say, oh, no, there was a Hamas leader in this building. Well, how do you know? Well, AI told us. It’s actually a fig leaf. Sorry, go ahead.

    Shir Hever

    For whom is this fig leaf intended? When the Israeli team has to defend themselves at the International Court of Justice, AI doesn’t help them. They cannot go to the International Court of Justice and say, “Our AI told us that this is a Hamas leader.” As these intelligence officers told Yuval Abraham many times, this so-called Hamas leader happens to be a guy with a gun, and that is enough. That’s all they can show. That certainly doesn’t justify demolishing an entire apartment building with the people inside it.

    Paul Jay

    But that is what they’re going to say. What other defense do they have?

    Shir Hever

    Yeah, but this is not going to help them. It’s not going to work. But for the soldiers, it does work. So, the fig leaf is a manipulation. Absolutely, they are lying. They are using AI in order to manipulate people, but they’re manipulating their own soldiers. 

    This is the first war in Israel’s history in which the soldiers are completely banned from contacting their own families and friends back home. This has been now 101 days, or 102-103, depending on when you’re going to broadcast this. During this time, the soldiers cannot call their girlfriends, cannot call their parents, and cannot tell them what’s happening in Gaza. Even more importantly, they cannot hear what people hear back home who can watch the news and follow the situation. It’s not just to prevent the public from knowing what’s happening in Gaza, but it’s even more importantly to prevent the soldiers from knowing that the whole world is watching and calling what they’re doing an act of genocide. Some soldiers who received a little bit of leave, not many receive leave to go and be with their family for a weekend or something, exhibit serious signs of PTSD because the reality clashes completely with what they saw on the ground in Gaza.

    Paul Jay

    I think there are two other parts to this, which I don’t think get discussed enough. Netanyahu and the Israeli propagandists try to compare what they’re doing to what the British and the Americans did in Germany. The firebombing of Hamburg or the American firebombing of Japanese cities, and then eventually the atomic bomb, which also had a fig leaf of a military target. We know without question that both of these things were done to try to break the morale of either the German people or the Japanese people, which means the civilians were the targets. That is a war crime. So if Israel wants to compare what they’re doing to that, then they’re comparing war crime to war crime. This doesn’t let them off the hook.

    Then there’s another even more extenuating piece, if you will, which is the British and Americans vis-à-vis Germany or the Americans with Japan; they were at war with another state. Maybe you can make some argument that the populations of those states, maybe in some perverted way, are targets. Gaza is not a state. Gaza is under occupation. My understanding of international law is you cannot attack the population of a place that’s under your occupation.

    The attack against the Israelis on October 7, which was a terrorist and murderous attack, you can use any adjectives you want, I condemn it. This was not an act from a state. To attack the population of Gaza when you’re the occupying power has no basis in international law, as far as I understand it.

    Shir Hever

    Yeah, well, I’m not the best person to talk about international law. To my understanding, it’s also very much illegal to tell the population of Gaza that you have 24 hours to leave the northern part of Gaza, and everyone who stays behind will be killed. That’s also something that neither the United States nor Great Britain or any of those examples did as part of their fighting, whether it was against Germany or Japan or whatever.

    I do think that it is interesting from the point of view, especially the use of the atomic bomb on Hiroshima and Nagasaki. That’s a very good example because the American administration at the time knew that they could not just send a pilot with an atomic bomb to destroy a civilian city. They had to lie to the pilot. They had to lie to the mathematicians, [John] von Neumann and [Oskar] Morgenstern, who later established RAND corporation, and told them, we need you to develop a model using game theory in order to find which targets would be least defended by air defense systems in Japan. They came up with the cities of Hiroshima and Nagasaki, but without being told what was the real purpose of this experiment, of this mathematical exercise. I think the American administration understood you cannot expect people to commit atrocities on their own. You have to lie to them and manipulate them. So that’s a very interesting example here.

    When you say Hamas is not a state, a lot of Israelis would say this is just a technical issue because if Hamas is so strong and if we are so afraid and we have to defend ourselves and all that, then we should fight with all our force to survive. This sort of argument, which they tried to use at the International Court of Justice, is based on a fantasy, on a hallucination, as if Hamas can be defeated by killing a lot of children and a lot of unarmed civilians, and that will somehow weaken Hamas. Yet, it doesn’t. In fact, look at the rate of casualties in the Israeli army for 100 days. They’re being killed every day in Gaza by Hamas fighters, not by innocent, defenseless civilians. They’re being killed by Hamas fighters who keep controlling all these tunnels and have access to enough weapons, petrol, and everything they need in order to continue their fighting against the Israeli military.

    The Israeli army failed to rescue any of the hostages. They failed to target or assassinate even one of the leaders of Hamas in Gaza. They only assassinated one in Lebanon. But in Gaza, all of this bombing has achieved nothing except killing a lot of civilians. So, that is also a hallucination. That is also a lie. There’s no number of thousands of families who will be trapped under the rubble, dying slowly, and prevented from being rescued by the Israeli military that will make Israel win this war.

    Paul Jay

    Did the testimony, not testimony, but the statements of the South African attorneys at the trial or the court, was any of that shown on Israeli television? And if yes, would it have any effect on people? Because it was quite eloquent and powerful, what was said there?

    Shir Hever

    It was not. Sadly, it was not. The Israeli defense was shown. The responses were shown, but not the actual accusations. You hear so many cries of indignation from the Israeli public, from journalists, from politicians calling it blood libel. How can you say that the state of Israel is killing children when they are killing a lot of children? How can you do this to the Jewish people when it was, in fact, almost the second sentence uttered by the legal team of Israel and the International Court of Justice? In fact, the whole convention for the Prevention of Genocide was for Jews and belongs to Jews, and therefore, Israel is above the law and cannot be targeted by this convention. This is the sort of argument you hear on Israeli media.

    I am a little bit taken aback and am listening carefully to what I hear on Israeli media, and I’m following very closely. They are actually admitting each and every element in the accusation of the South African legal team. They’re saying, yes, there were calls for genocide, and yes, there was targeting of civilians and use of starvation as a weapon. This is something that you can’t actually see in the Israeli media. If you put all these things together, you’d say, well, did Myanmar commit genocide against Rohingya? Then the Israelis would say, well, absolutely. One, two, three. This is how genocide is defined. But they recognize the one, two, and three that Israel is committing but are not able to reach the same conclusion.

    Paul Jay

    The AI systems we have been talking about, are they Israeli manufactured and designed?

    Shir Hever

    I don’t believe so. In fact, there was this index of which countries have the most advanced artificial intelligence technologies in the world. Israel was lagging behind very low, even below the United Arab Emirates, because the Israeli so-called high-tech miracle is really developed as a weapon of oppression against Palestinians. Artificial intelligence is a different technology. They just don’t have it.

    Paul Jay

    Do we know where they’re getting it from?

    Shir Hever

    This is a very big question. I had some suspicions that the Nimbus Project, which is a project by Amazon and Google providing cloud services to the Israeli military, might also be providing artificial intelligence services. I haven’t found proof of this yet, so I’m not making the accusation at this point.

    Then, the company Palantir, which you may know, is owned by Peter Thiel, a big Trump supporter. It is a company that is already well known for its technology of surveillance and oppression. They announced officially that they had signed a contract with the Israeli military to provide artificial intelligence. They’re not saying that it’s artificial intelligence for the purpose of acquiring targets, but I think that they are right now the prime suspects as to who is providing this technology.

    Paul Jay

    Okay. All right, thanks very much, Shir. We’ll pick this up again soon.

    Shir Hever

    Thank you, Paul.

    Paul Jay

    Thank you for joining us on theAnalysis.news. Don’t forget, if you come over to the website, get on the email list if you’re not on it. If you’re on YouTube, subscribe, and you can always hit the donate button if you’re so moved. Thanks again.

    Select one or choose any amount to donate whatever you like
    any amount
  • $5
  • $15
  • $25
  • $50
  • $100
  • $500
  • $1,000
    Custom Amount
    $
    Make this donation each month (optional)
    User my donation to help support the upcoming documentary "How to stop a nuclear war" (optional)
    Donate with Credit Card

    var gform;gform||(document.addEventListener("gform_main_scripts_loaded",function(){gform.scriptsLoaded=!0}),window.addEventListener("DOMContentLoaded",function(){gform.domLoaded=!0}),gform={domLoaded:!1,scriptsLoaded:!1,initializeOnLoaded:function(o){gform.domLoaded&&gform.scriptsLoaded?o():!gform.domLoaded&&gform.scriptsLoaded?window.addEventListener("DOMContentLoaded",o):document.addEventListener("gform_main_scripts_loaded",o)},hooks:{action:{},filter:{}},addAction:function(o,n,r,t){gform.addHook("action",o,n,r,t)},addFilter:function(o,n,r,t){gform.addHook("filter",o,n,r,t)},doAction:function(o){gform.doHook("action",o,arguments)},applyFilters:function(o){return gform.doHook("filter",o,arguments)},removeAction:function(o,n){gform.removeHook("action",o,n)},removeFilter:function(o,n,r){gform.removeHook("filter",o,n,r)},addHook:function(o,n,r,t,i){null==gform.hooks[o][n]&&(gform.hooks[o][n]=[]);var e=gform.hooks[o][n];null==i&&(i=n+"_"+e.length),gform.hooks[o][n].push({tag:i,callable:r,priority:t=null==t?10:t})},doHook:function(n,o,r){var t;if(r=Array.prototype.slice.call(r,1),null!=gform.hooks[n][o]&&((o=gform.hooks[n][o]).sort(function(o,n){return o.priority-n.priority}),o.forEach(function(o){"function"!=typeof(t=o.callable)&&(t=window[t]),"action"==n?t.apply(null,r):r[0]=t.apply(null,r)})),"filter"==n)return r[0]},removeHook:function(o,n,t,i){var r;null!=gform.hooks[o][n]&&(r=(r=gform.hooks[o][n]).filter(function(o,n,r){return!!(null!=i&&i!=o.tag||null!=t&&t!=o.priority)}),gform.hooks[o][n]=r)}});

    #gform_wrapper_10[data-form-index="0"].gform-theme,[data-parent-form="10_0"]{--gf-color-primary: #204ce5;--gf-color-primary-rgb: 32, 76, 229;--gf-color-primary-contrast: #fff;--gf-color-primary-contrast-rgb: 255, 255, 255;--gf-color-primary-darker: #001AB3;--gf-color-primary-lighter: #527EFF;--gf-color-secondary: #fff;--gf-color-secondary-rgb: 255, 255, 255;--gf-color-secondary-contrast: #112337;--gf-color-secondary-contrast-rgb: 17, 35, 55;--gf-color-secondary-darker: #F5F5F5;--gf-color-secondary-lighter: #FFFFFF;--gf-color-out-ctrl-light: rgba(17, 35, 55, 0.1);--gf-color-out-ctrl-light-rgb: 17, 35, 55;--gf-color-out-ctrl-light-darker: rgba(104, 110, 119, 0.35);--gf-color-out-ctrl-light-lighter: #F5F5F5;--gf-color-out-ctrl-dark: #585e6a;--gf-color-out-ctrl-dark-rgb: 88, 94, 106;--gf-color-out-ctrl-dark-darker: #112337;--gf-color-out-ctrl-dark-lighter: rgba(17, 35, 55, 0.65);--gf-color-in-ctrl: #fff;--gf-color-in-ctrl-rgb: 255, 255, 255;--gf-color-in-ctrl-contrast: #112337;--gf-color-in-ctrl-contrast-rgb: 17, 35, 55;--gf-color-in-ctrl-darker: #F5F5F5;--gf-color-in-ctrl-lighter: #FFFFFF;--gf-color-in-ctrl-primary: #204ce5;--gf-color-in-ctrl-primary-rgb: 32, 76, 229;--gf-color-in-ctrl-primary-contrast: #fff;--gf-color-in-ctrl-primary-contrast-rgb: 255, 255, 255;--gf-color-in-ctrl-primary-darker: #001AB3;--gf-color-in-ctrl-primary-lighter: #527EFF;--gf-color-in-ctrl-light: rgba(17, 35, 55, 0.1);--gf-color-in-ctrl-light-rgb: 17, 35, 55;--gf-color-in-ctrl-light-darker: rgba(104, 110, 119, 0.35);--gf-color-in-ctrl-light-lighter: #F5F5F5;--gf-color-in-ctrl-dark: #585e6a;--gf-color-in-ctrl-dark-rgb: 88, 94, 106;--gf-color-in-ctrl-dark-darker: #112337;--gf-color-in-ctrl-dark-lighter: rgba(17, 35, 55, 0.65);--gf-radius: 3px;--gf-font-size-secondary: 14px;--gf-font-size-tertiary: 13px;--gf-icon-ctrl-number: url("data:image/svg+xml,%3Csvg width='8' height='14' viewBox='0 0 8 14' fill='none' xmlns='http://www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M4 0C4.26522 5.96046e-08 4.51957 0.105357 4.70711 0.292893L7.70711 3.29289C8.09763 3.68342 8.09763 4.31658 7.70711 4.70711C7.31658 5.09763 6.68342 5.09763 6.29289 4.70711L4 2.41421L1.70711 4.70711C1.31658 5.09763 0.683417 5.09763 0.292893 4.70711C-0.0976311 4.31658 -0.097631 3.68342 0.292893 3.29289L3.29289 0.292893C3.48043 0.105357 3.73478 0 4 0ZM0.292893 9.29289C0.683417 8.90237 1.31658 8.90237 1.70711 9.29289L4 11.5858L6.29289 9.29289C6.68342 8.90237 7.31658 8.90237 7.70711 9.29289C8.09763 9.68342 8.09763 10.3166 7.70711 10.7071L4.70711 13.7071C4.31658 14.0976 3.68342 14.0976 3.29289 13.7071L0.292893 10.7071C-0.0976311 10.3166 -0.0976311 9.68342 0.292893 9.29289Z' fill='rgba(17, 35, 55, 0.65)'/%3E%3C/svg%3E");--gf-icon-ctrl-select: url("data:image/svg+xml,%3Csvg width='10' height='6' viewBox='0 0 10 6' fill='none' xmlns='http://www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0.292893 0.292893C0.683417 -0.097631 1.31658 -0.097631 1.70711 0.292893L5 3.58579L8.29289 0.292893C8.68342 -0.0976311 9.31658 -0.0976311 9.70711 0.292893C10.0976 0.683417 10.0976 1.31658 9.70711 1.70711L5.70711 5.70711C5.31658 6.09763 4.68342 6.09763 4.29289 5.70711L0.292893 1.70711C-0.0976311 1.31658 -0.0976311 0.683418 0.292893 0.292893Z' fill='rgba(17, 35, 55, 0.65)'/%3E%3C/svg%3E");--gf-icon-ctrl-search: url("data:image/svg+xml,%3Csvg version='1.1' xmlns='http://www.w3.org/2000/svg' width='640' height='640'%3E%3Cpath d='M256 128c-70.692 0-128 57.308-128 128 0 70.691 57.308 128 128 128 70.691 0 128-57.309 128-128 0-70.692-57.309-128-128-128zM64 256c0-106.039 85.961-192 192-192s192 85.961 192 192c0 41.466-13.146 79.863-35.498 111.248l154.125 154.125c12.496 12.496 12.496 32.758 0 45.254s-32.758 12.496-45.254 0L367.248 412.502C335.862 434.854 297.467 448 256 448c-106.039 0-192-85.962-192-192z' fill='rgba(17, 35, 55, 0.65)'/%3E%3C/svg%3E");--gf-label-space-y-secondary: var(--gf-label-space-y-md-secondary);--gf-ctrl-border-color: #686e77;--gf-ctrl-size: var(--gf-ctrl-size-md);--gf-ctrl-label-color-primary: #112337;--gf-ctrl-label-color-secondary: #112337;--gf-ctrl-choice-size: var(--gf-ctrl-choice-size-md);--gf-ctrl-checkbox-check-size: var(--gf-ctrl-checkbox-check-size-md);--gf-ctrl-radio-check-size: var(--gf-ctrl-radio-check-size-md);--gf-ctrl-btn-font-size: var(--gf-ctrl-btn-font-size-md);--gf-ctrl-btn-padding-x: var(--gf-ctrl-btn-padding-x-md);--gf-ctrl-btn-size: var(--gf-ctrl-btn-size-md);--gf-ctrl-btn-border-color-secondary: #686e77;--gf-ctrl-file-btn-bg-color-hover: #EBEBEB;--gf-field-pg-steps-number-color: rgba(17, 35, 55, 0.8);}
    Never miss another story

    Subscribe to theAnalysis.news – Newsletter

    Email(Required)
    Name(Required)



    First



    Last







    Δdocument.getElementById( "ak_js_1" ).setAttribute( "value", ( new Date() ).getTime() );

    gform.initializeOnLoaded( function() {gformInitSpinner( 10, 'https://theanalysis.news/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery('#gform_ajax_frame_10').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') >= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_10');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_10').length > 0;var is_redirect = contents.indexOf('gformRedirect(){') >= 0;var is_form = form_content.length > 0 && ! is_redirect && ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_10').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_10').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_10').removeClass('gform_validation_error');}setTimeout( function() { /* delay the scroll by 50 milliseconds to fix a bug in chrome */ }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_10').val();gformInitSpinner( 10, 'https://theanalysis.news/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [10, current_page]);window['gf_submitting_10'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}setTimeout(function(){jQuery('#gform_wrapper_10').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [10]);window['gf_submitting_10'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_10').text());}, 50);}else{jQuery('#gform_10').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger('gform_post_render', [10, current_page]);gform.utils.trigger({ event: 'gform/postRender', native: false, data: { formId: 10, currentPage: current_page } });} );} );

    Dr. Shir Hever studies the economic aspects of the Israeli occupation of the Palestinian territory. He is the manager of the Alliance for Justice between Israelis and Palestinians (BIP) and the military embargo coordinator for the Boycott National Committee (BNC).

    theAnalysis.news theme music
    written by Slim Williams for Paul Jay’s documentary film “Never-Endum-Referendum“.  
    Never-Endum-Referendum
    Artist Website
    Paul Jay’s Documentaries
    ...more
    View all episodesView all episodes
    Download on the App Store

    theAnalysis.newsBy Paul Jay

    • 4.8
    • 4.8
    • 4.8
    • 4.8
    • 4.8

    4.8

    117 ratings


    More shows like theAnalysis.news

    View all
    Economic Update with Richard D. Wolff by Democracy at Work, Richard D. Wolff

    Economic Update with Richard D. Wolff

    1,991 Listeners

    Behind the News with Doug Henwood by Doug Henwood

    Behind the News with Doug Henwood

    518 Listeners

    Jacobin Radio by Jacobin

    Jacobin Radio

    1,460 Listeners

    Ralph Nader Radio Hour by Ralph Nader

    Ralph Nader Radio Hour

    1,209 Listeners

    The Katie Halper Show by Katie Halper

    The Katie Halper Show

    1,511 Listeners

    The Intercept Briefing by The Intercept

    The Intercept Briefing

    6,127 Listeners

    Useful Idiots with Katie Halper and Aaron Maté by Useful Idiots, LLC

    Useful Idiots with Katie Halper and Aaron Maté

    4,452 Listeners

    Bad Faith by Briahna Joy Gray

    Bad Faith

    2,710 Listeners

    The Socialist Program with Brian Becker by The Socialist Program

    The Socialist Program with Brian Becker

    557 Listeners

    MOATS with George Galloway by Molucca Media Ltd

    MOATS with George Galloway

    153 Listeners

    American Prestige by Daniel Bessner & Derek Davison

    American Prestige

    1,070 Listeners

    System Update with Glenn Greenwald by Rumble

    System Update with Glenn Greenwald

    1,200 Listeners

    Geopolitical Economy Report by Ben Norton

    Geopolitical Economy Report

    321 Listeners

    The Chris Hedges Report by Chris Hedges

    The Chris Hedges Report

    367 Listeners

    Drop Site News by Drop Site News

    Drop Site News

    484 Listeners