TruthArchive.ai - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
In 2021, Israeli intelligence developed an AI program called Lavender to target individuals in war. The system designated 37,000 Palestinians as targets, resulting in civilian casualties. The IDF used mass surveillance to assess the likelihood of each person being a militant and targeted them accordingly. The Lavender system tracked individuals with patterns similar to Hamas, leading to the deaths of many innocent civilians. This AI targeting system has similarities to the US surveillance system.

Video Saved From X

reSee.it Video Transcript AI Summary
First Speaker argues that Microsoft provided services and access to data, including Palestinian data, which allowed Israel to set up systems to mass target and mass kill Palestinians. They mention an application called "Where is Daddy?" that allows the army to randomly track people and reach them when they are with their families in order to inflict the most harm, describing it as brutal. They state agreement with this view and emphasize the importance of understanding that this represents the end of humanity and the civilization people have pretended to belong to. They claim Israel has the most sophisticated military in the region and has known exactly what it is doing for two years. They assert that many soldiers are breaking down and suicide rates are increasing among young Israelis who have served in the army, noting they are older than teenagers and have been turned by indoctrination into willing executioners of a genocide. They call for intervention by people who love Israel to save what remains of Israel. First Speaker contends that the biggest harm is being done by those outside of Israel who defend the regime. They describe the regime as having imposed a military dictatorship for decades on Palestinians in the West Bank and Jerusalem, and until 2005 in Gaza, and claim this regime also extends to some Israelis who are part of the system. They argue that brutality toward others undermines one's own humanity. Second Speaker agrees and seeks clarification, asking if there is an app, possibly by an American company, called "Where's Daddy" that allows the Israeli government to murder men in front of their children. They reference the prior statements and want confirmation of that claim. First Speaker responds that Israel has developed not just a system but an automatized system to decide targets through a computing system, and that data has been provided by technology. They reiterate that this is part of a broader system of targeting.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: So what do you think about the use of artificial intelligence or Lavender by the IDF in identifying Hamas targets? And secondly, do you agree with Elon Musk about that the population decline is a risk for humanity? Speaker 1: Look, I again, I'm not I'm not, you know, with without without going into all the you know, I I'm I'm not on top of all the details of what's going on in Israel because my my bias is to defer to Israel. It's it's not for us to to second guess every everything. And I I believe that broadly the IDF gets to decide what it wants to do and that they're broadly in the right. And that's that's sort of the the perspective I come back to and

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker states that the army acts in a more moral fashion than the world. When asked how many Palestinians he has killed, he replies he doesn't count. He finds the removal of Gaza and the topic in general to be funny. He believes most people are racist. He states that only America can help Israel, needing their support, protection in the UN Security Council, and assistance in the Hague. He thanks President Biden and the people in Congress for their support. He believes when Israel wins, the entire civilized world wins. He mentions people are looking for a baby.

Video Saved From X

reSee.it Video Transcript AI Summary
Hamas invaded Israel on October 7th. Speaker 1 admits to not being well-informed about the situation and feels unqualified to comment. They express uncertainty about the accuracy of the information they have seen.

Video Saved From X

reSee.it Video Transcript AI Summary
We are not targeting anyone else in Gaza but civilians. Hamas is a terrorist organization. We are the victims, not the aggressors. There is no moral equivalence.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker emphasizes the need for independent answers regarding the failures of US and Israeli intelligence and defense. They support Israel's right to defend itself but caution against an emotional response and the risk of getting involved in a broader regional conflict. They criticize the Republicans for their unhelpful emotional response and stress the importance of cool-headedness in times of crisis. Another speaker highlights the severity of the situation by comparing it to a hypothetical attack on the US. The first speaker believes that Israel should make its own decision and offers compassion, support, and lessons from past US mistakes. They suggest offering limited munitions to Israel for self-defense without escalating into a broader regional war.

Video Saved From X

reSee.it Video Transcript AI Summary
Natalie asks about the AI piece, expressing cynicism that there may be a push for a “war bot” to circumvent consumer AI limits that block starting wars with WMDs, and wonders if there is a benevolent reason. Matthew responds that it’s worse than that: Hengseth described a platform to run on military desktops worldwide—secure, like ChatGPT or Claude but for the Pentagon and military services—that “doesn’t allow information to get out.” The core issue, he says, is who controls the AI, and two key questions about the future of war with AI: who ultimately owns these AI platforms, and who informs them—who gives them the algorithm and programming and essentially orders on how to answer questions. He notes increasing concerns about reliability of information, including how ChatGPT handles questions about trustworthy news sources. He mentions that ChatGPT defers to institutional structures rather than historical accuracy. The risk, he says, is that military AI programs may not provide honest, candid, objective information to military personnel, but rather information based on narratives the Pentagon or manufacturers want. A common belief is that technology makes war more precise and reduces civilian harm, but Matthew contends this is a myth. He explains that precision-guided munitions were not about preventing civilian casualties but about increasing efficiency—“the purpose was to make the weapons more efficient, so we had to drop less bombs to, say, blow up a bridge.” He cites the small diameter bomb as evidence that the aim is not to limit civilian casualties but to allow more bombs to be delivered from aircraft. He highlights real-world examples of AI in warfare, referencing Israeli systems in Gaza. He explains that three AI programs—Lavender, Gospel, and Where’s Daddy?—play roles in targeting and timing strikes. Lavender scans theInternet and databases to identify targets (e.g., labeling someone as a Hamas supporter based on a past online activity), and Where’s Daddy? coordinates that information to ensure bombs hit resistance fighters “when they are with their families,” not away from them. He notes reporting from Israeli media and Nine Two Magazine about these programs and urges viewers to examine that reporting; Tucker Carlson’s coverage is mentioned as example. Matthew argues this demonstrates the dystopian potential of AI in war and cautions against assuming American AI would be more benevolent. He mentions commentator references to justify or excuse actions, including a remark attributed to Mike Huckabee that “Israel did not attack Qatar. They just sent a missile into their country aimed at one person,” noting the nearby injuries or deaths. He ends with a reminder of Orwell’s reflections on war and the idea that those who cheer for war may be less enthusiastic if they experience its costs, suggesting a broader aim to make the costs of war felt among ruling elites who benefit from it.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the assassination and the legality of it. They mention that while there may be a legal justification for it, the way it was carried out raises suspicions and makes Israel look like Hamas. The speaker suggests that a better approach would be for Israeli forces to go in as uniformed police or troops, surround the area, and try to arrest the targets. If they resist, then force can be used. The speaker believes that the current method of assassination, pretending to be hospital employees, is tactically unwise and only strengthens the criticism against the IDF's actions in Gaza.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 asserts that Bezalel Smotrich and Ben Gavir are “literally talking about exterminating the entire population of Gaza.” Speaker 1 counters that they are not talking about extermination. Speaker 0 insists the statements are brazen, up front, and what they actually want to do. Speaker 0 adds that Hamas is involved in a separate context. Speaker 0 says, “The West Bank had nothing to do with what happened on October 7, but they're annexing that land anyway. They're raining terror on innocent people, innocent Palestinians.” Speaker 0 concedes, “I am willing to admit, because it's the truth, that what Hamas did on October 7 was a fucking atrocity,” specifically mentioning killing innocent people. Speaker 1 challenges acknowledgement of atrocities against civilians in Gaza. Speaker 0 asks about a hospital being tapped; Speaker 1 responds that it’s an old terrorist trick and they do it “all the time.” Speaker 0 asks whether the IDF's action was wrong. Speaker 1 concedes, “I'm sure they have committed what we would call war crimes, as every army does in every war.” Speaker 0 notes, “Including our own.” Speaker 1 agrees, giving the Civil War example: Sherman burned Atlanta and Vad, arguing that despite brutality, the North were the good guys fighting slavery, and also noting Israel is fighting to survive and is the front line in the Western world. Speaker 0 disputes this, saying much of the problems in the Middle East come from an expansionist policy and that if Israel wasn’t trying to continue expanding, they would not be dealing with the enemies they’re dealing with. Speaker 1 disagrees that they ever were expanding, arguing they “were attacked” and that they “never been trying to expand.” Speaker 0 claims Israel is trying to annex the West Bank, southern Lebanon, and Syria, and argues they have succeeded in doing so. Speaker 1 says these are lands where they were attacked from when Israel became a country in 1947; he claims Israel said, “we will accept half a loaf,” and asserts they had as much right to that land as anybody, with a historical presence since a thousand BC when King David had a lineage. Speaker 0 dismisses this lineage-based argument as irrelevant to the present. Speaker 1 counters that it’s relevant, and asserts that the notion of wiping out innocent people merely because one’s ancestors lived there centuries ago is not acceptable. The conversation ends with Speaker 0 calling Palestinians colonizers, and Speaker 1 arguing they are not colonizers; they assert that Israel is annexing land, which, in their view, is described as colonization.

Video Saved From X

reSee.it Video Transcript AI Summary
Former Israeli tank commander Ori Givarti, from Breaking the Silence, discusses the killing of innocent people in Gaza and lack of trust in IDF investigations. He explains how military policies in Gaza allow for the targeting of civilian homes and the predetermined number of innocent civilians that can be killed to destroy a target. Givarti highlights the use of an AI system to select targets, emphasizing the problematic nature of Israeli military tactics in Gaza.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker was asked about the IDF's use of AI, specifically Lavender, to identify Hamas targets. The speaker stated they are not on top of all the details of what's going on in Israel and their bias is to defer to Israel. They believe it's not for others to second guess everything and that broadly the IDF gets to decide what it wants to do and that they're broadly in the right. That is the perspective they come back to.

Video Saved From X

reSee.it Video Transcript AI Summary
The speakers are discussing the permissibility of collateral damage in war and whether civilians can be considered collateral damage. They mention examples of targeting refugee camps, hospitals, and mosques, with one speaker claiming that Israel targeted a hospital. The other speaker challenges this claim and asks for evidence. They also question the credibility of the evidence presented by Israel. The conversation becomes heated as they debate the validity of the evidence.

Video Saved From X

reSee.it Video Transcript AI Summary
A speaker claimed few people get wealthy, and another speaker alleged Al Qaeda killed their family in Palestine using AI and technology. The first speaker stated the primary source of death in Palestine is that Hamas has realized there are millions of useful idiots. Another speaker accused them of using AI and technology to kill Palestinians, not just terrorists. The first speaker responded that if the speaker's argument was strong, they would allow them to talk. The second speaker thanked anyone else who supports using technology and AI to kill Palestinians.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 raises two questions: first, what is the view on the use of artificial intelligence or Lavender by the IDF in identifying Hamas targets; second, whether they agree with Elon Musk about population decline being a risk to humanity. Speaker 1 responds: “Look, I again, I'm not I'm not, you know, with without without going into all the deep you know, I I'm I'm not on top of all the details of what's going on in Israel because my my bias is to defer to Israel. It's it's not for us to to second guess every everything. And I I believe that broadly the IDF gets to decide what it wants to do and that they're broadly in the right and that's that's sort of the the perspective I come back to.”

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 1 and Speaker 0 discuss the implications of AI in military use. They consider whether consumer AI is being bypassed with a secure, military-specific platform that would be sealed—essentially one-way in and no information out—for the Pentagon and military services. The key questions raised are: who controls the AI, who informs its algorithms, and who gives it its orders on how to answer questions, highlighting concerns about privatization and outsourcing of war. Speaker 1 argues that the future of war with AI hinges on two issues: ownership of AI platforms and the sources of their programming. They note that AI can deflect or defer to institutional structures rather than empirical accuracy, raising concerns about the reliability of information provided to military personnel. They also reference the myth that advancing technology automatically reduces civilian harm, citing that precision-guided munitions were designed for efficiency, not necessarily to prevent civilian casualties, noting that the intent was to reduce the number of bombs needed to achieve targets. The conversation shifts to the concept of precision in weapons. Speaker 1 points out that laser- and GPS-guided bombs were not primarily invented to minimize civilian casualties but to increase efficiency. They mention the small diameter bomb as an example, explaining that its use increases the number of bombs that can be deployed rather than primarily limiting collateral damage. The discussion then moves to real-world AI systems used in conflict zones. Speaker 1 cites Israeli programs—Lavender, Gospel, and Where’s Daddy?—as examples of nefarious and insidious AI in war. Lavender supposedly scans the Internet and other databases to identify targets, for example flagging someone as a Hamas supporter based on years of activity. Where’s Daddy? allegedly guides Israeli drones to strike fighters when they are with their families, not away from them. This reporting is linked to coverage from Israeli media and Nine Seven Two magazine, and Speaker 2 references Tucker Carlson’s coverage of these issues. Speaker 2 amplifies the point by noting the emotional impact of such capabilities, arguing that targeting men when they are with their children is particularly disturbing. They also discuss broader political reactions, including a remark attributed to Ambassador Huckabee about Israel not attacking Qatar but “sending a missile there” that injured nearby people. Speaker 1 concludes by invoking Orwell’s reflection on the Spanish Civil War, suggesting that those who cheer for war may be confronted by the consequences when modern aircraft enable distant bombing. They emphasize the need to make the costs of war felt by the ruling classes who benefit from it, not just the people on the ground.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 describes a 2021 claim by the commander of Israeli intelligence to design a machine to resolve a human bottleneck in locating and approving targets in war. A recent investigation by Plus 972 Magazine and Local Call reveals that the Israeli army developed an AI-based Lavender system to designate targets and direct airstrikes. During the initial weeks of the Lavender operation, the system designated about 37,000 Palestinians as targets and directed airstrikes on their homes. The system reportedly had an error rate of about 10%, and there was no requirement to verify the machine’s data. The Israeli army systematically attacked targeted individuals at night in their homes while their whole family was present. An automated component, known as “where’s daddy,” tracked targeted individuals and carried out bombings when they entered their family residences. The result, according to the report, was that thousands of women and children were killed by Israeli airstrikes. Israeli intelligence officers allegedly stated that the IDF bombed homes as a first option, and in several cases entire families were murdered when the actual target was not inside. In one instance, four buildings were destroyed along with everyone inside because a single target was in one of them. For targets marked as low level by Lavender, cheaper bombs were used, destroying entire buildings and killing mostly civilians and entire families. It was alleged that the IDF did not want to waste expensive bombs on “unimportant people,” and it was decided that for every low-level Hamas operative Lavender marked, it was permissible to kill up to 15 or 20 civilians; for a senior Hamas official, more than 100 civilians could be killed. Most AI targets were never tracked before the war. Lavender analyzed information collected on the 2,300,000 residents of the Gaza Strip through mass surveillance, assessing the likelihood of each person being a militant and giving a rating from 1 to 100. If the rating was high enough, the person and their entire family were killed. Lavender flagged individuals with patterns similar to Hamas, including police, civil defense, relatives, and residents with similar names or nicknames. The report notes that this kind of tracking system has existed in the US for years. Speaker 1 presents a counterpoint: a “fine gentleman of the secret service” claims to provide a list of every threat made about the president since February 3 and profiles of every threat maker, implying that targets could be identified through broad data collection including emails, chats, SMS. The passage suggests a tool akin to a Google search but including private communications. Speaker 0 adds that although some claim Israel controls the US, Joe Biden says Israel serves US interests. Speaker 2: A speaker asserts, “There’s no apology to be made. None. It is the best $3,000,000,000 investment we make,” and claims that without Israel the United States would have to invent an Israel to protect its regional interests. Speaker 0 closes reporting for Infowars, credited to Greg Reese.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker was asked why not blame Hamas for the atrocities. They explained their mission was to gather information, not assign blame. The speaker acknowledged the frustration of the people of Israel and emphasized the need for the government to provide access for further investigation.

Video Saved From X

reSee.it Video Transcript AI Summary
Israel is constantly under attack, relying on intelligence to avoid mistakes. A former Israeli military intelligence member discusses the recent conflict with Hamas in May 2021. Hamas aims to destroy Israel, launching missiles into Israeli cities. Israel defends itself using technology like the Iron Dome, but faces violent attacks from Palestinians. The speaker emphasizes that strength does not equate to aggression, urging a nuanced understanding of the conflict.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 and Speaker 1 discuss the situation in Gaza. Speaker 0 argues that Israel is defending itself after a massacre, while Speaker 1 highlights the civilian casualties and calls for a temporary ceasefire. Speaker 0 questions why France considers the numbers provided by a terrorist organization reliable. Speaker 1 mentions alternative military strategies to minimize civilian casualties, but Speaker 0 dismisses the idea, stating that Israel knows how to conduct its military operations. The conversation becomes heated as Speaker 0 accuses Speaker 1 of treating Israel like a child and disregarding its military expertise. Speaker 1 clarifies that the information comes from American sources. The discussion ends with Speaker 0 questioning why Israel would give advice to the French military when they don't fund it.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker states that they have not seen any evidence to suggest a need for a different approach in helping Israel defend itself. When asked if any formal assessment has been conducted to determine if Israel is following the rules of war, the speaker admits to being unaware of any such assessment by the United States government. The question of how they can ensure that the weapons and resources provided by the U.S. adhere to international law is raised, to which the speaker reiterates that they have not seen anything to suggest a change in their approach to assisting Israel's defense.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the difference between targeting Hamas and intentionally harming civilians. They claim that the Israeli actions are not solely focused on Hamas, but rather involve purposely killing a large number of civilians. They argue that evidence from Israeli leaders and assessments supports the idea that this is a campaign to punish and ethnically cleanse Gaza and the West Bank by getting rid of Palestinians.

Video Saved From X

reSee.it Video Transcript AI Summary
- Shortly before the attack, the government allegedly ordered the removal of all military presence from the area, giving Hamas a “free pass” to enter and begin their operation. In the following videos, former Israeli Defense Force (IDF) members warn that something very concerning is happening in Israel. - Afat Fenningzon reports, dated 10/07/2023, that Israeli defense forces around Gaza were instead positioned around the West Bank due to security concerns, leaving the Gaza envelope unoccupied. He says about 60 to 80% of that area was left without IDF forces. He notes that a year earlier there was a Gaza operation to prepare for such events, and ongoing trainings for these scenarios exist. This raises questions about Israeli intelligence: two years ago there were successful deployments of underground barriers with sensors to alert on terrorist breaches, yet there was zero response to the border and fence breaching. He emphasizes that Israel has a highly advanced military and questions how there could be no indication of what was coming, given that a cat moving near a fence would trigger forces. He asks, “What happened to the strongest army in the world? How come border crossings were wide open?” He describes the chain of events as very unusual and not typical for the Israeli defense system. He calls the current government highly corrupt and asserts the previous one was no better, stating his goal is to expose evil forces. He characterizes the surprise attack as seemingly a planned operation on all fronts and, if he were a conspiracy theorist, would say it feels like the work of the deep state. He suggests the people of Israel and the people of Palestine have been sold to “higher powers,” acknowledging how difficult the reality is to fathom. - Speaker 2 questions how the strongest army and the most sophisticated intelligence in the world could allow a few hundred Hamas fighters to enter Israel and cause the attack, while Hamas fighters did not meet any Israeli resistance in the area. He asserts it is not logical and implies there is more behind it, suggesting Israel sacrificed its own people and civilians on the Gaza border, removed protection and the army, and allowed Hamas to carry out their actions. He reiterates that Israel has the most sophisticated intelligence and a strong army, yet such an incursion occurred, implying hidden mechanisms or plans at work.

Video Saved From X

reSee.it Video Transcript AI Summary
- The speakers claim that American financial institutions and tech companies are deeply involved in the Gaza killings. They name banks, pension funds, Amazon, Google, and Microsoft as having provided services and access to Palestinian data that enabled Israel to set up systems to mass target and kill Palestinians. - They describe an application called Where is Daddy, asserting it allows the army to randomly track people and reach them even when they are with their families, facilitating harm. - The discussion characterizes Israel as possessing the most sophisticated military in the region, knowing precisely what it is doing for two years, and notes that many Israeli soldiers are breaking down, with rising suicidality among young soldiers who have served. - They argue that soldiers have been indoctrinated into becoming executioners of genocide, and that intervention is necessary to prevent further brutality. - The speakers contend that much of this action is driven by people outside Israel who defend the regime, which they describe as having imposed a military dictatorship on Palestinians in the West Bank, Jerusalem, and Gaza (the latter until 2005), and also affecting Israelis who are part of the system. They state that brutalizing others compromises humanity. - Speaker 0 presses for clarification about the existence of the Where is Daddy app, asking if it was a dream or a claim already stated. - Speaker 1 clarifies that Israel has developed an automated system to determine targets through computing, with data supplied by tech companies. He mentions Palantir as one company that publicly supports Israel. He references a public debate in which a Polish person protests that he is killing families, and the response is “you are killing civilians in Gaza,” to which the other person replies that the targets are “most probably terrorists.”

Mark Changizi

On the armchair military strategists for Israel. Moment 441
reSee.it Podcast Summary
Mark Changizi discusses the challenges of evaluating Israel's response to Hamas' actions, emphasizing that while one can condemn the atrocities committed by Hamas, assessing Israel's military strategy to minimize civilian casualties is complex and requires detailed knowledge. He criticizes those who judge Israel's actions without understanding the situation.
View Full Interactive Feed