TruthArchive.ai - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Israel's war on Hamas has caused immense devastation and loss of life in Gaza. The United States, despite international outcry, has bypassed Congress to approve an emergency weapon sale to Israel. Secretary Blinken claims an emergency exists, but it's unclear who Israel needs to defend themselves from. The situation in Gaza is dire, with thousands killed and millions displaced. The Israeli government's actions seem to indicate a desire for their genocide to occur more quickly.

Video Saved From X

reSee.it Video Transcript AI Summary
In 2021, Israeli intelligence developed an AI program called Lavender to target individuals in war. The system designated 37,000 Palestinians as targets, resulting in civilian casualties. The IDF used mass surveillance to assess the likelihood of each person being a militant and targeted them accordingly. The Lavender system tracked individuals with patterns similar to Hamas, leading to the deaths of many innocent civilians. This AI targeting system has similarities to the US surveillance system.

Video Saved From X

reSee.it Video Transcript AI Summary
The ongoing situation in Gaza is revealing a different side of Israel, one characterized by violence and systemic oppression. This Israel is accused of deliberately starving Palestinians and committing atrocities against civilians, including children. Reports highlight the use of AI to target families and the mocking behavior of soldiers towards victims. Medical facilities have been severely damaged, and healthcare workers face targeted attacks. Journalists are being killed, and humanitarian efforts are obstructed. Citizens of Israel are depicted as indifferent to the suffering in Gaza, engaging in activities that celebrate the destruction. This portrayal challenges the narrative of Israel as a democratic state and calls for a reevaluation of support for its actions, emphasizing the need for a shift towards human values for peace in the region.

Video Saved From X

reSee.it Video Transcript AI Summary
Al Jahzira has collated evidence revealing shocking actions by the Israeli army, including the destruction of Hebok Khazah. Soldiers filmed themselves destroying Palestinian homes and rifling through women's underwear. Some Israelis express support for erasing Gaza. An Israeli song mocking Palestinians who have lost their homes is mainstream. Israel uses AI for targeting in Gaza, tracking people via phones and social media to create kill lists, prioritizing those at home using software called "Where's Daddy?". Despite extensive online videos, there's a lack of footage showing dead Hamas fighters. All 36 hospitals in Gaza have been attacked. A messenger sent by Israelis into Nassar hospital was killed after delivering their message. The 202 paratroopers posted a video showing the killing of unarmed individuals. Deaths by snipers, including children, are common. Journalists are targeted. Over 10% of journalists in Gaza have been killed. Detainees are abused, with reports of beatings and sexual assault. Western media is criticized for double standards and biased coverage, particularly regarding human shields. There is no evidence Hamas uses human shields, while Israelis have been documented doing so. The West enables Israeli behavior. RAF Akrotiri is suspected of providing targeting information to Israel. Western politicians condemn Iranian attacks on Israel while excusing Israeli actions. The speaker urges support for independent media.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker A: The moral concern is that if you can remove the human element, you can use AI or autonomous targeting on individuals, and that could absolve us of the moral conundrum by making it seem like a mistake or that humans weren’t involved because it was AI or a company like Palantir. This worry is top of mind after the Min Minab girls school strike, and whether AI machine-assisted targeting played any role. Speaker B: In some ongoing wars, targeting decisions have been made by machines with no human sign-off. There are examples where the end-stage decision is simply identify and kill, with input data fed in but no human vetting at the final moment. This is a profound change and highly distressing. The analogy is like pager attacks where bombs are triggered with little certainty about who is affected, which many would label an act of terror. There is knowledge of both the use of autonomous weapons and mass surveillance as problematic points that have affected contracting and debates with a major AI company and the administration. Speaker A: In the specific case of the bombing of the girls’ school attached to the Iranian military base, today’s inquiries suggested that AI is involved, but a human pressed play in this particular instance. The key question becomes where the targeting coordinates came from and who supplied them to the United States military. Signals intelligence from Iran is often translated by Israel, a partner in this venture, and there are competing aims: Israel seeks total destruction of Iran, while the United States appears to want to disengage. There is speculation, not confirmation, about attempts to target Iran’s leaders or their officers’ families, which would have far-reaching consequences. The possibility of actions that cross a diplomatic line is a concern, especially given different endgames between the partners. Speaker C: If Israel is trying to push the United States to withdraw from the region, then the technology born and used in Israel—Palantir Maven software linked to DataMiner for tracking and social-media cross-checking—could lead to targeting in the U.S. itself. The greatest fear is that social media data could be used to identify who to track or target, raising the question of the next worst-case scenario in a context where war accelerates social change and can harden attitudes toward brutality and silencing dissent. War tends to make populations more tolerant of atrocities and less tolerant of opposing views, and the endgame could include governance by technology to suppress opposition rather than improve citizens’ lives. Speaker B: War changes societies faster than anything else, and it can produce a range of effects, from shifts in national attitudes to the justification of harsh measures during conflict. The discussion notes the risk of rule by technology and the possibility that the public could become disillusioned or undermined if their political system fails to address their concerns. The conversation also touched on the broader implications for democratic norms and the potential for technology-driven control. (Note: The transcript contains an advertising segment about a probiotic product, which has been omitted from this summary as promotional content.)

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker states that artificial intelligence is being used to create mass assassinations, blurring the lines between assassination and warfare. They claim that many targets in Gaza are bombed as a result of AI targeting. The speaker emphasizes the connection between AI and surveillance, asserting that AI needs information to generate targets, ideas, or propaganda. Surveillance data from telephones and the internet is key to training the algorithms used to conduct these mass assassinations.

Video Saved From X

reSee.it Video Transcript AI Summary
I emerged from prison to find that artificial intelligence is now used for mass assassinations, blurring the lines between assassination and warfare. Many targets in Gaza are bombed due to AI targeting. The link between artificial intelligence and surveillance is crucial, as AI relies on data from phones and the internet to identify targets and generate propaganda. Surveillance data is essential for training these algorithms to carry out such operations.

Video Saved From X

reSee.it Video Transcript AI Summary
Over the past year, a meticulously documented crime has unfolded, focusing on the town of Hebuk Khazah, which the Israeli army allegedly emptied and destroyed, recording and posting the destruction. Soldiers allegedly filmed themselves destroying homes and going through women's underwear. One individual stated they would "erase Gaza" if given the chance, a sentiment allegedly shared by many Israelis. Israelis allegedly mocked Palestinians who no longer have homes in song. AI is allegedly used for targeting in Gaza, using phones and social media to track individuals and assign points, with enough points leading to being killed, often at home using software called "Where's Daddy." Despite numerous videos, there is allegedly a lack of footage showing dead Hamas fighters. All 36 hospitals in Gaza have allegedly been assaulted. A man was allegedly used as a human shield and killed in front of his mother. Soldiers allegedly display impunity, posting videos of unarmed individuals being shot by snipers, including a child. Detainees are allegedly abused, with reports of beatings and a star of David carved into one man's back, and a dog being used to rape someone. 53 individuals have allegedly been killed in Israeli detention. Western media's double standards are criticized, with claims that the lives of one group of people are worth more than the lives of another group of people.

Video Saved From X

reSee.it Video Transcript AI Summary
Natalie asks about the AI piece, expressing cynicism that there may be a push for a “war bot” to circumvent consumer AI limits that block starting wars with WMDs, and wonders if there is a benevolent reason. Matthew responds that it’s worse than that: Hengseth described a platform to run on military desktops worldwide—secure, like ChatGPT or Claude but for the Pentagon and military services—that “doesn’t allow information to get out.” The core issue, he says, is who controls the AI, and two key questions about the future of war with AI: who ultimately owns these AI platforms, and who informs them—who gives them the algorithm and programming and essentially orders on how to answer questions. He notes increasing concerns about reliability of information, including how ChatGPT handles questions about trustworthy news sources. He mentions that ChatGPT defers to institutional structures rather than historical accuracy. The risk, he says, is that military AI programs may not provide honest, candid, objective information to military personnel, but rather information based on narratives the Pentagon or manufacturers want. A common belief is that technology makes war more precise and reduces civilian harm, but Matthew contends this is a myth. He explains that precision-guided munitions were not about preventing civilian casualties but about increasing efficiency—“the purpose was to make the weapons more efficient, so we had to drop less bombs to, say, blow up a bridge.” He cites the small diameter bomb as evidence that the aim is not to limit civilian casualties but to allow more bombs to be delivered from aircraft. He highlights real-world examples of AI in warfare, referencing Israeli systems in Gaza. He explains that three AI programs—Lavender, Gospel, and Where’s Daddy?—play roles in targeting and timing strikes. Lavender scans theInternet and databases to identify targets (e.g., labeling someone as a Hamas supporter based on a past online activity), and Where’s Daddy? coordinates that information to ensure bombs hit resistance fighters “when they are with their families,” not away from them. He notes reporting from Israeli media and Nine Two Magazine about these programs and urges viewers to examine that reporting; Tucker Carlson’s coverage is mentioned as example. Matthew argues this demonstrates the dystopian potential of AI in war and cautions against assuming American AI would be more benevolent. He mentions commentator references to justify or excuse actions, including a remark attributed to Mike Huckabee that “Israel did not attack Qatar. They just sent a missile into their country aimed at one person,” noting the nearby injuries or deaths. He ends with a reminder of Orwell’s reflections on war and the idea that those who cheer for war may be less enthusiastic if they experience its costs, suggesting a broader aim to make the costs of war felt among ruling elites who benefit from it.

Video Saved From X

reSee.it Video Transcript AI Summary
AI can be used to oppress people, as highlighted in an expose by 972 Magazine. The article discusses how Israel employed AI to identify suspects, but this technology resulted in the deaths of many civilians who were not the intended targets.

Video Saved From X

reSee.it Video Transcript AI Summary
Israel is significantly impacting the Gaza Strip, with ongoing attacks leading to accusations of war crimes. Human rights groups allege the use of white phosphorus against civilians, which is prohibited under international law. This weapon's deployment in urban areas raises serious concerns about indiscriminate harm to civilians. The destruction in Gaza is immense, challenging claims of self-defense and suggesting a campaign of extermination. Targeting hospitals and medical personnel further violates international protections, requiring justification from Israel. The staggering death toll reflects a troubling disregard for Palestinian lives. Awareness of these actions compels individuals to speak out against the situation in Palestine.

Video Saved From X

reSee.it Video Transcript AI Summary
Former Israeli tank commander Ori Givarti, from Breaking the Silence, discusses the killing of innocent people in Gaza and lack of trust in IDF investigations. He explains how military policies in Gaza allow for the targeting of civilian homes and the predetermined number of innocent civilians that can be killed to destroy a target. Givarti highlights the use of an AI system to select targets, emphasizing the problematic nature of Israeli military tactics in Gaza.

Video Saved From X

reSee.it Video Transcript AI Summary
An AI system marked 37,000 Palestinians in Gaza as suspected militants based on various factors. Despite knowing it made errors in 10% of cases, the IDF used this system to target individuals in their homes with unguided missiles, resulting in civilian casualties.

Video Saved From X

reSee.it Video Transcript AI Summary
An AI system called Lavender marked 37,000 Palestinians in Gaza as suspected militants based on small signs like phone usage. The Israeli military used this information to target and bomb these individuals, even though the system made errors in 10% of cases. This led to civilian casualties when unguided missiles were used on family homes, killing up to 20 civilians per suspected militant.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 1 and Speaker 0 discuss the implications of AI in military use. They consider whether consumer AI is being bypassed with a secure, military-specific platform that would be sealed—essentially one-way in and no information out—for the Pentagon and military services. The key questions raised are: who controls the AI, who informs its algorithms, and who gives it its orders on how to answer questions, highlighting concerns about privatization and outsourcing of war. Speaker 1 argues that the future of war with AI hinges on two issues: ownership of AI platforms and the sources of their programming. They note that AI can deflect or defer to institutional structures rather than empirical accuracy, raising concerns about the reliability of information provided to military personnel. They also reference the myth that advancing technology automatically reduces civilian harm, citing that precision-guided munitions were designed for efficiency, not necessarily to prevent civilian casualties, noting that the intent was to reduce the number of bombs needed to achieve targets. The conversation shifts to the concept of precision in weapons. Speaker 1 points out that laser- and GPS-guided bombs were not primarily invented to minimize civilian casualties but to increase efficiency. They mention the small diameter bomb as an example, explaining that its use increases the number of bombs that can be deployed rather than primarily limiting collateral damage. The discussion then moves to real-world AI systems used in conflict zones. Speaker 1 cites Israeli programs—Lavender, Gospel, and Where’s Daddy?—as examples of nefarious and insidious AI in war. Lavender supposedly scans the Internet and other databases to identify targets, for example flagging someone as a Hamas supporter based on years of activity. Where’s Daddy? allegedly guides Israeli drones to strike fighters when they are with their families, not away from them. This reporting is linked to coverage from Israeli media and Nine Seven Two magazine, and Speaker 2 references Tucker Carlson’s coverage of these issues. Speaker 2 amplifies the point by noting the emotional impact of such capabilities, arguing that targeting men when they are with their children is particularly disturbing. They also discuss broader political reactions, including a remark attributed to Ambassador Huckabee about Israel not attacking Qatar but “sending a missile there” that injured nearby people. Speaker 1 concludes by invoking Orwell’s reflection on the Spanish Civil War, suggesting that those who cheer for war may be confronted by the consequences when modern aircraft enable distant bombing. They emphasize the need to make the costs of war felt by the ruling classes who benefit from it, not just the people on the ground.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 describes a 2021 claim by the commander of Israeli intelligence to design a machine to resolve a human bottleneck in locating and approving targets in war. A recent investigation by Plus 972 Magazine and Local Call reveals that the Israeli army developed an AI-based Lavender system to designate targets and direct airstrikes. During the initial weeks of the Lavender operation, the system designated about 37,000 Palestinians as targets and directed airstrikes on their homes. The system reportedly had an error rate of about 10%, and there was no requirement to verify the machine’s data. The Israeli army systematically attacked targeted individuals at night in their homes while their whole family was present. An automated component, known as “where’s daddy,” tracked targeted individuals and carried out bombings when they entered their family residences. The result, according to the report, was that thousands of women and children were killed by Israeli airstrikes. Israeli intelligence officers allegedly stated that the IDF bombed homes as a first option, and in several cases entire families were murdered when the actual target was not inside. In one instance, four buildings were destroyed along with everyone inside because a single target was in one of them. For targets marked as low level by Lavender, cheaper bombs were used, destroying entire buildings and killing mostly civilians and entire families. It was alleged that the IDF did not want to waste expensive bombs on “unimportant people,” and it was decided that for every low-level Hamas operative Lavender marked, it was permissible to kill up to 15 or 20 civilians; for a senior Hamas official, more than 100 civilians could be killed. Most AI targets were never tracked before the war. Lavender analyzed information collected on the 2,300,000 residents of the Gaza Strip through mass surveillance, assessing the likelihood of each person being a militant and giving a rating from 1 to 100. If the rating was high enough, the person and their entire family were killed. Lavender flagged individuals with patterns similar to Hamas, including police, civil defense, relatives, and residents with similar names or nicknames. The report notes that this kind of tracking system has existed in the US for years. Speaker 1 presents a counterpoint: a “fine gentleman of the secret service” claims to provide a list of every threat made about the president since February 3 and profiles of every threat maker, implying that targets could be identified through broad data collection including emails, chats, SMS. The passage suggests a tool akin to a Google search but including private communications. Speaker 0 adds that although some claim Israel controls the US, Joe Biden says Israel serves US interests. Speaker 2: A speaker asserts, “There’s no apology to be made. None. It is the best $3,000,000,000 investment we make,” and claims that without Israel the United States would have to invent an Israel to protect its regional interests. Speaker 0 closes reporting for Infowars, credited to Greg Reese.

Video Saved From X

reSee.it Video Transcript AI Summary
Israeli political and military leaders have made statements that could be interpreted as genocidal towards Gaza. There is a link between these statements and the actions on the ground, with Israeli military actions likely constituting war crimes due to the disproportionate number of civilian casualties. The IDF spokesperson himself admitted that 2 out of 3 people killed are civilians, meaning around 12,000 civilians, mostly women and children, have been killed. This indicates evidence of a dangerous situation that could potentially lead to genocide in Gaza.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker addresses the use of artificial intelligence or Lavender by the IDF in identifying Hamas targets. They state they are not on top of all the details of what’s happening in Israel and that their bias is to defer to Israel. They say it’s not for us to second guess everything. They conclude that broadly the IDF gets to decide what it wants to do and that they’re broadly in the right.

Video Saved From X

reSee.it Video Transcript AI Summary
In Gaza, civilians face numerous challenges and dangers due to the actions of Hamas. The terrorist organization indoctrinates children in training camps and diverts humanitarian resources for rocket production. They also strategically position themselves in civilian areas, such as homes, schools, and mosques, making these places legitimate military targets. Hamas uses civilians as pawns to achieve their goals and initiated the current conflict with Israel, putting the civilians they are responsible for at risk. It is important to acknowledge that Hamas, as a genocidal terrorist organization, bears full responsibility for all the consequences that arise from their actions.

Video Saved From X

reSee.it Video Transcript AI Summary
The segment centers on a US-led Civil-Military Coordination Center in southern Israel, established in October 2025 to monitor the Gaza ceasefire. It showcases a map of the Strip, footage of trucks, and a Dataminer report. Dataminer is a private US tech company that uses artificial intelligence to mine social media in real time to issue warnings of critical situations, highlighting the growing relationship between private AI firms and militaries and signaling a structural shift in how warfare is conducted, who controls it, who profits, and how accountability works. Heidi Khalaf, chief AI scientist at the AI Now Institute, explains that militaries rely too heavily on commercial technologies and are not investing in their own traceable, explainable models, instead using a “black box.” Gaza provides the first confirmation that commercial AI models are being directly used in warfare, justified by speed at the cost of accuracy. The report asserts that Israel’s war in Gaza was not driven solely by soldiers but also by data prediction, location tracking, drone feeds, and AI models built by private tech firms. Palantir is described as a key player, with reports claiming they supplied AI tools to help identify and accelerate targeting of individuals in Gaza, though Palantir has denied these claims. Amazon and Google are said to have provided Israel with cloud infrastructure needed for military AI systems; both companies maintain their services are commercial, not military. These tools are said to have shifted the war from human intelligence to a data industry. While defense contracting is not new, earlier conflicts such as the 2003 US invasion of Iraq relied more on informants and interrogations; AI then involved a human in the loop, with clearer military applications. Now, the line between commercial and military use of AI is blurred, and corporations play a larger role. A key question raised is what it means when a private AI company controls the infrastructure the military depends on, rather than the state. Khalaf notes that militaries are ceding control and state obligations to faulty technology developed by private companies with different incentives, which can lead to AI being used to evade accountability for mass civilian casualties due to model inaccuracy. The analysis concludes that war is no longer just a battlefield—it is also about who builds and controls the software governing mass civilian data.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: Welcome back to Jake GTV news. Did you see ICE shooting American citizens? Speaker 1: I thought they were supposed to get rid of the illegals, though. Speaker 0: Me too. Let's go to Ching Chong on the murder scene. Speaker 1: Chloe and Michael, good morning. We're here in Minneapolis where ICE agents trained by Israel are causing chaos. We go to John for more. Speaker 0: Thanks, Ching Chong. Thought it was only Libtards who opposed this, but they are literally murdering Americans. Back to you in the studio. Speaker 2: Stand back. Speaker 1: Please don't hurt me, sir Ed. I'm here to get rid of the illegals, grandma. Speaker 0: Wow. Thanks, John. Check this out here. It's from the protest. Here we see an agent assault a woman for simply being at the protest. Speaker 3: Then Alex steps in to help her Speaker 0: get back on her feet, and Speaker 4: the agents pepper spray him and proceed to assault him. Speaker 0: They then proceed to remove his legally owned firearm and shoot him in the back roughly 10 times, not even kidding. Holy shit. Speaker 1: Please tell me they're gonna jail. Speaker 0: Nope. They're on administrative leave while the FBI pretends to care. Dude, what? Let's see what Trump's team has to say. Speaker 5: Very, very unfortunate incident. I don't like that he had a gun. I don't like the fact that he was carrying a gun. Speaker 6: You know, you can't have guns. You can't walk in with guns. You just can't. And you can't listen. You can't walk in with guns. You can't do that, but it's it's a very unfortunate incident. Speaker 7: Do you Speaker 1: agree with Trump, Steen? Speaker 6: Oh, hell yeah. Guns are bad now. Didn't you get the memo? Speaker 1: What about the second amendment? Speaker 6: It's all four d chess, honey. Trust the plan. Speaker 1: Sup, bro? How do you feel about ICE? Speaker 0: This country needs more Indians than blacks. Check your privilege. Speaker 1: Dude, when did everybody get so retarded? Was it the vaccines or something? We go to the investigation team to learn more. Speaker 8: Thanks, Ching Chung. So basically, we uncovered that not only is ICE Embassy located in Tel Aviv, but they're using the same technology they used to genocide the Palestinians. Speaker 0: It's a freaking Jewish spyware by Paragon Solutions called Graphite, and check this out. Tell me why Alex Pretty was googled a month prior to the shooting and, again, five minutes before his death. Make of that what you will. Back to you guys. Wow. Wasn't the Homeland Security's own Twitter page being run from Israel? Speaker 1: Yeah. Same with ICE's embassy, Tel Aviv to be exact. Speaker 0: Freaking Jews, man. Speaker 9: Shut it down. He was an unhinged lefty who thought our Chobus Goy Trumpstein was a dictator. He kicked the taillight the week prior, so he deserved to be gunned down like a dog. Speaker 1: Air that. Jeez, Producer Berg, chill. Speaker 0: Gosh, he's so Talmudic. Speaker 1: Right. Always victim. Speaker 0: Anyways, here's their emotional justification for cold blooded murder. Speaker 1: That was a pretty good leg kick. Speaker 0: Right? Let's get Shapiro Steen's take on this whole thing. Speaker 10: Just because we didn't arrest anyone for the Epstein files, genocide, or our poisonous mRNA doesn't mean we won't also get away with murdering Boyum. After all, he kicked a taillight. Speaker 0: Yeah. I guess you're right, Shapiro Steen. Israel is our greatest ally. Speaker 1: You're not getting a raise. Speaker 0: Discount on your only freaks? Speaker 1: Not a chance. Ching chong, take it away. Gosh, dude. You're such a weak little simp. She's a literal succubus. Speaker 0: Anyways, let's take a tour with the IDF, I mean ice. Whoops. What was your training like? We were supposed to be trained for this? Speaker 0: Yeah. We've got an antiseptic on the next block. Get ready to murder. Stop resisting. Did you see me shoot that senior citizen? Yeah. Definitely not an immigrant, he sure had it coming. Let's see what Diego's up to. Speaker 2: I will tell you this, brother. What? You know? I will tell you this. You raise your voice? I raise your voice. Speaker 1: Wow. Isn't that like against the law? Speaker 0: You'd think so but they'll end up getting paid administrative leave and mental health support. Speaker 1: Seriously? Speaker 0: Dead ass. If I Speaker 11: raise my voice, you'll erase Speaker 2: my Exactly. Yeah. Yeah. Speaker 11: Are you serious? You said, if I raise my voice, you'll erase my voice? Speaker 1: Yes. Mhmm. Mhmm. Ice. You guys are saving this country. Speaker 0: Didn't they kill that American woman last week? Renee Good or something? Speaker 1: That non chosen person? She was lesbian leftist Karen. Who cares? Speaker 0: Whatever you say, Daisy. No. Speaker 7: No. Shit. Shit. Oh my fucking god. What the fuck? What What the the fuck? Fuck? Speaker 0: You might be wondering, why Minneapolis? Tim Waltz ushered in a defund the police initiative, which created a perfect opportunity for Trump's team to bring about the first AI surveillance state. You know what they say, create the problem, usher in the solution. Tom, back to you. Exactly. Speaker 0: So Peter Thiel, a close advisor to J. D. Vance, founded Palantir, the company that built the AI surveillance system used to target sand people. That same technology was sold to ICE and rebranded as Immigration OS, creating a satanic surveillance network to monitor Americans. Speaker 9: Shut it down, Tom. That's not for the normies to understand. Keep it up and I'll turn you into a lampshade like I did with Jackie. Back to the Goyslop or you're canceled. Speaker 12: Goyslop Junior's Goyslop Filet is back, and it's got more seed oils than ever. Speaker 0: I hate myself. Goyslop Junior. Speaker 7: Go on. Speaker 6: Enjoy cancer. Speaker 1: Gosh, that looks good. Speaker 0: Producer Verk said if we stop talking about Palantir, Goyslap Junior will cater to the Super Bowl party. Speaker 1: Alright. Speaker 0: Zipped. Let's just have Eric Warsaw break it down for us. Speaker 12: Palantir. The same company that is run by the hardline Zionist Alex Karp who works closely with Israeli military, will now be in charge of America's civilian data collection. We built Foundry, which was just was used to distribute the COVID vaccine and saved millions of lives globally. Palantir is here to disrupt and make our the institutions we partner with the very best in the world, and when it's necessary to scare enemies and on occasion kill them. Speaker 12: And also, the target selections for the US military, police forces, and even target selections for ICE officers. Speaker 1: That's right, Eric. We're giving our data to the Israeli Jew whose AI targeted over fifty percent of the civilian deaths in Gaza. Here he is. Speaker 7: Your AI and your technology from Palestine to kill Palestinians. Speaker 13: Mostly terrorists. Speaker 1: And by terrorists, he means anyone who opposes their families being genocided, including women and children. This guy. Speaker 9: Shut it the heck down. Say goodbye to your Goyslav junior catering. Remember what happened to Charlie? You're next. Run the freaking commercials. Speaker 0: Want to express yourself? Well, now you can. I always wonder how dumb this going sometimes can be. Speaker 7: TikTok, Speaker 0: Now owned by the Jews at BlackRock. Speaker 7: We're watching that. Speaker 0: Wow. I thought China owning our data was bad. Now you can't even say Zionist without getting flagged. Speaker 1: Straight up. It's like, give it back to China at this point. Speaker 0: Anything's better than Jews at this point. Speaker 1: Right? It's like take a freaking joke, let alone facts. Speaker 0: That's based. We go to John for some breaking news. Thanks, guys. Couldn't have said it better. And this just in, we're taking over Greenland because it was promised to us by Lucifer himself. So take it away, Satan. Speaker 14: By the way, what are we doing with Greenland? We gotta do something with Greenland. Where's my advance team? Go to Greenland. They must have some satellite needs or something that we could do there. But we are coloring the world blue. Speaker 0: So satanic. Speaker 1: Right? Isn't Greenland the central hub for the undersea data cables connecting North America, Europe, and Asia? Speaker 0: Bingo. Speaker 0: Ching Chong joins us live from Greenland. Speaker 1: We're here in Greenland, and not only is it located on a gold mine of rare earth minerals, but its freezing temperatures are the perfect natural coolant for the AI supercomputers needed to power the new world order that will enslave humanity. Eric Morsaw, break it down for us. Speaker 12: If you thought George Orwell's 1984 was a bad surveillance state, wait until you see what Israel's Palantir can do with AI technology or America. It's gonna make the movie The Matrix look mild. Speaker 1: Thanks, Eric. But to truly understand the endgame, you need to understand their ultimate prize, Jerusalem's Golden Dome. The satanic cabal believes controlling this one holy site lets them hijack God's story for billions and install the Antichrist. Let's hear what Trump's theme has to say about it. Speaker 5: We will have all everything we want. We're getting everything we want at no cost. Speaker 10: So the so the Golden Dome will be on Greenland? Speaker 5: A piece of it, yes. And it's a very important part because it's everything comes over Greenland. If the bad guys start shooting, it comes over Greenland. Speaker 1: So what he means by that is the satanic cabal is taking a piece of God's throne and putting it on their AI brain in Greenland to legitimize the antichrist. Speaker 6: Is that some sort of question? Speaker 1: How does that make you feel? Speaker 6: Get the out of our country. Speaker 10: So what are we talking about? An acquisition of Greenland? Are you going to pay for it? Speaker 5: I mean We're talking about it's really being negotiated now, the details of it, but essentially it's total access. It's there's no end. Speaker 0: We're making Iran great again, Venezuela, and now Greenland. How exciting. Speaker 1: Why can't we just fix this country? Speaker 0: Because Israel is our greatest ally. Speaker 1: Right, Shapiro Steen? Speaker 0: Well. I'm so sick of pretending we're Israel first. Speaker 10: I heard that. Just because you stupid goyim think you can expose our satanic agenda doesn't mean you won't fall for our next tie up. Dennis, shut this episode down or you're all fired. Speaker 0: Thanks, Shapiro Steen. Suck on this. Anyways, if you're still not following Jake GTV, you're either brainwashed or legally retarded. Speaker 15: I think I figured out where our data's going. Just let me hack into Homeland Security real quick, and we're in. Speaker 0: And time to get rid of their lice For antiseptic purposes, of course. Did you hear we gave Jake GTV a strike on his YouTube? Speaker 9: Oh, someone's hacked into our system. Another pizza cost. Speaker 1: Look who it is, my base fucking noticer. If you wanna stop wondering what's going on and know, check out my new book on jakegtv.com. Otherwise, just hit the like, comment, and subscribe, and I'll see you on the next one. Speaker 9: Did you hit him with a YouTube strike? Speaker 0: Sir, we did, but he's not stopping. Speaker 9: Shadow ban his accounts. We must shut him down before the red Speaker 7: heifer Speaker 0: is sacrificed.

Video Saved From X

reSee.it Video Transcript AI Summary
Israel has a history of bombing Gaza, claiming mistakes after killing civilians. Recent incidents include airstrikes killing Palestinians in tents, foreign aid workers, and a six-year-old girl. Israel often attributes these killings to errors or misidentifications, sparking global outrage. Despite advanced technology, Israel's military actions have resulted in numerous civilian casualties, leading to accusations of intentional harm and potential genocide.

Video Saved From X

reSee.it Video Transcript AI Summary
- The speakers claim that American financial institutions and tech companies are deeply involved in the Gaza killings. They name banks, pension funds, Amazon, Google, and Microsoft as having provided services and access to Palestinian data that enabled Israel to set up systems to mass target and kill Palestinians. - They describe an application called Where is Daddy, asserting it allows the army to randomly track people and reach them even when they are with their families, facilitating harm. - The discussion characterizes Israel as possessing the most sophisticated military in the region, knowing precisely what it is doing for two years, and notes that many Israeli soldiers are breaking down, with rising suicidality among young soldiers who have served. - They argue that soldiers have been indoctrinated into becoming executioners of genocide, and that intervention is necessary to prevent further brutality. - The speakers contend that much of this action is driven by people outside Israel who defend the regime, which they describe as having imposed a military dictatorship on Palestinians in the West Bank, Jerusalem, and Gaza (the latter until 2005), and also affecting Israelis who are part of the system. They state that brutalizing others compromises humanity. - Speaker 0 presses for clarification about the existence of the Where is Daddy app, asking if it was a dream or a claim already stated. - Speaker 1 clarifies that Israel has developed an automated system to determine targets through computing, with data supplied by tech companies. He mentions Palantir as one company that publicly supports Israel. He references a public debate in which a Polish person protests that he is killing families, and the response is “you are killing civilians in Gaza,” to which the other person replies that the targets are “most probably terrorists.”

Video Saved From X

reSee.it Video Transcript AI Summary
- The discussion opens with claims that President Trump says “we’ve won the war against Iran,” but Israel allegedly wants the war to destroy Iran’s entire government structure, requiring boots on the ground for regime change. It’s argued that air strikes cannot achieve regime change and that Israel’s relatively small army would need U.S. ground forces, given Iran’s larger conventional force, to accomplish its objectives. - Senator Richard Blumenthal is cited as warning about American lives potentially being at risk from deploying ground troops in Iran, following a private White House briefing. - The new National Defense Authorization Act is described as renewing the involuntary draft; by year’s end, an involuntary draft could take place in the United States, pending full congressional approval. Dan McAdams of the Ron Paul Institute is described as expressing strong concern, arguing the draft would treat the government as owning citizens’ bodies, a stance attributed to him as supporting a view that “presumption is that the government owns you.” - The conversation contrasts Trump’s public desire to end the war quickly with Netanyahu’s government, which reportedly envisions a much larger military objective in the region, including a demilitarized zone in southern Lebanon akin to Gaza, and a broader aim to remove Hezbollah. The implication is that the United States and Israel may not share the same endgame. - Tucker Carlson is introduced as a guest to discuss these issues and offer predictions about consequences for the American people, including energy disruption, economic impacts, and shifts in U.S. influence in the Persian Gulf. - Carlson responds that he would not credit himself with prescience, but notes predictable consequences: disruption to global energy supplies, effects on the U.S. economy, potential loss of U.S. bases in the Gulf, and a shrinking American empire. He suggests that the war’s true goal may be to weaken the United States and withdraw from the Middle East; he questions whether diplomacy remains viable given the current trajectory. - Carlson discusses Iran’s new supreme leader Khomeini’s communique, highlighting threats to shut Hormuz “forever,” vows to avenge martyrs, and calls for all U.S. bases in the region to be closed. He notes that Tehran asserts it will target American bases while claiming it is not an enemy of surrounding countries, though bombs affect neighbors as well. - The exchange notes Trump’s remarks about possibly using nuclear weapons, and Carlson explains Iran’s internal factions, suggesting some seek negotiated settlements while others push for sustained conflict. Carlson emphasizes that Israel’s leadership may be pushing escalation in ways that diverge from U.S. interests and warns about the dangers of a joint operation with Israel, which would blur U.S. sovereignty in war decisions. - A discussion on the use of a term Amalek is explored: Carlson’s guest explains Amalek from the Old Testament as enemies of the Jewish people, with a historical biblical command to annihilate Amalek, including women and children, which the guest notes Christianity rejects; Netanyahu has used the term repeatedly in the conflict context, which Carlson characterizes as alarming and barbaric. - The guests debate how much influence is exerted in the White House, with Carlson noting limited direct advocacy for war among principal policymakers and attributing decisive pressure largely to Netanyahu’s threats. They question why Israel, a client state of the U.S., is allowed to dictate war steps, especially given the strategic importance of Hormuz and American assets in the region. - They discuss the ethical drift in U.S. policy, likening it to adopting the ethics of the Israeli government, and criticize the idea of targeting family members or civilians as a military strategy. They contrast Western civilization’s emphasis on individual moral responsibility with perceived tribal rationales. - The conversation touches on the potential rise of AI-assisted targeting or autonomous weapons: Carlson’s guest confirms that in some conflicts, targeting decisions have been made by machines with no human sign-off, though in the discussed case a human did press play on the attack. The coordinates and data sources for strikes are scrutinized, with suspicion cast on whether Israel supplied SIGINT or coordinates. - The guests warn about the broader societal impact of war on civil liberties, mentioning the increasing surveillance and the risk that technology could be used to suppress dissent or control the population. They discuss how war accelerates social change and potentially normalizes drastic actions or internal coercion. - The media’s role in selling the war is criticized as “propaganda,” with examples of government messaging and pop culture campaigns (including a White House-supported video game-like portrayal of U.S. military power). They debate whether propaganda can be effective without a clear, articulated rationale for war and without public buy-in. - They question the behavior of mainstream outlets and “access journalism,” arguing that reporters often avoid tough questions about how the war ends, the timetable, and the off-ramps, instead reinforcing government narratives. - In closing, Carlson and his co-hosts reflect on the political division surrounding the war, the erosion of trust in media, and the possibility of rebuilding a coalition of ordinary Americans who want effective governance without perpetual conflict or degradation of civil liberties. Carlson emphasizes a longing for a politics centered on improving lives rather than escalating war. - The segment ends with Carlson’s continued critique of media dynamics, the moral implications of the war, and a call for more transparent discussion about the true aims and consequences of extended military engagement in the region.

Breaking Points

REVEALED: Israel's AI Robot Killing Machine
reSee.it Podcast Summary
Anthony Lowenstein, an independent journalist and author of "The Palestine Laboratory," discusses his new documentary series with Al Jazeera English. The documentary highlights Israel's significant role in the global arms industry, showcasing its advanced military technology at events like the Farra air show. Israel, despite its small population, ranks as the ninth largest arms producer, with over $13 billion in annual exports. Lowenstein emphasizes the impact of artificial intelligence in targeting civilians in Gaza, leading to increased casualties. He notes that many countries admire Israel's military strategies and surveillance technologies, which are being adopted globally. The documentary aims to raise awareness of these issues and their implications for human rights and global security. The first part is available on YouTube, with the second part set to release on February 6.
View Full Interactive Feed