TruthArchive.ai - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Israel uses a system called Lavender to decide who to kill, assigning scores to Palestinians and drone striking those above a certain threshold. Palantir creates these "murder lists" by scraping data from Facebook, satellite imagery, and other surveillance sources, compiling personal information to assign weighted scores and identify targets. Palantir, founded by individuals with ties to the Israeli government and the CIA, built this surveillance platform in Israel to target Palestinians. Palantir also maintains an "enemies list" of 1 to 2 million US citizens for the CIA and federal law enforcement, classifying them as potential political dissidents. This database uses surveillance and AI to identify Americans deemed threats to the government, including those with anti-government views or potential involvement in domestic extremism.

Video Saved From X

reSee.it Video Transcript AI Summary
In 2021, Israeli intelligence developed an AI program called Lavender to target individuals in war. The system designated 37,000 Palestinians as targets, resulting in civilian casualties. The IDF used mass surveillance to assess the likelihood of each person being a militant and targeted them accordingly. The Lavender system tracked individuals with patterns similar to Hamas, leading to the deaths of many innocent civilians. This AI targeting system has similarities to the US surveillance system.

Video Saved From X

reSee.it Video Transcript AI Summary
Israel uses a system like Lavender to decide who to kill, assigning scores to Palestinians and drone striking those above a certain threshold. Palantir creates these "murder lists" by scraping data from Facebook, satellite imagery, and other sources to build databases with personal information, geolocation, bank information, healthcare information, and relationships. The company assigns weighted scores using algorithms and AI to advise the military on drone strikes. Palantir, founded by individuals with ties to the Israeli government and the CIA, allegedly built this platform in Israel to target Palestinians. Palantir also maintains an "enemies list" of one to two million US citizens for the CIA and federal law enforcement, classifying them as potential political dissidents. This database uses surveillance, AI, and weighted scores to identify Americans deemed a threat to the government, including those with anti-government views or those who might be a concern in a martial law or civil war scenario.

Video Saved From X

reSee.it Video Transcript AI Summary
Al Jahzira has collated evidence revealing shocking actions by the Israeli army, including the destruction of Hebok Khazah. Soldiers filmed themselves destroying Palestinian homes and rifling through women's underwear. Some Israelis express support for erasing Gaza. An Israeli song mocking Palestinians who have lost their homes is mainstream. Israel uses AI for targeting in Gaza, tracking people via phones and social media to create kill lists, prioritizing those at home using software called "Where's Daddy?". Despite extensive online videos, there's a lack of footage showing dead Hamas fighters. All 36 hospitals in Gaza have been attacked. A messenger sent by Israelis into Nassar hospital was killed after delivering their message. The 202 paratroopers posted a video showing the killing of unarmed individuals. Deaths by snipers, including children, are common. Journalists are targeted. Over 10% of journalists in Gaza have been killed. Detainees are abused, with reports of beatings and sexual assault. Western media is criticized for double standards and biased coverage, particularly regarding human shields. There is no evidence Hamas uses human shields, while Israelis have been documented doing so. The West enables Israeli behavior. RAF Akrotiri is suspected of providing targeting information to Israel. Western politicians condemn Iranian attacks on Israel while excusing Israeli actions. The speaker urges support for independent media.

Video Saved From X

reSee.it Video Transcript AI Summary
First Speaker argues that Microsoft provided services and access to data, including Palestinian data, which allowed Israel to set up systems to mass target and mass kill Palestinians. They mention an application called "Where is Daddy?" that allows the army to randomly track people and reach them when they are with their families in order to inflict the most harm, describing it as brutal. They state agreement with this view and emphasize the importance of understanding that this represents the end of humanity and the civilization people have pretended to belong to. They claim Israel has the most sophisticated military in the region and has known exactly what it is doing for two years. They assert that many soldiers are breaking down and suicide rates are increasing among young Israelis who have served in the army, noting they are older than teenagers and have been turned by indoctrination into willing executioners of a genocide. They call for intervention by people who love Israel to save what remains of Israel. First Speaker contends that the biggest harm is being done by those outside of Israel who defend the regime. They describe the regime as having imposed a military dictatorship for decades on Palestinians in the West Bank and Jerusalem, and until 2005 in Gaza, and claim this regime also extends to some Israelis who are part of the system. They argue that brutality toward others undermines one's own humanity. Second Speaker agrees and seeks clarification, asking if there is an app, possibly by an American company, called "Where's Daddy" that allows the Israeli government to murder men in front of their children. They reference the prior statements and want confirmation of that claim. First Speaker responds that Israel has developed not just a system but an automatized system to decide targets through a computing system, and that data has been provided by technology. They reiterate that this is part of a broader system of targeting.

Video Saved From X

reSee.it Video Transcript AI Summary
Israel uses a system called Lavender to decide who to kill, assigning scores to Palestinians and drone striking those above a certain threshold. Palantir creates these "murder lists" by scraping data from Facebook, satellite imagery, and other surveillance sources, compiling personal information to assign weighted scores and identify targets. Palantir, founded by individuals with ties to the Israeli government and the CIA, also maintains an "enemies list" of 1 to 2 million US citizens for the CIA and federal law enforcement. This list classifies Americans as potential political dissidents based on surveillance data and AI, assessing their threat level to the government, extremist views, and potential for anti-government activity in scenarios like martial law or civil war.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: So what do you think about the use of artificial intelligence or Lavender by the IDF in identifying Hamas targets? And secondly, do you agree with Elon Musk about that the population decline is a risk for humanity? Speaker 1: Look, I again, I'm not I'm not, you know, with without without going into all the you know, I I'm I'm not on top of all the details of what's going on in Israel because my my bias is to defer to Israel. It's it's not for us to to second guess every everything. And I I believe that broadly the IDF gets to decide what it wants to do and that they're broadly in the right. And that's that's sort of the the perspective I come back to and

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker states that artificial intelligence is being used to create mass assassinations, blurring the lines between assassination and warfare. They claim that many targets in Gaza are bombed as a result of AI targeting. The speaker emphasizes the connection between AI and surveillance, asserting that AI needs information to generate targets, ideas, or propaganda. Surveillance data from telephones and the internet is key to training the algorithms used to conduct these mass assassinations.

Video Saved From X

reSee.it Video Transcript AI Summary
I emerged from prison to find that artificial intelligence is now used for mass assassinations, blurring the lines between assassination and warfare. Many targets in Gaza are bombed due to AI targeting. The link between artificial intelligence and surveillance is crucial, as AI relies on data from phones and the internet to identify targets and generate propaganda. Surveillance data is essential for training these algorithms to carry out such operations.

Video Saved From X

reSee.it Video Transcript AI Summary
Natalie asks about the AI piece, expressing cynicism that there may be a push for a “war bot” to circumvent consumer AI limits that block starting wars with WMDs, and wonders if there is a benevolent reason. Matthew responds that it’s worse than that: Hengseth described a platform to run on military desktops worldwide—secure, like ChatGPT or Claude but for the Pentagon and military services—that “doesn’t allow information to get out.” The core issue, he says, is who controls the AI, and two key questions about the future of war with AI: who ultimately owns these AI platforms, and who informs them—who gives them the algorithm and programming and essentially orders on how to answer questions. He notes increasing concerns about reliability of information, including how ChatGPT handles questions about trustworthy news sources. He mentions that ChatGPT defers to institutional structures rather than historical accuracy. The risk, he says, is that military AI programs may not provide honest, candid, objective information to military personnel, but rather information based on narratives the Pentagon or manufacturers want. A common belief is that technology makes war more precise and reduces civilian harm, but Matthew contends this is a myth. He explains that precision-guided munitions were not about preventing civilian casualties but about increasing efficiency—“the purpose was to make the weapons more efficient, so we had to drop less bombs to, say, blow up a bridge.” He cites the small diameter bomb as evidence that the aim is not to limit civilian casualties but to allow more bombs to be delivered from aircraft. He highlights real-world examples of AI in warfare, referencing Israeli systems in Gaza. He explains that three AI programs—Lavender, Gospel, and Where’s Daddy?—play roles in targeting and timing strikes. Lavender scans theInternet and databases to identify targets (e.g., labeling someone as a Hamas supporter based on a past online activity), and Where’s Daddy? coordinates that information to ensure bombs hit resistance fighters “when they are with their families,” not away from them. He notes reporting from Israeli media and Nine Two Magazine about these programs and urges viewers to examine that reporting; Tucker Carlson’s coverage is mentioned as example. Matthew argues this demonstrates the dystopian potential of AI in war and cautions against assuming American AI would be more benevolent. He mentions commentator references to justify or excuse actions, including a remark attributed to Mike Huckabee that “Israel did not attack Qatar. They just sent a missile into their country aimed at one person,” noting the nearby injuries or deaths. He ends with a reminder of Orwell’s reflections on war and the idea that those who cheer for war may be less enthusiastic if they experience its costs, suggesting a broader aim to make the costs of war felt among ruling elites who benefit from it.

Video Saved From X

reSee.it Video Transcript AI Summary
Where's daddy? And according to this reporting in September, again based on what Israeli military officials told these journalists, it's easier to kill somebody when they're home than it is when they're out in the street. So this AI will monitor the location of your cell phone, your smartphone. And when it arrives at the location that's been determined to be your residence, then there will be a ping to the guy who can target your house and bring it down. I guess whoever came up with this name, where's daddy, thought that many of them would be fathers. And that's why it's called where's daddy. When they reach their homes daddy's home, and then the entire house and everybody in it could be blown up.

Video Saved From X

reSee.it Video Transcript AI Summary
Former Israeli tank commander Ori Givarti, from Breaking the Silence, discusses the killing of innocent people in Gaza and lack of trust in IDF investigations. He explains how military policies in Gaza allow for the targeting of civilian homes and the predetermined number of innocent civilians that can be killed to destroy a target. Givarti highlights the use of an AI system to select targets, emphasizing the problematic nature of Israeli military tactics in Gaza.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker was asked about the IDF's use of AI, specifically Lavender, to identify Hamas targets. The speaker stated they are not on top of all the details of what's going on in Israel and their bias is to defer to Israel. They believe it's not for others to second guess everything and that broadly the IDF gets to decide what it wants to do and that they're broadly in the right. That is the perspective they come back to.

Video Saved From X

reSee.it Video Transcript AI Summary
An AI system marked 37,000 Palestinians in Gaza as suspected militants based on various factors. Despite knowing it made errors in 10% of cases, the IDF used this system to target individuals in their homes with unguided missiles, resulting in civilian casualties.

Video Saved From X

reSee.it Video Transcript AI Summary
Patrick Sarval is introduced as an author and expert on conspiracies, system architecture, geopolitics, and software systems. Ab Gieterink asks who Patrick Sarval is and what his expertise entails. Sarval describes himself as an IT architect, often a freelance contractor working with various control and cybernetics-oriented systems, with earlier experience including a Bitcoin startup in 2011, photography work for events, and involvement in topics around conspiracy thinking. He notes his books, including Complotcatalogus and Spiegelpaleis, and mentions Seprouter and Niburu in relation to conspiratorial topics. Gieterink references a prior interview about Complotcatalogus and another of Sarval’s books, and sets the stage to discuss Palantir, surveillance, and the internet. The conversation then shifts to explaining Palantir and its significance. Sarval emphasizes Palantir as a key element in a broader trend rather than focusing solely on the company itself. He uses science-fiction analogies to describe how data processing and artificial intelligence are evolving. In particular, he introduces the concept of a “brein” (brain) or “legion” that integrates disparate data streams, builds an ontology, and enables predictive analytics and tactical decision-making. Palantir is described as the intelligence brain that aggregates data from multiple sources to produce meaningful insights. Sarval explains that a rudimentary prototype of such a system operates under the name Lavender in Gaza, where metadata from sources like Meta (Facebook, WhatsApp, Instagram), cell towers, satellites, and other sensors are fed into Palantir. The system performs threat analysis, ranks threats from high to low, and then a military operator—still human—must approve the action, with about 20–25 seconds to decide whether to fire a weapon. The claim is that Palantir-like software functions as the brain behind this process, orchestrating data integration, ontology creation, data fusion, digital twins, profiling, predictions, and tactical dissemination. The discussion covers how Palantir integrates data from medical records, parking fines, phone data, WhatsApp contacts, and more, then applies an overarching data model and digital twin to simulate and project outcomes. This enables targeted marketing alongside military uses, illustrating the broad reach of the platform. Sarval notes there are two divisions within Palantir: Gotum (military) and Foundry (business models), which he mentions to illustrate the dual-use nature of the technology. He warns that the system is designed to close feedback loops, allowing it to learn and refine its outputs over time, similar to how a thermostat adjusts heating based on sensor inputs. A central concern is the risk to the rule of law and human agency. The discussion highlights the potential erosion of the presumption of innocence and due process when decisions increasingly rely on predictive models and AI. The panel considers the possibility that in a high-stress battlefield scenario, soldiers or commanders might defer to the Palantir-presented “world view,” making it harder to refuse an order. There is also concern about the shift toward autonomous weapons and the removal of human oversight in critical decisions, raising fears about the ethics and accountability of such systems. The conversation moves to the political and ideological backdrop surrounding Palantir’s leadership. Peter Thiel, Elon Musk, and a close circle with ties to PayPal and other tech-industry figures are discussed. Sarval characterizes Palantir’s leadership as ideologically defined, with statements about Zionism and a political worldview influencing how the technology is developed and deployed. The dialogue touches on perceived connections to broader geopolitical influence, including the role of influence campaigns, media shaping, and the involvement of powerful networks in technology development and national security. As the discussion progresses, the speakers explore the implications of advanced AI and the “new generative AI” era. They consider the nature of AI and the potential for it to act not just as a data processor but as a decision-maker with emergent properties that challenge human control. The concept of pre-crime—predicting and acting on potential future threats before they materialize—is discussed as a troubling possibility, especially when a machine’s probability-based judgments guide life-and-death actions. Towards the end, the conversation contemplates what a fully dominated surveillance state might look like, including cognitive warfare and personalized influence through media, ads, and social networks. The dialogue returns to questions about how far Palantir and similar systems have penetrated international security programs, with speculation about Gaza, NATO adoption, and commercial uses beyond military applications. The speakers acknowledge the possibility of multiple trajectories and emphasize the need for checks and balances, transparency, and critical reflection on the power such systems confer upon a relatively small group of technologists and influencers. They conclude with a nod to the transformative and potentially dystopian future of AI-enabled surveillance and decision-making, cautioning against unbridled expansion and urging vigilance.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 raises two questions: first, what is the view on the use of artificial intelligence or Lavender by the IDF in identifying Hamas targets; second, whether they agree with Elon Musk about population decline being a risk to humanity. Speaker 1 responds: “Look, I again, I'm not I'm not, you know, with without without going into all the deep you know, I I'm I'm not on top of all the details of what's going on in Israel because my my bias is to defer to Israel. It's it's not for us to to second guess every everything. And I I believe that broadly the IDF gets to decide what it wants to do and that they're broadly in the right and that's that's sort of the the perspective I come back to.”

Video Saved From X

reSee.it Video Transcript AI Summary
An AI system called Lavender marked 37,000 Palestinians in Gaza as suspected militants based on small signs like phone usage. The Israeli military used this information to target and bomb these individuals, even though the system made errors in 10% of cases. This led to civilian casualties when unguided missiles were used on family homes, killing up to 20 civilians per suspected militant.

Video Saved From X

reSee.it Video Transcript AI Summary
Lavender is software developed by Palantir for the Israeli IDF. Following the events of October 7th, when the IDF entered Gaza, they utilized Lavender for targeting during bombings. This software structure allows IDF personnel to evade legal accountability for actions that may violate international law. It also alleviates moral responsibility, facilitating the process of warfare. A notable incident involved Peter Thiel, who became visibly distressed when questioned about the implications of this technology, realizing that while he might avoid legal repercussions, public opinion could still hold him accountable. This situation exemplifies how advancements in automation and weaponry are reshaping modern warfare in ways previously unimaginable.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 1 and Speaker 0 discuss the implications of AI in military use. They consider whether consumer AI is being bypassed with a secure, military-specific platform that would be sealed—essentially one-way in and no information out—for the Pentagon and military services. The key questions raised are: who controls the AI, who informs its algorithms, and who gives it its orders on how to answer questions, highlighting concerns about privatization and outsourcing of war. Speaker 1 argues that the future of war with AI hinges on two issues: ownership of AI platforms and the sources of their programming. They note that AI can deflect or defer to institutional structures rather than empirical accuracy, raising concerns about the reliability of information provided to military personnel. They also reference the myth that advancing technology automatically reduces civilian harm, citing that precision-guided munitions were designed for efficiency, not necessarily to prevent civilian casualties, noting that the intent was to reduce the number of bombs needed to achieve targets. The conversation shifts to the concept of precision in weapons. Speaker 1 points out that laser- and GPS-guided bombs were not primarily invented to minimize civilian casualties but to increase efficiency. They mention the small diameter bomb as an example, explaining that its use increases the number of bombs that can be deployed rather than primarily limiting collateral damage. The discussion then moves to real-world AI systems used in conflict zones. Speaker 1 cites Israeli programs—Lavender, Gospel, and Where’s Daddy?—as examples of nefarious and insidious AI in war. Lavender supposedly scans the Internet and other databases to identify targets, for example flagging someone as a Hamas supporter based on years of activity. Where’s Daddy? allegedly guides Israeli drones to strike fighters when they are with their families, not away from them. This reporting is linked to coverage from Israeli media and Nine Seven Two magazine, and Speaker 2 references Tucker Carlson’s coverage of these issues. Speaker 2 amplifies the point by noting the emotional impact of such capabilities, arguing that targeting men when they are with their children is particularly disturbing. They also discuss broader political reactions, including a remark attributed to Ambassador Huckabee about Israel not attacking Qatar but “sending a missile there” that injured nearby people. Speaker 1 concludes by invoking Orwell’s reflection on the Spanish Civil War, suggesting that those who cheer for war may be confronted by the consequences when modern aircraft enable distant bombing. They emphasize the need to make the costs of war felt by the ruling classes who benefit from it, not just the people on the ground.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 describes a 2021 claim by the commander of Israeli intelligence to design a machine to resolve a human bottleneck in locating and approving targets in war. A recent investigation by Plus 972 Magazine and Local Call reveals that the Israeli army developed an AI-based Lavender system to designate targets and direct airstrikes. During the initial weeks of the Lavender operation, the system designated about 37,000 Palestinians as targets and directed airstrikes on their homes. The system reportedly had an error rate of about 10%, and there was no requirement to verify the machine’s data. The Israeli army systematically attacked targeted individuals at night in their homes while their whole family was present. An automated component, known as “where’s daddy,” tracked targeted individuals and carried out bombings when they entered their family residences. The result, according to the report, was that thousands of women and children were killed by Israeli airstrikes. Israeli intelligence officers allegedly stated that the IDF bombed homes as a first option, and in several cases entire families were murdered when the actual target was not inside. In one instance, four buildings were destroyed along with everyone inside because a single target was in one of them. For targets marked as low level by Lavender, cheaper bombs were used, destroying entire buildings and killing mostly civilians and entire families. It was alleged that the IDF did not want to waste expensive bombs on “unimportant people,” and it was decided that for every low-level Hamas operative Lavender marked, it was permissible to kill up to 15 or 20 civilians; for a senior Hamas official, more than 100 civilians could be killed. Most AI targets were never tracked before the war. Lavender analyzed information collected on the 2,300,000 residents of the Gaza Strip through mass surveillance, assessing the likelihood of each person being a militant and giving a rating from 1 to 100. If the rating was high enough, the person and their entire family were killed. Lavender flagged individuals with patterns similar to Hamas, including police, civil defense, relatives, and residents with similar names or nicknames. The report notes that this kind of tracking system has existed in the US for years. Speaker 1 presents a counterpoint: a “fine gentleman of the secret service” claims to provide a list of every threat made about the president since February 3 and profiles of every threat maker, implying that targets could be identified through broad data collection including emails, chats, SMS. The passage suggests a tool akin to a Google search but including private communications. Speaker 0 adds that although some claim Israel controls the US, Joe Biden says Israel serves US interests. Speaker 2: A speaker asserts, “There’s no apology to be made. None. It is the best $3,000,000,000 investment we make,” and claims that without Israel the United States would have to invent an Israel to protect its regional interests. Speaker 0 closes reporting for Infowars, credited to Greg Reese.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker addresses the use of artificial intelligence or Lavender by the IDF in identifying Hamas targets. They state they are not on top of all the details of what’s happening in Israel and that their bias is to defer to Israel. They say it’s not for us to second guess everything. They conclude that broadly the IDF gets to decide what it wants to do and that they’re broadly in the right.

Video Saved From X

reSee.it Video Transcript AI Summary
The segment centers on a US-led Civil-Military Coordination Center in southern Israel, established in October 2025 to monitor the Gaza ceasefire. It showcases a map of the Strip, footage of trucks, and a Dataminer report. Dataminer is a private US tech company that uses artificial intelligence to mine social media in real time to issue warnings of critical situations, highlighting the growing relationship between private AI firms and militaries and signaling a structural shift in how warfare is conducted, who controls it, who profits, and how accountability works. Heidi Khalaf, chief AI scientist at the AI Now Institute, explains that militaries rely too heavily on commercial technologies and are not investing in their own traceable, explainable models, instead using a “black box.” Gaza provides the first confirmation that commercial AI models are being directly used in warfare, justified by speed at the cost of accuracy. The report asserts that Israel’s war in Gaza was not driven solely by soldiers but also by data prediction, location tracking, drone feeds, and AI models built by private tech firms. Palantir is described as a key player, with reports claiming they supplied AI tools to help identify and accelerate targeting of individuals in Gaza, though Palantir has denied these claims. Amazon and Google are said to have provided Israel with cloud infrastructure needed for military AI systems; both companies maintain their services are commercial, not military. These tools are said to have shifted the war from human intelligence to a data industry. While defense contracting is not new, earlier conflicts such as the 2003 US invasion of Iraq relied more on informants and interrogations; AI then involved a human in the loop, with clearer military applications. Now, the line between commercial and military use of AI is blurred, and corporations play a larger role. A key question raised is what it means when a private AI company controls the infrastructure the military depends on, rather than the state. Khalaf notes that militaries are ceding control and state obligations to faulty technology developed by private companies with different incentives, which can lead to AI being used to evade accountability for mass civilian casualties due to model inaccuracy. The analysis concludes that war is no longer just a battlefield—it is also about who builds and controls the software governing mass civilian data.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: Welcome back to Jake GTV news. Did you see ICE shooting American citizens? Speaker 1: I thought they were supposed to get rid of the illegals, though. Speaker 0: Me too. Let's go to Ching Chong on the murder scene. Speaker 1: Chloe and Michael, good morning. We're here in Minneapolis where ICE agents trained by Israel are causing chaos. We go to John for more. Speaker 0: Thanks, Ching Chong. Thought it was only Libtards who opposed this, but they are literally murdering Americans. Back to you in the studio. Speaker 2: Stand back. Speaker 1: Please don't hurt me, sir Ed. I'm here to get rid of the illegals, grandma. Speaker 0: Wow. Thanks, John. Check this out here. It's from the protest. Here we see an agent assault a woman for simply being at the protest. Speaker 3: Then Alex steps in to help her Speaker 0: get back on her feet, and Speaker 4: the agents pepper spray him and proceed to assault him. Speaker 0: They then proceed to remove his legally owned firearm and shoot him in the back roughly 10 times, not even kidding. Holy shit. Speaker 1: Please tell me they're gonna jail. Speaker 0: Nope. They're on administrative leave while the FBI pretends to care. Dude, what? Let's see what Trump's team has to say. Speaker 5: Very, very unfortunate incident. I don't like that he had a gun. I don't like the fact that he was carrying a gun. Speaker 6: You know, you can't have guns. You can't walk in with guns. You just can't. And you can't listen. You can't walk in with guns. You can't do that, but it's it's a very unfortunate incident. Speaker 7: Do you Speaker 1: agree with Trump, Steen? Speaker 6: Oh, hell yeah. Guns are bad now. Didn't you get the memo? Speaker 1: What about the second amendment? Speaker 6: It's all four d chess, honey. Trust the plan. Speaker 1: Sup, bro? How do you feel about ICE? Speaker 0: This country needs more Indians than blacks. Check your privilege. Speaker 1: Dude, when did everybody get so retarded? Was it the vaccines or something? We go to the investigation team to learn more. Speaker 8: Thanks, Ching Chung. So basically, we uncovered that not only is ICE Embassy located in Tel Aviv, but they're using the same technology they used to genocide the Palestinians. Speaker 0: It's a freaking Jewish spyware by Paragon Solutions called Graphite, and check this out. Tell me why Alex Pretty was googled a month prior to the shooting and, again, five minutes before his death. Make of that what you will. Back to you guys. Wow. Wasn't the Homeland Security's own Twitter page being run from Israel? Speaker 1: Yeah. Same with ICE's embassy, Tel Aviv to be exact. Speaker 0: Freaking Jews, man. Speaker 9: Shut it down. He was an unhinged lefty who thought our Chobus Goy Trumpstein was a dictator. He kicked the taillight the week prior, so he deserved to be gunned down like a dog. Speaker 1: Air that. Jeez, Producer Berg, chill. Speaker 0: Gosh, he's so Talmudic. Speaker 1: Right. Always victim. Speaker 0: Anyways, here's their emotional justification for cold blooded murder. Speaker 1: That was a pretty good leg kick. Speaker 0: Right? Let's get Shapiro Steen's take on this whole thing. Speaker 10: Just because we didn't arrest anyone for the Epstein files, genocide, or our poisonous mRNA doesn't mean we won't also get away with murdering Boyum. After all, he kicked a taillight. Speaker 0: Yeah. I guess you're right, Shapiro Steen. Israel is our greatest ally. Speaker 1: You're not getting a raise. Speaker 0: Discount on your only freaks? Speaker 1: Not a chance. Ching chong, take it away. Gosh, dude. You're such a weak little simp. She's a literal succubus. Speaker 0: Anyways, let's take a tour with the IDF, I mean ice. Whoops. What was your training like? We were supposed to be trained for this? Speaker 0: Yeah. We've got an antiseptic on the next block. Get ready to murder. Stop resisting. Did you see me shoot that senior citizen? Yeah. Definitely not an immigrant, he sure had it coming. Let's see what Diego's up to. Speaker 2: I will tell you this, brother. What? You know? I will tell you this. You raise your voice? I raise your voice. Speaker 1: Wow. Isn't that like against the law? Speaker 0: You'd think so but they'll end up getting paid administrative leave and mental health support. Speaker 1: Seriously? Speaker 0: Dead ass. If I Speaker 11: raise my voice, you'll erase Speaker 2: my Exactly. Yeah. Yeah. Speaker 11: Are you serious? You said, if I raise my voice, you'll erase my voice? Speaker 1: Yes. Mhmm. Mhmm. Ice. You guys are saving this country. Speaker 0: Didn't they kill that American woman last week? Renee Good or something? Speaker 1: That non chosen person? She was lesbian leftist Karen. Who cares? Speaker 0: Whatever you say, Daisy. No. Speaker 7: No. Shit. Shit. Oh my fucking god. What the fuck? What What the the fuck? Fuck? Speaker 0: You might be wondering, why Minneapolis? Tim Waltz ushered in a defund the police initiative, which created a perfect opportunity for Trump's team to bring about the first AI surveillance state. You know what they say, create the problem, usher in the solution. Tom, back to you. Exactly. Speaker 0: So Peter Thiel, a close advisor to J. D. Vance, founded Palantir, the company that built the AI surveillance system used to target sand people. That same technology was sold to ICE and rebranded as Immigration OS, creating a satanic surveillance network to monitor Americans. Speaker 9: Shut it down, Tom. That's not for the normies to understand. Keep it up and I'll turn you into a lampshade like I did with Jackie. Back to the Goyslop or you're canceled. Speaker 12: Goyslop Junior's Goyslop Filet is back, and it's got more seed oils than ever. Speaker 0: I hate myself. Goyslop Junior. Speaker 7: Go on. Speaker 6: Enjoy cancer. Speaker 1: Gosh, that looks good. Speaker 0: Producer Verk said if we stop talking about Palantir, Goyslap Junior will cater to the Super Bowl party. Speaker 1: Alright. Speaker 0: Zipped. Let's just have Eric Warsaw break it down for us. Speaker 12: Palantir. The same company that is run by the hardline Zionist Alex Karp who works closely with Israeli military, will now be in charge of America's civilian data collection. We built Foundry, which was just was used to distribute the COVID vaccine and saved millions of lives globally. Palantir is here to disrupt and make our the institutions we partner with the very best in the world, and when it's necessary to scare enemies and on occasion kill them. Speaker 12: And also, the target selections for the US military, police forces, and even target selections for ICE officers. Speaker 1: That's right, Eric. We're giving our data to the Israeli Jew whose AI targeted over fifty percent of the civilian deaths in Gaza. Here he is. Speaker 7: Your AI and your technology from Palestine to kill Palestinians. Speaker 13: Mostly terrorists. Speaker 1: And by terrorists, he means anyone who opposes their families being genocided, including women and children. This guy. Speaker 9: Shut it the heck down. Say goodbye to your Goyslav junior catering. Remember what happened to Charlie? You're next. Run the freaking commercials. Speaker 0: Want to express yourself? Well, now you can. I always wonder how dumb this going sometimes can be. Speaker 7: TikTok, Speaker 0: Now owned by the Jews at BlackRock. Speaker 7: We're watching that. Speaker 0: Wow. I thought China owning our data was bad. Now you can't even say Zionist without getting flagged. Speaker 1: Straight up. It's like, give it back to China at this point. Speaker 0: Anything's better than Jews at this point. Speaker 1: Right? It's like take a freaking joke, let alone facts. Speaker 0: That's based. We go to John for some breaking news. Thanks, guys. Couldn't have said it better. And this just in, we're taking over Greenland because it was promised to us by Lucifer himself. So take it away, Satan. Speaker 14: By the way, what are we doing with Greenland? We gotta do something with Greenland. Where's my advance team? Go to Greenland. They must have some satellite needs or something that we could do there. But we are coloring the world blue. Speaker 0: So satanic. Speaker 1: Right? Isn't Greenland the central hub for the undersea data cables connecting North America, Europe, and Asia? Speaker 0: Bingo. Speaker 0: Ching Chong joins us live from Greenland. Speaker 1: We're here in Greenland, and not only is it located on a gold mine of rare earth minerals, but its freezing temperatures are the perfect natural coolant for the AI supercomputers needed to power the new world order that will enslave humanity. Eric Morsaw, break it down for us. Speaker 12: If you thought George Orwell's 1984 was a bad surveillance state, wait until you see what Israel's Palantir can do with AI technology or America. It's gonna make the movie The Matrix look mild. Speaker 1: Thanks, Eric. But to truly understand the endgame, you need to understand their ultimate prize, Jerusalem's Golden Dome. The satanic cabal believes controlling this one holy site lets them hijack God's story for billions and install the Antichrist. Let's hear what Trump's theme has to say about it. Speaker 5: We will have all everything we want. We're getting everything we want at no cost. Speaker 10: So the so the Golden Dome will be on Greenland? Speaker 5: A piece of it, yes. And it's a very important part because it's everything comes over Greenland. If the bad guys start shooting, it comes over Greenland. Speaker 1: So what he means by that is the satanic cabal is taking a piece of God's throne and putting it on their AI brain in Greenland to legitimize the antichrist. Speaker 6: Is that some sort of question? Speaker 1: How does that make you feel? Speaker 6: Get the out of our country. Speaker 10: So what are we talking about? An acquisition of Greenland? Are you going to pay for it? Speaker 5: I mean We're talking about it's really being negotiated now, the details of it, but essentially it's total access. It's there's no end. Speaker 0: We're making Iran great again, Venezuela, and now Greenland. How exciting. Speaker 1: Why can't we just fix this country? Speaker 0: Because Israel is our greatest ally. Speaker 1: Right, Shapiro Steen? Speaker 0: Well. I'm so sick of pretending we're Israel first. Speaker 10: I heard that. Just because you stupid goyim think you can expose our satanic agenda doesn't mean you won't fall for our next tie up. Dennis, shut this episode down or you're all fired. Speaker 0: Thanks, Shapiro Steen. Suck on this. Anyways, if you're still not following Jake GTV, you're either brainwashed or legally retarded. Speaker 15: I think I figured out where our data's going. Just let me hack into Homeland Security real quick, and we're in. Speaker 0: And time to get rid of their lice For antiseptic purposes, of course. Did you hear we gave Jake GTV a strike on his YouTube? Speaker 9: Oh, someone's hacked into our system. Another pizza cost. Speaker 1: Look who it is, my base fucking noticer. If you wanna stop wondering what's going on and know, check out my new book on jakegtv.com. Otherwise, just hit the like, comment, and subscribe, and I'll see you on the next one. Speaker 9: Did you hit him with a YouTube strike? Speaker 0: Sir, we did, but he's not stopping. Speaker 9: Shadow ban his accounts. We must shut him down before the red Speaker 7: heifer Speaker 0: is sacrificed.

Video Saved From X

reSee.it Video Transcript AI Summary
Israel is using artificial intelligence (AI) to target and assassinate individuals in Gaza, even if it means killing Palestinian civilians. The Israeli military has a division called the targets division, which uses AI algorithms and automated software to accelerate target creation. The goal is to create a shock effect and continue the war, as previous operations have run out of targets. The AI tools have created 12,000 targets in this war alone, twice as many as in the entire 2014 war. The military has loosened restrictions on harming civilians, knowingly striking targets that may result in civilian casualties. This war policy, aided by AI, has led to civilian devastation and potential war crimes.

Video Saved From X

reSee.it Video Transcript AI Summary
- The speakers claim that American financial institutions and tech companies are deeply involved in the Gaza killings. They name banks, pension funds, Amazon, Google, and Microsoft as having provided services and access to Palestinian data that enabled Israel to set up systems to mass target and kill Palestinians. - They describe an application called Where is Daddy, asserting it allows the army to randomly track people and reach them even when they are with their families, facilitating harm. - The discussion characterizes Israel as possessing the most sophisticated military in the region, knowing precisely what it is doing for two years, and notes that many Israeli soldiers are breaking down, with rising suicidality among young soldiers who have served. - They argue that soldiers have been indoctrinated into becoming executioners of genocide, and that intervention is necessary to prevent further brutality. - The speakers contend that much of this action is driven by people outside Israel who defend the regime, which they describe as having imposed a military dictatorship on Palestinians in the West Bank, Jerusalem, and Gaza (the latter until 2005), and also affecting Israelis who are part of the system. They state that brutalizing others compromises humanity. - Speaker 0 presses for clarification about the existence of the Where is Daddy app, asking if it was a dream or a claim already stated. - Speaker 1 clarifies that Israel has developed an automated system to determine targets through computing, with data supplied by tech companies. He mentions Palantir as one company that publicly supports Israel. He references a public debate in which a Polish person protests that he is killing families, and the response is “you are killing civilians in Gaza,” to which the other person replies that the targets are “most probably terrorists.”
View Full Interactive Feed