reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 outlines how manipulation operates and four simple ways to protect yourself, noting it is pervasive in deception and will also discuss the “purring war” surrounding Trump. A time-saving tip is to use the word “So” or “That’s all you have to say,” letting Mark Levine fill in, with “Nazi” repeated in response. The speaker emphasizes game theory: treat others as they treat you, including groups like signists, who censor those they deem antisemitic. People should be excluded from power if they meddle in others’ lives. He gives examples about racism and hiring, mentioning Amish people and Coca Cola, suggesting social backlash from lip-tart critics. He asserts Monsanto’s history of slave ownership (Sephardi Jews as slave traders) and claims a broader point about who is reminded about slave-owning founders while not highlighting Jewish slave owners. He references Intuition Machine and vows to complement it regarding manipulation. Identity and perception are discussed: you have an identity you believe in, formed from background, family, and nation, and you ground your views on what you directly know through feeling, hearing, and seeing; physical causation and genuine human interaction round out three grounding pillars. Reasoning often relies on hearsay—information passed through others—which can create a grounding gap; as data moves through many steps, each step can be manipulated by those aiming to distort thinking. The four manipulation methods are described as follows: - Filtering: presenting only part of the picture (e.g., one war side’s crimes reported, climate data showing warming globally but not locally) and using imagery that frames dictators or enemies in a particular way, with crafted scenes to provoke a specific response. - Presence of actors: conversations that seem honest but involve actors such as Ben Shapiro or Greta, implying that what you hear may be staged; Greta’s honesty is acknowledged but interactions may be manipulated. - Slogans and identity tactics: slogans like MAGA tie to policy implications and identity, enabling manipulation by aligning beliefs with a brand; also, fallacies and de-emphasizing evidence through various tricks. - Other tactics: ad hominem attacks, false authorities, poisoning the well, weaponizing identity (e.g., American identity or Patriot Act), social-proof coercion (being excluded from family events without vaccination), filter bubbles, paid demonstrators, and slow escalation tactics (foot in the door to gradual war). To protect yourself, he advises checking whether data are genuine and complete, identifying red flags, and distinguishing real causation from correlation. He suggests asking whether data were constructed, whether there are missing data, and whether the actor is genuine or merely performing. He stresses staying close to direct experience and engaging with people you disagree with to test dogma. He also mentions several contemporary geopolitical topics and individuals to illustrate the manipulation and political dynamics, including discussions on the Purim War narrative, Trump’s alliances and criticisms, and various military developments in the Middle East, Europe, and the U.S. Toward conclusions, the defense is to assess data authenticity, identify red herrings, determine whether the scene is theater or genuine, and consider who is speaking and whether they are an actor. The talk ends with a note about posting a cat video on Substack or X.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 and Speaker 1 discuss government disinformation offices and transparency concerns. - CISA’s office of mis, dis, and malinformation (MDM) operated as a DHS unit focused on domestic threat actors, with archive details at cisa.gov/mdm. The office existed for two years, from 2021 to 2023, before being shut down and renamed after the foundation published a series of reports. - The disinformation governance board was formed around April 2022. The CISOs countering foreign influence task force, originally aimed at stopping Russian influence and repurposed to “stop Trump in the twenty twenty election,” changed its name to the office of mis, dis, and malinformation and shifted focus from foreign influence to 80% domestic, 20% foreign, one month before the twenty twenty election. - Speaker 1 argues that the information environment problems are largely domestic, suggesting an 80/20 focus on foreign vs domestic issues should be flipped. - A June 2022 Holly Senate committee link is highlighted, leading to a 31-page PDF that, as of now, represents the sum total of internal documents related to the office of mis, dis, and malinformation. The speaker questions why there is more transparency about the DHS MIS office from a whistleblower three years ago than in ten months of current executive power. - The speaker calls for comprehensive publication of internal files: every email, text, and correspondence from DHS MIS personnel, to be placed in a WikiLeaks/JFK-style publicly accessible database for forensic reconstruction of DHS actions during those years, to name and shame responsible individuals and prevent repetition. - The video also references George Soros state department cables published by WikiLeaks (from 2010), noting extensive transparency about the Open Society Foundations’ relationship with the state department fifteen years ago, compared to today. The claim is that Open Society Foundations’ activities through the state department, USAID, and the CIA were weaponized to influence domestic politics while remaining secret, with zero disclosures to this day. - Speaker questions why cooperative agreements from USAID with Open Society Foundation, Omidyar Network, or Gates Foundation have never been made public, nor quarterly or annual milestone reports, network details, or the actual scope of funded activities. USAID grant descriptions on usaspending.gov are often opaque or misleading compared to the true activities funded. - The speaker urges transparency across DHS, USAID, the State Department, CIA, ODNI, and related entities, asking for open files and for accountability. They stress the need to open these records now to inform the public and prevent recurrence, especially as mid-term political considerations loom.

Video Saved From X

reSee.it Video Transcript AI Summary
In this session, the speaker discusses how disinformation is not just about lies, but also about distorting and manipulating the truth. They introduce the 4 D's model: dismiss, distort, distract, and dismay. The audience is given cards to identify these tactics in quotes from different organizations. They discuss examples of dismiss, distort, and distract, and someone adds a fifth D, divide. The session focuses on the various ways people twist stories and attack those who present uncomfortable evidence.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker states that the "Russian story" would be called a covert influence campaign if they were doing it. The speaker also claims they would be the last to say they've never tried a covert influence campaign.

Video Saved From X

reSee.it Video Transcript AI Summary
We demonize and then use the wrap up smear tactic in politics. This involves smearing someone with falsehoods, getting it reported in the press, and then using that as validation. It's a tactic that is self-evident.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 and Speaker 1 discuss a network of alleged influence surrounding Tim Ballard, Glenn Beck, and broader geopolitical insinuations, tying activism and media narratives to covert operations and manipulation. Speaker 0 recalls meeting Tim Ballard during a period when he was pursuing controversial legal matters, noting that Glenn Beck helped him build Underground Railroad and was Ballard’s close ally for breaking stories on child trafficking. When Ballard contemplated a dash for political office (senate or congress) and was poised to win after the Sound of Freedom release, Speaker 0 says the attacks against him began. He claims that Glenn Beck subsequently “threw him under the bus,” and quotes his own video response to Ballard’s reaction, arguing that Beck’s loyalty had changed because Beck was “pledging allegiance to Israel,” implying he was bought and paid for and controlled by intelligence agencies. The point is that Beck was not Ballard’s friend, according to Speaker 0, who shows Ballard a video to illustrate this shift. Speaker 1 adds a specific counter-narrative about the Sound of Freedom story. He asserts that the child trafficking ring Tim Ballard exposed in South America, depicted in the film, was actually Israeli-run. He claims the ring was “run by Israelis,” and that its head escaped to Portugal, where a judge released him, after which no traceable location remains. Speaker 1 emphasizes that this is the real story behind Sound of Freedom and asserts that the truth is not told to audiences, urging listeners to research independently to uncover that the ring was Israeli-run. He reiterates the theme that “it’s always them” and that “it always comes back to them.” Speaker 1 shifts to a broader media warning about Twitter, stating that it is not a free speech platform but “a military application,” a propaganda operation that is highly artificial, synthetic, and manipulated. He clarifies that he uses Twitter but urges users to recognize that not everything on the platform is as it seems. He warns that big accounts may be part of campaigns, with paid boosts, manipulated algorithms, bots, and unauthentic accounts. The advisory is to be aware of the battlefield on which users engage, not to abandon the platform, but to be more discerning. He urges readers to develop a wary eye toward others by examining profiles, feeds, retweets, boosts, networks, and who is using the same messaging. Speaker 0 closes by reiterating the pattern of attention, influence, and alleged manipulation that ties these figures and narratives together, suggesting a recurring causal link between entertainment media, political ambition, and covert agendas.

Video Saved From X

reSee.it Video Transcript AI Summary
The speakers discuss the prevalence of biased and false news on social media, with some media outlets publishing these stories without fact-checking. They emphasize that this is extremely dangerous to our democracy, repeating this statement multiple times.

Video Saved From X

reSee.it Video Transcript AI Summary
To destabilize a country, one must inundate its public square with misinformation and doubt, eroding trust in leaders, media, institutions, and even fellow citizens. When people no longer believe in the concept of truth, the game is won.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 argues that it is all election interference, claiming they love to talk about disinformation and democracy, and that it's all disinformation. They say those people are great at cheating on elections and great at misinformation, disinformation. They claim these people are weaponizing the DOJ and the FBI, our election systems, and attacking free speech, and they're going into the states.

Video Saved From X

reSee.it Video Transcript AI Summary
The session discusses the use of misinformation tactics, including dismiss, distort, distract, and dismay. Participants analyze quotes to identify these tactics. Trump is cited as a prime example of spreading disinformation. The group also introduces a fifth tactic, divide, to the discussion. The audience actively engages in identifying these tactics throughout the session.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a strategy to manipulate public opinion by creating confusion and mistrust. They mention flooding a country's public square with raw sewage, raising questions, spreading dirt, and promoting conspiracy theories. The goal is to make citizens lose trust in their leaders, the mainstream media, political institutions, and even each other. Once trust is lost, the game is won.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker alleges that it is “all election interference” and that they are “great at cheating on elections, and they're great at misinformation, disinformation” (described as similar, but not the same). The speaker further claims that “they're weaponizing the DOJ and the FBI, our election systems, and attacking free speech, and they're also going into the states.”

Video Saved From X

reSee.it Video Transcript AI Summary
Our job is to control what people think by undermining the messaging.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a tactic called the "wrap up smear" in politics. This tactic involves demonizing someone by spreading falsehoods about them. The goal is to then use these false claims to validate the smear by pointing to media reports. This tactic is referred to as the "wrap up smear" because it involves merchandising the press's report on the smear. The speaker emphasizes that this tactic is a diversionary and self-fulfilling problem in politics.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker lays out how manipulation works and how to protect yourself, framing four simple ways people try to deceive you and pointing to pervasive uses in current events and media. The discussion also touches on a chaotic overview of the Trump-era conflict and related political narratives. Key framework for manipulation: - Identity and grounding: You have an identity and background you believe in, and you use your intelligence to form models of the world based on three pillars: direct perception (what you feel, hear, see), physical causation (objects moving, events happening), and genuine human interaction. As you move away from these pillars, data can be manipulated at each step, creating a grounding gap where outside actors can distort your thinking. - Four ways to manipulate (presented as four distinct methods): 1) Filtering: Selecting or omitting information so the image you see is incomplete or distorted. For example, presenting one side of a war’s crimes or issues like global warming with selective reporting, leading to an incomplete picture. They note that correlations can appear without full context, and that entanglement or constructed scenes can mislead you. 2) The use of constructed scenes and misdirection: Seeing an image tied to a dictator or a positive scenario that is designed to push you toward a certain interpretation, not because of genuine causation but because the scene was created to influence thought. 3) The “actors” or inauthentic conversations: You may think you’re having an honest exchange, but the interlocutor is someone else (examples cited include Ben Shapiro or Greta Thunberg in some contexts) or an actor, suggesting that some discussions are not genuine expressions of belief but performances to manipulate views. 4) The combination of the above with propaganda tools: Slogans and branding (like MAGA) tie to identity and imply broader policy directions; fallacies and deceptive reasoning (ad hominem, false authorities, poisoning the well) prevent evidence from changing beliefs; social proof and identity coercion (pressure within groups, “you must be for/against this to belong”) can hijack thinking. - Consequences and signals of manipulation: They emphasize “grounding gaps” that appear when data is distant from direct perception and when intermediate steps between evidence and belief are introduced. They warn that correlation is not causation, and stress evaluating intent and construction (Was something created to fool you? Is it authentic? Are you seeing the complete data?). - Tactics used in campaigns and discourse: Overwhelming audiences with slogans, fear, and constructed narratives; making it hard to check the underlying data; deploying a filter bubble to isolate information; employing “foot in the door” to escalate commitments; and using paid demonstrations or orchestrated events to shape perception. - Defensive approach suggested: Ensure data authenticity and completeness, check for red herrings and missing information, distinguish genuine encounters from acted portrayals, and seek direct, grounded understanding of events rather than secondhand interpretations. Seek out genuine interactions with people you disagree with to test the strength of your conclusions. The speaker weaves in numerous political anecdotes and personal commentary about contemporary figures and events (Trump, Iran, Israel, Europe, media personalities, and various political actors) to illustrate how manipulation can operate in real-world contexts, while urging vigilance against data filtering, constructed scenarios, and identity-driven persuasion. The overall message centers on recognizing grounding gaps, interrogating data provenance, and prioritizing direct observation and authentic dialogue to protect one's reasoning from manipulation.

Video Saved From X

reSee.it Video Transcript AI Summary
Throughout history, leaders and generals have used distraction, deception, untruths, and a mix of truth in military campaigns. According to Speaker 1, the government is capable of disinformation campaigns, psychological operations, and information warfare. Speaker 1 claims to have participated in information warfare campaigns against Al Qaeda and ISIS, involving deception, lies, misinformation, and disinformation to sway the audience. Speaker 1 believes QAnon is a well-executed SIOP (Single Integrated Operational Plan) directed against the American people.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0, speaking in March 2024, argues for “deflating” the system. The core claim is that there exists a fake controlled opposition: illiterion puppets posing as opponents on each side, but in reality both sides serve the same agenda of totalitarian control and the controlling illiterion masters. The purpose of deflating, according to this view, is to prevent the fake opposition from being bribed or blackmailed, which would otherwise keep control of the narrative and shape of public perception. The speaker contends that in these large-scale systems there is no real democratic choice and there never will be. The proposed solution is to deflate the parasitic system. The transcript then references David Icke and a claim about Donald Trump: “David Icke, Trump doubles down on support for COVID fake vaccines and boosters despite outcry from conservatives.” The speaker questions Trump supporters, stating that “He was a fraud all along as I have said since 2016 and he has been leading you to glorious failure for the masters that own him. No politician is going to get us out of this. We have to do it.” This presents the position that Trump’s stance on vaccines is used to illustrate a broader pattern of manipulation by a so-called masters’ system, implying that political leaders are not the solution and that collective action is necessary outside the conventional political framework. The transcript also includes a claim attributed to Catherine Austin Fitz: “Trump put $10 billion dollars into a program to depopulate The US.” This assertion is presented as a sourced claim, accompanied by a prompt to like and follow and a source referenced as tumia.org. The overall narrative ties these points together to argue that both mainstream politics and alleged hidden forces operate to maintain control, and that true change requires deflating the parasitic system rather than relying on political figures or conventional democratic processes.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a tactic called the "wrap up smear" in politics. This tactic involves demonizing someone with false information, then using the press to validate the smear by reporting it. The speaker refers to this as merchandise, where they use the press's report on the smear to further promote it. They emphasize that this tactic is a diversionary and self-fulfilling problem in politics.

Video Saved From X

reSee.it Video Transcript AI Summary
To weaken democratic institutions, it's not essential for people to believe disinformation. Overwhelming the public sphere with disinformation, raising questions, spreading dirt, and planting conspiracy theories can be enough to erode trust. Once citizens distrust leaders, mainstream media, political institutions, each other, and the possibility of truth, the goal is achieved.

Video Saved From X

reSee.it Video Transcript AI Summary
Jews are accused of using deceptive tactics to achieve their political goals. They allegedly manipulate situations by creating false narratives and dialectics. Instead of openly advocating for war or the genocide of Palestinians, they manufacture fake terrorist attacks and portray themselves as victims. This supposedly prompts the United States to fight their wars. The speaker claims that Jews tricked Americans with events like 9/11 and manipulated the situation to their advantage.

Video Saved From X

reSee.it Video Transcript AI Summary
In this video, the speaker discusses the concept of doublespeak, which is language designed to mislead or evade responsibility. They explain the different types of doublespeak, such as euphemisms and jargon, and provide examples of how it is used in various contexts. The speaker also mentions the importance of using the right words to evoke a desired response from the listener, as demonstrated by political strategist Frank Luntz. They highlight instances of doublespeak in graphs and advertisements, as well as the rebranding of words to create a more positive perception. The video concludes by emphasizing the need to question and clarify information when faced with doublespeak.

Video Saved From X

reSee.it Video Transcript AI Summary
During the debate, Speaker 0 accuses Speaker 1 of lying about a Russian plan and claims that there is overwhelming evidence of Russian engagement. Speaker 1 denies these allegations, stating that intelligence agencies and former heads of the CIA have called it garbage. Speaker 0 also accuses the FBI of cheating by telling Facebook and Twitter what to do. Speaker 2 believes that the objective is to stop Donald Trump and what he represents in the political process. Speaker 0 concludes by accusing Joe Biden of lying about a major scandal, calling it cheating and election interference on an unprecedented scale.

Video Saved From X

reSee.it Video Transcript AI Summary
Information laundering occurs when lies are made to sound credible by being stated in Congress or a mainstream outlet. Examples of information laundering include Rudy Giuliani sharing bad intel from Ukraine and TikTok influencers claiming COVID can cause pain. Disinformation should not be supported with wallets, voices, or votes.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: Cognitive control runs deeper than simply changing what you think; it shapes the very process of how you think. Are your thoughts really your own? We’ll break down techniques that sneak past your critical thinking to lead you to a conclusion, often without you realizing it. We’ll start with weaponized language, then show how reality itself can be distorted and simplified, and finish with methods that control someone’s entire environment. We begin with weaponizing words. Words are the building blocks of thought, and these techniques create emotional shortcuts before logical analysis can wake up. Loaded language uses words packed with emotional baggage to evoke reaction without evidence. Example contrasts: neutral terms versus loaded ones (public servant vs. bureaucrat; estate tax vs. death tax). Paltering is lying by telling the truth—carefully choosing only true statements to create a misleading picture (e.g., “I did not have textual relations with that chatbot” to imply nothing happened). Obfuscation uses jargon to bury a simple truth under complexity. Rationalization uses emotion-then-logic to defend a decision as if it were purely rational. Section two moves to distorting and simplifying reality. Oversimplification reduces real, messy problems to slogans or black-and-white choices. Out-of-context quotes can make it appear the opposite of what was meant. Limited hangout admits to a small part of a story to appear transparent while hiding the rest. Passe unique (single thought) aims to render opposing viewpoints immoral or unthinkable, narrowing acceptable debate until only one thought remains. The final section covers controlling the environment. Love bombing lavishes praise to secure acceptance, then isolates the person from prior life to foster dependence. Operant conditioning—rewards and punishments on social platforms—shapes behavior; milieux control creates an information bubble that blocks opposing views, discourages critical thinking, and uses its own language to isolate a population. The core takeaway: recognizing these techniques is the first and best defense; awareness reduces their power. The toolkit promises to help you spot propaganda in ads, politics, online groups, and everyday arguments. Speaker 1: Division is a deliberate strategy, not a bug in the system. Chapter one of the playbook focuses on twisting reality to control beliefs. Disinformation is the intentional spread of lies to spark outrage and distrust before facts can be checked, aiming to make you doubt truth itself. FUD—fear, uncertainty, doubt—paralyzes you; the fire hose of falsehood overwhelms with a high volume of junk information across platforms, with no commitment to truth. Euphemism softens harsh realities (civilian deaths becomes collateral damage). The playbook hijacks emotions, demonizes opponents, and sometimes creates manufactured bliss to obscure problems. The long game demoralizes a population to render voting and institutions meaningless, and the endgame is to lock down power by breaking unity among people—pitting departments against each other, issuing nonnegotiable diktats, and launching coordinated harassment campaigns (FLAC) to deter dissent. The objective is poisoning reality to provoke confusion, manipulate emotions, and induce powerlessness. The antidote is naming and recognizing tactics (disinformation, FUD, demonization, etc.) to regain control of the conversation and build more honest, constructive discourse. The information battlefield uses framing, the half-truth, gaslighting, foot-in-the-door tactics, guilt by association, labeling, and latitudes of acceptance to rig debates before they start. The Gish gallop overwhelms with rapid claims; data overload creates a wall of complexity; glittering generalities rely on vague, emotionally charged terms to persuade without substance. Chapter two and beyond emphasize that recognizing the rules of the game lets you slow down, name the tactic, and guide conversations back to facts. The playbook’s architecture: control reality, trigger emotions, build the crowd, and anoint a hero to lead. Understanding these plays is not to promote cynicism, but to enable clearer thinking and more honest dialogue.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker references a saying: "Tell a lie big enough, loud enough, and long enough, sooner or later people believe it." The speaker attributes the quote to Hitler. The speaker anticipates that the other person thought they were going to say Fauci. The speaker concludes by saying "same difference" and that they are aligned on that point.
View Full Interactive Feed