reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 outlines how manipulation operates and four simple ways to protect yourself, noting it is pervasive in deception and will also discuss the “purring war” surrounding Trump. A time-saving tip is to use the word “So” or “That’s all you have to say,” letting Mark Levine fill in, with “Nazi” repeated in response. The speaker emphasizes game theory: treat others as they treat you, including groups like signists, who censor those they deem antisemitic. People should be excluded from power if they meddle in others’ lives. He gives examples about racism and hiring, mentioning Amish people and Coca Cola, suggesting social backlash from lip-tart critics. He asserts Monsanto’s history of slave ownership (Sephardi Jews as slave traders) and claims a broader point about who is reminded about slave-owning founders while not highlighting Jewish slave owners. He references Intuition Machine and vows to complement it regarding manipulation. Identity and perception are discussed: you have an identity you believe in, formed from background, family, and nation, and you ground your views on what you directly know through feeling, hearing, and seeing; physical causation and genuine human interaction round out three grounding pillars. Reasoning often relies on hearsay—information passed through others—which can create a grounding gap; as data moves through many steps, each step can be manipulated by those aiming to distort thinking. The four manipulation methods are described as follows: - Filtering: presenting only part of the picture (e.g., one war side’s crimes reported, climate data showing warming globally but not locally) and using imagery that frames dictators or enemies in a particular way, with crafted scenes to provoke a specific response. - Presence of actors: conversations that seem honest but involve actors such as Ben Shapiro or Greta, implying that what you hear may be staged; Greta’s honesty is acknowledged but interactions may be manipulated. - Slogans and identity tactics: slogans like MAGA tie to policy implications and identity, enabling manipulation by aligning beliefs with a brand; also, fallacies and de-emphasizing evidence through various tricks. - Other tactics: ad hominem attacks, false authorities, poisoning the well, weaponizing identity (e.g., American identity or Patriot Act), social-proof coercion (being excluded from family events without vaccination), filter bubbles, paid demonstrators, and slow escalation tactics (foot in the door to gradual war). To protect yourself, he advises checking whether data are genuine and complete, identifying red flags, and distinguishing real causation from correlation. He suggests asking whether data were constructed, whether there are missing data, and whether the actor is genuine or merely performing. He stresses staying close to direct experience and engaging with people you disagree with to test dogma. He also mentions several contemporary geopolitical topics and individuals to illustrate the manipulation and political dynamics, including discussions on the Purim War narrative, Trump’s alliances and criticisms, and various military developments in the Middle East, Europe, and the U.S. Toward conclusions, the defense is to assess data authenticity, identify red herrings, determine whether the scene is theater or genuine, and consider who is speaking and whether they are an actor. The talk ends with a note about posting a cat video on Substack or X.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a tactic called the "wrap up smear." This tactic involves smearing someone with false information, then publicizing it and having it reported in the press. By doing so, the smear gains validation and credibility. The speaker emphasizes that this tactic is self-evident and a common strategy.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the tactics used by industries to spread disinformation when scientific findings go against their interests. They mention examples from various industries, such as coal, gas, energy, agriculture, and pharmaceuticals. The speaker believes that the pharmaceutical industry is the most skilled and devastating practitioner of these tactics. They describe tactics like "the blitz," where doctors and scientists who support inconvenient science are harassed. The speaker shares their personal experience of being targeted by the pharmaceutical industry and the censorship they faced on platforms like YouTube. They express their commitment to documenting and exposing these tactics in a book.

Video Saved From X

reSee.it Video Transcript AI Summary
Welcome to Cybersecurity 101. Today, we're discussing countering disinformation on social media. With the abundance of fake and dishonest information online, it's important to know how to identify it. In recent times, there has been a surge in false information about COVID-19. While some misinformation stems from ignorance, there are deliberate attempts to mislead, harm, or manipulate. This intentional spread of false information is known as disinformation. It can undermine trust in public health, leading to lower vaccine acceptance and adherence to safety protocols. Additionally, disinformation can divide communities, resulting in increased infections and deaths. In this lesson, we'll explore how social media is used to influence and provide strategies to identify and counter disinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
To undermine democratic institutions, it's not necessary for people to believe the information. The key is to flood the public space with misinformation, doubts, and conspiracy theories. This creates confusion and erodes trust in leaders, media, institutions, and even among citizens themselves. When people no longer know what to believe or trust, the damage is done.

Video Saved From X

reSee.it Video Transcript AI Summary
In this session, the speaker discusses how disinformation is not just about lies, but also about distorting and manipulating the truth. They introduce the 4 D's model: dismiss, distort, distract, and dismay. The audience is given cards to identify these tactics in quotes from different organizations. They discuss examples of dismiss, distort, and distract, and someone adds a fifth D, divide. The session focuses on the various ways people twist stories and attack those who present uncomfortable evidence.

Video Saved From X

reSee.it Video Transcript AI Summary
We demonize and then use the wrap up smear tactic in politics. This involves smearing someone with falsehoods, getting it reported in the press, and then using that as validation. It's a tactic that is self-evident.

Video Saved From X

reSee.it Video Transcript AI Summary
This video highlights a fake quote attributed to Donald Trump, claiming Republicans are easily fooled. The speaker urges viewers to report the person spreading this disinformation. They emphasize the danger of manipulating elections with false information and urge people to fact-check before believing everything they see or hear. The video stresses the importance of holding individuals accountable for spreading lies and misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
The transcript centers on claims that the BBC manipulated coverage of a Trump speech in 2021, just hours before the January 6 Capitol riot. It alleges that the BBC’s Panorama segment heavily doctored Trump’s words, splicing together two quotes taken an hour apart to imply that he encouraged an insurrection. The narration asserts that the BBC combined two clips about fifty-four minutes apart to create a misleading impression. It presents the following clip as the BBC’s version: “We're gonna walk down to the capital, and I'll be there with you. And we fight. We fight like hell.” It then notes that this is not what Trump actually said at that moment. The sequence is then explained with the actual wording shown: “We're gonna walk down to the capital, and we're gonna cheer on our brave senators and congressmen and women.” The narrative claims that it wasn’t until nearly an hour later that Trump then said the second part of the BBC’s version: “We're gonna walk down to the capital. And we fight. We fight like hell.” The account characterizes the BBC as a “holier than thou” public service broadcaster, questioning its credibility in light of the alleged manipulation. It references BBC’s own fact-checking service, BBC Verify, described as counters disinformation, and labels this juxtaposition as irony given the alleged doctored footage. Throughout, the speaker emphasizes that the BBC’s portrayal, by mixing two separate moments from Trump’s remarks, appears designed to suggest that Trump called for an insurrection, despite the actual words differing significantly and the timing of the statements not aligning with a single, continuous message. In summary, the transcript claims that the BBC Panorama segment clearly doctored Trump’s speech by splicing two clips, creating a false impression of urging an insurrection, while also contrasting this with the BBC’s claimed role as an impartial public broadcaster and its BBC Verify fact-checking service. The allegedly altered lines and their precise ordering are presented verbatim to illustrate the supposed manipulation.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a tactic called the "wrap up smear" in politics. This tactic involves smearing someone with false information, then publicizing it and using the press to validate the smear. It is a diversionary tactic used to demonize individuals or groups. The speaker believes this tactic is self-evident and worth considering.

Video Saved From X

reSee.it Video Transcript AI Summary
Let's discuss the recent propaganda tactics associated with Trump. This approach isn't new; it mirrors strategies used by autocrats like Hitler. Trump has effectively convinced a significant portion of the population that the system is rigged, fostering distrust in public institutions and the media. This tactic, which involves repeating a big lie, has historically led to disastrous societal outcomes. Despite legal consequences for figures like Rudy Giuliani, many still believe the false narratives. The overarching goal has been to undermine trust in our institutions, and Trump has succeeded in this regard, particularly through his promotion of "fake news."

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a strategy to manipulate public opinion by creating confusion and mistrust. They mention flooding a country's public square with raw sewage, raising questions, spreading dirt, and promoting conspiracy theories. The goal is to make citizens lose trust in their leaders, the mainstream media, political institutions, and even each other. Once trust is lost, the game is won.

Video Saved From X

reSee.it Video Transcript AI Summary
I spent years researching and watching lengthy videos to understand the influence of organizations like the Atlantic Council, which is heavily funded by U.S. government agencies, including the CIA and the Pentagon. This group trains journalists to identify and censor disinformation, particularly targeting populist narratives like those of Donald Trump and Brexit. They promote a framework called the "four D's" of disinformation: dismiss, distort, distract, and dismay. This framework allows them to label factually true information as disinformation if it undermines government narratives. The Atlantic Council's connections to high-ranking CIA officials and its role in shaping media narratives illustrate a troubling intersection of government and media, aiming to control public discourse and influence political outcomes.

Video Saved From X

reSee.it Video Transcript AI Summary
We have over 70 million voters supporting a man who has made extreme statements, including threats of violence against political opponents and intentions to undermine constitutional democracy. Despite clear quotes, many Trump supporters deny or downplay his words. This reflects a broader issue of disinformation and a troubling disregard for truth that has developed over the past decade. There is a significant lack of understanding of fundamental democratic principles, such as checks and balances and the rule of law. The challenge now is how to engage with those who seem disconnected from these civic basics and have been influenced by misinformation during this time.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a tactic called the "wrap up smear" in politics. This tactic involves demonizing someone by spreading falsehoods about them. The goal is to then use these false claims to validate the smear by pointing to media reports. This tactic is referred to as the "wrap up smear" because it involves merchandising the press's report on the smear. The speaker emphasizes that this tactic is a diversionary and self-fulfilling problem in politics.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker lays out how manipulation works and how to protect yourself, framing four simple ways people try to deceive you and pointing to pervasive uses in current events and media. The discussion also touches on a chaotic overview of the Trump-era conflict and related political narratives. Key framework for manipulation: - Identity and grounding: You have an identity and background you believe in, and you use your intelligence to form models of the world based on three pillars: direct perception (what you feel, hear, see), physical causation (objects moving, events happening), and genuine human interaction. As you move away from these pillars, data can be manipulated at each step, creating a grounding gap where outside actors can distort your thinking. - Four ways to manipulate (presented as four distinct methods): 1) Filtering: Selecting or omitting information so the image you see is incomplete or distorted. For example, presenting one side of a war’s crimes or issues like global warming with selective reporting, leading to an incomplete picture. They note that correlations can appear without full context, and that entanglement or constructed scenes can mislead you. 2) The use of constructed scenes and misdirection: Seeing an image tied to a dictator or a positive scenario that is designed to push you toward a certain interpretation, not because of genuine causation but because the scene was created to influence thought. 3) The “actors” or inauthentic conversations: You may think you’re having an honest exchange, but the interlocutor is someone else (examples cited include Ben Shapiro or Greta Thunberg in some contexts) or an actor, suggesting that some discussions are not genuine expressions of belief but performances to manipulate views. 4) The combination of the above with propaganda tools: Slogans and branding (like MAGA) tie to identity and imply broader policy directions; fallacies and deceptive reasoning (ad hominem, false authorities, poisoning the well) prevent evidence from changing beliefs; social proof and identity coercion (pressure within groups, “you must be for/against this to belong”) can hijack thinking. - Consequences and signals of manipulation: They emphasize “grounding gaps” that appear when data is distant from direct perception and when intermediate steps between evidence and belief are introduced. They warn that correlation is not causation, and stress evaluating intent and construction (Was something created to fool you? Is it authentic? Are you seeing the complete data?). - Tactics used in campaigns and discourse: Overwhelming audiences with slogans, fear, and constructed narratives; making it hard to check the underlying data; deploying a filter bubble to isolate information; employing “foot in the door” to escalate commitments; and using paid demonstrations or orchestrated events to shape perception. - Defensive approach suggested: Ensure data authenticity and completeness, check for red herrings and missing information, distinguish genuine encounters from acted portrayals, and seek direct, grounded understanding of events rather than secondhand interpretations. Seek out genuine interactions with people you disagree with to test the strength of your conclusions. The speaker weaves in numerous political anecdotes and personal commentary about contemporary figures and events (Trump, Iran, Israel, Europe, media personalities, and various political actors) to illustrate how manipulation can operate in real-world contexts, while urging vigilance against data filtering, constructed scenarios, and identity-driven persuasion. The overall message centers on recognizing grounding gaps, interrogating data provenance, and prioritizing direct observation and authentic dialogue to protect one's reasoning from manipulation.

Video Saved From X

reSee.it Video Transcript AI Summary
I discovered a clip from a nine-hour conference I watched in 2019, which reveals how the Atlantic Council, funded by U.S. taxpayer dollars, trains journalists from major media outlets to censor information that undermines government narratives. This organization, known as NATO's think tank, has seven former CIA directors on its board and receives annual funding from various military and intelligence agencies. They promote a framework called the "four D's" of disinformation: dismiss, distort, distract, and dismay. This training aims to suppress populist sentiments, particularly during the 2020 election cycle, by labeling factually true information as disinformation if it contradicts preferred narratives. The Atlantic Council's collaboration with Burisma, signed just before Trump's inauguration, highlights the intertwining of corporate interests and government actions in shaping public discourse.

Video Saved From X

reSee.it Video Transcript AI Summary
Throughout history, leaders and generals have used distraction, deception, untruths, and a mix of truth in military campaigns. According to Speaker 1, the government is capable of disinformation campaigns, psychological operations, and information warfare. Speaker 1 claims to have participated in information warfare campaigns against Al Qaeda and ISIS, involving deception, lies, misinformation, and disinformation to sway the audience. Speaker 1 believes QAnon is a well-executed SIOP (Single Integrated Operational Plan) directed against the American people.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a tactic called the "wrap up smear" in politics. This tactic involves demonizing someone with false information, then using the press to validate the smear by reporting it. The speaker refers to this as merchandise, where they use the press's report on the smear to further promote it. They emphasize that this tactic is a diversionary and self-fulfilling problem in politics.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 states that labeling Donald Trump's plan as Project 2025 is not rhetoric, and claiming Trump started an insurrection is a fact. Speaker 1 argues that both examples are rhetoric and factually incorrect. Trump has stated he has nothing to do with Project 2025 and has never been charged with insurrection. Speaker 1 accuses Speaker 0 of spreading misinformation and expresses shame.

Video Saved From X

reSee.it Video Transcript AI Summary
To weaken democratic institutions, it's not essential for people to believe disinformation. Overwhelming the public sphere with disinformation, raising questions, spreading dirt, and planting conspiracy theories can be enough to erode trust. Once citizens distrust leaders, mainstream media, political institutions, each other, and the possibility of truth, the goal is achieved.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker describes a deliberate strategy to corrode public trust by raising questions, spreading dirt, and planting conspiracy theories, thereby causing citizens to doubt the credibility of leaders, mainstream media, political institutions, and even each other and the concept of truth. The aim is to overwhelm citizens with suspicion until a sense of shared reality dissolves, enabling whoever orchestrates the tactic to prevail. A country's public square with enough raw sewage. You just have to raise enough questions, spread enough dirt, plant enough conspiracy theorizing that citizens no longer know what to believe. Once they lose trust in their leaders, the mainstream media, in political institutions, in each other, in the possibility of truth. The game's won. This is presented as a win for the manipulators.

Video Saved From X

reSee.it Video Transcript AI Summary
Information laundering occurs when lies are made to sound credible by being stated in Congress or a mainstream outlet. Examples of information laundering include Rudy Giuliani sharing bad intel from Ukraine and TikTok influencers claiming COVID can cause pain. Disinformation should not be supported with wallets, voices, or votes.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: Cognitive control runs deeper than simply changing what you think; it shapes the very process of how you think. Are your thoughts really your own? We’ll break down techniques that sneak past your critical thinking to lead you to a conclusion, often without you realizing it. We’ll start with weaponized language, then show how reality itself can be distorted and simplified, and finish with methods that control someone’s entire environment. We begin with weaponizing words. Words are the building blocks of thought, and these techniques create emotional shortcuts before logical analysis can wake up. Loaded language uses words packed with emotional baggage to evoke reaction without evidence. Example contrasts: neutral terms versus loaded ones (public servant vs. bureaucrat; estate tax vs. death tax). Paltering is lying by telling the truth—carefully choosing only true statements to create a misleading picture (e.g., “I did not have textual relations with that chatbot” to imply nothing happened). Obfuscation uses jargon to bury a simple truth under complexity. Rationalization uses emotion-then-logic to defend a decision as if it were purely rational. Section two moves to distorting and simplifying reality. Oversimplification reduces real, messy problems to slogans or black-and-white choices. Out-of-context quotes can make it appear the opposite of what was meant. Limited hangout admits to a small part of a story to appear transparent while hiding the rest. Passe unique (single thought) aims to render opposing viewpoints immoral or unthinkable, narrowing acceptable debate until only one thought remains. The final section covers controlling the environment. Love bombing lavishes praise to secure acceptance, then isolates the person from prior life to foster dependence. Operant conditioning—rewards and punishments on social platforms—shapes behavior; milieux control creates an information bubble that blocks opposing views, discourages critical thinking, and uses its own language to isolate a population. The core takeaway: recognizing these techniques is the first and best defense; awareness reduces their power. The toolkit promises to help you spot propaganda in ads, politics, online groups, and everyday arguments. Speaker 1: Division is a deliberate strategy, not a bug in the system. Chapter one of the playbook focuses on twisting reality to control beliefs. Disinformation is the intentional spread of lies to spark outrage and distrust before facts can be checked, aiming to make you doubt truth itself. FUD—fear, uncertainty, doubt—paralyzes you; the fire hose of falsehood overwhelms with a high volume of junk information across platforms, with no commitment to truth. Euphemism softens harsh realities (civilian deaths becomes collateral damage). The playbook hijacks emotions, demonizes opponents, and sometimes creates manufactured bliss to obscure problems. The long game demoralizes a population to render voting and institutions meaningless, and the endgame is to lock down power by breaking unity among people—pitting departments against each other, issuing nonnegotiable diktats, and launching coordinated harassment campaigns (FLAC) to deter dissent. The objective is poisoning reality to provoke confusion, manipulate emotions, and induce powerlessness. The antidote is naming and recognizing tactics (disinformation, FUD, demonization, etc.) to regain control of the conversation and build more honest, constructive discourse. The information battlefield uses framing, the half-truth, gaslighting, foot-in-the-door tactics, guilt by association, labeling, and latitudes of acceptance to rig debates before they start. The Gish gallop overwhelms with rapid claims; data overload creates a wall of complexity; glittering generalities rely on vague, emotionally charged terms to persuade without substance. Chapter two and beyond emphasize that recognizing the rules of the game lets you slow down, name the tactic, and guide conversations back to facts. The playbook’s architecture: control reality, trigger emotions, build the crowd, and anoint a hero to lead. Understanding these plays is not to promote cynicism, but to enable clearer thinking and more honest dialogue.

Mark Changizi

How do we handle DISinformation? Moment 154
reSee.it Podcast Summary
Disinformation involves intentional lying, which is harder to maintain than misinformation; reputation networks should identify liars, not centralized fact checkers.
View Full Interactive Feed