reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
The transcript presents a cascade of allegations and observations surrounding the COVID-19 outbreak and related operations: - It is claimed that “every time there is something that comes out that is in fact false information that is starting to actually hamper our ability to address the pandemic,” and that there are two months during which “we have all these modern technologies” in place to respond after false information emerges. - A suggestion is made that the outbreak was simulated or anticipated two months before it began, with dialogue implying the virus had already circulated for two months, and that such foresight was connected to a simulated scenario tied to the Wuhan military games in October 2019. - The speakers allege that information about a novel coronavirus with a “fern cleavage site insertion” leaked during the Wuhan games, and that “they” knew this and prepared a cover-up over the next two months, launching a tabletop exercise with media, intelligence agencies, the Gates Foundation, the World Economic Forum, and others. - They describe a “tabletop exercise” for a pandemic, described as event two zero one, conducted with pharmaceutical executives, the deputy director of the CIA (who later became the director of national intelligence), Avril Haines, and others. They claim this exercise was run “the week of the Wuhan games,” and that it involved a simulated global spread beginning with a coronavirus outbreak and evolving into a pandemic. - The dialogue asserts that the exercise was “hosted at Johns Hopkins, funded by Bill Gates,” and references a scenario where pigs in Brazil, not pangolins in China, are the initial hosts, with the simulation detailing widespread illness, hospitalizations, and international travel turning local epidemics into a global pandemic. - They allege that the Central Intelligence Agency, in 2015, under Avril Haines (then deputy director), approached Ralph Baric to discuss gain-of-function research on coronaviruses, and that Baric was in contact with the Wuhan Institute of Virology’s Shi Zhengli (the “Bat Lady”) regarding a possible project on coronavirus evolution in humans. - The speakers question Haines’s qualifications, noting she is described as a physicist and “research engineer,” and suggesting she could run the CIA or the entire intelligence community, including participating in a coronavirus response simulation. - They cite today’s reports about social media platforms detecting and removing accounts spreading pandemic-related misinformation, and argue that the deputy director of the CIA—who later became DNI—led a pre-pandemic censorship conference about pressuring social media to ban conspiracy theories that the virus originated from a lab or was linked to U.S. military projects. - The overall narrative ties together claims of advance knowledge, the existence of pre-pandemic simulations (event 201), connections between Johns Hopkins, the Gates Foundation, NATO, and corporate media, and the involvement of Avril Haines in both pre-pandemic discussions and post-pandemic censorship efforts.

Video Saved From X

reSee.it Video Transcript AI Summary
Renee DiResta, a speaker at the 4th Annual Cybersecurity Summit, discusses the power of partnerships in combating misinformation. She highlights the need for collaboration between government agencies, research institutions, and civil society organizations to address the spread of false and misleading information. DiResta emphasizes the importance of situational awareness, context, and resilience in countering harmful narratives. She suggests the establishment of a Center of Excellence within the federal government to coordinate efforts and promote effective communication. While acknowledging the need to respect civil liberties and prioritize free expression, DiResta emphasizes the urgency of addressing the current challenges posed by misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
Welcome to Cybersecurity 101. Today, we're discussing countering disinformation on social media. With the abundance of fake and dishonest information online, it's important to know how to identify it. In recent times, there has been a surge in false information about COVID-19. While some misinformation stems from ignorance, there are deliberate attempts to mislead, harm, or manipulate. This intentional spread of false information is known as disinformation. It can undermine trust in public health, leading to lower vaccine acceptance and adherence to safety protocols. Additionally, disinformation can divide communities, resulting in increased infections and deaths. In this lesson, we'll explore how social media is used to influence and provide strategies to identify and counter disinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 and Speaker 1 discuss government disinformation offices and transparency concerns. - CISA’s office of mis, dis, and malinformation (MDM) operated as a DHS unit focused on domestic threat actors, with archive details at cisa.gov/mdm. The office existed for two years, from 2021 to 2023, before being shut down and renamed after the foundation published a series of reports. - The disinformation governance board was formed around April 2022. The CISOs countering foreign influence task force, originally aimed at stopping Russian influence and repurposed to “stop Trump in the twenty twenty election,” changed its name to the office of mis, dis, and malinformation and shifted focus from foreign influence to 80% domestic, 20% foreign, one month before the twenty twenty election. - Speaker 1 argues that the information environment problems are largely domestic, suggesting an 80/20 focus on foreign vs domestic issues should be flipped. - A June 2022 Holly Senate committee link is highlighted, leading to a 31-page PDF that, as of now, represents the sum total of internal documents related to the office of mis, dis, and malinformation. The speaker questions why there is more transparency about the DHS MIS office from a whistleblower three years ago than in ten months of current executive power. - The speaker calls for comprehensive publication of internal files: every email, text, and correspondence from DHS MIS personnel, to be placed in a WikiLeaks/JFK-style publicly accessible database for forensic reconstruction of DHS actions during those years, to name and shame responsible individuals and prevent repetition. - The video also references George Soros state department cables published by WikiLeaks (from 2010), noting extensive transparency about the Open Society Foundations’ relationship with the state department fifteen years ago, compared to today. The claim is that Open Society Foundations’ activities through the state department, USAID, and the CIA were weaponized to influence domestic politics while remaining secret, with zero disclosures to this day. - Speaker questions why cooperative agreements from USAID with Open Society Foundation, Omidyar Network, or Gates Foundation have never been made public, nor quarterly or annual milestone reports, network details, or the actual scope of funded activities. USAID grant descriptions on usaspending.gov are often opaque or misleading compared to the true activities funded. - The speaker urges transparency across DHS, USAID, the State Department, CIA, ODNI, and related entities, asking for open files and for accountability. They stress the need to open these records now to inform the public and prevent recurrence, especially as mid-term political considerations loom.

Video Saved From X

reSee.it Video Transcript AI Summary
Stanford University, University of Washington, Graphica, and the Atlantic Council were used as a front by the Department of Homeland Security (DHS) to manipulate social media during the 2020 election. The goal was to censor posts containing misinformation about mail-in ballots and other election-related topics. DHS lacked the legal authority to directly censor, so they set up the Election Infrastructure Partnership (EIP) to fill the gaps. These outside organizations received federal funding and worked closely with DHS to ban or throttle millions of posts and accounts. The entire operation was orchestrated to rig the election. The question now is whether there will be political accountability for these actions.

Video Saved From X

reSee.it Video Transcript AI Summary
Our current focus is on prebunking to prevent misinformation rather than debunking after the fact. Preemptively injecting people with weakened fake news can build cognitive antibodies against manipulation. This approach is likened to a psychological vaccine. The initiative involves partnerships with Cambridge University, the Department of Homeland Security, and Google's Jigsaw unit.

Video Saved From X

reSee.it Video Transcript AI Summary
Renee DiResta, a speaker at the 4th Annual Cybersecurity Summit, discusses the power of partnerships in combating misinformation and disinformation. She highlights the need for collaboration between government agencies, research institutions, and civil society organizations to address the spread of false and misleading narratives. DiResta emphasizes the importance of situational awareness, effective communication, and the promotion of reliable information while respecting civil liberties and prioritizing free expression. She suggests the establishment of a Center of Excellence within the federal government to coordinate efforts and facilitate ongoing research and analysis. The goal is to counter harmful misinformation without infringing on individuals' rights to free speech.

Video Saved From X

reSee.it Video Transcript AI Summary
We developed real-world interventions, like the game Go Viral, to help people identify fake news about COVID-19. We collaborated with organizations, governments, and social media companies, including the Cabinet offices, the World Health Organization, and the United Nations Verify Campaign. Through our game called Bad News, users experience a simulated social media feed and learn how misinformation spreads. Our research shows that people who go through our interventions become better at recognizing fake news, gain confidence in discerning fact from fiction, and share less fake news with others.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the establishment of an election integrity partnership (EIP) at the request of the Department of Homeland Security (DHS). They mention an email from Graham Brookie of the Atlantic Council, confirming the setup of the partnership. The speaker also highlights the connection between the Atlantic Council and the National Endowment for Democracy, which is linked to the CIA. They point out the involvement of various organizations with ties to intelligence agencies, such as Stanford, UW, and Graphika. The speaker asserts that EIP was not a secret operation and was formed due to funding and legal limitations of CISA. They express personal satisfaction in discovering and sharing this information.

Video Saved From X

reSee.it Video Transcript AI Summary
CISA lacked the capability and resources to address election disinformation. To bridge this gap, a project was quickly formed involving four institutions. The project collaborated with government partners like CISA DHS and local/state governments, civil society groups including NAACP, MITRE, Common Cause, and the Healthy Elections Project, and major platforms such as Facebook, Twitter, YouTube, TikTok, Reddit, and Nextdoor. Agreements for data access were made with some platforms, while analysts had to work individually with others.

Video Saved From X

reSee.it Video Transcript AI Summary
Renee DiResta, a key player in the censorship industrial complex, discusses the power of partnerships in combating misinformation. She highlights the collaboration between government agencies, research organizations, and social media platforms to censor disinformation. DiResta emphasizes the need to create a social norm that supports government censorship and justifies it as a means to prevent harm and protect national security. She proposes the establishment of a Center of Excellence within the federal government to coordinate efforts, deploy experts, and promote resilience products. DiResta acknowledges the importance of respecting civil liberties and free expression while prioritizing effective communication and situational awareness.

Video Saved From X

reSee.it Video Transcript AI Summary
Renee DiResta, a speaker at the 4th Annual Cybersecurity Summit, discusses the power of partnerships in combating misinformation and disinformation. She highlights the need for collaboration between government agencies, research institutions, and civil society organizations to address the spread of false and misleading narratives. DiResta emphasizes the importance of situational awareness, effective communication, and the promotion of reliable information while respecting civil liberties and prioritizing free expression. She suggests the establishment of a Center of Excellence within the federal government to coordinate efforts and facilitate ongoing research and analysis. The goal is to mitigate the impact of harmful misinformation and protect democratic institutions and public health.

Video Saved From X

reSee.it Video Transcript AI Summary
Many people overlook their options in dealing with misinformation on social media. Early detection is key to tracking and countering harmful narratives. Legal action can be taken against profit-driven disinformation networks. Fact-checking alone may not change beliefs, so building counter narratives is crucial. Our organization helps detect, assess, and mitigate the impact of misinformation to prevent future issues. The recent events at the US Capitol highlight the real-world consequences of online disinformation. Translation: It is important to detect and counter harmful narratives early to prevent misinformation from causing real-world harm. Legal action can be taken against profit-driven disinformation networks, and building counter narratives is essential. Our organization helps organizations address the impact of misinformation to prevent future issues. The recent events at the US Capitol show the consequences of online misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
We actively addressed disinformation and misinformation during the pandemic and the US election by collaborating with the editing community. This model will be used in future elections globally. We aim to identify threats early by working with governments and other platforms to understand the landscape.

Video Saved From X

reSee.it Video Transcript AI Summary
CISA lacked funding and legal authorizations to understand election disinformation. To bridge this gap, four institutions collaborated to fill the void left by the government. The cooperation between government and tech platforms proved effective, resulting in numerous papers discussing takedowns. However, two challenges remain: how to sustain this collaboration and the lack of federal preparation in identifying and analyzing election misinformation and disinformation. The absence of a clear federal lead and limitations within the IC and FBI hindered progress. CISA provided support but lacked real capability due to unclear legal authorities, including concerns regarding the First Amendment.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses how the US Department of Defense censored Americans during the 2020 election cycle. They explain that a group within the Atlantic Council and the foreign policy establishment pushed for a permanent domestic censorship government office to counter misinformation and disinformation. This office was eventually established within the Department of Homeland Security (DHS) through an obscure cybersecurity agency called CISA. The speaker details how this agency, with the combined powers of the CIA and FBI, classified online misinformation as a cybersecurity attack on democracy. They further explain how Stanford University, the University of Washington, Graphica, and the Atlantic Council, all Pentagon-associated institutions, were involved in a coordinated mass censorship campaign to pre-censor any disputes about the legitimacy of mail-in ballots. This campaign involved pressuring tech companies to adopt new terms of service speech violation bans. The speaker suggests that this censorship operation was orchestrated to ensure the perceived legitimacy of a Biden victory in the case of a red mirage blue shift event. They also mention the connection between this operation and the impeachment of Trump in late 2019.

Video Saved From X

reSee.it Video Transcript AI Summary
Before the 2020 election, a group involving DHS, NATO, and the DNC planned a mass censorship campaign on social media to prevent disputing mail-in ballot legitimacy. They partnered with Stanford, University of Washington, Graphika, and the Atlantic Council, all linked to the Pentagon. Using threats and pressure, they forced tech companies to ban content questioning mail-in ballots. This was done to ensure public acceptance of a potential Biden victory due to mail-in ballots. The group aimed to control the narrative and prevent election crisis.

Video Saved From X

reSee.it Video Transcript AI Summary
The panel discussion focuses on how major platforms like Google, Twitter, and Facebook are addressing false and misleading narratives surrounding COVID-19. The speakers discuss their policies and strategies for moderating and mitigating misinformation. They highlight the importance of providing authoritative information, removing harmful content, and addressing borderline content that could lead to vaccine hesitancy. The panelists also acknowledge the challenges of handling misinformation during a rapidly evolving crisis and emphasize the need for flexibility and adaptability in their approaches. They mention the use of AI systems and human review to sift through vast amounts of data and the importance of partnerships with health authorities and fact-checking organizations.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the intelligence community's efforts to share information with social media platforms to address inauthentic content. They clarify that the office of the director of National Intelligence would only participate in approved election security briefings with private companies like Twitter, YouTube, Microsoft, and state election officials. These briefings focus on discussing threats and have nothing to do with content moderation or the Biden laptop as Russian disinformation. The speaker mentions that there were weekly meetings between the FBI, DHS, and Twitter, but only one reference to their office. They hope that this reference was part of the approved process for election security briefings.

Video Saved From X

reSee.it Video Transcript AI Summary
CISA lacked funding and legal authority to address election disinformation. A project involving 4 institutions was created to fill this gap. Government-tech cooperation has been effective, producing 60-70 papers on takedowns. Challenges include maintaining this collaboration. The federal government was unprepared to handle election misinformation and disinformation due to limitations in agencies like the IC and FBI. CISA lacked real capability and faced legal authority issues, including First Amendment concerns.

Video Saved From X

reSee.it Video Transcript AI Summary
I founded Aletheia Group in 2019 after working on a Senate campaign. We developed strategies to combat disinformation and influence operations. This issue is a national security concern, not just political. Aletheia Group consists of diverse experts aiming to tackle this challenge. My background in government consulting and policy helped shape my approach. Disinformation targets voter turnout and candidate choice. Governments, especially the US, have the resources to combat disinformation effectively. We need to shift our approach to disinformation and address it legislatively.

Video Saved From X

reSee.it Video Transcript AI Summary
The Biden administration plans to bring together democracies in a transatlantic summit to address threats to democracy. The European Union also wants to establish a transatlantic digital marketplace and work together against disinformation. Mainstream platforms like Twitter and Facebook have started labeling misinformation, and there is hope that nefarious movements will decline. The EU's Democracy Action Plan, which includes introducing costs for spreading disinformation, is seen as a game changer. The Digital Services Act and Digital Markets Act proposed by the European Commission are steps in the right direction. Collaboration between governments, civil society, and industry is crucial, as disinformation is a growing threat that requires a collective response.

Video Saved From X

reSee.it Video Transcript AI Summary
Before the 2020 election, a coordinated censorship campaign was launched. This involved the Department of Homeland Security, NATO, and the DNC, leveraging institutions like Stanford University, the University of Washington, Graphica, and the Atlantic Council—many with ties to the Pentagon. These groups, many staffed by former intelligence officials, worked together to suppress discussion questioning the legitimacy of mail-in ballots. They used a multi-step plan to pressure social media companies into adopting a new policy banning content undermining public confidence in the election process. This involved threats of government action and leveraging media allies. Millions of posts across multiple platforms were censored or suppressed. The goal was to prevent questions about the election outcome, anticipating a potential crisis if initial results appeared to favor Trump before shifting to Biden.

Video Saved From X

reSee.it Video Transcript AI Summary
Stanford researcher Renee DiResta and journalist Matt Taibbi discuss the revelation that Congress has access to people's emails through subpoena power. They also confirm that the Election Infrastructure Partnership (EIP) was a Department of Homeland Security (DHS) operation from the beginning, as previously established. Twitter's Lisa Roman reveals that CISA received grants to build a web portal for reporting election-related misinformation. The committee report highlights gaps in legal authorities and coordination for addressing election misinformation. DiResta explains how DHS has taken on a role similar to a domestic CIA for information control. The federal government lacked preparedness and coordination in identifying and analyzing election misinformation, as the intelligence community and FBI have specific limitations.

The Joe Rogan Experience

Joe Rogan Experience #1263 - Renée DiResta
Guests: Renée DiResta
reSee.it Podcast Summary
Renée DiResta began her research into online misinformation in 2015, initially focusing on anti-vaccine activity in California. She observed how small groups could amplify messages on social media, both through legitimate means and coordinated efforts to manipulate algorithms. This led her to explore how terrorist organizations like ISIS used similar tactics to spread propaganda. By late 2015, as discussions about ISIS intensified, attention shifted to Russian interference in social media, particularly following Adrian Chen's exposé on the Internet Research Agency (IRA). DiResta explained that the consolidation of social media platforms made it easier for propagandists to target specific audiences. The IRA created fake accounts that mimicked real people, often referred to as "sock puppets," to influence American discourse. By 2016, during the presidential campaign, these accounts were actively engaging in divisive conversations, often amplifying existing tensions. The IRA's strategy involved building communities around various identities, such as LGBT or African American groups, to foster in-group dynamics and subtly influence opinions. They created pages that appeared authentic and relatable, often using humor and cultural references to engage users. This long-term strategy aimed to normalize certain narratives and create divisions within American society. DiResta noted that the IRA's operations were sophisticated, employing tactics akin to those of a marketing agency, but with a focus on manipulation and disinformation. They targeted specific demographics and tailored their content to resonate with those audiences, often using memes and culturally relevant language. The conversation also touched on the challenges of moderating content on social media platforms. DiResta highlighted the difficulty of balancing free speech with the need to combat harassment and misinformation. She emphasized that the algorithms used by these platforms often exacerbate polarization, as they prioritize sensational content that generates engagement. As technology evolves, including advancements in deepfakes and AI-generated content, DiResta expressed concern about the potential for misinformation to escalate into real-world consequences. She pointed out that the ease of creating convincing fake identities and narratives could lead to significant societal disruptions. In conclusion, DiResta underscored the importance of understanding the mechanisms behind online disinformation and the need for accountability from social media platforms. She advocated for a multi-stakeholder approach to address these challenges, recognizing that the landscape of online communication is rapidly changing and requires ongoing vigilance and adaptation.
View Full Interactive Feed