reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the use of sock puppets on Twitter and Facebook, as well as defensive and offensive tactics employed by anti-disinformation operatives. They mention techniques like doxxing and deception, and the use of merchandise sites to gather information. The speaker also talks about checking potentially malicious content sites, takedowns, and ensuring machine security.

Video Saved From X

reSee.it Video Transcript AI Summary
Recently, internal files from the Cyber Threat Intelligence League were released, revealing that US and UK military contractors were involved in censoring and using psychological operations and disinformation tactics against the American people. While some argue that social media platforms have the right to remove content that violates their terms of service, it is important to note that the government cannot encourage or promote actions that infringe upon freedom of speech. The whistleblower behind these files claims that the leader of the CTIL was present at the Obama White House in 2017, receiving instructions to counter disinformation and prevent a repeat of the events in 2016.

Video Saved From X

reSee.it Video Transcript AI Summary
In 2022, as Director of Information Security at The Intercept, the speaker wrote articles critical of Elon Musk's takeover of Twitter, including his purging of leftist accounts and reinstatement of neo-Nazis and anti-vaxxers. Subsequently, Musk permanently suspended the speaker's account, then reinstated it after a poll, but demanded deletion of a tweet. Instead, the speaker quit Twitter for a year. The speaker now works with a collective that makes open-source security and privacy software, including Syd.social, an app to delete data from X and migrate tweets to Blue Sky. The speaker is also involved in Tesla Takedown, a nonviolent movement aiming to devalue Tesla stock and force Musk to sell shares to cover his Twitter debt. The goal is to trigger a Tesla stock "death spiral."

Video Saved From X

reSee.it Video Transcript AI Summary
FBI Special Agent Elvis Chan informed Twitter and other social media platforms about a potential hack and leak operation before the 2020 presidential election, despite having no evidence. The government had possession of Hunter Biden's laptop for a year and pre-bunked the story that eventually came out. The Aspen Institute held a tabletop exercise to prepare for a story about Hunter Biden and Burisma, with journalists and big tech executives present. The exercise aimed to discount and censor the story. Fifty-one former intelligence officials signed a letter claiming the laptop story was Russian disinformation, with the intention of helping Joe Biden win the election. This coordinated effort to suppress information is concerning and undermines democracy.

Video Saved From X

reSee.it Video Transcript AI Summary
Michael Shellenberger's CTIL files reveal a trove of documents exposing the involvement of governments in censorship. The documents describe the activities of the Cyber Threat Intelligence League (CTIL), an anti-disinformation group that worked closely with the US Department of Homeland Security (DHS) and military contractors. The whistleblower's documents reveal the genesis of modern digital censorship programs, partnerships with intelligence agencies and civil society organizations, and the use of offensive techniques like sock puppet accounts. The documents also show that CTIL aimed to become part of the federal government and had connections with FBI and CISA employees. The documents provide a comprehensive picture of the birth of the censorship industrial complex.

Video Saved From X

reSee.it Video Transcript AI Summary
In a congressional hearing, Michael Schellenberger and Matt Taibbi revealed details about the Cyber Threats Intelligence League (CTIL), a powerful organization involved in censoring free speech. They presented evidence of a censorship industrial complex, involving government agencies, contractors, and big tech platforms, that censors ordinary Americans and elected officials for holding disfavored views. The CTIL files exposed US and UK military contractors working to censor and conduct psychological operations against the American people. They also discussed the violation of the First Amendment by government agencies and the need for defunding and dismantling these organizations involved in censorship. The hearing highlighted the dangers of government-sponsored censorship and the importance of protecting free speech.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: When I first met Tim Ballard, he was in this wild legal fight, and Glenn Beck helped him build Underground Railroad. They were best friends. Whenever Sam or Tim needed to break a story about child trafficking, Glenn Beck was “his fucking dude.” Then Tim was considering running for Senate or Congress, and with the momentum from Sound of Freedom, he seemed like a shoo-in, and he was set to upset some politician. After those attacks began, Glenn Beck “threw him under the bus,” and Tim told me, “I can’t believe that Glenn would fucking do that to me.” That exact video I showed him—Tim’s friend pledging allegiance to Israel, “he’s bought and paid for,” “not your friend,” “controlled by our intelligence agencies,” “Israel’s bitch.” Tim watched that one video and said, “holy fuck.” Speaker 1: Ryan, you might know this—the child ring Tim Ballard busted up in South America, depicted in Sound of Freedom, was Israeli-run. It was run by Israelis. The head of that ring escaped to Portugal, where a judge basically let him go, and nobody knows where that guy ended up. That’s the real story of Sound of Freedom: an Israeli-run sex-trafficking ring. You’re not told that. Do research and find out about it. That’s who was running the ring. So there’s a lot of interconnection—it's always them, man. It always comes back to them. It seems to always come back to them. It’s like 6,000,000 to one odds. Speaker 0: Every single time. Every single time. It’s strange how that happens. But you wanna wrap it up, Sam? Speaker 1: Yeah. Let’s wrap it up. Listen, everybody. Twitter is not a free speech platform. It is not an open, super highway of information. It is a military application. It is a propaganda operation. It is highly bodied, highly artificial, highly synthetic and manipulated. I’m not saying don’t use it; I use it every day. We absolutely must use it as best we can, but I need everybody to be aware that not everything is as it seems on this platform. You cannot take this platform at face value. Many of the big accounts you see mainstream through your feed aren’t to be taken at face value. They’re running campaigns, being paid, boosted, the algorithm manipulated, with bots and unauthentic accounts. You must be aware of the battlefield you’re engaging on. And I’m not saying you should leave. On the contrary, I want you here, battling. But it’s not what it seems. There’s a lot of smoke and mirrors, shadows, espionage, and spy games on this platform, and you need to be savvy. Don’t develop mistrust of everybody, but develop a wary eye. Look at people’s Twitter profiles, scroll through their feeds, see who they’re retweeting, who they’re boosting, who they’re following, who their networks are, who’s using the same message.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker testified about the censorship industrial complex, revealing that it is worse than previously thought. Internal files from the Cyber Threat Intelligence League showed military contractors working to censor and use psychological operations against Americans. While some argue that social media platforms have the right to censor content, the First Amendment prohibits the government from abridging freedom of speech. The whistleblower claims that the leader of the CTIL was present at the Obama White House in 2017 when instructed to create a counter disinformation project. The Department of Homeland Security's Cybersecurity and Information Security Agency played a central role in censorship, with other government agencies supporting it. The speaker calls for defunding and dismantling these organizations, as well as implementing oversight to prevent future censorship. They also suggest making liability protections contingent on transparent moderation and public reporting of censorship requests.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a coordinated disinformation campaign involving the Election Integrity Partnership (EIP), The New York Times, and the Department of Homeland Security. The EIP provided a blog post to The New York Times, which then published a defamatory article. The EIP later cited the article in its own report. The Department of Homeland Security was revealed to be involved in the campaign. The speaker highlights the censorship and silencing of right-wing voices on social media platforms, as well as the impact on public access to information. Lawsuits were filed against The New York Times and EIP, but were dismissed. The speaker suggests that this campaign will be used in the 2024 election.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 notes: “Have them write the information.” Speaker 1 points out that two people were sitting together: George Gao, director general of the Chinese CDC, and the deputy director of the CIA, who later became head of the entire intelligence community, at an event during the Wuhan military games two months before the Wuhan outbreak. They remark on how conveniently they were seated near each other given how closely they would coordinate two months later. In this segment, social media is mentioned 19 times. Speaker 2 comments that social media is now the primary channel for news, that interruptions to platforms could curb misinformation but also limit access to legitimate sources, and that health ministries worldwide are trying to combat misinformation and disinformation. Speaker 1 describes the tabletop exercise: the deputy director of the CIA becomes head of the ODNI as soon as Biden takes office and is dealing with social media issues. The speaker notes that George Gao attended the exercise, asking why the simulation—which was about an animal-borne coronavirus outbreak in Latin America—had the China CDC head at the table and the U.S. ODNI head present, while the outbreak was said to start in Brazil, and there were no Latin American health officials present. The president of the UPS foundation is mentioned as the only Latin-named figure. The speaker questions why the Brazil CDC director isn’t in the exercise if it’s simulating a Latin American outbreak and points to the arrangement as contradictory to the premise. Speaker 3 repeats that experts agree new disinformation campaigns are generated daily, describing the problem as huge and potentially undermining pandemic response and governance. Speaker 1 emphasizes disinformation keeping us from ending the pandemic, noting the Wuhan games are ongoing in Wuhan, and describing rumors that the US military engineered the virus and that USAID funded work, with a web of claims about public health, vaccines, and pharmaceutical company misdeeds. The speaker asserts that Pfizer, Moderna, and Gates Foundation funding are involved, including claims that Moderna patented the coronavirus vaccine before the outbreak and that Moderna is a Pentagon arm with no prior successful vaccine. Speaker 2 warns that unrest from false rumors and divisive messaging is rising and undermining response efforts as trust declines. Speaker 1 mentions the “China CDC, in charge of the Wuhan lab,” and notes that healthcare workers, if poorly trained, might give wrong information or say “I don’t know,” which erodes public trust. Speaker 0 recalls a Sierra Leone radio interview about whether Ebola was man-made, highlighting the importance of the TOT (tabletop exercise) and ensuring that nobody suspects a man-made origin. Speaker 4: Proposes steps to prevent spreading misinformation on social media by collaborating with telecommunications companies to control information access and ensuring a trusted source floods the zone with messaging, including trained influential community leaders and health workers to disseminate the desired messaging. Speaker 1 questions the idea of flooding the zone with messaging and notes the need for a rapid response to disinformation, while acknowledging that there are intelligence sources identifying foreign disinformation campaigns as part of a larger effort to address the pandemic.

Video Saved From X

reSee.it Video Transcript AI Summary
Many people overlook their options in dealing with misinformation on social media. Early detection is key to tracking and countering harmful narratives. Legal action can be taken against profit-driven disinformation networks. Fact-checking alone may not change beliefs, so building counter narratives is crucial. Our organization helps detect, assess, and mitigate the impact of misinformation to prevent future issues. The recent events at the US Capitol highlight the real-world consequences of online disinformation. Translation: It is important to detect and counter harmful narratives early to prevent misinformation from causing real-world harm. Legal action can be taken against profit-driven disinformation networks, and building counter narratives is essential. Our organization helps organizations address the impact of misinformation to prevent future issues. The recent events at the US Capitol show the consequences of online misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
We learn about DC leaks and the connection to APT 28, a Russian military intelligence hacking group. The morning the Hunter Biden story broke in the New York Post, it was confusing. We didn't know what to believe, but it seemed like a possible hacking campaign by APT 28. Despite that, I didn't feel comfortable removing the content from Twitter.

Video Saved From X

reSee.it Video Transcript AI Summary
Before the 2020 election, a group involving DHS, NATO, and DNC planned a mass censorship campaign on social media with 4 Pentagon-linked institutions. They aimed to prevent questioning of mail-in ballot legitimacy. The group coerced tech companies to censor content through threats and pressure, resulting in millions of posts being banned or limited. The campaign was set up months before the election to avoid a crisis if the election results were disputed. The group's actions were based on the belief that a Biden victory would rely on mail-in ballots.

Video Saved From X

reSee.it Video Transcript AI Summary
To combat alleged misinformation, the censorship industrial complex used counterterrorism and intelligence tactics, including psychological operations, to shape domestic opinion. The speaker, a counterterrorism and counter espionage expert, was asked in 2008 to apply these same skill sets to the UFO community.

Video Saved From X

reSee.it Video Transcript AI Summary
The discussion touches on the concept of a "Ministry of Truth" and the efforts surrounding it. One speaker shares their background, emphasizing a focus on Internet censorship. They began as a corporate lawyer, then worked in the Trump White House as a speechwriter and advisor on technology issues. They later led the cyber division at the State Department, managing the relationship between government and major tech companies like Google and Facebook. This role involved facilitating communication and lobbying efforts between these companies and the government.

Video Saved From X

reSee.it Video Transcript AI Summary
The video discusses the CTI League, a group of volunteer cybersecurity experts, and their efforts to combat cybercrime and misinformation. The leaders of the CTI League aimed to build support for censorship and government involvement in cybersecurity. They promoted the concept of cognitive security and advocated for government censorship and counter-misinformation. The leaders had military backgrounds and sought to bring military tactics to social media platforms. They believed that misinformation could be treated as a cybersecurity problem. The report they published called for government, military, and intelligence involvement in censorship. They also suggested using information sharing and analysis centers to promote confidence in government. The leaders viewed disinformation as a political tool to change belief sets and internal narratives. They compared their proposed censorship model to that of the Chinese government.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses how the US Department of Defense censored Americans during the 2020 election cycle. They explain that a group within the Atlantic Council and the foreign policy establishment pushed for a permanent domestic censorship government office to counter misinformation and disinformation. This office was eventually established within the Department of Homeland Security (DHS) through an obscure cybersecurity agency called CISA. The speaker details how this agency, with the combined powers of the CIA and FBI, classified online misinformation as a cybersecurity attack on democracy. They further explain how Stanford University, the University of Washington, Graphica, and the Atlantic Council, all Pentagon-associated institutions, were involved in a coordinated mass censorship campaign to pre-censor any disputes about the legitimacy of mail-in ballots. This campaign involved pressuring tech companies to adopt new terms of service speech violation bans. The speaker suggests that this censorship operation was orchestrated to ensure the perceived legitimacy of a Biden victory in the case of a red mirage blue shift event. They also mention the connection between this operation and the impeachment of Trump in late 2019.

Video Saved From X

reSee.it Video Transcript AI Summary
Before the 2020 election, a group involving DHS, NATO, and the DNC planned a mass censorship campaign on social media to prevent disputing mail-in ballot legitimacy. They partnered with Stanford, University of Washington, Graphika, and the Atlantic Council, all linked to the Pentagon. Using threats and pressure, they forced tech companies to ban content questioning mail-in ballots. This was done to ensure public acceptance of a potential Biden victory due to mail-in ballots. The group aimed to control the narrative and prevent election crisis.

Video Saved From X

reSee.it Video Transcript AI Summary
Twitter censored the speaker's account in 2021 for sharing COVID vaccine-related information. Internal emails reveal that a Twitter employee named Michael Vincent Coe flagged a tweet for violating COVID misinformation policies. Coe, who has a business administration degree, dismissed the claims without providing evidence. Another Twitter employee, Joseph Guay, also flagged a tweet related to DARPA, questioning their involvement in funding vaccine research. Guay acknowledged that the article linked in the tweet discussed the topic accurately, but deemed the speaker's context as harmful and false. Both employees left Twitter around the same time. The speaker's lawyers are considering legal action.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker asks about the communication between government agencies and social media platforms. They mention email traffic and censorship activities that were not public. The speaker also discusses how the CDC had a partnership with Twitter, allowing them privileged access to flag misinformation. They mention the Virality Project, which is a collaboration between private entities and the government to surveil and censor social media. The speaker shares their personal experience of having their tweets censored and expresses concern about the violation of the First Amendment. They mention a court case that supports the idea that liking, commenting, and sharing are protected by the First Amendment. The speaker finds it appalling that the executive branch violated the First Amendment.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the intelligence community's efforts to share information with social media platforms to address inauthentic content. They clarify that the office of the director of National Intelligence would only participate in approved election security briefings with private companies like Twitter, YouTube, Microsoft, and state election officials. These briefings focus on discussing threats and have nothing to do with content moderation or the Biden laptop as Russian disinformation. The speaker mentions that there were weekly meetings between the FBI, DHS, and Twitter, but only one reference to their office. They hope that this reference was part of the approved process for election security briefings.

Video Saved From X

reSee.it Video Transcript AI Summary
Many people are afraid to come forward about important issues because they fear the consequences, like what happened to Snowden. The speaker has been speaking out for three years and wonders why others are so afraid. They believe that those who are willing to die for their country should also be willing to speak up. The speaker addresses the audience and those watching online, urging them to come forward and help expose the truth. They mention disinformation sites like Metabunk and Contrail Science, run by someone named Mick West, who tries to discredit those who question persistent contrails. The speaker warns against sharing articles without verifying their credibility, as there are people paid to spread disinformation and make others look noncredible.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker works with the German Marshall Fund, which tracks Russian activities. The speaker directs the audience to hamilton68.com, a site created to monitor Russian trolls and bot armies. The goal is to provide the public with information to help them distinguish between legitimate speech and speech originating outside the country intended to create chaos. The speaker acknowledges the difficulty the country will face in discerning the origins and intent of different types of speech.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the spread of vaccine and election disinformation on social media platforms like Facebook. They emphasize the need for transparency in algorithms and engagement to hold platforms accountable. The discussion also touches on misinformation surrounding Donald Trump, Hunter Biden, and COVID-19. The speaker highlights the importance of self-policing by groups like lawyers and state medical boards to combat false information. Additionally, they mention the need for investigations into profiteering off the pandemic.

The Joe Rogan Experience

Joe Rogan Experience #1263 - Renée DiResta
Guests: Renée DiResta
reSee.it Podcast Summary
Renée DiResta began her research into online misinformation in 2015, initially focusing on anti-vaccine activity in California. She observed how small groups could amplify messages on social media, both through legitimate means and coordinated efforts to manipulate algorithms. This led her to explore how terrorist organizations like ISIS used similar tactics to spread propaganda. By late 2015, as discussions about ISIS intensified, attention shifted to Russian interference in social media, particularly following Adrian Chen's exposé on the Internet Research Agency (IRA). DiResta explained that the consolidation of social media platforms made it easier for propagandists to target specific audiences. The IRA created fake accounts that mimicked real people, often referred to as "sock puppets," to influence American discourse. By 2016, during the presidential campaign, these accounts were actively engaging in divisive conversations, often amplifying existing tensions. The IRA's strategy involved building communities around various identities, such as LGBT or African American groups, to foster in-group dynamics and subtly influence opinions. They created pages that appeared authentic and relatable, often using humor and cultural references to engage users. This long-term strategy aimed to normalize certain narratives and create divisions within American society. DiResta noted that the IRA's operations were sophisticated, employing tactics akin to those of a marketing agency, but with a focus on manipulation and disinformation. They targeted specific demographics and tailored their content to resonate with those audiences, often using memes and culturally relevant language. The conversation also touched on the challenges of moderating content on social media platforms. DiResta highlighted the difficulty of balancing free speech with the need to combat harassment and misinformation. She emphasized that the algorithms used by these platforms often exacerbate polarization, as they prioritize sensational content that generates engagement. As technology evolves, including advancements in deepfakes and AI-generated content, DiResta expressed concern about the potential for misinformation to escalate into real-world consequences. She pointed out that the ease of creating convincing fake identities and narratives could lead to significant societal disruptions. In conclusion, DiResta underscored the importance of understanding the mechanisms behind online disinformation and the need for accountability from social media platforms. She advocated for a multi-stakeholder approach to address these challenges, recognizing that the landscape of online communication is rapidly changing and requires ongoing vigilance and adaptation.
View Full Interactive Feed