reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the use of sock puppets on Twitter and Facebook, as well as defensive and offensive tactics employed by anti-disinformation operatives. They mention techniques like doxxing and deception, and the use of merchandise sites to gather information. The speaker also talks about checking potentially malicious content sites, takedowns, and ensuring machine security.

Video Saved From X

reSee.it Video Transcript AI Summary
In this video, the speaker discusses the evolution of disinformation in the context of the 2016 and 2020 elections. In 2016, the focus was on foreign disinformation, primarily from Russia, spread through fake accounts and coordinated efforts. However, in the 2020 US election, the disinformation was mostly domestic, originating from authentic accounts, including verified pundits and everyday people. While there were some foreign activities, they played a minor role. The disinformation campaign was not entirely coordinated but rather cultivated and organic, with blue check accounts being major spreaders. This shift highlights the changing nature of disinformation and the need to address it from a different perspective.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 describes Tim Ballard as having worked with Glenn Beck to build Underground Railroad, portraying Beck as Ballard’s close ally whenever Ballard needed to break a story on child trafficking. When Ballard considered running for Senate and would have likely won with momentum after the Sound of Freedom release, attacks began, and Glenn Beck reportedly “threw him under the bus.” Speaker 0 asserts that Beck pledged allegiance to Israel, is “bought and paid for,” and “Israel's bitch,” claiming Ballard watched a video and realized this. Speaker 1 adds a claim about theSound of Freedom narrative: the child trafficking ring Ballard busted in South America, depicted in the movie, was an Israeli-run sex trafficking ring, run by Israelis. The head of that ring allegedly escaped to Portugal where a judge let him go, and nobody knows where this guy ended up. The speakers state that this is the real story of Sound of Freedom and that “It was an Israeli run sex trafficking ring,” noting that this is not told to the audience and urging others to research it. Speaker 1 then transitions to commentary on Twitter, stating that Twitter is not a free speech platform and is not an open information highway; it is a military application, a propaganda operation, highly bodied, highly artificial, highly synthetic, and manipulated. They acknowledge using it daily but emphasize that not everything is as it seems on the platform. They caution that prominent accounts cannot be taken at face value because campaigns are run, the algorithm is manipulated, and there are bots and unauthentic accounts. The speakers urge awareness of the battlefield on which Twitter is engaged, and advise developing a wary eye toward content, encouraging audiences to examine profiles, retweets, boosts, follows, and networks to understand who is using the same messaging and why.

Video Saved From X

reSee.it Video Transcript AI Summary
The disinformation industry distorts reality with online propaganda. Hanan's team boasts about past successes, with tools like AIMS to weaponize social media. Their bots are sophisticated, appearing human with multiple platform accounts. They create fake personas for various purposes. The team claims to have worked in countries worldwide and can hack Telegram and Gmail. Hanan exploits vulnerabilities in the global signaling system, SS7.

Video Saved From X

reSee.it Video Transcript AI Summary
McCarthy accuses the US of being infiltrated by foreign forces, specifically Russian-linked Twitter accounts. These accounts, allegedly connected to bots and trolls, are said to be impersonating Americans and spreading false information. The Russian influence tracker, Hamilton 68, monitors these networks and their impact on social media. The release of the Nunes memo was heavily promoted by Russian bots, with the hashtag #ReleaseTheMemo trending. Russian bots have also been involved in manipulating discussions around the Parkland shooting. The goal of these bots is to create political discord and influence public opinion. McCarthy's investigation into the 600 Russian-linked accounts has sparked controversy and raised concerns about Russian interference in US politics.

Video Saved From X

reSee.it Video Transcript AI Summary
Digital platforms are being misused to subvert science and spread disinformation and hate to billions of people. This global threat demands clear and coordinated global action. A policy brief on information integrity on digital platforms puts forward a framework for a concerned international response.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 and Speaker 1 discuss a network of alleged influence surrounding Tim Ballard, Glenn Beck, and broader geopolitical insinuations, tying activism and media narratives to covert operations and manipulation. Speaker 0 recalls meeting Tim Ballard during a period when he was pursuing controversial legal matters, noting that Glenn Beck helped him build Underground Railroad and was Ballard’s close ally for breaking stories on child trafficking. When Ballard contemplated a dash for political office (senate or congress) and was poised to win after the Sound of Freedom release, Speaker 0 says the attacks against him began. He claims that Glenn Beck subsequently “threw him under the bus,” and quotes his own video response to Ballard’s reaction, arguing that Beck’s loyalty had changed because Beck was “pledging allegiance to Israel,” implying he was bought and paid for and controlled by intelligence agencies. The point is that Beck was not Ballard’s friend, according to Speaker 0, who shows Ballard a video to illustrate this shift. Speaker 1 adds a specific counter-narrative about the Sound of Freedom story. He asserts that the child trafficking ring Tim Ballard exposed in South America, depicted in the film, was actually Israeli-run. He claims the ring was “run by Israelis,” and that its head escaped to Portugal, where a judge released him, after which no traceable location remains. Speaker 1 emphasizes that this is the real story behind Sound of Freedom and asserts that the truth is not told to audiences, urging listeners to research independently to uncover that the ring was Israeli-run. He reiterates the theme that “it’s always them” and that “it always comes back to them.” Speaker 1 shifts to a broader media warning about Twitter, stating that it is not a free speech platform but “a military application,” a propaganda operation that is highly artificial, synthetic, and manipulated. He clarifies that he uses Twitter but urges users to recognize that not everything on the platform is as it seems. He warns that big accounts may be part of campaigns, with paid boosts, manipulated algorithms, bots, and unauthentic accounts. The advisory is to be aware of the battlefield on which users engage, not to abandon the platform, but to be more discerning. He urges readers to develop a wary eye toward others by examining profiles, feeds, retweets, boosts, networks, and who is using the same messaging. Speaker 0 closes by reiterating the pattern of attention, influence, and alleged manipulation that ties these figures and narratives together, suggesting a recurring causal link between entertainment media, political ambition, and covert agendas.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: When I first met Tim Ballard, he was in this wild legal fight, and Glenn Beck helped him build Underground Railroad. They were best friends. Whenever Sam or Tim needed to break a story about child trafficking, Glenn Beck was “his fucking dude.” Then Tim was considering running for Senate or Congress, and with the momentum from Sound of Freedom, he seemed like a shoo-in, and he was set to upset some politician. After those attacks began, Glenn Beck “threw him under the bus,” and Tim told me, “I can’t believe that Glenn would fucking do that to me.” That exact video I showed him—Tim’s friend pledging allegiance to Israel, “he’s bought and paid for,” “not your friend,” “controlled by our intelligence agencies,” “Israel’s bitch.” Tim watched that one video and said, “holy fuck.” Speaker 1: Ryan, you might know this—the child ring Tim Ballard busted up in South America, depicted in Sound of Freedom, was Israeli-run. It was run by Israelis. The head of that ring escaped to Portugal, where a judge basically let him go, and nobody knows where that guy ended up. That’s the real story of Sound of Freedom: an Israeli-run sex-trafficking ring. You’re not told that. Do research and find out about it. That’s who was running the ring. So there’s a lot of interconnection—it's always them, man. It always comes back to them. It seems to always come back to them. It’s like 6,000,000 to one odds. Speaker 0: Every single time. Every single time. It’s strange how that happens. But you wanna wrap it up, Sam? Speaker 1: Yeah. Let’s wrap it up. Listen, everybody. Twitter is not a free speech platform. It is not an open, super highway of information. It is a military application. It is a propaganda operation. It is highly bodied, highly artificial, highly synthetic and manipulated. I’m not saying don’t use it; I use it every day. We absolutely must use it as best we can, but I need everybody to be aware that not everything is as it seems on this platform. You cannot take this platform at face value. Many of the big accounts you see mainstream through your feed aren’t to be taken at face value. They’re running campaigns, being paid, boosted, the algorithm manipulated, with bots and unauthentic accounts. You must be aware of the battlefield you’re engaging on. And I’m not saying you should leave. On the contrary, I want you here, battling. But it’s not what it seems. There’s a lot of smoke and mirrors, shadows, espionage, and spy games on this platform, and you need to be savvy. Don’t develop mistrust of everybody, but develop a wary eye. Look at people’s Twitter profiles, scroll through their feeds, see who they’re retweeting, who they’re boosting, who they’re following, who their networks are, who’s using the same message.

Video Saved From X

reSee.it Video Transcript AI Summary
A website called Hamilton Sixty Eight was created to track Russian accounts. The speaker asserts that the website identifies current Russian bots that are disseminating information. The speaker questions the accuracy of this claim, stating, "That's bull."

Video Saved From X

reSee.it Video Transcript AI Summary
The Russians have weaponized social media by manipulating public opinion through biased or fake stories. However, domestic disinformation is also a significant issue. In 2016, the Russian efforts may not have been very sophisticated, but they learned that they don't need to create content themselves as there are people in the US who will do it. There were two types of disinformation attacks in 2016, with the Internet Research Agency taking over existing groups in the US and pushing radical positions. While foreign influence gets a lot of attention, the majority of problems in the information environment are domestic. The domestic threat of disinformation is considered the most significant immediate threat to the 2020 election.

Video Saved From X

reSee.it Video Transcript AI Summary
Many people overlook their options in dealing with misinformation on social media. Early detection is key to tracking and countering harmful narratives. Legal action can be taken against profit-driven disinformation networks. Fact-checking alone may not change beliefs, so building counter narratives is crucial. Our organization helps detect, assess, and mitigate the impact of misinformation to prevent future issues. The recent events at the US Capitol highlight the real-world consequences of online disinformation. Translation: It is important to detect and counter harmful narratives early to prevent misinformation from causing real-world harm. Legal action can be taken against profit-driven disinformation networks, and building counter narratives is essential. Our organization helps organizations address the impact of misinformation to prevent future issues. The recent events at the US Capitol show the consequences of online misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
The Russians have weaponized social media by manipulating public opinion through biased or fake stories. However, domestic disinformation is also a significant issue. In 2016, the Russian efforts may not have been very sophisticated, but they learned that they don't need to create the content themselves as there are people in the US who will do it. There were two types of disinformation attacks in 2016: the Internet Research Agency created personas to take over existing US groups and push radical positions. However, the majority of these problems are domestic, related to how we interact online, political speech, amplification, and how politicians use platforms. The domestic threat of disinformation is the most significant immediate threat to the 2020 election.

Video Saved From X

reSee.it Video Transcript AI Summary
Twitter is developing a tool to combat hate speech by analyzing networks to flag harmful content. This tool will hide violative tweets and redirect users to positive influencers, community groups, or mental health resources. Twitter currently quarantines harmful tweets, but believes providing healthier alternatives is more effective in disrupting radicalization.

Video Saved From X

reSee.it Video Transcript AI Summary
AI is being misused to create and spread false and hateful information at scale. AI-generated content, including fake videos and photos, is easily produced and often indistinguishable from real content. The barriers to creating such content are low, while financial and strategic gains incentivize its creation. AI content can be created cheaply with minimal human intervention. Deep fakes, images, audio, and video are being deployed in war zones like Ukraine, Gaza, and Sudan, triggering diplomatic crises, inciting unrest, and creating confusion. This also undermines the work of UN agencies as false information spreads about their intentions and work.

Video Saved From X

reSee.it Video Transcript AI Summary
We created a website called Hamilton 68 to track Russian accounts. Our website shows that there are currently Russian bots spreading information.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses how claims go viral and the role of influential accounts in spreading them. They use cumulative graphs to track the spread of claims on social media, with the y-axis representing the number of shares and the x-axis representing time. High-follower accounts often change the trajectory of a tweet, helping it go viral. They mention specific influential accounts like Tim Cast and the Gateway Pundit, who spread false or misleading claims of voter fraud. Eventually, the false claim was amplified by President Trump's son on Twitter. Online participants actively spread information that highlighted election irregularities and exaggerated the impact of small issues like stolen mail, spreading falsehoods.

Video Saved From X

reSee.it Video Transcript AI Summary
The speakers claim that America is under attack by Russian bots on Twitter, which are part of an ongoing attack by the Russian government. These bots are flooding Twitter, targeting Americans, and attempting to fan the flames of political discord by creating echo chambers and alternate realities. The speakers reference Hamilton 68, a website tracking Russian-linked Twitter accounts, as evidence of this activity. They claim this dashboard shows Russian bots are involved in various topics, from political narratives to school shootings. However, another speaker alleges that Hamilton 68 is a fraud. They claim the list of accounts it tracks are not Russian bots, but rather ordinary Americans, and that Hamilton 68 is misrepresenting organic opinions as Russian influence. Some speakers claim they are personally targeted by Russian bots. A dashboard at securingdemocracy.org is suggested for tracking Russian activity. A video by Matt Orphala is praised. Negative news about vaccines is said to be amplified by Russian bots.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the activities of a group called CTIL (Cyber Threat Intelligence League) and their efforts to combat misinformation and disinformation on social media platforms. The whistleblower reveals that CTIL used Python code to scrape records from Twitter and tracked incidents of disinformation. They also worked on counter messaging, encouraging mask-wearing, and building an amplification network. The speaker mentions that some CTIL members took extreme measures to conceal their identities. The group's activities received little attention until now, but Wired published an article about CTIL, highlighting their work against misinformation. The speaker concludes by mentioning the upcoming testimony to Congress and refers to Matt Taibbi's perspective on the matter.

Video Saved From X

reSee.it Video Transcript AI Summary
Devar AI presents Rockia, a dashboard designed as a copilot for navigating information wars, particularly on TikTok. It is currently demoed for the IDF counter propaganda unit, but can be tailored for any government monitoring collective consciousness. Rockia analyzes narratives and generates counter-narratives and social media campaigns using AI, addressing the rise in anti-Israel sentiment on social media since October 7. The dashboard displays topic clusters of TikTok videos, each represented by a card. Users can access full reports on each topic, examine AI-generated counter-narratives to combat negative or bolster positive sentiments, and view lists of TikTok videos within each cluster.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker works with the German Marshall Fund, which tracks Russian activities. The speaker directs the audience to hamilton68.com, a site created to monitor Russian trolls and bot armies. The goal is to provide the public with information to help them distinguish between legitimate speech and speech originating outside the country intended to create chaos. The speaker acknowledges the difficulty the country will face in discerning the origins and intent of different types of speech.

Video Saved From X

reSee.it Video Transcript AI Summary
Digital platforms are being misused to subvert science and spread disinformation and hate to billions of people. This global threat demands clear and coordinated global action. A policy brief on information integrity on digital platforms puts forward a framework for a concerned international response.

Video Saved From X

reSee.it Video Transcript AI Summary
The Alliance for Securing Democracy is developing tools and strategies to counter attacks on the U.S. and its allies. They are tracking the toolkit Russia is using to undermine democracies. Their dashboard tracks Russian active measures and can be found at dashboard.securingdemocracy.org.

Video Saved From X

reSee.it Video Transcript AI Summary
The video discusses the spread of fake images and videos during the Russia-Ukraine conflict. Examples include a fake image of Zelensky in military gear and footage from a video game used in news reports. The speaker warns of anti-Russian fake news but acknowledges similar misinformation may exist on the other side. They emphasize the need to be critical of information before reacting emotionally.

Breaking Points

Tim Dillon Says US LOSING 'S*** Talking' War
Guests: Tim Dillon
reSee.it Podcast Summary
The episode analyzes Iranian online propaganda, focusing on the Lego videos that parody and critique U.S. politics and culture. The discussion describes how these clips blend humor, memes, and cultural references to reach a broad audience, highlighting the performers’ independence from the Iranian government and their aim to bypass traditional media. The hosts contrast the modern, internet-driven approach with past state-led propaganda, noting how rapid production cycles and global reach amplify viral content and influence public perception about the war and U.S. leadership. The conversation also examines how American audiences interpret these messages, the role of technology in creating and disseminating them, and the broader implications for information warfare. By comparing this to earlier campaigns and to domestic media, the panel underscores a shift in how nations project power online and how citizens engage with conflicting narratives in real time, including critiques of leaders and policy.

The Joe Rogan Experience

Joe Rogan Experience #1263 - Renée DiResta
Guests: Renée DiResta
reSee.it Podcast Summary
Renée DiResta began her research into online misinformation in 2015, initially focusing on anti-vaccine activity in California. She observed how small groups could amplify messages on social media, both through legitimate means and coordinated efforts to manipulate algorithms. This led her to explore how terrorist organizations like ISIS used similar tactics to spread propaganda. By late 2015, as discussions about ISIS intensified, attention shifted to Russian interference in social media, particularly following Adrian Chen's exposé on the Internet Research Agency (IRA). DiResta explained that the consolidation of social media platforms made it easier for propagandists to target specific audiences. The IRA created fake accounts that mimicked real people, often referred to as "sock puppets," to influence American discourse. By 2016, during the presidential campaign, these accounts were actively engaging in divisive conversations, often amplifying existing tensions. The IRA's strategy involved building communities around various identities, such as LGBT or African American groups, to foster in-group dynamics and subtly influence opinions. They created pages that appeared authentic and relatable, often using humor and cultural references to engage users. This long-term strategy aimed to normalize certain narratives and create divisions within American society. DiResta noted that the IRA's operations were sophisticated, employing tactics akin to those of a marketing agency, but with a focus on manipulation and disinformation. They targeted specific demographics and tailored their content to resonate with those audiences, often using memes and culturally relevant language. The conversation also touched on the challenges of moderating content on social media platforms. DiResta highlighted the difficulty of balancing free speech with the need to combat harassment and misinformation. She emphasized that the algorithms used by these platforms often exacerbate polarization, as they prioritize sensational content that generates engagement. As technology evolves, including advancements in deepfakes and AI-generated content, DiResta expressed concern about the potential for misinformation to escalate into real-world consequences. She pointed out that the ease of creating convincing fake identities and narratives could lead to significant societal disruptions. In conclusion, DiResta underscored the importance of understanding the mechanisms behind online disinformation and the need for accountability from social media platforms. She advocated for a multi-stakeholder approach to address these challenges, recognizing that the landscape of online communication is rapidly changing and requires ongoing vigilance and adaptation.
View Full Interactive Feed