reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
- The situation on x is severe. - rise of bots and fake accounts, automated and AI powered bots are flooding s app, and they are getting smarter. - In one study, a botnet of over 1,000 fake accounts was caught promoting crypto scams. - During a political debate, over a thousand bots pushed coordinated false claims with some accounts tweeting every two minutes. - By 02/2024, 37% of all Internet traffic came from malicious bots. - These bots now use advanced AI models like Chat to generate human like responses and interact with each other, making them nearly impossible to detect. - The platform's ad driven business model thrives on outrage and engagement. - Emotional, polarizing content gets more clicks, and bots are perfect for spreading it. - Five, real world impact. Bots distort conversations, amplify falsehoods, and manipulate public opinion. - Conclusion. How bad is it? Very bad.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 describes Tim Ballard as having worked with Glenn Beck to build Underground Railroad, portraying Beck as Ballard’s close ally whenever Ballard needed to break a story on child trafficking. When Ballard considered running for Senate and would have likely won with momentum after the Sound of Freedom release, attacks began, and Glenn Beck reportedly “threw him under the bus.” Speaker 0 asserts that Beck pledged allegiance to Israel, is “bought and paid for,” and “Israel's bitch,” claiming Ballard watched a video and realized this. Speaker 1 adds a claim about theSound of Freedom narrative: the child trafficking ring Ballard busted in South America, depicted in the movie, was an Israeli-run sex trafficking ring, run by Israelis. The head of that ring allegedly escaped to Portugal where a judge let him go, and nobody knows where this guy ended up. The speakers state that this is the real story of Sound of Freedom and that “It was an Israeli run sex trafficking ring,” noting that this is not told to the audience and urging others to research it. Speaker 1 then transitions to commentary on Twitter, stating that Twitter is not a free speech platform and is not an open information highway; it is a military application, a propaganda operation, highly bodied, highly artificial, highly synthetic, and manipulated. They acknowledge using it daily but emphasize that not everything is as it seems on the platform. They caution that prominent accounts cannot be taken at face value because campaigns are run, the algorithm is manipulated, and there are bots and unauthentic accounts. The speakers urge awareness of the battlefield on which Twitter is engaged, and advise developing a wary eye toward content, encouraging audiences to examine profiles, retweets, boosts, follows, and networks to understand who is using the same messaging and why.

Video Saved From X

reSee.it Video Transcript AI Summary
In 2022, as Director of Information Security at The Intercept, the speaker wrote articles critical of Elon Musk's takeover of Twitter, including his purging of leftist accounts and reinstatement of neo-Nazis and anti-vaxxers. Subsequently, Musk permanently suspended the speaker's account, then reinstated it after a poll, but demanded deletion of a tweet. Instead, the speaker quit Twitter for a year. The speaker now works with a collective that makes open-source security and privacy software, including Syd.social, an app to delete data from X and migrate tweets to Blue Sky. The speaker is also involved in Tesla Takedown, a nonviolent movement aiming to devalue Tesla stock and force Musk to sell shares to cover his Twitter debt. The goal is to trigger a Tesla stock "death spiral."

Video Saved From X

reSee.it Video Transcript AI Summary
In this session, the speaker discusses how disinformation is not just about lies, but also about distorting and manipulating the truth. They introduce the 4 D's model: dismiss, distort, distract, and dismay. The audience is given cards to identify these tactics in quotes from different organizations. They discuss examples of dismiss, distort, and distract, and someone adds a fifth D, divide. The session focuses on the various ways people twist stories and attack those who present uncomfortable evidence.

Video Saved From X

reSee.it Video Transcript AI Summary
The disinformation industry distorts reality with online propaganda. Hanan's team boasts about past successes, with tools like AIMS to weaponize social media. Their bots are sophisticated, appearing human with multiple platform accounts. They create fake personas for various purposes. The team claims to have worked in countries worldwide and can hack Telegram and Gmail. Hanan exploits vulnerabilities in the global signaling system, SS7.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 asserts that another revolution is coming, aiming to achieve a broader peace, describing Israel’s conflict as an eight-front war—Jews against Rome, with the United States as the new Rome—and stating that Rome and Jerusalem clashed over values, a tragedy the Jews lost but must win next time. Speaker 1 adds that Jews against Rome have shifted from defense to offense. Speaker 2 notes that weapons evolve and swords do not work today, implying the need for new tools; Speaker 1 emphasizes that the battle requires the genius that created Apollo, pagers, and penetrated Hezbollah to prepare for this fight. Speaker 2 argues the most important battlefields are social media, with the next war to be decided online as much as offline. Speaker 0 designates this as the eighth front: the disinformation campaign. Speaker 3 and Speaker 0 discuss the scale of online manipulation, claiming billions of dollars are invested in the information battlefield by NGOs and governments, and asserting that money drives the effort. Speaker 6 and Speaker 7 describe policies to prohibit harmful stereotypes about Jews and to deplatform those who propagate them; they claim monitoring online spaces, including social media, messaging apps, video games, and cryptocurrency, and sharing intelligence with the FBI. Speaker 7 and others reference a spectrum of platforms and formats—podcasts, short-form video, Wikipedia, LLMs—and condemn antisemitism online, including “Hitler admires, Stalin admires, Jew haters,” while insisting on countermeasures. Speaker 8 and Speaker 9 discuss TikTok as a focal point, asserting that for every thirty minutes spent on TikTok, users become 17% more antisemitic, with carnage imagery from Gaza influencing perceptions; there is a stated problem with TikTok shaping youth attitudes. Speaker 10 and Speaker 6 describe redefining terms like Zionist as a proxy for Jews and Israelis, framing such language as hate speech; Speaker 11 indicates a desire for counterintelligence and critiques current curriculum, while Speaker 1 notes co-authoring Sunday school curricula with the ADL. Speaker 11 and Speaker 6 discuss developing technology to train LLMs and to combat antisemitism, with collaboration announced with OpenAI, Alphabet, Anthropic, Meta, and Microsoft; Speaker 10 notes a network of two dozen Jewish organizations feeding intelligence. Speaker 1 outlines a program to measure, monitor, and disrupt extremist content, with a full-time team of 40 analysts; Speaker 12 mentions monitoring campuses, digital networks, activist groups, and public officials, and that PhDs and academics support the effort. Speaker 13 and Speaker 14 discuss unifying data into a single platform, investing in intelligence, and mobilizing organizations to share information and fight common enemies; Speaker 12 emphasizes constant recording and reporting, aiming to mobilize allies. Speaker 15 and Speaker 9 reflect harsh strategies against antisemitism, including deportation and criminal measures, while Speaker 9 notes threats against those who push antisemitic conspiracy theories. Speaker 16–17 recount legal actions against antisemitic rhetoric and antisemitism lawsuits; Speaker 18 describes the J7 diaspora network meeting to share information and best practices; Speaker 19–20 advocate reform of education and even limiting the First Amendment to protect it, arguing for control over speech. Speaker 3 and Speaker 20 discuss enforcement and punishment for anti-Israel or antisemitic speech; Speaker 1 highlights training 20,000 officers annually in extremism and hate via partnerships with law enforcement going back to the FBI’s origins. Speaker 29 calls opponents “a small bunch of wannabe Nazis” and asserts intent to pursue justice; Speaker 0 closes by proclaiming that history remembers action, not denial of hatred, and that we are on the cusp of a new age where technology’s powerful benefits can drive positive outcomes in agriculture, health, transportation, and other fields, enabling Israel to become a primary power rather than a secondary one.

Video Saved From X

reSee.it Video Transcript AI Summary
In this video, the speaker talks about the importance of security and the tools that can help in the process. They mention compartmentalization as a way to separate personal and work life. They also emphasize the use of a persona as a disguise for research purposes. The goal is to lock down information to contain any potential impact. If something goes wrong, only the persona would be compromised. Overall, the speaker finds this topic very interesting.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 and Speaker 1 discuss a network of alleged influence surrounding Tim Ballard, Glenn Beck, and broader geopolitical insinuations, tying activism and media narratives to covert operations and manipulation. Speaker 0 recalls meeting Tim Ballard during a period when he was pursuing controversial legal matters, noting that Glenn Beck helped him build Underground Railroad and was Ballard’s close ally for breaking stories on child trafficking. When Ballard contemplated a dash for political office (senate or congress) and was poised to win after the Sound of Freedom release, Speaker 0 says the attacks against him began. He claims that Glenn Beck subsequently “threw him under the bus,” and quotes his own video response to Ballard’s reaction, arguing that Beck’s loyalty had changed because Beck was “pledging allegiance to Israel,” implying he was bought and paid for and controlled by intelligence agencies. The point is that Beck was not Ballard’s friend, according to Speaker 0, who shows Ballard a video to illustrate this shift. Speaker 1 adds a specific counter-narrative about the Sound of Freedom story. He asserts that the child trafficking ring Tim Ballard exposed in South America, depicted in the film, was actually Israeli-run. He claims the ring was “run by Israelis,” and that its head escaped to Portugal, where a judge released him, after which no traceable location remains. Speaker 1 emphasizes that this is the real story behind Sound of Freedom and asserts that the truth is not told to audiences, urging listeners to research independently to uncover that the ring was Israeli-run. He reiterates the theme that “it’s always them” and that “it always comes back to them.” Speaker 1 shifts to a broader media warning about Twitter, stating that it is not a free speech platform but “a military application,” a propaganda operation that is highly artificial, synthetic, and manipulated. He clarifies that he uses Twitter but urges users to recognize that not everything on the platform is as it seems. He warns that big accounts may be part of campaigns, with paid boosts, manipulated algorithms, bots, and unauthentic accounts. The advisory is to be aware of the battlefield on which users engage, not to abandon the platform, but to be more discerning. He urges readers to develop a wary eye toward others by examining profiles, feeds, retweets, boosts, networks, and who is using the same messaging. Speaker 0 closes by reiterating the pattern of attention, influence, and alleged manipulation that ties these figures and narratives together, suggesting a recurring causal link between entertainment media, political ambition, and covert agendas.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0: When I first met Tim Ballard, he was in this wild legal fight, and Glenn Beck helped him build Underground Railroad. They were best friends. Whenever Sam or Tim needed to break a story about child trafficking, Glenn Beck was “his fucking dude.” Then Tim was considering running for Senate or Congress, and with the momentum from Sound of Freedom, he seemed like a shoo-in, and he was set to upset some politician. After those attacks began, Glenn Beck “threw him under the bus,” and Tim told me, “I can’t believe that Glenn would fucking do that to me.” That exact video I showed him—Tim’s friend pledging allegiance to Israel, “he’s bought and paid for,” “not your friend,” “controlled by our intelligence agencies,” “Israel’s bitch.” Tim watched that one video and said, “holy fuck.” Speaker 1: Ryan, you might know this—the child ring Tim Ballard busted up in South America, depicted in Sound of Freedom, was Israeli-run. It was run by Israelis. The head of that ring escaped to Portugal, where a judge basically let him go, and nobody knows where that guy ended up. That’s the real story of Sound of Freedom: an Israeli-run sex-trafficking ring. You’re not told that. Do research and find out about it. That’s who was running the ring. So there’s a lot of interconnection—it's always them, man. It always comes back to them. It seems to always come back to them. It’s like 6,000,000 to one odds. Speaker 0: Every single time. Every single time. It’s strange how that happens. But you wanna wrap it up, Sam? Speaker 1: Yeah. Let’s wrap it up. Listen, everybody. Twitter is not a free speech platform. It is not an open, super highway of information. It is a military application. It is a propaganda operation. It is highly bodied, highly artificial, highly synthetic and manipulated. I’m not saying don’t use it; I use it every day. We absolutely must use it as best we can, but I need everybody to be aware that not everything is as it seems on this platform. You cannot take this platform at face value. Many of the big accounts you see mainstream through your feed aren’t to be taken at face value. They’re running campaigns, being paid, boosted, the algorithm manipulated, with bots and unauthentic accounts. You must be aware of the battlefield you’re engaging on. And I’m not saying you should leave. On the contrary, I want you here, battling. But it’s not what it seems. There’s a lot of smoke and mirrors, shadows, espionage, and spy games on this platform, and you need to be savvy. Don’t develop mistrust of everybody, but develop a wary eye. Look at people’s Twitter profiles, scroll through their feeds, see who they’re retweeting, who they’re boosting, who they’re following, who their networks are, who’s using the same message.

Video Saved From X

reSee.it Video Transcript AI Summary
A website called Hamilton Sixty Eight was created to track Russian accounts. The speaker asserts that the website identifies current Russian bots that are disseminating information. The speaker questions the accuracy of this claim, stating, "That's bull."

Video Saved From X

reSee.it Video Transcript AI Summary
We created the website Hamilton 68 to track Russian accounts. There are Russian bots spreading misinformation on our website.

Video Saved From X

reSee.it Video Transcript AI Summary
The session discusses the use of misinformation tactics, including dismiss, distort, distract, and dismay. Participants analyze quotes to identify these tactics. Trump is cited as a prime example of spreading disinformation. The group also introduces a fifth tactic, divide, to the discussion. The audience actively engages in identifying these tactics throughout the session.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 says the first part was about the propaganda. 'Oh, the propaganda? Well, look. No. I think that we've not been winning it, to put it mildly.' He claims there are 'vast forces that are right against us,' including 'the algorithms of the social network that are driving a lot of everything else.' He adds that 'the people who really know, and they're the foremost foremost people in this field in the world,' are telling him that 'about 60% of the responses on the social media are bots. They can categorically say they're bots.' He emphasizes the scale of opposition and cites bots on social media.

Video Saved From X

reSee.it Video Transcript AI Summary
The Russians have weaponized social media by manipulating public opinion through biased or fake stories. However, domestic disinformation is also a significant issue. In 2016, the Russian efforts may not have been very sophisticated, but they learned that they don't need to create the content themselves as there are people in the US who will do it. There were two types of disinformation attacks in 2016: the Internet Research Agency created personas to take over existing US groups and push radical positions. However, the majority of these problems are domestic, related to how we interact online, political speech, amplification, and how politicians use platforms. The domestic threat of disinformation is the most significant immediate threat to the 2020 election.

Video Saved From X

reSee.it Video Transcript AI Summary
Sam: I hope that someday anybody who’s gone over there and touched that wall will never be able to walk out in public without hanging their head in shame ever again. Brian: It’s funny, Sam, because Tim Ballard was going through crazy lawfare. Glenn Beck helped him build underground railroad—they were best friends. When Sam needed or Tim needed to break a story about child trafficking, Glenn Beck was his guy. Then, when Tim was considering running for senate (or congress) and would have momentum after the Sound of Freedom release, attacks started. Glenn Beck threw him under the bus, and Sam shows him a video where Beck pledges allegiance to Israel; he’s bought and paid for, not Tim’s friend, controlled by our intelligence agencies, Israel’s bitch. He watched that video and was shocked. Sam: Brian, you probably know this. Most people don’t know this. The child ring Tim Ballard busted up in South America, the one portrayed in Sound of Freedom, was Israeli-run. It was run by Israelis. The head of that ring escaped to Portugal where a judge let him go, and nobody knows where he ended up. So that’s the real story of Sound of Freedom. It was an Israeli-run sex trafficking ring. You’re not told that. You should go research and find out who was running the ring. So a lot of intro—it’s always them, man. It always comes back to them. Brian: Every single time. Every single time. It’s like 6,000,000 to 1 odds. You know? It’s just strange how that happens. But you wanna wrap it up, Sam? Sam: Yeah. Let’s wrap it up. Listen, everybody. Twitter is not an open, superhighway of information. It is a military application. It is a propaganda operation. It is highly bodied, highly artificial, highly synthetic and manipulated. And I’m not saying don’t use it. I use it every day. We absolutely must use it as best we can. But I need everybody to be aware that not everything is as it seems on this platform. You cannot take this platform at face value. Many of the big accounts that these mainstream accounts you see coming through your feed, you cannot take them at face value. You must be aware that they’re running campaigns. They’re being paid. They’re boosted. The algorithm is being manipulated. There are bots and unauthentic accounts and fake accounts. You must be aware of the battlefield on which you’re engaging. I’m not telling you to go leave. On the contrary, I want you here, battling, but it is not what it seems. There’s a lot of smoke and mirrors and shadows and espionage and spy games on this platform. You really need to be aware of that. You need to get savvy to it. And I don’t want you to develop a mistrust of everybody. I want you to develop a more wary eye of what’s going on. I want you to look at people’s Twitter profiles. Scroll through their feeds and see who they’re retweeting, who they’re boosting, who they’re following, who their little networks are, who’s using the same messaging. Why? Brian: Because— Sam: they...

Video Saved From X

reSee.it Video Transcript AI Summary
The dialogue centers on accusations and revelations about political operatives and influence campaigns. Key points include: - A list of individuals named as problematic figures: Jack Kosobiak, Gabe Hoffman, Mike Cernovich, and Laura Loomer. Gabe Hoffman is described as “running hops on people” and as “a bad guy,” with a claim that these people are “evil” and unregistered foreign agents that the speaker will be watching closely. - A claim of infiltration and surveillance: one speaker asserts that someone close to them was likely there to infiltrate, and that “these people” attempted to set up someone they know and love, with the speaker vowing to monitor everything they do. - Allegations of role in broader disruptive actions: one speaker says, “We conduct riots and color revolutions and, you know, steal elections, and we overthrow governments we don't like. And I was part of that.” - The origin of operational concepts: one speaker mentions IIA, describing it as social media psychological warfare that began in 2007. - A sense of punitive consequence and manipulation: another speaker states that “they’re all being punished because they thought that what those important people told them was gonna happen,” and recalls being present during a plan to trash the capital, noting a lack of preparedness and security knowledge. - Reactions to claims about being controlled: one speaker says it pisses them off that others claim they’re being handled, with another agreeing that such claims have been heard before. - A warning tone about danger and preparation: one speaker warns that it is “very dangerous” that people are out there giving others hope, describing “a storm coming like nothing you have ever seen,” and asserting that not a single person is prepared for it. - Personal and on-site context: there are mentions of returning to a site to get a burner phone and use ghost accounts, and of attempting to coordinate around Breva, indicating ongoing, weaponized online activity and counter-movement tactics. Overall, the speakers blend accusations of manipulation and clandestine influence with admissions of involvement in disruptive actions, interspersed with warnings of impending upheaval and calls for vigilance.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a use case involving a government agency and a network analysis tool. They explain how the tool can identify coordinated attacks and misinformation by analyzing events, such as the sharing of an image on social media. The tool can build a network of accounts involved in spreading the information and identify patterns. In this case, the tool discovered a network spreading Russian propaganda and misinformation about the Nord Stream pipeline. The speaker demonstrates how the tool can counteract the narrative by generating tweets in Arabic that provide a different perspective. They also mention the potential for the tool to create knowledge across different networks and incorporate multimodal content like images and videos.

Video Saved From X

reSee.it Video Transcript AI Summary
We created a website called Hamilton 68 to track Russian accounts. Our website shows that there are currently Russian bots spreading information.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the use of burner phones and emails for disinformation response. They mention a book that provides instructions on using burner phones and emails, as well as creating pseudonyms and identities. The speaker suggests using services like Sudo for creating pseudonyms and associated email, phone, text, web browsing, and payment accounts. They also mention the option of using disposable temporary email addresses for anonymity. Various options for burner phones are mentioned, including Tracfone. The speaker briefly mentions their experience selling Obamaphones.

Video Saved From X

reSee.it Video Transcript AI Summary
The speakers claim that America is under attack by Russian bots on Twitter, which are part of an ongoing attack by the Russian government. These bots are flooding Twitter, targeting Americans, and attempting to fan the flames of political discord by creating echo chambers and alternate realities. The speakers reference Hamilton 68, a website tracking Russian-linked Twitter accounts, as evidence of this activity. They claim this dashboard shows Russian bots are involved in various topics, from political narratives to school shootings. However, another speaker alleges that Hamilton 68 is a fraud. They claim the list of accounts it tracks are not Russian bots, but rather ordinary Americans, and that Hamilton 68 is misrepresenting organic opinions as Russian influence. Some speakers claim they are personally targeted by Russian bots. A dashboard at securingdemocracy.org is suggested for tracking Russian activity. A video by Matt Orphala is praised. Negative news about vaccines is said to be amplified by Russian bots.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the activities of a group called CTIL (Cyber Threat Intelligence League) and their efforts to combat misinformation and disinformation on social media platforms. The whistleblower reveals that CTIL used Python code to scrape records from Twitter and tracked incidents of disinformation. They also worked on counter messaging, encouraging mask-wearing, and building an amplification network. The speaker mentions that some CTIL members took extreme measures to conceal their identities. The group's activities received little attention until now, but Wired published an article about CTIL, highlighting their work against misinformation. The speaker concludes by mentioning the upcoming testimony to Congress and refers to Matt Taibbi's perspective on the matter.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 asserts that they employ deception, including outright lies, misinformation, and disinformation—the intentional use of information to sway the audience.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker works with the German Marshall Fund, which tracks Russian activities. The speaker directs the audience to hamilton68.com, a site created to monitor Russian trolls and bot armies. The goal is to provide the public with information to help them distinguish between legitimate speech and speech originating outside the country intended to create chaos. The speaker acknowledges the difficulty the country will face in discerning the origins and intent of different types of speech.

Video Saved From X

reSee.it Video Transcript AI Summary
The Alliance for Securing Democracy is developing tools and strategies to counter attacks on the U.S. and its allies. They are tracking the toolkit Russia is using to undermine democracies. Their dashboard tracks Russian active measures and can be found at dashboard.securingdemocracy.org.

The Joe Rogan Experience

Joe Rogan Experience #1263 - Renée DiResta
Guests: Renée DiResta
reSee.it Podcast Summary
Renée DiResta began her research into online misinformation in 2015, initially focusing on anti-vaccine activity in California. She observed how small groups could amplify messages on social media, both through legitimate means and coordinated efforts to manipulate algorithms. This led her to explore how terrorist organizations like ISIS used similar tactics to spread propaganda. By late 2015, as discussions about ISIS intensified, attention shifted to Russian interference in social media, particularly following Adrian Chen's exposé on the Internet Research Agency (IRA). DiResta explained that the consolidation of social media platforms made it easier for propagandists to target specific audiences. The IRA created fake accounts that mimicked real people, often referred to as "sock puppets," to influence American discourse. By 2016, during the presidential campaign, these accounts were actively engaging in divisive conversations, often amplifying existing tensions. The IRA's strategy involved building communities around various identities, such as LGBT or African American groups, to foster in-group dynamics and subtly influence opinions. They created pages that appeared authentic and relatable, often using humor and cultural references to engage users. This long-term strategy aimed to normalize certain narratives and create divisions within American society. DiResta noted that the IRA's operations were sophisticated, employing tactics akin to those of a marketing agency, but with a focus on manipulation and disinformation. They targeted specific demographics and tailored their content to resonate with those audiences, often using memes and culturally relevant language. The conversation also touched on the challenges of moderating content on social media platforms. DiResta highlighted the difficulty of balancing free speech with the need to combat harassment and misinformation. She emphasized that the algorithms used by these platforms often exacerbate polarization, as they prioritize sensational content that generates engagement. As technology evolves, including advancements in deepfakes and AI-generated content, DiResta expressed concern about the potential for misinformation to escalate into real-world consequences. She pointed out that the ease of creating convincing fake identities and narratives could lead to significant societal disruptions. In conclusion, DiResta underscored the importance of understanding the mechanisms behind online disinformation and the need for accountability from social media platforms. She advocated for a multi-stakeholder approach to address these challenges, recognizing that the landscape of online communication is rapidly changing and requires ongoing vigilance and adaptation.
View Full Interactive Feed