TruthArchive.ai - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Free speech should exist, but boundaries are needed when speech incites violence or discourages vaccinations. The question is where the US should draw those lines and what rules should be in place. With billions of online activities, AI could potentially encode and enforce these rules. A delayed response to harmful content means the harm is already done.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 argues that it is difficult to hear, but it is time to limit the First Amendment in order to protect it. They state that we need to control the platforms—specifically all social platforms—and to stack rank the authenticity of every person who expresses themselves online. They say we should take control over what people are saying based on that ranking. The government should check all the social media.

Video Saved From X

reSee.it Video Transcript AI Summary
Regulating social media is essential, as Congress has struggled to address the issues posed by rogue corporations. There is a need for better oversight and action from both Congress and the administration to tackle these challenges effectively.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media platforms must apply the same rules consistently. There needs to be accountability for these sites, as they communicate directly with millions without sufficient oversight or regulation. This lack of responsibility must change.

Video Saved From X

reSee.it Video Transcript AI Summary
Section 230, which granted internet platforms immunity as passive conduits, should be repealed. This perspective is based on the idea that platforms like Facebook, X, Instagram, and TikTok are not simply pass-throughs. Without moderation and monitoring of content by these platforms, there is a loss of control.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media sites must be held responsible and understand their power. They speak directly to millions of people without oversight or regulation, and this has to stop. The same rule has to apply across platforms; there can't be one rule for Facebook and another for Twitter.

Video Saved From X

reSee.it Video Transcript AI Summary
Section 230, which granted internet platforms immunity as passive conduits, should be repealed. This perspective is based on the idea that platforms like Facebook, X, Instagram, and TikTok are not simply pass-throughs. Without moderation and monitoring of content, there is a loss of total control.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media platforms should be held responsible for their power, as they directly address millions without oversight. The same rules must apply across platforms like Facebook and Twitter. There needs to be a responsibility placed on these sites to understand their reach and influence. The current lack of regulation on these platforms must end.

Video Saved From X

reSee.it Video Transcript AI Summary
If social media platforms like Facebook, X, Instagram, or TikTok don't moderate and monitor content, we lose total control. This loss of control extends beyond social and psychological effects, leading to real harm.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media sites should be held responsible for their power, as they directly address millions without oversight or regulation, and this must end. There can't be one rule for Facebook and another for Twitter; the same rule must apply to both.

Video Saved From X

reSee.it Video Transcript AI Summary
Section 230, which granted internet platforms immunity as passive conduits, should be repealed. This perspective is based on the idea that platforms like Facebook, X, Instagram, and TikTok are not simply pass-throughs. Without moderation and monitoring, there is a loss of control, leading to social, psychological, and real-world harm.

Video Saved From X

reSee.it Video Transcript AI Summary
The ADL Center for Technology and Society has graded tech platforms on their responsiveness to antisemitism and other forms of hate. Meta, for example, gutted its fact-checking department. Tech platforms have a responsibility to check and remove hateful speech. Congress and federal regulators, as well as states, have a role to play. Tech platforms are not accountable for misinformation due to Section 230 of the Federal Communications Act, which provides them immunity. Congress needs to amend Section 230 to hold tech platforms accountable. These platforms are private companies and can deplatform users via user agreements. The deplatforming and replatforming of people has been observed on platforms like X and Facebook/Meta. Universities are being held accountable for antisemitism on campus, and accountability is effective in changing behavior.

Video Saved From X

reSee.it Video Transcript AI Summary
Concerns are rising about a tech industrial complex that threatens our country. Americans face overwhelming misinformation, leading to power abuse. The free press is deteriorating, and social media is neglecting fact-checking. Lies are overshadowing the truth for profit and power. It's crucial to hold social platforms accountable to safeguard our children, families, and democracy from these abuses.

Video Saved From X

reSee.it Video Transcript AI Summary
Section 230, which granted internet platforms immunity as passive conduits, should be repealed. This perspective is based on the idea that platforms like Facebook, X, Instagram, and TikTok are not simply pass-throughs. Without moderation and monitoring, there is a loss of control, leading to social, psychological, and real-world harm.

Video Saved From X

reSee.it Video Transcript AI Summary
If social media platforms do not moderate and monitor content, we lose total control. This loss of control results in real harm, beyond just social and psychological effects.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media algorithms that function like pitcher plants, pulling people into rabbit holes, should be banned as they abuse the public forum. These rabbit holes lead to echo chambers, where artificial insanity thrives. QAnon is a prominent example of this artificial insanity. These devices pose a threat to self-government and democracy. Reforms are necessary for both democracy and capitalism, and both sets of reforms are achievable.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media companies should be liable for their algorithms' actions, not users' content. Appealing to freedom of speech is a smokescreen. Companies are responsible for what their algorithms promote, similar to an editor being responsible for front-page content. If an algorithm writes something, the company is definitely liable. Information isn't truth; most of it is junk. Truth is rare, costly, and complicated. Flooding the world with information won't make the truth float up. Institutions are needed to sift through information. Media companies decide where public attention goes and have a responsibility to distinguish reliable from unreliable information. AI further complicates this.

Video Saved From X

reSee.it Video Transcript AI Summary
If social media platforms like Facebook, X, Instagram, or TikTok don't moderate and monitor content, we lose total control. This loss of control extends beyond social and psychological effects to include real harm.

Video Saved From X

reSee.it Video Transcript AI Summary
Section 230, which granted internet platforms immunity as passive conduits, should be repealed. This perspective is based on the belief that platforms like Facebook, X, Instagram, and TikTok are not simply pass-throughs. Without moderation and monitoring of content by these platforms, there is a loss of total control.

Video Saved From X

reSee.it Video Transcript AI Summary
If platforms like Facebook, Twitter, Instagram, or TikTok fail to moderate and monitor content, we risk losing control over the situation. This lack of oversight can lead to significant social and psychological consequences, as well as real harm.

Video Saved From X

reSee.it Video Transcript AI Summary
Facebook and other platforms have the power to manipulate content without explanation or transparency. They can secretly ban candidates or limit their reach while boosting other content. Elon Musk believes this is done in the name of free speech and to benefit people.

Video Saved From X

reSee.it Video Transcript AI Summary
Millions of people are being purged from the Internet as big tech titans have the power to control and censor. It's time to recognize social media companies as public utilities, just like electricity and telephone services. Social media is essential for businesses, nonprofits, and political campaigns. The establishment has been censoring those who question them, using any excuse to consolidate power. We must unite as Americans and demand an Internet bill of rights that protects our freedom of speech in cyberspace. This is the United States, where our right to free speech is not optional.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media sites must be held responsible and understand their power. The speaker claims these sites speak directly to millions of people without oversight or regulation, and that "has to stop." The speaker asserts that the same rules must apply across platforms like Facebook and Twitter. Someone "has lost his privileges" and content "should be taken down."

Video Saved From X

reSee.it Video Transcript AI Summary
Social media sites must be held responsible and understand their power. The speaker claims these platforms directly address millions without oversight or regulation, and this must end. The speaker asserts there can't be different rules for Facebook and Twitter; the same rule must apply to both. Someone has lost their privileges, and content should be taken down.

Tucker Carlson

Tucker Carlson LIVE: The End of Free Speech w/ Michael Shellenberger
Guests: Michael Shellenberger
reSee.it Podcast Summary
Two weeks after Charlie Kirk was assassinated for engaging openly on campuses, this episode uses his life as a blueprint for free speech. Kirk traveled from campus to campus, inviting disagreement, listening as often as he spoke. Carlson argues that sincere Christians and a culture of open dialogue embody a healthier public square. If we want to honor Kirk, we should ask leaders to answer tough questions calmly and directly—about Nord Stream, Ukraine aid, JFK files, and other mysteries—rather than silence voices through censorship. The discussion turns to Section 230, the 1996 clause that shields platforms from lawsuits while hosting user content. Carlson explains the publisher-platform distinction and notes how social networks now dominate information flows. Republicans and Democrats have both flirted with revoking or reforming 230, often under donor or moral pressure. Some urge treating platforms as regulated utilities; others propose filters that let adults decide what to see while policing illegal material. California is pressed to enact a sweeping hate-speech law that would fine speakers for content deemed violent or coercive based on protected characteristics. Kirk cites online suppression of prominent figures and questions whether such measures reduce harm or shield the powerful from critique. He cites UK arrests for speech—thousands in a year—alongside a sense that censorship enforces political orthodoxy. The ADL and lawmakers like Don Bacon appear as central actors in this frame. Michael Shellenberger joins to discuss what he calls the censorship industrial complex, present from Europe to California, aided by AI and algorithmic tooling. They debate how platforms evolved into de facto utilities, the push to reform 230 to force censorship, and the tension between civil liberties and public safety. The conversation touches TikTok, Musk’s influence at X, and how filters might expand speech rather than shrink it. They contrast Europe’s regime with American traditions and warn of global trends. The final stretch covers UAPs and Epstein, with Shellenberger urging transparency around the CIA and NSA, drone incursions, and unexplained phenomena. They debate the possibility of non-human intelligence, the role of government secrecy, and the need for disclosure to prevent conspiratorial mistrust. The exchange closes with mutual appreciation and a commitment to continue reporting on free speech, power, and truth.
View Full Interactive Feed