reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Major strategic problems in global communication have led to the spread of disinformation about the pandemic on social media. State-sponsored groups are creating accounts to sow political discord and gain financial advantages. Violence against healthcare workers and minority populations is increasing. Different countries are implementing limited internet shutdowns to manage the overwhelming amount of misinformation. Experts agree that identifying every bad actor is a huge challenge, and new disinformation campaigns are generated daily. Some believe that controlling access to information is necessary to combat the problem. However, it's not just trolls spreading fake news, but also political leaders. It's crucial to ensure that accurate public health information reaches the public through various outlets. Misinformation is causing unrest, eroding trust, and hindering response efforts. Governments are implementing interventions, including internet shutdowns and penalties for spreading harmful falsehoods. Social media companies are trying to limit misuse of their platforms, but it's a complex issue. The public is losing trust in both misinformation and the measures to control it.

Video Saved From X

reSee.it Video Transcript AI Summary
In this session, the speaker discusses how disinformation is not just about lies, but also about distorting and manipulating the truth. They introduce the 4 D's model: dismiss, distort, distract, and dismay. The audience is given cards to identify these tactics in quotes from different organizations. They discuss examples of dismiss, distort, and distract, and someone adds a fifth D, divide. The session focuses on the various ways people twist stories and attack those who present uncomfortable evidence.

Video Saved From X

reSee.it Video Transcript AI Summary
There is a discussion about the control of information and how false information can be challenged. Social media platforms are urged to take responsibility and partner with scientific and health communities to provide accurate information. The idea of government enforcement against fake news is also mentioned. Shutting down information is seen as impractical, and instead, flooding accurate information and relying on trusted sources are suggested strategies. The video then shifts to a description of a past pandemic, where millions of people died, the global economy suffered, and societal impacts were long-lasting.

Video Saved From X

reSee.it Video Transcript AI Summary
Disinformation on social media platforms poses challenges to democracy, bolstering authoritarians and silencing opposition. Countering disinformation is crucial for a thriving democracy. The steps countering disinformation guide provides 9 thematic sections and a comprehensive intervention database to promote information integrity and strengthen societal resilience. Key takeaways include the need for a whole of society approach, prioritizing programs addressing disinformation and societal cleavages, and utilizing mixed methods like fact checking and monitoring. Establishing norms, regulations, and better content moderation is essential. Political parties should be discouraged from engaging in disinformation. Explore the interventions database for organizations, projects, and donors combating disinformation worldwide.

Video Saved From X

reSee.it Video Transcript AI Summary
Digital platforms are being misused to subvert science and spread disinformation and hate to billions of people. This global threat demands clear and coordinated global action. A policy brief on information integrity on digital platforms puts forward a framework for a concerned international response.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media companies are deleting accounts spreading disinformation about the pandemic, including state-sponsored groups. Violence against healthcare workers and minority populations is increasing. Some countries are implementing limited internet shutdowns to manage the overwhelming amount of misinformation. Experts believe that identifying every bad actor is a challenging task, as new disinformation campaigns are generated daily. Controlling and reducing access to information may be necessary to combat the problem. However, it's not just trolls spreading fake news, but also political leaders. It is crucial for news organizations, public health groups, and companies to promote accurate information to protect the public.

Video Saved From X

reSee.it Video Transcript AI Summary
Social media has provided unprecedented access to health information but has also accelerated the spread of misinformation. This has contributed to mistrust in vaccines and other health interventions, fueled stigma and discrimination, and led to violence against health workers and marginalized groups. During the COVID-19 pandemic, falsehoods about masks, vaccines, and lockdowns spread rapidly and were almost as deadly as the virus.

Video Saved From X

reSee.it Video Transcript AI Summary
We developed real-world interventions, like the game Go Viral, to help people identify fake news about COVID-19. We collaborated with organizations, governments, and social media companies, including the Cabinet offices, the World Health Organization, and the United Nations Verify Campaign. Through our game called Bad News, users experience a simulated social media feed and learn how misinformation spreads. Our research shows that people who go through our interventions become better at recognizing fake news, gain confidence in discerning fact from fiction, and share less fake news with others.

Video Saved From X

reSee.it Video Transcript AI Summary
There are good and bad journalists, but when the public mistrusts us and turns to misleading alternative sources, it's problematic. Without a common set of facts, it's difficult to solve society's big problems. CourseCorrect is using machine learning and AI to identify and combat misinformation. They analyze linguistic patterns, network science, and temporal behavior to pinpoint misinformation sources and its reach. Tailoring corrections based on the context of the person is crucial for effectiveness. CourseCorrect's experiments have shown that strategically placing correct information in social media networks can reduce the spread of misinformation. By testing different strategies, they can advise journalists on the most effective ways to combat misinformation. A former Facebook public policy director is part of the team, bringing valuable experience in coordinating the company's work during elections.

Video Saved From X

reSee.it Video Transcript AI Summary
To combat information manipulation, we must focus on prevention rather than cure. Prebunking, like vaccination, is more effective than debunking. By educating people about disinformation and its tactics, we can reduce its impact and build societal resilience.

Video Saved From X

reSee.it Video Transcript AI Summary
Many people overlook their options in dealing with misinformation on social media. Early detection is key to tracking and countering harmful narratives. Legal action can be taken against profit-driven disinformation networks. Fact-checking alone may not change beliefs, so building counter narratives is crucial. Our organization helps detect, assess, and mitigate the impact of misinformation to prevent future issues. The recent events at the US Capitol highlight the real-world consequences of online disinformation. Translation: It is important to detect and counter harmful narratives early to prevent misinformation from causing real-world harm. Legal action can be taken against profit-driven disinformation networks, and building counter narratives is essential. Our organization helps organizations address the impact of misinformation to prevent future issues. The recent events at the US Capitol show the consequences of online misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
Social engineering is described as a disease affecting society, causing people to react to news at face value. The Israel-Palestine situation is cited as an example of social engineering, as it removed Ukraine from the news cycle. Social engineering is equated to psychological warfare or brainwashing, capturing most of America. The speaker suggests watching the Netflix documentary "Social Dilemma" as an introduction to the topic.

Video Saved From X

reSee.it Video Transcript AI Summary
Our current focus on debunking misinformation is often ineffective because once false information is encountered, it becomes difficult to correct. Prebunking, or preemptively educating people about misinformation, is more effective. This approach is like a psychological vaccine, based on the theory of inoculation. Just as a weakened virus dose triggers antibody production, exposing people to fake news examples can help them build cognitive defenses against misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
The panel discussion focuses on how major platforms like Google, Twitter, and Facebook are addressing false and misleading narratives surrounding COVID-19. The speakers discuss their policies and strategies for moderating and mitigating misinformation. They highlight the importance of providing authoritative information, removing harmful content, and addressing borderline content that could lead to vaccine hesitancy. The panelists also acknowledge the challenges of handling misinformation during a rapidly evolving crisis and emphasize the need for flexibility and adaptability in their approaches. They mention the use of AI systems and human review to sift through vast amounts of data and the importance of partnerships with health authorities and fact-checking organizations.

Video Saved From X

reSee.it Video Transcript AI Summary
This week, an initiative was launched with companies and nonprofits to improve research and understanding of how automated processes curate online experiences. This is important for understanding online mis- and disinformation, a challenge that leaders must address. While it's easy to dismiss disinformation, ignoring it poses a threat to valued norms. How can wars end if people believe their reasons are legal and noble? How can climate change be tackled if people don't believe it exists? How are human rights upheld when people are subject to hateful rhetoric? The goals of those who perpetuate disinformation are to cause chaos, reduce the ability to defend, disband communities, and collapse countries' collective strength. There is an opportunity to ensure these weapons of war do not become an established part of warfare. Despite facing many battles, there is cause for optimism because for every new weapon, there is a new tool to overcome it. We have the means; we just need the collective will.

Video Saved From X

reSee.it Video Transcript AI Summary
There is misinformation circulating about the origin of the virus, with some people believing it is manmade. This misinformation can lead to violations and even deaths. It is important to train healthcare workers to ensure they have accurate information to share with the public. Telecommunication companies should be involved in providing access to reliable communication channels. Trusted sources should flood the zone with information, including community leaders and health workers, to amplify the message. Constant communication is necessary to address the vacuum created by disinformation. It is crucial to respond quickly to false information that hampers efforts to address the pandemic.

Video Saved From X

reSee.it Video Transcript AI Summary
The discussion centers on COVID-19 misinformation and the roles of public figures and disinformation spreaders. Speaker 0 questions whether doctor Fauci is involved in a plot to kill millions. Speaker 1 says he cannot confirm involvement but asserts Fauci is not an innocent bystander and is aware of his actions; he doesn’t have the information to determine the extent of Fauci’s involvement. Speaker 2 identifies Dr. Dirashid Bhattar as one of the top spreaders of COVID-19 disinformation on social media, citing the Center for Countering Digital Hate, noting Bhattar once had more than a million followers. The dialogue includes several false or debunked claims attributed to Bhattar. Speaker 1 states that “More people are dying from the COVID vaccine than from COVID,” a claim Speaker 2 labels as not true, along with Bhattar’s assertion that “the Red Cross won’t accept blood from people who have had the COVID vaccine,” and his claim that “most who took COVID vaccines will be dead by 2025.” Bhattar’s broader theory is that COVID was a planned operation, politically motivated as part of a secret global plot to depopulate the earth. Speaker 0 asks if Speaker 1 believes the pandemic was planned; Speaker 1 responds affirmatively but says he has no idea who is behind it. Speaker 2 warns that praising or repeating Bhattar’s views is dangerous, noting Bhattar’s use of false or twisted information to distrust vaccines. The conversation touches on whether the COVID vaccine works; Speaker 1 says the vaccine is “very effective at what it was designed for perhaps,” but “not preventing death.” Speaker 0 challenges this, and Speaker 2 counters that Bhattar doubles down on vaccines being more dangerous than the virus, even in the face of data. A numerical claim is raised: “6,340,000,000 doses of this vaccine have been given,” with implications if the claim were true. Speaker 1 says vaccines are designed with ingredients published and that each vaccine appears to be different, though he concedes not being a vaccine developer. Speaker 2 notes Bhattar has been removed from Facebook and Instagram for disinformation but remains active on Twitter, Telegram, and his own site. Speaker 0 references a September 5 retweet of a photo suggesting AstraZeneca was made in 2018; Speaker 1 acknowledges it could have been fake and questions why Bhattar would share such content. A combined exchange discusses questioning agencies and the consequences of misinformation, with Speaker 0 accusing Bhattar of contributing to a mass misinformation problem and Speaker 1 acknowledging the existence of a large follower base that has received false information. The dialogue closes with a mention of a statement from North Carolina’s Board of Medicine prior to COVID, implying regulatory context or action.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses the impact of social media on the credibility of science during the COVID-19 pandemic. They highlight the danger of amplifying pseudoscientists in official positions, leading to confusion and misinformation. The focus shifts to the issue of public health versus science, emphasizing the need for transparency and honesty in the field.

Video Saved From X

reSee.it Video Transcript AI Summary
As technology advances, we must develop resilience to combat information manipulation. Disinformation spreads when people share it, so it's crucial to understand its influence and the techniques used. Increased awareness reduces susceptibility to manipulation, strengthening our collective resilience.

Video Saved From X

reSee.it Video Transcript AI Summary
Digital platforms are being misused to subvert science and spread disinformation and hate to billions of people. This global threat demands clear and coordinated global action. A policy brief on information integrity on digital platforms puts forward a framework for a concerned international response.

Video Saved From X

reSee.it Video Transcript AI Summary
Current measures focus on debunking and correcting misinformation, but research shows it's difficult to change people's beliefs once they've been exposed to falsehoods. This is called the continued influence of misinformation. Prebunking, on the other hand, is more effective. It involves protecting people before they encounter fake news. It's like a psychological vaccine based on the theory of inoculation. Just as a weakened dose of a virus triggers the production of antibodies, preemptively exposing people to fake news or manipulation techniques helps them develop cognitive antibodies against misinformation.

Video Saved From X

reSee.it Video Transcript AI Summary
It's easy to blame those who believe or spread mis/disinformation. Governments, internet, and social media companies have a responsibility to prevent the spread of harmful lies and promote access to accurate health information. The WHO is working with partners, companies, and researchers to understand how misinformation and disinformation spreads, who is targeted, how they are influenced, and what can be done to counter this problem.

Video Saved From X

reSee.it Video Transcript AI Summary
In this lesson on countering disinformation on social media, we learn that false information about COVID-19 has been circulating since 2020. Some misinformation is unintentional, while others are deliberately created to mislead or harm. Disinformation can erode trust in public health, leading to lower vaccine acceptance and adherence to safety protocols. It can also divide communities and cause a rise in infections and deaths. We are shown an example of a post from Susan's uncle, Steve, who compares COVID-19 to the flu, committing fallacies such as mob appeal, weak analogy, suppressed evidence, and appeal to authority. Susan, on the other hand, does fact-based research and counters her uncle's opinions with evidence. It is important to protect ourselves from disinformation and prevent its spread.

TED

How we can protect truth in the age of misinformation | Sinan Aral
Guests: Sinan Aral
reSee.it Podcast Summary
On April 23, 2013, a false tweet from the Associated Press about explosions at the White House caused a $140 billion stock market drop. The Internet Research Agency's misinformation during the 2016 election reached 126 million people on Facebook. A study found false news spreads further and faster than true news, driven by novelty and emotional responses. Future challenges include synthetic media from generative adversarial networks. Solutions involve labeling information, economic incentives, regulation, transparency, and ethical considerations in technology. Vigilance is essential to defend truth against misinformation.

Mark Changizi

How do we handle DISinformation? Moment 154
reSee.it Podcast Summary
Disinformation involves intentional lying, which is harder to maintain than misinformation; reputation networks should identify liars, not centralized fact checkers.
View Full Interactive Feed