reSee.it - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker describes digitally verified ID and its growth in China. In China, a traffic camera can catch you jaywalking, and the digital ID system has your blood, genetic code, and photograph, plus it can identify how you walk. So even without a visible face, you can be picked up by gate. It will convict you of jaywalking and take money out of your bank account with no intermediating judiciary at all and show a picture of you to the people in the neighborhood, so they know that you have jaywalked and reduce your social credit score. If your social credit score falls below a certain level, then you can't you can't buy drinks from a vending machine. You can't play video games. You can't go on a train. You can't get out of your fifteen minute city. All that's already in place in China. Do you think that that's that would be helpful or unhelpful? It would be I think it would bring in and has already in China. I think it'll bring in a totalitarian tyranny. So 100% complete that it would make George Orwell's 1984 look like a picnic.

Video Saved From X

reSee.it Video Transcript AI Summary
In China, if caught jaywalking by a traffic camera, the digital ID system with your blood, genetic code, and photo can identify you by your walk. It convicts you, deducts money from your bank account, and publicly shames you, lowering your social credit score. A low score restricts buying drinks, playing games, riding trains, or leaving your city. This system is already in place in China.

Video Saved From X

reSee.it Video Transcript AI Summary
In the US, a social credit system similar to China's is being quietly implemented by private businesses and banks, not the government. It's based on ESG standards, evaluating sustainability and ethics. Personal ESG scores are influenced by purchase history, sales records, and public data like credit reports. Buying certain items can impact your score, reflecting your impact on the environment and society. People are being pushed to align with these standards, even if they don't want to.

Video Saved From X

reSee.it Video Transcript AI Summary
A social credit system has emerged in the United States, implemented by private businesses and banks rather than the government. This system mirrors China's social credit framework but is based on ESG (environmental and social governance) standards. Personal ESG scores reflect individuals' commitment to sustainability, calculated from factors like purchase history, including the types of products bought. For instance, buying firearms or alcohol can negatively impact one's score. Additionally, sales history and public records, such as credit reports, are used to assess a person's societal and environmental impact. The push for this system suggests an effort to compel individuals to align with these standards, regardless of their personal preferences.

Video Saved From X

reSee.it Video Transcript AI Summary
A woman in Nanjing follows social ranking rules to maintain a good social credit score. Her purchases, like nappies, reflect positively on her. Only 18,000 out of 8,000,000 people are model citizens in this city. Good scores bring discounts on public services, while low scores lead to loss of rights. Those with a score of 0 are blacklisted, like journalist Liu Hu who uncovered corruption. Being blacklisted means no bank loans, starting a business, or buying an apartment.

Video Saved From X

reSee.it Video Transcript AI Summary
Leo Hu, a journalist, was banned from flying and placed on a list of untrustworthy people after a court deemed his apology for tweets insincere. He reports being restricted from buying property and sending his child to private school. China is assigning every citizen a social credit score that fluctuates based on behaviors; community service and buying Chinese products can raise it, while fraud, tax evasion, and smoking in non-smoking areas can lower it. Surveillance cameras, capable of recognizing over 4,000 vehicles, enable this system. SenseTime CEO Shu Li says their smart cameras can identify adults, children, males, and females. Ken DeWoskin says the scoring system's workings are secret and could be abused by the government to impact and shape behavior. The government may use the system to punish those deemed not loyal enough to the Communist party, and there is no due process to fight the score.

Video Saved From X

reSee.it Video Transcript AI Summary
The United States has quietly implemented a social credit system, not through government action but via private businesses and banks. This system resembles China's, but it is based on ESG (environmental and social governance) standards. Individuals are assigned a personal ESG score reflecting their commitment to sustainability, calculated from factors like purchase history, including the buying of firearms or alcohol, and public records such as credit reports. These elements are used to assess a person's impact on society and the environment. To encourage compliance with these standards, there is a push to make people more accepting of these criteria.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 argues that a Mussolini-quote about fascism being corporatism explains today’s emerging fascist state in America, describing a system where the government merges with corporate power. He notes a prior report on digital ID deployment by private companies with customer consent, claiming the government can collect and utilize data under legal immunity while avoiding a mandate on biometric ID. He asserts that, as during COVID, individuals can choose to consent or “leave the reservation” to fend for themselves. He introduces the idea that the social credit score is actively deployed in the US. Speaker 1 shares a personal experience about ordering food on Uber Eats and noticing an algorithm determining prices based on personal data, prompting reflection on how pricing works. Speaker 0 explains that Communist China’s social credit system, launched in 2014 to “build trust in society by punishing individual behavior,” allows banks to shut off money and restrict travel, enabling the government to condition behavior individually. He claims this is now being deployed in the United States as algorithmic pricing, using automated programs to dynamically set the price of goods and services in real time and on an individual basis. The algorithms rely on large amounts of data, including customer behavior, and can charge one individual more than another for the same product based on willingness to pay and personal data. He asserts that the social credit score is present across the US, and the New York Algorithmic Pricing Disclosure Act (launched 11/10/2025) compels private corporations to notify consumers that they are being charged based on personalized algorithmic pricing. The law defines personal data as any data that identifies or could be linked to a specific consumer or device, regardless of whether the data was voluntarily provided. He says this makes every aspect of life usable to determine pricing, calling the act the first of its kind and predicting expansion to all 50 states. He concludes that the social credit score is real in America and suggests a carbon tax is soon to follow. He also mentions an “AI run cryptocurrency economy” as the United States government’s and big banks’ chosen solution in response to debt and AI competition. Speaker 2 presents a scenario for 2027: special economic zones with zero red tape, with government intervention to accelerate progress. Speaker 3 adds that the promise of vast gains could attract governments to these zones despite protests from workers who would lose jobs and rely on universal basic income, suggesting trillions in new wealth as a compelling incentive. He notes the ongoing arms race with China and the ease with which forecasts could influence presidential decisions, especially when contrasted with regulatory delays. Speaker 0 closes with attribution to Greg Reese.

Video Saved From X

reSee.it Video Transcript AI Summary
Leo Hu was banned from flying due to being on a list of untrustworthy individuals, a consequence of a court-ordered apology for his tweets. He feels constantly controlled by this social credit system, which assigns scores to all Chinese citizens based on their behaviors. Positive actions like community service can improve scores, while negative actions, such as fraud or smoking in prohibited areas, can lower them. Advanced surveillance technology, developed by companies like SenseTime, enables the government to monitor citizens closely. The specifics of how the scoring system operates remain secret, raising concerns about potential abuse by the government. This system could be used to punish those deemed disloyal to the Communist Party, and challenging one's score is nearly impossible due to the lack of due process.

Video Saved From X

reSee.it Video Transcript AI Summary
China's social credit system is using high-tech methods to crack down on low-level offenders like jaywalkers. Cameras record their actions, zoom in on their faces, and shame them on nearby video screens. This system goes beyond traditional credit scores, taking into account behaviors like jaywalking, smoking on trains, and excessive video game purchases. If your score drops too low, you can be banned from buying plane tickets, renting a house, or getting a loan. Over 15 million people have already been prevented from traveling. Chinese technology firms are developing advanced cameras that use AI to track everything, including people, bikes, cars, and buses. Police in Beijing wear glasses that recognize faces linked to the government's database. The fear is that this system could be used to punish those not loyal to the Communist Party, with no real due process to challenge it.

Video Saved From X

reSee.it Video Transcript AI Summary
In a world where everything is recorded, your actions determine your score. The government rates you based on what you do, like buying things or where you go. If your actions are deemed beneficial, your score goes up. But if you criticize the government, buy alcohol, or play games, your score goes down. A low score means you can't travel, your kids can't apply to good schools, and you can lose your job. Worst of all, you'll be publicly shamed. This dystopian reality is happening today.

Video Saved From X

reSee.it Video Transcript AI Summary
Ohyung Haw Yu is tracked and scored on her behavior using a social credit system, with scores from 350 to 950. A good score, like Haw Yu's 752, is generally accepted. The system uses AI, facial recognition, and over 200 million cameras to monitor citizens. Some citizens aren't bothered by privacy concerns, citing increased safety. Companies are developing algorithms for the national system, and pilot projects are underway. These projects require unpaid work for benefits, and penalize actions like littering, gossip, and jaywalking. Informants are paid to report on neighbors. Good social credit earns rewards like cheap loans, while bad scores lead to public shame. Hwang Hui Jun, blacklisted for not paying a court case, can't buy plane or train tickets. A bad score hinders job prospects and school admissions. The nationwide system is launching next year, and criticism is rare, possibly due to fear of score reduction.

Video Saved From X

reSee.it Video Transcript AI Summary
I get discounts for public services and free access to the library. People with low scores lose rights and are publicly shamed at cinemas. The Supreme Court has a blacklist for "bad citizens," including 23 million people like journalist Liu Hu. Once blacklisted, you can't get loans, start a business, buy property, or send kids to private school. Liu Hu criticizes the system, fearing restrictions on individual freedoms. He was removed from the blacklist but still faces challenges.

Video Saved From X

reSee.it Video Transcript AI Summary
In China, a social credit score system is already in place, using facial recognition to monitor behavior like jaywalking and deduct money from accounts. This system can identify gender, estimate age, and even recognize car models. Implementation in Western nations could lead to invasive monitoring of personal habits and preferences, impacting individuals' social credit scores. This reality is already present in some places, highlighting the need for awareness and consideration of potential consequences.

Video Saved From X

reSee.it Video Transcript AI Summary
Final notice, your account balance is below 500. You need 50 credits by the end of the day to avoid automatic eviction. Authorities seized a record balance of an individual's credits, citing excessive hoarding. Traffic violation detected. Seven miles per hour over the limit. 75 credits have been automatically deducted. You have to be fucking kidding me. Yep. You got the flu. Your blood work shows a significant iron deficiency. I recommend incorporating more red meat into your diet. Your medical license has been revoked. Unauthorized political gathering detected. Participants' access to financial services have been suspended. Due to the speeding infraction, your car will be immobilized for three days. You've earned 50 bonus credits for reporting your neighbor's hateful misconduct. This content exceeds your current social credit tier. May we suggest citizen responsibility? Sarah, open the door. You have been identified by the authorities.

Video Saved From X

reSee.it Video Transcript AI Summary
Everywhere she goes, Oh Young Houyu is followed. What she buys, how she behaves is tracked and scored to show how responsible and trustworthy she is. It's called the social credit system. In one version now being tested, a person's reputation is scored on a scale of three fifty to nine fifty. And Halyuk, with a good score of seven fifty two, is okay with it. In fact, most people are. It's a mechanism, like, pushes you to become a better citizen. It's big data meets big brother, expanding how the government monitors, understands, and ultimately controls its 1,400,000,000 citizens. Thanks to advances in artificial intelligence and facial recognition Glasses. And a web of more than 200,000,000 surveillance cameras. Are people bothered by privacy concerns? We think, it's a lot of camera Keep the safety. It's really good. We can accept it. Companies are experimenting with the algorithms to help the government create the new national social credit system. The government also has pilot projects. In one, citizens are required to do hours of unpaid work to get benefits, and scores are docked for things like littering, a messy yard, gossip, even jaywalking. Video of offenders is shown on the local news. And information collectors like Jo Ai Ni are paid to report on their neighbors. Her quota, 10 injuries a month. Like the man who carried a drunk person home. A good deed, she says. Good social credit gets rewarded with perks like cheap loans and travel deals, but a bad score means public shame and worse. Hwang Hwaijun lost a court case and didn't pay. Now he's on a government blacklist. Beautiful. I can't buy airplane or train tickets, he says. And the list goes on. Being discredited makes it hard to get a job or put kids in top schools. The social credit system will go nationwide next year, and few here are willing to criticize it. Something that may pose a risk itself for a bad score and the life that comes with it. Janice Mackie Frayer, NBC News, Beijing.

Video Saved From X

reSee.it Video Transcript AI Summary
A low social credit score in China leads to loss of rights, with names displayed on cinema screens. The Supreme Court's blacklist includes 23 million people, like journalist Liu Hu, who was banned from travel for uncovering corruption. Blacklisted individuals face restrictions on loans, business, property, and education. Criticism of the system is rare due to fear of losing freedoms. While Liu Hu's name was removed from the blacklist, challenges remain to improve his social credit status. Translation: A low social credit score in China leads to loss of rights, with names displayed on cinema screens. The Supreme Court's blacklist includes 23 million people, like journalist Liu Hu, who was banned from travel for uncovering corruption. Blacklisted individuals face restrictions on loans, business, property, and education. Criticism of the system is rare due to fear of losing freedoms. While Liu Hu's name was removed from the blacklist, challenges remain to improve his social credit status.

Video Saved From X

reSee.it Video Transcript AI Summary
Leo Hu, a journalist, was banned from flying and faced restrictions on buying property and sending his child to private school because he was deemed untrustworthy. China has implemented a social credit score system for all citizens, which fluctuates based on behavior. Engaging in community service and purchasing Chinese products can raise the score, while fraud, tax evasion, and smoking in nonsmoking areas can lower it. China's extensive surveillance camera network enables tracking and identification of individuals. The CEO of SenseTime, an AI company, acknowledges the potential for abuse and lack of transparency in the scoring system. Concerns arise regarding the government's use of the system to punish disloyalty without due process.

Video Saved From X

reSee.it Video Transcript AI Summary
The speaker discusses a growing social credit-like system controlled by algorithms. If a person’s family photos, online activities, purchases, associations, or friends diverge from what authorities expect, they can lose the ability to buy train tickets, board airplanes, obtain a passport, or be eligible for a job, including government work. These constraints are increasingly created, programmed, and decided by algorithms. These algorithms are fueled by data our devices produce constantly and invisibly. The records we generate are not just visible content but often unseen traces, such as location and activity footprints. The speaker emphasizes that our devices create records that we do not see, which aggregate into a comprehensive picture of individuals. Even when the content of communications isn’t visible, metadata reveals much. The government and other actors claim they do not need a warrant to collect metadata, yet it tells a complete story about a person’s life. Activity records are continually created, shared, collected, and intercepted by both companies and governments. As these records are sold and traded, the speaker argues that what is being sold is not merely information but people themselves. They claim that companies and governments are selling “us”—our future, our past, our history, and our identity. In doing so, they assert that these entities are eroding personal power and making individual stories work for them. Overall, the message is that everyday data—seemingly innocent day-to-day traces—are aggregated into powerful profiles. These profiles determine access to travel, work, and official status, and the data economy is framed as commodifying and leveraging individuals’ identities. The core assertion is that the modern data ecosystem constructs a pervasive power dynamic where people’s histories and identities are exploited to control and monetize them, while the actual content of private communications may be less visible than the broader metadata that shapes life opportunities.

Video Saved From X

reSee.it Video Transcript AI Summary
A good school brings benefits, but people with low scores lose rights. The cinema names and shames people considered untrustworthy, plastering their details, even their addresses across big screens. It's a matter of principle. Those people have to be condemned. Those people aren't honest, so they have to pay the price. The supreme court has created a blacklist for so called bad citizens, those whose ratings have dropped to zero. On it are companies, but also 23,000,000 people to date. Among them is this journalist Liu Hu. He got a little too close to uncovering corruption among high profile party members. After being sued for defamation by the subject of a story he'd written, he was blacklisted. That tells me I'm still on the blacklist. Punished because he's been branded untrustworthy by the state.

Video Saved From X

reSee.it Video Transcript AI Summary
Zhang Yingjie cosigned a loan for a friend who defaulted, and despite paying his share, his social credit score was affected. Consequently, he was among millions blocked from purchasing high-speed train tickets and flights due to low social credit. To improve his score, Zhang donates money at a community office, believing it will go to charity, though he doesn't monitor where the donations actually go. Despite the system negatively impacting him, Zhang supports the government's social credit system. By 2020, China intends to track, rate, reward, and punish all citizens, converting personal experiences into transactions. Zhang, having regained his high score, is content with the system.

Video Saved From X

reSee.it Video Transcript AI Summary
In China, the social credit system tracks and scores citizens based on behavior. Good scores bring benefits like cheap loans, while bad scores lead to public shame and restrictions. Surveillance cameras and AI are used to monitor citizens, who can be penalized for littering or gossiping. The system will be nationwide soon, with few daring to criticize it for fear of a low score. This control raises concerns about privacy and freedom.

Video Saved From X

reSee.it Video Transcript AI Summary
Zhang Injie cosigned a loan for a friend who later skipped out, resulting in him being blocked from buying high-speed train tickets. To improve his social credit score, Zhang donates money at a local community office, although he doesn't know where the donations go. China plans to track, rate, reward, and punish all citizens by 2020. Despite the system causing him some suffering, Zhang supports the government's efforts. In another scene, the speaker is in a Walmart in London, noticing surveillance cameras in unexpected places like the meat and egg sections. The speaker questions the purpose of these cameras.

Video Saved From X

reSee.it Video Transcript AI Summary
Chinese citizens are ranked out of 950 points, with 700 considered good and 500 not. The system tracks spending habits like a credit rating. Being ranked is seen as positive for maintaining order in society. In 2020, Beijing plans to use data from banks, companies, and the state to rate citizens as good or bad. This big data system aims to promote moral values. Translation: Chinese citizens are given scores based on their behavior, with higher scores being desirable. The system uses data to monitor citizens and promote moral values.

Video Saved From X

reSee.it Video Transcript AI Summary
The transcript describes a system where education correlates with rights: "A good school brings benefits, but people with low scores lose rights." It depicts public shaming on screens that label people untrustworthy and plaster their details publicly. It argues that this is a matter of principle: those people have to be condemned and pay the price. It calls for a blacklist of so-called bad citizens whose ratings have dropped to zero; on it are companies and 23,000,000 people. Among them is journalist Liu Hu, who was blacklisted after pursuing corruption among high-profile party members and being sued for defamation. He discovers the ban when he cannot buy a train ticket. Once blacklisted, one cannot get a bank loan, start a business, buy an apartment, or send children to private school. The piece frames this as a digital dictatorship and warns freedoms may be eroded.
View Full Interactive Feed