TruthArchive.ai - Related Video Feed

Video Saved From X

reSee.it Video Transcript AI Summary
Since 2013, mobile devices are now the primary focus, with smartphones constantly emitting signals to cell towers even when idle. These signals contain unique identifiers like IMEI and IMSI, allowing tracking of a user's movements. Companies store this data for unknown purposes, leading to privacy concerns and mass surveillance through bulk collection.

Video Saved From X

reSee.it Video Transcript AI Summary
Above Phone is presented as a privacy-focused alternative to standard smartphones, giving users complete control over their phone and apps. It claims to function without tracking, forced logins, or advertising. The phone is compatible with any cell service and allows private app downloads. The AboveSuite includes a VPN, private email and calendar, private chats, video calls, Internet phone, and a search engine. New users receive a free 45-minute live call for support, along with free email and chat support, guides, and video courses.

Video Saved From X

reSee.it Video Transcript AI Summary
Eric Prince and Tucker Carlson discuss what they describe as pervasive, ongoing phone and device surveillance. They say that a study of devices—including Google Mobile Services on Android and iPhones—shows a spike in data leaving the phone around 3 AM, amounting to about 50 megabytes, effectively the phone “dialing home to the mother ship” and exporting “all of your goings on.” They describe “pillow talk” and other private interactions being transmitted, and claim that even apps like WhatsApp, which is marketed as end-to-end encrypted, ultimately have data that is “sliced and diced and analyzed and used to push … advertising” once it passes through servers. They argue that this surveillance is not limited to phones but extends to other devices in the home, including Amazon’s Alexa and automobiles, which they say now have trackers and can trigger a kill switch, with recording of audio and, in many cases, video. The speakers contend this situation represents a monopoly by a handful of big tech companies that can use the collected data to control markets, dominate, and vertically integrate the economy, potentially shutting down competitors. They connect this to broader concerns about political power, claiming that the data profiles built on individuals enable manipulation of public opinion, messaging, and even election outcomes. They reference banking data, noting that banks like Chase have announced selling customers’ purchasing histories to other companies, as part of what they call a broader data-driven power shift. The discussion expands to warnings about a “technological breakaway civilization” operating illegally and interfaced with private intelligence agencies to manipulate, censor, and steal elections. They argue that AI, capable of trillions of calculations per second, magnifies these risks and increases the ability to take control of civilization. They reference geopolitical events, such as China’s blockade of Taiwan, and claim that microchips sold internationally have kill switches that could disable critical military and infrastructure. They speculate about the capabilities of NSA, Chinese, Russian, or hacker groups to exploit this vulnerability, describing a world in which the infrastructure is exposed like Swiss cheese to criminals and governments. Throughout, the speakers criticize the idea that technology is neutral, asserting instead that it has been hijacked by corrupt governments and corporations. They contrast these concerns with Google’s founding motto “don’t be evil,” claiming it was contradicted by later documents showing CIA involvement and In-Q-Tel’s role, and they warn that a social-credit, cashless society rollout could be enforced by private devices rather than drones or troops. The segment emphasizes education of Congress, state attorneys general, and the public about these supposed threats. Note: Promotional product endorsements and sponsor requests in the transcript have been omitted from this summary.

Video Saved From X

reSee.it Video Transcript AI Summary
Smartphones are constantly connected to cellular towers, even when the screen is off. They emit radio frequency emissions to communicate with the nearest tower, creating a record of the phone's presence. This data is stored and can be accessed by companies and governments for surveillance purposes. The problem is that users have no control or visibility over what their phones are doing at any given time. Hacking is a common method used to gain access to devices, allowing attackers to control and collect personal information. Companies like Google and Facebook also collect and store user data, which can be accessed by governments. The lack of transparency and control over data collection poses a threat to privacy and individual power. Trust in technology is limited.

Video Saved From X

reSee.it Video Transcript AI Summary
Good morning. John McAfee here. Let’s talk about privacy. If you think encrypted systems like ProtonMail or Signal offer you privacy, you’re mistaken. Encryption was designed to prevent man-in-the-middle attacks, but that’s no longer the issue. Your smartphone is the primary surveillance tool for governments worldwide. Malware can easily be installed just by visiting certain websites, allowing attackers to monitor your inputs and outputs, rendering encryption ineffective. I use Gmail because it requires a subpoena for information, giving their lawyers 30 days to review it. That’s enough time for me to change my email frequently. Wake up—privacy is a myth, and encryption is outdated technology being falsely marketed as safe. Thank you for listening.

Video Saved From X

reSee.it Video Transcript AI Summary
Your phone is not just a phone. It is the result of research that captures your attention, creating a power imbalance where you are unaware that you are being constantly monitored. They gather maximum information about you, surveilling you 24/7. In return, they know you so well that they can not only predict things about you but also manipulate your behavior. The internet of things will do the same.

Video Saved From X

reSee.it Video Transcript AI Summary
Above Phone offers a privacy-focused alternative to standard smartphones, giving users complete control over their phone and apps without tracking. It works with any cell service and requires no forced logins. The phone includes AboveSuite, featuring a VPN, private email and calendar, private chats, video calls, Internet phone, and a search engine. Users can download apps privately and control their phone with secure hardware. New users receive a free 45-minute live call, along with free email and chat support, guides, and video courses.

Video Saved From X

reSee.it Video Transcript AI Summary
This phone is not a nostalgia product, but a gadget for hacking, independence, and anonymity. It is compact and lightweight, weighing only three ounces.

Video Saved From X

reSee.it Video Transcript AI Summary
Anything you've ever said or done in the vicinity of your phone's camera or microphone, everything you've ever put into your phone, emails, text messages, Snapchat, Twitter, whatever, You search queries on Google, every embarrassing health search, every embarrassing text conversation with the significant other, every nude photograph people may not have taken, any search. They know where you are at all times. They know where you go and when. They know what you buy. They have access to your bank account. AI will literally know everything about you. They can create fake platforms that look real or rather fake people. And imagine if they were talking to you and they passed the Turing test, you know it's AI. It's like total, like, rape of everybody by the system forever. It's not good.

Video Saved From X

reSee.it Video Transcript AI Summary
This Black Friday, elevate your technology with Above's secure, open-source phone and laptop solutions. The Above phone is compatible with any cell service and popular apps, while the Abovebook offers user-friendly software and reliable hardware. Above suite ensures your online privacy with a VPN, email, calendar, video conferencing, encrypted chat, search engine, and Internet phone number, all for $100 a year. These services sync across devices, and each purchase includes a 45-minute free support call, plus access to guides and video courses. Enjoy our Black Friday sale with $100 off all devices, an extra $100 off when buying two or more, and an additional $400 off when purchasing four. Visit abovephone.com/blackfriday to enhance your tech experience this season.

Video Saved From X

reSee.it Video Transcript AI Summary
John McAfee states that encrypted systems like ProtonMail and Signal offer no privacy because smartphones are surveillance devices. Malware can be easily planted via websites like Pornhub, watching inputs before encryption and reading outputs after encryption. Encryption is old technology marketed as safe but is now worthless. McAfee uses Gmail because Google requires a subpoena to release information and their lawyers have 30 days to review it, which is enough time for him to change his email. He changes his email every 15 days. He believes people are being sold a false sense of security with encryption.

Video Saved From X

reSee.it Video Transcript AI Summary
Speaker 0 asserts that there is no security whatsoever and that cybersecurity professionals face this problem daily. They state that while people are watching their phones, their phones are watching them. The operating system is designed to watch and listen to users, to know who their friends are, what is being said in text messages, and to listen at times. They claim that, although people look at their phones and it has many facilities, it is the world’s greatest spy device, designed as a spy device. Now, this.

Video Saved From X

reSee.it Video Transcript AI Summary
A spyware called Pegasus can bypass phone security, access messages, photos, videos, microphone, camera, GPS, and more without detection. It infects iOS and Android through unknown vulnerabilities. NSO Group, an Israeli company, sells Pegasus to government clients worldwide. Leaked records show widespread abuse of Pegasus for surveillance. This invasion of privacy threatens democracy by enabling oppressive regimes to control populations. The software undermines the notion of phone security and poses a significant threat to personal privacy and freedom.

Video Saved From X

reSee.it Video Transcript AI Summary
A data broker, Near Intelligence, with ties to US Defense Contractors, tracked cell phones of visitors to Jeffrey Epstein's island over a three-year period. We found that Near Intelligence left this data exposed online. The maps generated show visitors' movements, potentially leading back to their homes and workplaces. The data reveals visitors came from over 166 locations in the US and abroad. Near Intelligence sources data from advertising exchanges. Before a targeted ad appears, your phone sends data, including location, to ad exchanges. Near Intelligence siphons this data, repackages, analyzes, and sells it. Despite its intended use for advertising, Near Intelligence has provided this data to the US military. Anyone with a phone can be tracked. To protect your privacy, use trusted apps, turn off location services, use ad blockers, and use VPNs that filter out advertising technology.

Video Saved From X

reSee.it Video Transcript AI Summary
A data broker tracked cell phones of visitors to Jeffrey Epstein's island, exposing their data online. Near Intelligence, linked to US defense contractors, meticulously monitored visitors' movements over 3 years. The data revealed locations in the US and other countries. Near Intelligence sources data from advertising exchanges, selling it for targeted ads and possibly to the military. This highlights the potential for mass surveillance through ad tech. While smartphone users can be tracked, steps like using trusted apps, disabling location services, and using VPNs can help protect privacy.

Video Saved From X

reSee.it Video Transcript AI Summary
Google recently auto-installed on Android 9+ a component called Android System Safety Core, which sparked panic because its purpose wasn’t clearly explained. The transcript outlines the following points: Google says the component is for sensitive content warnings and, generally, “performs classification of media to help users detect unwanted content.” It then presents contrasting views from self-described experts. The GrapheneOS maintainers published a post on X stating that SafetyCore “doesn’t provide client side scanning and is mainly designed to offer on device machine learning models that can be used by other applications to classify content as spam, scam, or malware.” The speaker, however, rejects this explanation as “the biggest pack of lies from Shields of Big Tech that lay claim to cybersecurity knowledge,” asserting that the feature is clearly about client-side scanning. The speaker claims that the true purpose is client-side scanning, and that any justification portraying it as a benign feature is false. They express frustration with what they describe as widespread misinformation intended to reassure users that they have nothing to worry about. They insist there is a lot to worry about, emphasizing that Safety Core is, in their view, about client-side scanning and is being framed as a feature users have always needed. The speaker contends that they had anticipated the module “for a long time,” suggesting it was inevitable and tying it to a broader concept they call the “see what you see technology,” which they say is directly connected to AI. They argue that this module completes a circle by bringing all of big tech into client-side scanning. Finally, the speaker warns that to understand how this impacts users, one should stay attentive, implying that the development will have significant and widespread effects.

Video Saved From X

reSee.it Video Transcript AI Summary
Apple's upcoming upgrade will integrate ChatGPT into every iPhone, enabling the collection and analysis of user data. A side-by-side test revealed that both Google and Apple phones transmit significant data dumps, around 50 megabytes, between 2 and 3 AM nightly, sharing user preferences and daily activities. By age 13, an average American child has had 72 million data points collected on them by big tech, tracked through a unique 32-digit advertising ID. This ID allows companies to monitor device locations for targeted advertising and sales. The goal of unplugged communication is to help people connect without surrendering their digital data to tech companies. Some individuals prefer to remain uninformed and compliant, while others seek to protect their privacy.

Lex Fridman Podcast

Pavel Durov: Telegram, Freedom, Censorship, Money, Power & Human Nature | Lex Fridman Podcast #482
reSee.it Podcast Summary
Telegram founder Pavel Durov describes a life devoted to freedom of speech, privacy, and human connection in a world where governments and corporations push to centralize information. He recounts the France arrest and prolonged investigation that tested Telegram’s mission, the Moldova and Romania interactions, and the broader struggle to keep private messages unreadable to authorities. He argues that Telegram must endure pressure rather than compromise user rights, even at great personal cost. Beyond politics, Durov shares a philosophy shaped by early hardship and relentless discipline. Fear and greed, he says, are freedom’s chief enemies; living with mortality, embracing arduous routines, and avoiding intoxicants fuel clarity of mind. He describes a life of 300 push-ups and 300 squats each morning, long daily workouts, and a habit of thinking deeply in quiet moments before the world intrudes. This self-control underwrites his stance against surveillance capitalism and overbearing regulators. Technically, Telegram stays lean by design. The engineering team is about forty people, yet the company out-innovates rivals through automation, distributed data storage, and a focus on speed. Privacy is built in: no employee can read private messages, data is encrypted across geographies, and open-source reproducible builds ensure verifiable security. Telegram’s servers compose a self-authored stack, minimizing external dependencies, while users can opt into end-to-end encrypted secret chats with trade-offs on history and collaboration. Business strategy blends subscription, context-based advertising, and ecosystem building. Telegram Premium attracts millions of paid subscribers, while channels and groups provide non-personal ad inventory. Telegram also explores blockchain with TON and a growing open-network ecosystem; gifts, username ownership, and a thriving bot platform monetize creator activity without harvesting user data. He notes that the company would shut down in a country rather than surrender privacy, reinforcing a principle that freedom and trust trump revenue. On geopolitics and governance, Durov recounts arrests, bans, and investigations across France, Russia, Iran, and Moldova. He describes a 2018 poisoning scare as a rare personal crisis that intensified his resolve to defend privacy. He argues that censorship begets power for authorities while eroding civil liberty, and that a platform should enable diverse voices rather than align with any government. He emphasizes the public’s right to speak, assemble, and access information, even amid conflict, and he calls for competitive, entrepreneurship-friendly policy in Europe.

Possible Podcast

Nick Thompson on our AI future
Guests: Nick Thompson
reSee.it Podcast Summary
Artificial avatars loom as a new form of presence, capable of extending reach, preserving a voice, and even offering a form of longevity. In this Possible episode, Reed Hoffman and Nick Thompson discuss avatars trained on a person’s writings and speeches, and Reed’s own digital twin. Avatars could perform tasks more efficiently and keep conversation alive after death, while the idea of conversing with a preserved voice unsettles some listeners. The vision includes blending attitudes from multiple versions rather than fixing on a single age. Beyond avatars, the conversation turns to the purpose of the Possible podcast: to chart a future that is ambitious yet grounded. Hoffman describes technology as Homo techne, a shift from physical to cognitive powers, with an Entourage of Agents that people will orchestrate in daily life. These agents will be multiple, each serving roles like historian, skeptic, or guide, forming a cabinet of experts to tackle work, learning, and life choices. The speakers acknowledge a real moment in technology and imagine rapid change in five to ten years. They discuss democracy and journalism: deep concerns about the business model of journalism and the risk of misinformation, while recognizing AI can be a defensive tool and a catalyst for collective learning if agents include built-in fact-checking. They describe efforts to enhance empathy through AI, such as Speak Easy and the Pi agent from Inflection, aiming to guide conversations away from hostility and toward common ground before debates. They critique the idea of a single friendly voice and argue for a suite of agents to preserve human agency. On memory, privacy, and data use, they discuss recall features that could remember everything on a device, along with security and ownership concerns. The tradeoffs between utility and surveillance emerge clearly: memory could amplify productivity, but unauthorized access risks catastrophic harm. Hoffman's perspective emphasizes governance and self-regulation, while Thompson reflects on the phone’s omnipresent data and the balance between convenience and risk. They mention the Earth Species Project translating animal communication and ponder a future where AI translates languages beyond humans, then return to a hopeful note: if conditions align, AI could widen equality and strengthen democracy.

Tucker Carlson

How to Stop the Government From Spying on You, Explained by a Digital Privacy Expert
reSee.it Podcast Summary
Yannick Schrade discusses privacy as a fundamental aspect of freedom, describing encryption as a built‑in asymmetry in the universe that keeps secrets safe even under immense coercion. The conversation centers on making computations private as well as data, proposing architectures that allow multiple parties to compute over encrypted inputs without revealing them. Yannick explains his background, his European experience with data protection laws, and the founding of Archium to push private, scalable computing. He contrasts end‑to‑end encryption with the broader threat of device and platform compromises, emphasizing that the security of a message is limited by the security of the end devices and the supply chain. The talk then covers practical privacy measures, such as open‑source tools like Signal, hardware trust models, and the idea of distributing trust across many devices to avoid single points of failure. They examine the limitations of current consumer devices, the risk of backdoors, and the need for legal and technical frameworks to prevent blanket surveillance, including objections to backdoors and “client‑side scanning” proposals in the EU and effectively mandatory surveillance regimes. The discussion expands to the tension between private cryptography and state power, noting Snowden’s revelations about backdoored standards and the global cryptography ecosystem where cryptographers and independent researchers help identify weaknesses, even when governments push standardization. They explore the consequences of surveillance for finance, money flows, and the blockchain ecosystem, explaining pseudonymity in Bitcoin and the privacy shortcomings of public ledgers, as well as the potential for private, verifiable computations that preserve data ownership while enabling secure healthcare analytics and national security applications. The hosts and Yannick debate the inevitability of privacy‑preserving technology, the real risks of centralized control, and the possibility of a more decentralized, verifiable, privacy‑enhanced future. The conversation closes with reflections on who should own and regulate such technologies, the role of investors in privacy‑centric ventures, and a forward-looking optimism about a utopian direction if privacy tech can clearly demonstrate superior utility and safety.

The Megyn Kelly Show

Left Falsely Blames Right For House Fire & Data Privacy Issues, w/ Lowry & Cooke, Erik Prince & Weil
Guests: Lowry, Cooke, Erik Prince, Weil
reSee.it Podcast Summary
An explosive thread of political blame unfurls after a South Carolina circuit judge’s home catches fire. Diane Goodstein had recently blocked the release of voter files to the DOJ amid a Trump-backed effort to curb non-citizen registration. The blaze on a water-framed property injured her husband, Arie, and possibly others; he was airlifted with multiple fractures. Authorities later said there was no evidence the fire was intentionally set. The episode becomes the centerpiece as Dan Goldman accuses Trump-era figures of doxxing judges and stoking violence, a claim debated by the panel. Media and political reactions unfold in real time. Goldman’s tweet linking the fire to 'mega' supporters is challenged by Rich Lowry and Charlie Cook, who warn against rushing to conclusions. Nerra Tanden retweets commentary tying previous criticism of officials to the blaze, while outlets such as People and Newsweek frame the incident as a Trump-opposition story. The hosts argue there’s a pattern of one-sided coverage and call for restraint, noting killings linked to political violence on both sides while criticizing how left-leaning voices frame events for political gain. Attention shifts to Virginia, where Jay Jones’s text exchanges reveal a willingness that opponents die for policy ends. The messages include references to shooting and 'two bullets in the brain,' followed by denials that minimize the episode, while a local investigation corroborates past controversial remarks about policing. The panel stresses such a worldview would be disqualifying for a top law officer, and notes that Democratic leaders have not uniformly called for his resignation, contrasting reaction to similar episodes in other races. The discussion highlights concerns about accountability and political violence language. On privacy and power, the interview with Eric Prince and Joe While centers on surveillance capitalism and the limits of data collection. They describe how apps continually transmit location and behavior to data harvesters, arguing the current phone ecosystem leaves citizens exposed to advertising networks and potential government access. Their privacy-focused Ups phone is presented as an alternative with encryption, a data-only SIM, and a hard-wipe function. The discussion emphasizes that while such devices reduce exposure, total privacy remains complicated by telecom infrastructure and legal frameworks.

Coldfusion

Apple vs Facebook - The Great Privacy Fight
reSee.it Podcast Summary
In the early days of the internet, possibilities seemed endless, but corporate monopolies now exploit user data for profit. Apple has introduced features in iOS 14 and 14.5 that enhance user privacy by allowing users to see what data apps collect and to opt out of tracking. This directly challenges Facebook's business model, which relies on targeted advertising. Zuckerberg has expressed concern over potential impacts on small businesses and profitability. Apple's moves could set trends in user privacy, but the long-term effects on the internet remain uncertain.

The Diary of a CEO

Top CIA Security Advisor: Jeffrey Epstein Epstein Was A Made Up Person & They Can See Your Messages!
Guests: Gavin de Becker
reSee.it Podcast Summary
The episode features a candid conversation with Gavin de Becker about high‑stakes security work, global power dynamics, and the fragility of privacy in the digital age. Gavin describes the core mission of his company as anti‑assassination, detailing threat assessment, protective coverage, and risk management for some of the world’s most influential figures. He argues that modern smartphones are endlessly vulnerable to state and nonstate actors, explaining that even with frequent software updates, no solution can guarantee confidentiality as long as powerful actors pursue access. The discussion expands beyond personal safety to consider how intelligence and blackmail can shape public behavior, influence decisions, and quietly steer politics and finance. Throughout, the host steers the conversation toward how individuals can navigate a world where information is contested, sources are questioned, and truth is often filtered or redacted. The dialogue weaves in firsthand anecdotes about famous clients and notable incidents, including allegations of intimate leverage used to control public figures, and it interrogates how media coverage—whether about Epstein, Bezos, or other luminaries—can be weaponized to create narratives that endure beyond the facts. The guests touch on the ethics and responsibilities of public life, noting that truth often competes with national security claims, and they discuss why transparency about complex, sensitive events remains controversial. The conversation then broadens to philosophical questions about reality in the age of AI: how technologies can blur lines between genuine experience and simulated content, and why intuition and human connection remain crucial for safety, trust, and meaningful interaction. As the hosts and guest explore personal stories—childhood, resilience, and the drive to serve others—they frame a pragmatic set of lessons: listen to intuition, act with integrity, and allow goals to unfold downstream rather than forcing rigid outcomes. The episode closes with reflections on small‑scale governance, subsidiarity, and the enduring value of authentic human contact in a world of rapid technological change.

Generative Now

Scott Belsky: Content Creators, Creativity, and Marketing in the AI Landscape
Guests: Scott Belsky
reSee.it Podcast Summary
Generative AI is not merely a tool for tweaking images or drafting copy; Scott Belsky explains how it reshapes creativity, marketing, and the very economics of content. In a conversation recorded after the Robin Hood AI Summit, he and the host unpack how AI shifts who can create, what counts as originality, and whether the flood of automated output will drown or elevate human ideas. The discussion repeatedly returns to tensions between democratization and rising expectations. Creatives find that novelty often leads to utility, using AI for mood boards, then discovering commercial possibilities. Belsky argues that the real challenge is whether AI democratizes or commoditizes creativity, and how surface area of exploration shapes outcomes. As brands flood social feeds with automatically generated variants, the demand for authentic, emotionally resonant work rises, making the creator's ability to tell a distinctive story more valuable than ever. On platforms and governance, the conversation shifts to regulation, licensing, and the provenance of models. Adobe argues that outputs should carry credentials indicating training data sources, and that brands will prefer models trained on licensed content for commercial work. The company points to Adobe Stock as an example of licensed training, and suggests a future where assets carry verifiable model-origin metadata to enable trust and compliance. Beyond compliance, the dialogue explores personal agents and the next wave of AI helpers. On-device, privacy-preserving agents could manage communications, shopping, and routines while surfacing safer choices and warnings. The vision extends to small businesses benefiting from AI-assisted decision making, allowing a five-person team to reach revenue levels once reserved for larger firms. The optimism rests on human ingenuity unlocking higher-order work as lower-order tasks become automated.

Generative Now

Josh Mohrer: Is the Future of AI Businesses A Solo Pursuit?
Guests: Josh Mohrer
reSee.it Podcast Summary
Wave started as a simple idea: record long meetings, doctor visits, or any conversation and return a concise, accurate summary. Josh Mohrer, who built Uber’s New York operations and later ran Lot 18 and the Infatuation partnerships, built Wave as a solo founder, powered by AI. Based in New York, he emphasizes that the company is essentially one person, with contractors and a small team, and that his background in e-commerce, marketing, and operations shaped how he approached product, growth, and customer support. He recounts how he left Levels Health to re-enter operational work, learned modern tooling such as Retool and React Native, and pivoted toward building an app that could transcribe and summarize audio. He recalls testing with his dad, a doctor, who found the summaries highly accurate and useful, and the early prototype evolved over 18 months into a mobile-first product capable of recording multi-hour sessions in the background. He notes that ChatGPT-era access to coding help accelerated progress but required learning servers and workflows. Despite being the sole engineer, he hired one engineer to rebuild the app in Swift for better Apple performance, while he continues to handle support personally to maintain high signal feedback. Wave’s growth appears to be user-driven: about 7,000 hours of usage per day on weekdays, 2,000 on weekends, and a majority of users applying the tool to work contexts. He frames himself as a cybernetic shopkeeper selling AI, embracing constraints of solo operation and valuing ownership, cash-flow, and the potential for a future sale or larger venture. On the technology front, he argues that AI acts as an amplifier, transforming how engineers write code and how products are integrated. He discusses the shift from SDK abstractions to direct API calls in an AI-enabled world and shares how he uses AI to power internal tools, support workflows, and even privacy and security considerations, including plans for SOC 2 compliance and data storage on Google Cloud. He remains optimistic about consumer AI adoption while noting that truly agentic personal assistants for everyday life may be farther out than some hype suggests.
View Full Interactive Feed