У нас вы можете посмотреть бесплатно Three Labs Just Stole Claude's Brain. Here's What It Broke (And Why You Should Care) или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
My site: https://natebjones.com Full Story w/ Prompts: https://natesnewsletter.substack.com/... _______________________________________ What's really happening when three Chinese labs run 16 million automated conversations across 24,000 fake accounts to steal Claude's capabilities? The common story is Cold War espionage—but the reality is more interesting when you recognize this is a Napster problem, and the thousand-to-one economics of extraction apply to everyone on earth. In this video, I share the inside scoop on why distillation changes how you should evaluate every AI tool you're using: • Why $2 million in API costs can extract capabilities that cost $2 billion to develop • How distilled models occupy narrower capability manifolds that break on agentic work • What the "off-manifold probe" reveals that no benchmark captures • Where the performance shadow between frontier and distilled models is widest For anyone building real systems on AI, the provenance of a model is not just an ethical question—it's a capability question, and where the weights come from determines how the model breaks. Chapters 00:00 Anthropic Caught Three Labs Stealing Claude's Brain 02:34 Distilled Models Are Systematically Worse in Unmeasured Ways 04:23 The Cold War Framing Is Incomplete 06:56 What Distillation Actually Does to Intelligence 10:27 The Brittleness Problem: Narrower Manifolds 13:21 Why Kimi K2 Breaks on Sustained Agentic Work 16:30 A Framework: Task Scope vs Model Provenance 20:50 The Thousand-to-One Economics of Extraction 24:51 Hydra Networks and Operational Sophistication 27:15 Speed Bumps and the Time Edge 28:40 The Universal Incentive to Distill 31:35 Talent Acquisition Operates on the Same Principle 33:46 What This Means for the Tools You're Using Now 35:23 The Off-Manifold Probe: Testing for Generality Subscribe for daily AI strategy and news. For deeper playbooks and analysis: https://natesnewsletter.substack.com/