У нас вы можете посмотреть бесплатно Distributed Training & The "Nation-State" Data Center Problem | Steffen Cruz @ NeurIPS 2025 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Are we reaching the limits of centralized compute? In this interview from NeurIPS 2025, Steffen Cruz (Founder, @MacrocosmosAI) argues that the future of AI isn't in data centers the size of small countries—it's in distributed, decentralized training. Steffen discusses the engineering challenges of training models over the open internet, the inspiration he drew from Bittensor, and why he believes the current "brute force" era of AI scaling is approaching a tipping point. He also shares his predictions for 2026 and explains why specialized, distributed models might soon go toe-to-toe with frontier models. Timestamps: 0:00 Intro & Meeting Heroes (Richard Sutton) 0:58 What is Macrocosmos? (Distributed Training Explained) 1:36 The #1 Bottleneck: Internet Latency vs. Interconnects 2:25 How Bittensor Sparked the Idea 3:43 Predictions for Distributed Learning in 2026 4:28 Why Researchers Look at Him with "Shock and Horror" 5:34 Physics vs. AI: Why Current AI is "Scrappy" & "Brute Forced" 7:00 The "Nation-State" Data Center Problem #NeurIPS2025 #ArtificialIntelligence #DistributedComputing #Bittensor #MachineLearning #Macrocosmos #OpenSourceAI