У нас вы можете посмотреть бесплатно The Post-Transformer Era: AI's Next Frontier | NYU x Pathway или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
What comes after Transformers? The Transformer architecture behind GPT has dominated AI for nearly a decade. But cracks are showing. Transformer-based models have no continuous learning (frozen in time, like Groundhog Day), limited context windows, and compute costs that spiral as reasoning gets longer. What would it take for the next generation of frontier models to do long-horizon reasoning? To learn continuously? Generalize from experience? Join leading researchers from NYU Tandon and Pathway as they explore where AI is headed, from the algorithmic foundations to emerging architectures challenging the Transformer's dominance. 👤 SPEAKERS Martín Farach-Colton — Chair, CSE, NYU Tandon | ACM/IEEE/SIAM Fellow Julian Togelius — Professor, NYU Tandon | IEEE Fellow, Head of AI at Nof1, Sr Program Committee Member, NeurIPS Adrian Kosowski — CSO & Co-Founder, Pathway | PhD at 20, 100+ papers Zuzanna Stamirowska (Moderator) — CEO & Co-Founder, Pathway | PhD in Complexity Science 📌 TOPICS COVERED – Why Transformer-based models hit a wall: memory limitations preventing long-context reasoning, losing coherence rapidly, and a fundamental scaling wall despite massive investment – Emerging paradigms: sparse activations, Hebbian plasticity, and brain-inspired architectures – The path to continual learning and interpretable AI – Live Q&A with frontier researchers from New York University and Pathway Organized by lead researchers from NYU Tandon and Pathway, in collaboration with Technical Councils at select IITs. 🏷️ TAGS #PostTransformer #AI #MachineLearning #NYU #Pathway #FutureOfAI #ContinualLearning #airesearch 00:00 Introduction to the Post-Transformer Era 00:17 Background and Key Predictions 01:07 Meet the Experts 05:25 Challenges with Current AI Models 10:12 The Future of AI: Beyond Transformers 19:11 Reasoning and Its Importance in AI 22:51 AI in Games: A Testbed for Reasoning 29:25 Bridging the Gap: Spatial vs. Language Reasoning 34:28 Understanding Transformer Architectures 36:40 The Role of Theoretical Computer Science in AI 40:32 Challenges in AI: Memory and Learning 50:53 Continual Learning and Its Complexities 57:53 Future Directions in AI and Robotics 01:09:34 Concluding Thoughts on AI Progress