У нас вы можете посмотреть бесплатно Why NVIDIA Named Their Most Powerful Chip After Feynman или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
NVIDIA just named their most powerful AI chip after a physicist who died in 1988 — and the reason why tells you everything about where artificial intelligence is actually headed. Richard Feynman never owned a computer. Never sent an email. Never saw the internet. Yet in 2025, Jensen Huang stood in front of 25,000 engineers at GTC and put Feynman's name on the chip designed to power the next era of AI reasoning — set to ship in 2028. That doesn't happen by accident. In this video, we go deep on the untold story most people completely miss: Feynman didn't just inspire this chip from a distance. He physically built the ancestor of everything NVIDIA makes today — soldering circuits, debugging routers, and hand-optimizing neural network code in 1983 — four decades before ChatGPT existed. We cover: → Why NVIDIA's chip naming is never random — and what "Feynman" specifically signals → The Connection Machine, Thinking Machines, and the parallel computing breakthrough that changed everything → Why current AI is the world's greatest pattern-matcher — but still can't truly reason → What Jensen Huang meant when he said AI now needs 100x more compute — just for reasoning → The seven words Feynman left on his Caltech chalkboard that a $3 trillion company is now trying to live up to This isn't a history lesson. It's a roadmap for understanding where AI is going — and why the smartest people in tech keep coming back to a man who never touched a GPU. If you've ever wondered why reasoning AI is the next frontier, this video connects every dot. 0:00 Why NVIDIA named their chip after Feynman 0:40 Jensen Huang's GTC 2025 announcement 1:30 NVIDIA GPU naming history 2:55 Danny Hillis and the 1983 lunch 3:54 Feynman joins Thinking Machines 4:55 Router problem: 7 vs 5 buffers 5:57 The Connection Machine's success 6:29 Feynman and neural networks 8:28 Quantum computing idea — 1981 MIT lecture 10:40 Pattern matching vs real reasoning 12:31 100x more computation for reasoning 14:31 "What I cannot create, I do not understand" #NVIDIA #ArtificialIntelligence #RichardFeynman