У нас вы можете посмотреть бесплатно AI Trained on the Internet. Now It's Destroying It. или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Sign up to our newsletter 👉 https://absolutelyagentic.com/?modal=... In 2023, StackOverflow was the largest programming knowledge base on Earth, built over 15 years by millions of developers. By 2025, question volume had collapsed by 78%, gutted by the very AI models trained on its content. It's not just StackOverflow - Chegg lost 99% of its stock value, and publishers lost a third of their search traffic. But the real crisis runs deeper: researchers have discovered that when AI trains on AI-generated content, its outputs degrade rapidly in a process called model collapse. AI may be devouring the very ecosystem it needs to survive, and high-quality human data could be exhausted by 2028. This video explores what happens when the internet starts eating itself. Chapters 00:00 - Intro 01:58 - The Substrate 04:54 - Model Collapse 08:00 - Ouroboros and the Casualties 12:25 - The Counter Case Sources to Google Ilia Shumailov - AI Models Collapse When Trained on Recursively Generated Data (Nature, 2024) Rice University - Model Autophagy Disorder (MAD) Study Epoch AI - Will We Run Out of Data? Limits of LLM Scaling (2024) Emily Bender - On the Dangers of Stochastic Parrots Press Gazette - Publishers Lost a Third of Search Traffic to AI Summaries University of Maryland - Reliability of AI-Generated Text Detection Tools University of Waterloo - Growth of Synthetic Content on the Web #AI #DeadInternetTheory #ModelCollapse #AITraining #ArtificialIntelligence #AbsolutelyAgentic