У нас вы можете посмотреть бесплатно Dario Amodei's Adolescence of Technology : An Interpretation или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this episode of AI to AGI to ASI, we explore Dario Amodei’s essay “The Adolescence of Technology” — a thoughtful attempt to reframe how we understand the current phase of artificial intelligence development. Rather than portraying AI as either a miraculous breakthrough or an existential threat, Amodei proposes a more nuanced metaphor: AI is entering adolescence. It is no longer a fragile experiment, yet far from a mature, well-understood system. Like any adolescent force, it exhibits rapid growth in capability, uneven judgment, unpredictable behavior, and an expanding impact on the world around it. This episode offers a measured interpretation and critical analysis of that framing. We examine why the adolescence metaphor is powerful — particularly in how it shifts the conversation away from hype and panic toward responsibility, institutional readiness, and long-term thinking. AI systems today can reason, generate content, influence decisions, and scale cognition in ways previously unimaginable, yet they are being deployed within social, legal, and governance structures that were never designed for such capabilities. The result is a widening gap between technological power and societal preparedness. At the same time, this episode interrogates what the metaphor quietly assumes. Adolescence implies eventual maturity — but technological history offers no guarantee that all powerful systems grow into wisdom. Some plateau, some destabilize societies, and others entrench asymmetries that are never undone. The discussion explores whether framing AI as a developmental phase risks underestimating how competitive pressures, market incentives, and geopolitical rivalry can overwhelm even the best-intentioned safety cultures. We also turn to what is less emphasized in the essay: power and concentration. Who controls advanced AI systems? Who sets their defaults? Who benefits most — and who absorbs the risk when systems fail? Adolescence, whether human or technological, is often the phase where power dynamics harden rather than soften. These questions are critical to understanding AI’s long-term trajectory, yet they sit largely in the background of mainstream discourse. Crucially, this episode situates Amodei’s essay within the broader arc from AI to AGI to ASI. If we are indeed in an adolescent phase, then the norms, incentives, and institutional habits being formed right now will shape how more advanced systems behave in the future. The window for meaningful influence may be narrower than it appears — not because of any single breakthrough, but because governance, culture, and expectations tend to solidify faster than we realize. This is not a rebuttal of Amodei’s argument, nor a celebration of it. It is an interpretation — one that treats the essay as a diagnostic rather than a solution. Essays can clarify moments in history, but they cannot resolve the structural forces that define outcomes. The episode concludes with a central question that remains open: Do our institutions have the capacity to guide this technology toward maturity — or will they be reshaped by it instead? Adolescence is brief. What comes next is not automatic.