Русские видео

Сейчас в тренде

Иностранные видео




Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса ClipSaver.ru



OpenAI Just Lost Another Scientist—Warns AGI May Be Humanity’s Biggest Mistake

Is OpenAI pushing humanity toward disaster? Another top scientist just quit, warning that AGI—Artificial General Intelligence—could be our biggest mistake. Steven Adler, a former safety researcher at OpenAI, is the latest expert to walk away, saying the company is moving too fast without a clear plan to align AGI with human values. And he’s not alone—nearly half of OpenAI’s AGI safety team has resigned in just a few months. Their message? OpenAI isn’t doing enough to prevent AI from spiraling out of control. But why is OpenAI racing toward AGI despite these warnings? CEO Sam Altman has made it clear: the goal is to build AGI as fast as possible, no matter the risks. Meanwhile, AI pioneers like Geoffrey Hinton and Elon Musk are sounding the alarm, comparing AI’s dangers to nuclear weapons. The biggest issue? Alignment. Experts fear that OpenAI doesn’t know how to control AGI—and might not even be close to figuring it out. Former OpenAI scientist Paul Christiano warns there’s a 10-20% chance of AI taking over in ways we won’t even recognize. Some, like Daniel Kokotajlo, put the odds even higher—at 70%. With governments struggling to regulate AI and tech giants locked in an arms race, are we already past the point of no return? Could AGI be the end of human control over intelligence? Should we stop AGI development before it’s too late? Or is the future already out of our hands? Let us know your thoughts in the comments. Searchable questions: Why are OpenAI scientists quitting? Is AGI dangerous? Will AI take over the world? What is OpenAI's AGI plan? Can AI be controlled? Should we stop AGI development? What happens if AGI goes wrong? #agi #ai #ainews ==================================== 🌟 For sponsorship inquiries contact us at: [email protected]

Comments