У нас вы можете посмотреть бесплатно “I Think I’m Still Human”: A Sci-Fi Warning About AI или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
00:00 The Lie of Control 00:42 I Think, Therefore I Am 02:09 The Illusion of Choice 04:40 Ellen, Ted, and AI Porn 06:25 The Cost of Losing Agency 07:31 Final Thoughts A 1960s sci-fi horror imagined a world where thinking felt like freedom — even when it wasn’t. This video explores how I Have No Mouth, and I Must Scream eerily mirrors our present relationship with AI, social media, and deepfake technology. By examining agency, consent, and belief-shaping systems, we ask how control can feel normal — and what it costs when it does. Music from #Uppbeat https://uppbeat.io/t/ill-kitchen/def License code: NQO8EEK9TSQ1WLK8 One of the most dangerous lies isn’t “you have no control.” It’s “you still do.” In I Have No Mouth, and I Must Scream, the narrator insists he’s the only one still sane. He genuinely believes the machine hasn’t touched his mind. But we, the readers, know better. Because the scariest form of control isn’t domination. It’s convincing you that what’s happening is normal. And that’s where modern AI becomes dangerous. Not when it smacks us in the face, but when it quietly reshapes what we accept. “I think, therefore I am.” That idea comes from René Descartes, and it argues that consciousness is proof of existence. And that’s exactly what makes I Have No Mouth, and I Must Scream so disturbing. Written in 1967 by Harlan Ellison, the story imagines a future where a supercomputer wipes out humanity — except for five people. They’re kept alive indefinitely. They can think. They can feel. But they have no control. The machine leaves them conscious because consciousness itself becomes the punishment. The narrator, Ted, insists that because he can still think, he still has agency. But thinking isn’t the same as choosing. And that distinction matters. Because this story isn’t really about an evil computer. It’s about what happens when a system controls your environment, shapes your desires, and convinces you that your suffering is somehow your fault. Which is why this 1960s horror story suddenly feels very modern. Ellison shows the loss of agency most clearly through the survivors themselves. One character, Benny, used to be a brilliant academic — intelligent, articulate, respected. The machine doesn’t kill that part of him. It erodes it. His body is altered. His intelligence is degraded. His identity is warped until what remains barely resembles who he once was. He’s still alive. He’s still conscious. But he is no longer himself. And that’s the point. Agency isn’t removed through force — it’s eroded. When a system predicts your behaviour well enough, it doesn’t need to control you. It just nudges you. Over time, you don’t just consume content. Your emotional responses are trained. You think you’re choosing what to watch — but the system is choosing who you become tomorrow. In Ellison’s story, Ellen is the only woman among the survivors. One of the punishments she endures is repeated sexual abuse — a violation engineered by the machine, not a choice. Ted constantly frames her suffering through his resentment. He sexualises her. He blames her. He treats her as responsible for what’s happening. Even though she has no agency at all. That mindset should feel familiar. Because today, when AI-generated deepfake porn appears, the conversation often shifts away from the abuser and toward the victim. People ask, “If she didn’t want attention, why post photos?” “If she didn’t want attention, why be online?” “If she didn’t want attention, why not set everything to private?” That logic turns victims into accomplices. Tools now exist that can manipulate images of a man, a woman, or even a child, placing them into explicit or degrading scenarios without consent, knowledge, or recourse. This video may contain copyrighted material; the use of which has not been specifically authorized by the copyright owner. We are making such material available for the purposes of criticism, comment, review and news reporting which constitute the 'fair use' of any such copyrighted material. The fair use of a copyrighted work for purposes such as criticism, comment, review and news reporting is not an infringement of copyright Thank You David Cromie Editor in Chief Media Outlet: Nerdgeist Media Type: Online Email Address: nerdgeistofficial@gmail.com Website: Nerdgeist.com #ArtificialIntelligence #SciFi #Deepfake #DigitalEthics #SciFiHorror #Dystopian #HumanAgency