У нас вы можете посмотреть бесплатно Mastering Academic English :Who is Responsible When AI Goes Rogue? (Listening/ Speaking) или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
🎙 Engineering Ethics Podcast – AI Liability Debate In this academic listening exercise for first-year engineering students, we examine a critical question: Should engineers be legally liable for unintended harm caused by AI systems they design? This discussion presents structured arguments for liability, against strict liability, as well as counter-arguments and rebuttals using formal academic vocabulary appropriate for university-level study. 📚 Required Reading Resources Please review the following academic sources before or after listening: 1️⃣ British Columbia Law Institute (2024) Report on Artificial Intelligence and Civil Liability 🔗 https://www.bcli.org/project/artifici... 2️⃣ Sayre & Glover (2024) “Machines Make Mistakes Too: Planning for AI Liability in Contracting” Case Western Reserve Journal of Law, Technology & the Internet 🔗 https://scholarlycommons.law.case.edu... 🧠 Pre-Listening Critical Thinking Questions Before listening, consider the following: What is the difference between ethical responsibility and legal liability? Should engineers be responsible for outcomes they did not intend? How should responsibility be distributed when multiple actors contribute to a technological system? Can strict liability discourage innovation? How predictable must a risk be before it becomes legally foreseeable? 📖 Vocabulary Glossary Civil liability – Legal responsibility for harm or damages under civil law. Strict liability – Legal responsibility without proof of negligence or fault. Negligence – Failure to exercise reasonable care. Foreseeability – The ability to reasonably predict potential consequences. Upstream actors – Designers, developers, or manufacturers involved before deployment. Deployer – The organization or individual who implements or uses a system. Distributed responsibility – Shared accountability among multiple contributors. Deterrence – Prevention of harmful behavior through legal consequences. Fault-based standard – A legal approach requiring proof of wrongdoing or carelessness. Emergent behavior – Unexpected system behavior arising from complex interactions. 🎓 Academic Skills Focus This listening activity helps students develop: Academic listening comprehension Legal and ethical vocabulary Argument analysis Critical thinking Structured debate skills For more academic English preparation, research writing support, and university success training, visit: 🌐 https://www.efuas.com English for University Academic Success