У нас вы можете посмотреть бесплатно The Ethical AI Puzzle with guest Cansu Canca on Building AI Boston или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
What does it mean to build AI that actually works? Not just technically, but for the people it touches. In this episode of Building AI Boston, we sit down with Cansu Canca, philosopher, Founder and Director of the AI Ethics Lab, and Director of Responsible AI Practice at Northeastern University, to talk about why getting AI ethics right is less about adding a policy at the end and more about how you build from the start. Cansu came to AI ethics through philosophy, public health, and law — fields where the stakes are high and the time to decide is short. That combination shaped her belief that ethical thinking is not a luxury or a slowdown. It is how you get to better technology. We dig into: • Why every AI system already reflects someone's values, whether you planned it or not • What autonomy, fairness, and harm reduction actually look like when you move from principle to practice • Her Puzzle-solving in Ethics (PiE) Model for working through ethical questions in real time, when there is no perfect answer and no time to wait • How bias in your model is often just a sign that your model is not doing its job • Why academia is one of the few places left that can ask hard questions without a product to protect • What responsible AI governance inside a university looks like when you are the builder, the buyer, and the classroom all at once Cansu has advised the UN, Interpol, the World Health Organization, and the World Economic Forum. But this conversation stays grounded — in the practical, the urgent, and the question of who is actually in the room when these decisions get made.