У нас вы можете посмотреть бесплатно E2 - How "neutral" systems seem fair... until someone checks. или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this second episode of _Slop Happens_, Bianca Prins and Claudio Luís Vera welcome *Maranke Wierenga*, advisor on data-driven work and member of the *NEN Commission on AI & Big Data*, where she also chairs the Working Group on Ethics and Fundamental Rights. Together, they explore curious cases where technology fails the people it’s meant to serve. First, they dive into the *COMPAS case in Florida*, where a risk-scoring system used in the justice system showed troubling bias across racial groups. The discussion looks at how “neutral” algorithms can reinforce inequalities already present in society. They then examine a case from the *Netherlands*, where automated parking enforcement unintentionally creates barriers for people with disabilities. What happens when efficient systems overlook real people? And how can we design technology that is fairer and more accountable? Join us for this conversation—and let us know what you think.