• ClipSaver
  • dtub.ru
ClipSaver
Русские видео
  • Смешные видео
  • Приколы
  • Обзоры
  • Новости
  • Тесты
  • Спорт
  • Любовь
  • Музыка
  • Разное
Сейчас в тренде
  • Фейгин лайф
  • Три кота
  • Самвел адамян
  • А4 ютуб
  • скачать бит
  • гитара с нуля
Иностранные видео
  • Funny Babies
  • Funny Sports
  • Funny Animals
  • Funny Pranks
  • Funny Magic
  • Funny Vines
  • Funny Virals
  • Funny K-Pop

Why democratizing AI is absolutely crucial | Karen Palmer | Big Think скачать в хорошем качестве

Why democratizing AI is absolutely crucial | Karen Palmer | Big Think 5 лет назад

скачать видео

скачать mp3

скачать mp4

поделиться

телефон с камерой

телефон с видео

бесплатно

загрузить,

Не удается загрузить Youtube-плеер. Проверьте блокировку Youtube в вашей сети.
Повторяем попытку...
Why democratizing AI is absolutely crucial | Karen Palmer | Big Think
  • Поделиться ВК
  • Поделиться в ОК
  •  
  •  


Скачать видео с ютуб по ссылке или смотреть без блокировок на сайте: Why democratizing AI is absolutely crucial | Karen Palmer | Big Think в качестве 4k

У нас вы можете посмотреть бесплатно Why democratizing AI is absolutely crucial | Karen Palmer | Big Think или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:

  • Информация по загрузке:

Скачать mp3 с ютуба отдельным файлом. Бесплатный рингтон Why democratizing AI is absolutely crucial | Karen Palmer | Big Think в формате MP3:


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса ClipSaver.ru



Why democratizing AI is absolutely crucial | Karen Palmer | Big Think

Why democratizing AI is absolutely crucial Watch the newest video from Big Think: https://bigth.ink/NewVideo Join Big Think Edge for exclusive videos: https://bigth.ink/Edge ---------------------------------------------------------------------------------- Implicit biases are feelings and ideas subconsciously attributed to a group or culture based on learned associations and experiences. Everyone has them, but it can be dangerous when those biases are transferred to a powerful technology like AI. By keeping the development of artificial intelligence private, we are risking building systems that are intrinsically biased against certain groups. Governance and regulations are necessary to ensure that artificial intelligence remains as neutral as possible. ---------------------------------------------------------------------------------- KAREN PALMER Karen Palmer is the Storyteller from the Future. She is an award-winning international artist and TED speaker. She creates immersive film experiences at the intersection of film, A.I. technology, gaming, immersive storytelling, neuroscience, consciousness, implicit bias, and the parkour philosophy of moving through fear. She is the creator of RIOT, an emotionally responsive film, which uses facial recognition and A.I. technology to navigate through a dangerous riot. ---------------------------------------------------------------------------------- TRANSCRIPT: One of the key themes kind of in the subtext of the narratives of my work that I create is about democratizing artificial intelligence and kind of looking at the lack of AI governance and AI regulation. And the consequence and implications of that which is what my Perception iO project reflects for the user experience. This is a really big deal girls and boys out there. This is a really big deal. You know there was something called the, you may have heard of called email and the internet which was with the government and the military for decades before it came to us, the people. There’s serious consequences if us, the people, don’t have access or are not involved in the development of these networks with these really powerful forms of technology. There’s something called implicit bias which basically means everybody basically has bias that they’ve had being brought up and their livelihood and their experiences in life. And if you’re a developer or a designer you tend to subconsciously program implicit bias into what you’re doing. The consequences of implicit bias in technology like AI is basically catastrophic. So I’m going to give an example. There’s a system called the Compass system which is a system which supports judges as they’re sentencing a criminal. This is an AI system and it has been proven that this system has given longer sentences for people of color and black people than it does to white people. There’s also a similar system in the UK which has been proven to give longer sentences to working class people. The artificial intelligence which supports the judges in these decisions is designed by private organizations and corporations. The data in this AI has no regulation governing it and basically a commercial entity has created this, given it to judges and it is affecting people’s lives. People of color and black people for the worse on a daily basis. As a black woman working in storytelling and technology this type of conversation is very important to bring to the fore. For other developers and academics in this area that’s not a priority for them. They have other narratives that they want to bring. So part of democratizing, creating a regulation, a governance is to me essential and that’s why when you experience my stories and my immersive experiences this is the context of them. Because maybe lots of people this is just too heavy for them. They want to watch The Voice, they want to watch X Factor. This is way too heavy shit. But if you’re in an immersive experience and you’re feeling it and you’re feeling this emotion and you’re seeing the consequences maybe viscerally it can connect with you in your gut in a different way. So that’s why I created experiences to kind of show, bring these things to people in a way which is accessible to them in a language and experience which they understand. ---------------------------------------------------------------------------------- ABOUT BIG THINK: Smarter Faster™ Big Think is the leading source of expert-driven, actionable, educational content -- with thousands of videos, featuring experts ranging from Bill Clinton to Bill Nye, we help you get smarter, faster. Subscribe to learn from top minds like these daily. Get actionable lessons from the world’s greatest thinkers & doers. Our experts are either disrupting or leading their respective fields. ​We aim to help you explore the big ideas and core skills that define knowledge in the 21st century, so you can apply them to the questions and challenges in your own life.

Comments
  • Karen Palmer – The Future of Immersive Filmmaking (FoST 2017) 8 лет назад
    Karen Palmer – The Future of Immersive Filmmaking (FoST 2017)
    Опубликовано: 8 лет назад
  • Your Brain Is A Remote Control | Karen Palmer | TEDxSydney 9 лет назад
    Your Brain Is A Remote Control | Karen Palmer | TEDxSydney
    Опубликовано: 9 лет назад
  • 3 ways smart people stay stuck in failing patterns | Anne-Laure Le Cunff 9 дней назад
    3 ways smart people stay stuck in failing patterns | Anne-Laure Le Cunff
    Опубликовано: 9 дней назад
  • Carol Gilligan, “In a Different Voice: Why Does Nobody Talk About the Abortion Decisions?” 3 года назад
    Carol Gilligan, “In a Different Voice: Why Does Nobody Talk About the Abortion Decisions?”
    Опубликовано: 3 года назад
  • The problem with AI-generated art | Steven Zapata | TEDxBerkeley 2 года назад
    The problem with AI-generated art | Steven Zapata | TEDxBerkeley
    Опубликовано: 2 года назад
  • The terrifying ways that social media is altering teenage brains | Clare Morell: Full Interview 13 дней назад
    The terrifying ways that social media is altering teenage brains | Clare Morell: Full Interview
    Опубликовано: 13 дней назад
  • Удаляем свои фото, выходим из чатов, скрываем фамилию? Как избежать штрафов 5 месяцев назад
    Удаляем свои фото, выходим из чатов, скрываем фамилию? Как избежать штрафов
    Опубликовано: 5 месяцев назад
  • Yuval Noah Harari: Why advanced societies fall for mass delusion 2 недели назад
    Yuval Noah Harari: Why advanced societies fall for mass delusion
    Опубликовано: 2 недели назад
  • The Harvard happiness finding no one wants to hear | Robert Waldinger 6 дней назад
    The Harvard happiness finding no one wants to hear | Robert Waldinger
    Опубликовано: 6 дней назад
  • Michio Kaku: The impending collapse of digital computing as we know it 2 недели назад
    Michio Kaku: The impending collapse of digital computing as we know it
    Опубликовано: 2 недели назад
  • Democratizing Artificial Intelligence | CNBC Conversation 7 лет назад
    Democratizing Artificial Intelligence | CNBC Conversation
    Опубликовано: 7 лет назад
  • Тамара Эйдельман: почему Россию ждет оттепель, и в чем не похожи Путин и Сталин 3 дня назад
    Тамара Эйдельман: почему Россию ждет оттепель, и в чем не похожи Путин и Сталин
    Опубликовано: 3 дня назад
  • 13 часов назад
    "Этим и спасемся": Шевчук, Гребенщиков, Хаматова, Макаревич, Набутов, Долин, Би-2, Муратов, Козырев
    Опубликовано: 13 часов назад
  • Ник Бостром: Что произойдёт, когда компьютеры станут умнее нас? 10 лет назад
    Ник Бостром: Что произойдёт, когда компьютеры станут умнее нас?
    Опубликовано: 10 лет назад
  • Почему Сократ ненавидел демократию 9 лет назад
    Почему Сократ ненавидел демократию
    Опубликовано: 9 лет назад
  • Кто развязал чеченскую войну? | ХОЛОД.ВОЙНА 1 день назад
    Кто развязал чеченскую войну? | ХОЛОД.ВОЙНА
    Опубликовано: 1 день назад
  • AI for Good - Sustainability 8 лет назад
    AI for Good - Sustainability
    Опубликовано: 8 лет назад
  • The real reason some people adapt faster than others | George Bonanno 5 дней назад
    The real reason some people adapt faster than others | George Bonanno
    Опубликовано: 5 дней назад
  • Екатерина Шульман: Новогоднее обращение 2026 14 часов назад
    Екатерина Шульман: Новогоднее обращение 2026
    Опубликовано: 14 часов назад
  • Искусство и наука хорошо ошибаться 3 недели назад
    Искусство и наука хорошо ошибаться
    Опубликовано: 3 недели назад

Контактный email для правообладателей: [email protected] © 2017 - 2026

Отказ от ответственности - Disclaimer Правообладателям - DMCA Условия использования сайта - TOS



Карта сайта 1 Карта сайта 2 Карта сайта 3 Карта сайта 4 Карта сайта 5