У нас вы можете посмотреть бесплатно AI Chatbots and Companions: Navigating Trust, Emotion, and Human Agency или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
All Tech Is Human’s livestream brought together experts from All Tech Is Human and the Center for News, Technology & Innovation (CNTI) to explore the rapidly growing world of AI chatbots and emotionally responsive AI companions. Moderated by David Ryan Polgar, the conversation highlighted how these systems are increasingly shaping the way people seek information, emotional support, and even relational connection. The panel emphasized that while “chatbots” and “companions” are sometimes used interchangeably, the distinction matters: companions are often designed or experienced as emotionally engaging, memory-enabled, and personalized in ways that can blur the line between tool and relationship. CNTI’s Jay Barchas-Lichtenstein and Prabhat Mishra shared findings from their qualitative research in the U.S. and India, showing that people are not fully replacing traditional news or search sources with chatbots, but rather adding them as “one more tool in the toolbox.” Users valued the speed, clarity, customization, and conversational interface of chatbots, especially for making sense of overwhelming amounts of information. However, the panel raised concerns about how users may perceive chatbots as neutral or “averaged” sources, often paying less attention to where information originates, which can create risks around trust, bias, and misinformation. Leah Ferentinos presented All Tech Is Human’s report focused specifically on AI companions; systems designed or used for emotional and social interaction. The discussion underscored both the appeal and the dangers of these tools: users are drawn to nonjudgmental, always-available support, but survey respondents warned about emotional overdependence, manipulation, privacy exploitation, and the erosion of human agency. Particular concern was raised about youth adoption, with the panel stressing that early design and governance decisions are critical before harmful dynamics become entrenched. Across the conversation, speakers agreed that the societal impact of AI companions is not predetermined but will be shaped by incentives, design choices, accountability structures, and guardrails. Rather than relying solely on digital literacy, the panel called for stronger governance, transparency, privacy-by-default practices, and design standards that prevent emotional substitution or closed ecosystems. The livestream concluded with a shared message: AI companions may offer benefits, but only if developed as tools that strengthen—not replace—human connection and autonomy. == All Tech Is Human is a non-profit building a whole-of-ecosystem approach with tackling thorny tech & society issues. We are multistakeholder and multidisciplinary, surfacing important values, tensions, tradeoffs, and best practices as we work towards a better tech future. Learn more at AllTechIsHuman.org