У нас вы можете посмотреть бесплатно SensorLM: Learning the Language of Wearable Sensors или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Associate Professor Yuzhe Yang presents SensorLM, a new family of sensor-language foundation models that bridges the gap between sensor signals and human language. Yuzhe discusses the hierarchical pipeline for automatic sensor-text data annotation, the pre-training methods, and the model's zero-shot and few-shot learning, cross-modal retrieval, and caption generation capabilities. And explores how SensorLM translates complex sensor signals into meaningful insights, opening the door to a new generation of personalized health and wellness applications. Yuzhe Yang is a Visiting Faculty Researcher at Google, and an Assistant Professor at UCLA. He received his Ph.D. in Computer Science at MIT. His research interests include advancing machine learning and artificial intelligence to drive innovations in healthcare, medicine, and scientific discovery. His research has been published in Nature Medicine, Science Translational Medicine, NeurIPS, ICML, and ICLR, and featured in media outlets such as WSJ, Forbes, and BBC. He is a recipient of the Rising Stars in AI, the Rising Stars in Data Science, the AMIA Doctoral Dissertation Award Honorable Mention, and Forbes 30 Under 30. Learn more: https://research.google/blog/sensorlm... Read the paper: https://arxiv.org/abs/2506.09108 oin the GDG AI for Science community for talks, events, workshops, collaborations and more: https://gdg.community.dev/gdg-ai-for-...