У нас вы можете посмотреть бесплатно A Decoupled VR and Real-time Audio System for Distributed Musical Collaboration или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Boem, A., Andrioaia, O. C., Stefani, D., Micalizzi, A., & Turchet, L. (2025, October). A Decoupled VR and Real-time Audio System for Distributed Musical Collaboration. In 2025 IEEE 6th International Symposium on the Internet of Sounds (IS2) (pp. 1-10). IEEE. https://doi.org/10.1109/IS264627.2025... Musical collaboration in virtual reality (VR) faces a fundamental technical challenge: achieving the ultra-low latency required for ensemble synchronization while maintaining the rich spatial interactions characteristic of Shared Virtual Environments. This paper presents a proof-of-concept for a hybrid decoupled architecture that addresses these competing requirements of real-time graphics, interactions, and audio through strategic layer separation. Our system combines dedicated real-time audio hardware for networked music performance (Elk LIVE) with consumer head-mounted displays running a custom-made social VR application (made with the Ubiq framework). These two subsystems are connected through lightweight control messaging infrastructure. We evaluated the system through deployment with 12 musicians across three distributed groups, measuring technical performance metrics and user experience through standardized questionnaires. Results show that the audio subsystem maintained consistent latency with minimal packet loss, while VR layer performance varied significantly. Participants achieved moderate levels of social presence and creativity support, with evidence suggesting that audio consistency enables musical focus even when visual performance degrades. Our findings indicate that decoupled architectures might resolve the tension between musical precision and VR immersion requirements, providing design principles for next-generation Musical Metaverse systems that prioritize temporal consistency over absolute performance optimization. This work has been supported by the MUSMET project. The MUSMET project (https://musmet.eu/) is funded by the EIC Pathfinder Open scheme of the European Commission (grant agreement n. 101184379). Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Innovation Council. Neither the European Union nor the European Innovation Council can be held responsible for them.