У нас вы можете посмотреть бесплатно Multimodal Interaction For Virtual Reality Experience или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Abstract: In recent years, Virtual Reality (VR) systems have emerged as a novel medium for artists to express their ideas within an immersive 3D Virtual Environment (VE). For instance, artists benefit from the ample virtual space. At the same time, they can magnify a specific area within the VE to execute precise adjustments, which gives artists the flexibility to build an art piece. Artists also benefit from using novel input devices that provide feedback to create enhanced works. These input devices can mimic or improve the tools artists use. A third advantage is that artists can revert or undo errors, which may be challenging or impossible when creating an object with physical materials. Moreover, in the immersive virtual space, artists can visually concentrate on their craft by mitigating external distractions. Due to these advantages, artists have chosen VR as a medium to express themselves. For example, Gio Nakpil creates intricate 3D sculptures in VR, demonstrating the transformative potential of VR within the realm of art. Another example is Collin Leix, who seamlessly integrates classical techniques from oil painting with contemporary technological advancements using VR. These artistic VR tools are not here to replace existing mediums but to create opportunities for new ways to communicate art. However, there are still persistent usability challenges, in particular for emerging VR sculpting. For example, existing interaction techniques often fail to capture the materiality of sculpting by requiring unnatural hand use. This talk will present current challenges for multimodal interaction in VR art, current qualitative findings by the authors and their colleagues, and a way forward to motivate researchers to help us improve the quality of interaction, allowing artists to concentrate on their craft rather than the technology. We will also describe our VR Sketch and Sculpt application design for research experiments. BIO: Francisco R. Ortega is an Associate Professor at Colorado State University (CSU) and has been Director of the Natural User Interaction Lab (NUILAB) since Fall 2018. Dr. Ortega earned his Ph.D. in Computer Science (CS) with a focus on Human-Computer Interaction (HCI) and 3D User Interfaces (3DUI) from Florida International University (FIU). He also held the Postdoctoral and visiting assistant professor positions at FIU between February 2015 and July 2018. His research has focused on (1) multimodal and unimodal interaction (gesture-centric), which includes gesture elicitation (e.g., a form of participatory design), (2) information access effort in augmented reality (e.g., visual cues and automation bias), (3) AR notifications, and (4) stress reduction using virtual reality forest bathing. For multimodal interaction research, Dr. Ortega focuses on enhancing user interaction through (a) multimodal elicitation, (b) developing interactive techniques, and (c) refining augmented reality visualization techniques. The primary domains for interaction include general environments, immersive analytics, and VR sketching. His research has resulted in over 90 peer-reviewed publications, including books, journals, conferences, workshops, and magazine articles, in venues such as ACM CHI, ACM VRST, IEEE VR, IEEE TVCG, IEEE ISMAR, ACM PACMHCI, ACM ISS, ACM SUI, IEEE 3DUI, HFES, and Human Factor Journals, among others. Dr. Ortega has experience with multiple government-funded projects. For example, Dr. Ortega served as a co-Principal Investigator for the DARPA Communicating with Computers project. He is a principal investigator (PI) for a 3-year effort funded by ONR, titled "Perceptual/Cognitive Aspects of Augmented Reality: Experimental Research and a Computational Model." He was recently awarded a new ONR grant titled “Assessing Cognitive Load and Managing Extraneous Load to Optimize Training.” The National Science Foundation and other agencies and companies have also funded him. This includes the NSF CAREER 2023 for microgestures and multimodal interaction. Since his tenure-track appointment at CSU in August 2018, Dr. Ortega has brought over 4.2 million dollars in external funding (with 3.7 million as principal investigator). His lab website is https://nuilab.org