Comparison of Audio and Visual Cues to Support Remote Guidance in Immersive Environments [conference paper]

Conference

International Conference on Artificial Reality and Telexistence & Eurographics Symposium on Virtual Environments - December 4, 2020

Authors

Fei Wu (Ph.D. student), Jerald Thomas (Ph.D. student), Shreyas Chinnola, Evan Suma Rosenberg (assistant professor)

Abstract

Collaborative virtual environments provide the ability for collocated and remote participants to communicate and share information with each other. For example, immersive technologies can be used to facilitate collaborative guidance during navigation of an unfamiliar environment. However, the design space of 3D user interfaces for supporting collaborative guidance tasks, along with the advantages and disadvantages of different immersive communication modalities to support these tasks, are not well understood. In this paper, we investigate three different methods for providing assistance (visual-only, audio-only, and combined audio/visual cues) using an asymmetric collaborative guidance task. We developed a novel experimental design and virtual reality scenario to evaluate task performance during navigation of a complex and dynamic environment while simultaneously avoiding observation by patrolling sentries. Two experiments were conducted: a dyadic study conducted at a large public event and a controlled lab study using a confederate. Combined audio/visual guidance cues were rated easier to use and more effectively facilitated the avoidance of sentries compared with the audio-only condition. The presented work has the potential to inform the design of future experiments and applications that involve communication modalities to support collaborative guidance tasks with immersive technologies.

Link to full paper

Comparison of Audio and Visual Cues to Support Remote Guidance in Immersive Environments

Keywords

immersive technology, virtual reality

Share