Reactive Alignment of Virtual and Physical Environments Using Redirected Walking [conference paper]

Conference

IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops – March 22-26, 2020

Authors

Jerald Thomas (Ph.D. student), Evan Suma Rosenberg (assistant professor)

Abstract

Virtual reality applications may accomplish richer and more immersive experiences by incorporating physical interactions such as passive haptic feedback. These interactions require that the coordinate mapping between the physical and virtual environment remains fixed. However, a static relationship is not maintained by many common locomotion techniques, including redirected walking, resulting in a state of misalignment. In this work, we address this limitation by proposing a novel reactive algorithm that uses redirected walking techniques to transition the system from a misaligned state to an aligned state, thereby enabling the user to interact with the physical environment. Traditionally, redirected walking algorithms primarily optimize for avoiding collisions with the boundaries of the physical space, whereas the proposed method leverages redirection techniques to achieve a desired system configuration. Simulation-based experiments demonstrate an effective use of this strategy when combined with redirected walking using artificial potential functions. In the future, reactive environment alignment can enhance the interactivity of virtual reality applications and inform new research vectors that combine redirected walking and passive haptics.

Link to full paper

Reactive Alignment of Virtual and Physical Environments Using Redirected Walking

Keywords

virtual reality, human computer interaction (HCI)

Share