Authors: Koorosh Vaziri (Ph.D. student), Maria Bondy (summer REU researcher), Amanda Bui (undergraduate research assistant), Victoria Interrante (professor)
Abstract: We report the results of an experiment that provides new insight into the extent to which, and conditions under which, scene detail affects spatial perception accuracy in VR applications. Using a custom-built video-see-through HMD, participants judged distances in a real-world outdoor environment under three different conditions of detail reduction: raw camera view, Sobel-filtered camera view, and complete background subtraction, plus a control condition of unmediated real-world viewing. We found no significant difference in distance walked between the three VST conditions, despite significant differences in ratings of visual and experiential realism, suggesting a sole reliance on angular declination to the target, independent of context.