← Back to Projects

Geometric Scale Estimation using Multisensory Cues in Virtual Reality

Course: [GCT573] Cognition and Emotion (KAIST)
Role: Project Leader & Developer
Stack: C++, Unity, VR Interaction, Haptic Rendering Pipeline

Abstract

We present a study that investigates how multisensory integration influences three-dimensional object scale estimation in virtual reality (VR). The accurate scale perception remains a critical yet underexplored component of spatial cognition in VR, especially regarding vertical positioning, compared to the predominantly focused on horizontal depth. Although multisensory feedback has demonstrated clear advantages in real-world settings, the mechanism through which visual and haptic cues interact to support scale estimation in VR is not fully understood. In this study, we examine perceptual accuracy using 3-level scales of reference and comparison cubes placed at systematically varied horizontal depths and vertical heights, across three stimulus conditions such as visual-only, haptic-only, and combined visual-haptic feedback. We analyzed the main and interaction effects of those factors on estimation accuracy, as well as tendencies toward over- and underestimation. Results showed that spatial positions and sensory conditions had a greater influence on scale perception than the object’s actual scale, with inconsistent estimates at vertical positions and improved accuracy under multisensory cues in close-range conditions. Our findings contribute to the theoretical understanding of cross-modal perception and offer practical implications for designing perceptually robust VR interfaces.