Computers and Graphics (Pergamon), cilt.118, ss.23-32, 2024 (SCI-Expanded)
Despite remarkable advances in virtual reality (VR) technologies, serious challenges remain in making extended VR sessions with head-mounted displays (HMDs) thoroughly comfortable. 3D stereo imagery can cause discomfort and eye fatigue due to poor stereo camera settings that result in extreme disparities and vergence-accommodation conflicts. The default stereoscopic parameters of consumer HMDs produce images with shallow depth to circumvent these issues. In this work, we propose a methodology to utilize the gaze-directed and visual saliency-guided paradigms for automatic stereo camera control in real-time interactive VR by employing the basics of stereo grading. We evaluate these two approaches at different levels of interaction, first through a user study and then through a performance benchmark. The results show that the gaze-directed approach outperforms the saliency-guided approach in the VEs tested and both methods are able to convey a better overall depth feeling than the default HMD setting without hindering visual comfort. It is also shown that both approaches lead to a significant overall enhancement of the VR experience in the more interactive VE.