Luidolt, L., Wimmer, M., & Krösl, K. (2020). Gaze-Dependent Simulation of Light Perception in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics, 26(12), 3557–3567. https://doi.org/10.1109/tvcg.2020.3023604
IEEE Transactions on Visualization and Computer Graphics
-
ISSN:
1077-2626
-
Date (published):
2020
-
Number of Pages:
11
-
Peer reviewed:
Yes
-
Keywords:
Software; virtual reality; perception; Computer Graphics and Computer-Aided Design; user studies; Computer Vision and Pattern Recognition; Signal Processing
-
Abstract:
The perception of light is inherently different inside a virtual reality (VR) or augmented reality (AR) simulation when compared to the real world. Conventional head-worn displays (HWDs) are not able to display the same high dynamic range of brightness and color as the human eye can perceive in the real world. To mimic the perception of real-world scenes in virtual scenes, it is crucial to reproduce the effects of incident light on the human visual system. In order to advance virtual simulations towards perceptual realism, we present an eye-tracked VR/AR simulation comprising effects for gaze-dependent temporal eye adaption, perceptual glare, visual acuity reduction, and scotopic color vision. Our simulation is based on medical expert knowledge and medical studies of the healthy human eye. We conducted the first user study comparing the perception of light in a real-world low-light scene to a VR simulation. Our results show that the proposed combination of simulated visual effects is well received by users and also indicate that an individual adaptation is necessary, because perception of light is highly subjective.
de
The perception of light is inherently different inside a virtual reality (VR) or augmented reality (AR) simulation when compared to the real world. Conventional head-worn displays (HWDs) are not able to display the same high dynamic range of brightness and color as the human eye can perceive in the real world. To mimic the perception of real-world scenes in virtual scenes, it is crucial to reproduce the effects of incident light on the human visual system. In order to advance virtual simulations towards perceptual realism, we present an eye-tracked VR/AR simulation comprising effects for gaze-dependent temporal eye adaption, perceptual glare, visual acuity reduction, and scotopic color vision. Our simulation is based on medical expert knowledge and medical studies of the healthy human eye. We conducted the first user study comparing the perception of light in a real-world low-light scene to a VR simulation. Our results show that the proposed combination of simulated visual effects is well received by users and also indicate that an individual adaptation is necessary, because perception of light is highly subjective.
en
Research Areas:
Visual Computing and Human-Centered Technology: 100%