<div class="csl-bib-body">
<div class="csl-entry">Varzandeh, S., Vasylevska, K., Vonach, E., & Kaufmann, H. (2024). Towards Environment- and Task-Independent Locomotion Prediction for Haptic VR. In <i>ICAT-EGVE 2024 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments</i>. ICAT-EGVE 2024 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, Tsukuba, Japan. Eurographics Association. https://doi.org/10.2312/EGVE.20241356</div>
</div>
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/209501
-
dc.description.abstract
The use of robots presenting physical props has significantly enhanced the haptic experience in virtual reality. Autonomous mobile robots made haptic interaction in large walkable virtual environments feasible but brought new challenges. For effective operation, a mobile robot must not only track the user but also predict her future position for the next several seconds to be able to plan and navigate in the common space safely and timely. This paper presents a novel environment- and taskindependent concept for locomotion-based prediction of the user position within a chosen range. Our approach supports the dynamic placement of haptic content with minimum restrictions. We validate it based on a real use case by making predictions within a range of 2 m to 4 m or 2 s to 5 s. We also discuss the adaptation to arbitrary space sizes and configurations with minimal real data collection. Finally, we suggest optimal utilization strategies and discuss the limitations of our approach.
en
dc.description.sponsorship
FWF - Österr. Wissenschaftsfonds
-
dc.language.iso
en
-
dc.rights.uri
http://creativecommons.org/licenses/by/4.0/
-
dc.subject
Virtual Reality
en
dc.subject
Interaction Techniques
en
dc.subject
Prediction
en
dc.subject
Haptics
en
dc.title
Towards Environment- and Task-Independent Locomotion Prediction for Haptic VR