AR Monitoring System for User Safety in VR Encountered-Type Haptics Soroosh Mortezapoor* TU Wien, Vienna, Austria Mohammad Ghazanfari† TU Wien, Vienna, Austria Emanuel Vonach‡ TU Wien, Vienna, Austria Jakob Paul Hoffmann§ TU Wien, Vienna, Austria Hannes Kaufmann¶ TU Wien, Vienna, Austria Figure 1: (a) In a non-AR setup, the physical workspace and two displays have to be observed simultaneously by the overseer responsible for safety monitoring. (b) Using the AR monitoring system, the overseer is equipped with a Microsoft Hololens 2 and a wireless E-stop, and (c) uses the AR view for safety monitoring. ABSTRACT This paper presents an Augmented Reality (AR) monitoring system for a human overseer of a mobile robotic Encountered-Type Haptic Device (ETHD) in walkable Virtual Reality (VR). By integrating Robot Operating System (ROS), Unity, and Microsoft HoloLens 2, the system combines critical data, navigation, and behavior plans of the robotic ETHD system into a unified AR interface. This approach reduces the overseer’s workload while enabling uninterrupted mon- itoring of the physical workspace to enhance safety. Preliminary results demonstrate its effectiveness and potential in improving focus and decision-making, leading to a safer workspace. Index Terms: Augmented Reality, AR, Virtual Reality, VR, Robotics, ROS, Encountered-Type Haptic Device, ETHD, Safety. Index Terms: Human-centered computing—Human computer interaction (HCI)—Interaction paradigms—Mixed / augmented re- ality; Human-centered computing—Human computer interaction (HCI)—Interaction devices—Haptic devices; 1 INTRODUCTION AND BACKGROUND Mobile ETHDs can provide a convincing VR experience, using robotic systems to deliver real-time haptic feedback by aligning physical objects with virtual counterparts ([4, 1, 14, 18]). This allows users to touch virtual objects as if they were real in large- scale VR (Figure 1 (a)). However, safety is a significant concern in these systems, where immersed users are in close proximity to *e-mail: soroosh.mortezapoor@tuwien.ac.at †e-mail: mohammad.ghazanfari@tuwien.ac.at ‡e-mail: emanuel.vonach@tuwien.ac.at §e-mail: e1505330@student.tuwien.ac.at ¶e-mail: hannes.kaufmann@tuwien.ac.at moving robots they cannot see or anticipate. This makes them particularly vulnerable to collisions and other hazards. While autonomous robots employ safety measures like collision avoidance, additional safeguards are essential for immersed users. Industry norms suggest limiting actuator speeds in close Human- Robot Collaboration (HRC) [7], but in ETHDs, this can increase de- lays [2]. Other strategies include covering robots with soft bumpers and building safe end-effectors with soft edges to prevent injury in harsh collisions [5]. Integrating visual feedback to increase user awareness, such as warning signs [2] and visualization of robot hard- ware in virtual environment (VE) [9, 5], is another common safety measure in ETHDs, but it can disrupt user immersion and focus [3]. On the other hand, external oversight by an expert observer, or overseer, is an effective way to enhance safety while preserving user immersion [6, 17]. The overseer monitors the workspace and can halt the robot before a hazardous situation arises. However, simultaneously observing all system components, such as the robot, VR user(s), and software displays, can overwhelm the overseer, reducing their ability to respond promptly. AR has shown promising results in HRC to improve safety and awareness [8, 16, 13]. To alleviate the overseer’s task, we intro- duce an AR monitoring system for use with mobile robotic ETHD systems. In our AR-assisted monitoring system, crucial informa- tion is overlayed onto the physical workspace, offering intuitive, context-aware visualization. It enables overseers to focus on the physical workspace while monitoring system health through virtual overlays using AR headsets. We demonstrate our approach on Co- boDeck [11], a mobile robotic ETHD platform for walkable VR, where users can explore large virtual environments and interact with objects like walls, doors, and furniture. CoboDeck employs an om- nidirectional mobile manipulator robot to align physical props with virtual counterparts. Preliminary tests of our AR monitoring system indicate that this system enhances efficiency and safety. © 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. 2 SYSTEM DESIGN The CoboDeck platform originally incorporates a mobile robotic sys- tem (Robot) powered by ROS Melodic [12], depicted as Display 1 in Figure 1 (a), and a Unity application (VR Unity) on an external PC for handling walkable VR, shown as Display 2 in Figure 1 (a). The Robot is responsible for positioning encountered-type haptic props in the physical environment near the user, which corresponds to the virtual objects in the VE handled by VR Unity. These two compo- nents are connected via network, enabling seamless synchronization between the virtual and physical spaces. Additionally, all critical data from both systems are published into the ROS data topics of the Robot, maximizing accessibility for all system components. In this system, a human expert overseer equipped with a remote emer- gency stop monitors the different components continuously for user safety, including the two mentioned displays, as well as the physical workspace during the entire operation. Our proposed AR monitoring system allows the overseer (See Figure 1 (b)) to access all necessary information at the same time. It employs a Microsoft HoloLens 2 [10] device, built and integrated into CoboDeck using a dedicated Unity application (AR Unity) with Microsoft’s Mixed Reality OpenXR Plugin. AR Unity runs on the same machine as VR Unity (Intel Core i7 9900K, Nvidia RTX 2080Ti, 32GB memory). Communication between ROS and the AR Unity is facilitated by the ROS TCP Connector for Unity [15], similar to the VR Unity. The AR interface provides the overseer with a comprehensive view of the whole system’s general health and the status of the Robot’s localization, navigation, and high-level decisions. The sys- tem’s general health information includes the battery levels, the op- erational status of the Robot’s onboard computer, and the VR Unity- ROS communication latency. Additionally, the interface offers real-time on-demand overlayed transparent visualization of the robot and VR User models based on their data in ROS onto the physical workspace. Displaying their positions, orientations, robot’s LiDAR data, and dynamically gener- ated costmaps, including obstacles and their inflation layers, allows the overseer to quickly verify the accuracy of the underlying systems such as localization and collision avoidance. Any misalignment between these visualizations and the physical workspace, such as the LiDAR data projected onto the walls, is immediately noticed by the overseer as potential localization faults, indicating the need for an intervention. Regarding navigation, the AR system displays the robot’s planned paths in real-time. These visualizations allow the overseer to ensure the Robot’s paths remain reasonable and safe within the operational environment, and to anticipate upcoming changes in the workspace. High-level decision information is additionally visualized, offer- ing clarity on the Robot’s behavior-dependent actions as defined in the CoboDeck system. For example, the Chase space, which repre- sents areas where the robot navigates when not engaged in active interaction, is dynamically visualized. This space can be seen in Fig- ure 1 (c) as a colorful rainbow. Similarly, Escape points, which are the robot’s dynamically identified escape targets for avoiding user- to-robot collisions, are depicted using the red arrows. Finally, during haptic interaction requests, the planned Haptic Rover Poses and the robotic arm’s movement trajectories are shown, giving the overseer a complete understanding of the robot’s short-term decisions and immediate objectives before and during their execution. 3 PRELIMINARY RESULTS Preliminary experience with our proposed AR monitoring system by expert overseers demonstrated its potential and effectiveness in managing the CoboDeck platform. The system enabled over- seers to maintain their primary focus on the physical workspace, where close collaboration and interaction between the robot and the immersed VR user occurred. Simultaneously, the AR interface pro- vided access to critical system data, including the robot’s underlying decision-making processes and real-time status. This dual-layered view enhanced the overseers’ situational awareness, allowing them to monitor the robot’s actions in context and ensure its responses aligned with system safety protocols. By visualizing the robot’s plans, the overseers were able to monitor adherence to predefined user safety guidelines, contributing to a safer and more controlled environment for the VR user. In rare yet possible instances where the robot’s planned trajectories or decisions deviated from expectations, or posed risks, the overseers were able to swiftly intervene using the emergency stop mechanism, halting robot motion instantly. This rapid response capability ensured that even in complex scenarios, user safety and system integrity were preserved. 4 CONCLUSION AND FUTURE WORK The immersive nature of VR enhanced with mobile robotic ETHD creates a unique challenge, as users remain unaware of the robot’s movements and decisions. This emphasizes the critical role of an informed overseer equipped with multiple visualizations in ensur- ing system reliability. Our AR system effectively addresses this challenge by providing the overseer with real-time access to crucial system data, including localization, navigation, and high-level be- haviors while letting them stay focused on the physical workspace. This enables the overseer to maintain constant awareness of the system’s health and intervene promptly when necessary. Preliminary results underline the AR system’s potential as a robust monitoring tool that enhances safety, efficiency, and overall operational control in mobile robotic ETHDs similar to CoboDeck. In the future, we aim to conduct extensive evaluations of our AR monitoring system, focusing on the effectiveness and relevance of the visualized data. This will help us refine the selection of information presented to the overseer, ensuring it keeps them focused on the workspace without introducing unnecessary distractions. The insights gained from these planned studies will be presented in an extended future publication, contributing to the broader advancement of AR-based oversight in robotics for safety. ACKNOWLEDGEMENTS This work has been funded by the grant F77 (SFB “Advanced Com- putational Design”, SP5) of the Austrian Science Fund FWF. REFERENCES [1] M. Abdullah, M. Kim, W. Hassan, Y. Kuroda, and S. Jeon. Haptic- Drone: An encountered-type kinesthetic haptic interface with control- lable force feedback: Example of stiffness and weight rendering. In 2018 IEEE Haptics Symposium (HAPTICS), pp. 334–339. IEEE, 2018. 1 [2] P. Abtahi, B. Landry, J. Yang, M. Pavone, S. Follmer, and J. A. Landay. Beyond the force: Using quadcopters to appropriate objects and the environment for haptics in virtual reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–13, 2019. 1 [3] V. R. M. Garcia. Contribution to the Study of Usability and Haptic Feedback of Encountered-Type Haptic Displays. PhD thesis, INSA de Rennes, 2021. 1 [4] M. Hoppe, P. Knierim, T. Kosch, M. Funk, L. Futami, S. Schneegass, N. Henze, A. Schmidt, and T. Machulla. VRHapticDrones: Providing haptics in virtual reality through quadcopters. In Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, pp. 7–18, 2018. 1 [5] A. Horie, M. Y. Saraiji, Z. Kashino, and M. Inami. Encounteredlimbs: A room-scale encountered-type haptic presentation using wearable robotic arms. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR), pp. 260–269. IEEE, 2021. 1 [6] International Organization for Standardization. Iso 10218-2:2011 robots and robotic devices — safety requirements for industrial robots — part 2: Robot systems and integration. Standard ISO 10218-2:2011, International Organization for Standardization, Geneva, Switzerland, 2011. Available at: https://www.iso.org/standard/41571. html. 1 [7] International Organization for Standardization. Iso/ts 15066:2016 robots and robotic devices — collaborative robots. Technical Specifi- cation ISO/TS 15066:2016, International Organization for Standard- ization, Geneva, Switzerland, 2016. Available at: https://www.iso. org/standard/62996.html. 1 [8] C. Li, P. Zheng, Y. Yin, Y. M. Pang, and S. Huo. An AR-assisted deep reinforcement learning-based approach towards mutual-cognitive safe human-robot interaction. Robotics and Computer-Integrated Manufac- turing, 80:102471, 2023. 1 [9] V. R. Mercado, F. Argelaguet, G. Casiez, and A. Lécuyer. Watch out for the robot! designing visual feedback safety techniques when interacting with encountered-type haptic displays. Frontiers in Virtual Reality, 3, 2022. 1 [10] Microsoft Corporation. Microsoft HoloLens 2. https://www. microsoft.com/en-us/hololens, 2019. Accessed: 2023-12-24. 2 [11] S. Mortezapoor, K. Vasylevska, E. Vonach, and H. Kaufmann. Co- boDeck: A large-scale haptic VR system using a collaborative mobile robot. In IEEE Conference on Virtual Reality, 2023. 1 [12] Open Source Robotics Foundation. Robot Operating System (ROS) Melodic Morenia. https://wiki.ros.org/melodic, 2018. Ac- cessed: 2023-12-24. 2 [13] S. Papanastasiou, N. Kousi, P. Karagiannis, C. Gkournelos, A. Pa- pavasileiou, K. Dimoulas, K. Baris, S. Koukas, G. Michalos, and S. Makris. Towards seamless human robot collaboration: integrat- ing multimodal interaction. The International Journal of Advanced Manufacturing Technology, 105:3881–3897, 2019. 1 [14] R. Suzuki, H. Hedayati, C. Zheng, J. L. Bohn, D. Szafir, E. Y.-L. Do, M. D. Gross, and D. Leithinger. Roomshift: Room-scale dynamic haptics for vr with furniture-moving swarm robots. In Proceedings of the 2020 CHI conference on human factors in computing systems, pp. 1–11, 2020. 1 [15] Unity Technologies. ROS TCP Connector. https://github.com/ Unity-Technologies/ROS-TCP-Connector, 2023. Accessed: 2023-12-24. 2 [16] C. Vogel, C. Walter, and N. Elkmann. Safeguarding and supporting fu- ture human-robot cooperative manufacturing processes by a projection- and camera-based technology. Procedia Manufacturing, 11:39–46, 2017. 1 [17] E. Vonach, C. Gatterer, and H. Kaufmann. VRRobot: Robot actuated props in an infinite virtual environment. In 2017 IEEE Virtual Reality (VR), pp. 74–83. IEEE, 2017. 1 [18] Y. Wang, Z. Chen, H. Li, Z. Cao, H. Luo, T. Zhang, K. Ou, J. Raiti, C. Yu, S. Patel, et al. Movevr: Enabling multiform force feedback in virtual reality using household cleaning robot. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–12, 2020. 1