Hu, R., Kogler, J., Gelautz, M., Lin, M., & Xia, Y. (2024). A Dynamic Calibration Framework for the Event-Frame Stereo Camera System. IEEE Robotics and Automation Letters, 9(12), 11465–11472. https://doi.org/10.1109/LRA.2024.3491426
E193-01 - Forschungsbereich Computer Vision E192-05 - Forschungsbereich Theory and Logic
-
Journal:
IEEE Robotics and Automation Letters
-
ISSN:
2377-3766
-
Date (published):
Dec-2024
-
Number of Pages:
8
-
Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
-
Peer reviewed:
Yes
-
Keywords:
Calibration and Identification; Sensor Fusion; Event Cameras; Event-frame Stereo Camera System
en
Abstract:
The fusion of event cameras and conventional frame cameras is a novel research field, and a stereo structure consisting of an event camera and a frame camera can incorporate the advantages of both. This paper develops a dynamic calibration framework for the event-frame stereo camera system. In this framework, the first step is to complete the initial detection on a circle-grid calibration pattern, and a sliding-window time matching method is proposed to match the event-frame pairs. Then, a refining method is devised for two cameras to get the accurate information of the pattern. Particularly, for the event camera, a patch-size motion compensation method with high computational efficiency is designed to achieve time synchronization for two cameras and fit circles in an image of warped events. Finally, the pose between two cameras is globally optimized by constructing a pose-landmark graph with two types of edges. The proposed calibration framework has the advantages of high real-time performance and easy deployment, and its effectiveness is verified by experiments based on self-recorded datasets. The code of this paper is released at: http://github.com/rayhu95/EFSC_calib.
en
Project (external):
National Natural Science Foundation of China
-
Project ID:
61836001
-
Research Areas:
Visual Computing and Human-Centered Technology: 70% Automation and Robotics: 30%