Title: Implementing a Time-of-Flight Camera Interface for Visual Simultaneous Localization and Mapping
Language: English
Authors: Jojic, Peter 
Qualification level: Diploma
Keywords: Autonome mobile Roboter; Simultane Lokalisierung und Kartenerstellung; Navigation; 3D Kamera; 3D Sensor
Autonomous mobile robots; Simultaneous Localization and Mapping; SLAM; 3D scene analysis; tracking; 3D camera; Time-of-Flight principle; range-imaging camera; 3D sensor
Advisor: Vincze, Markus
Assisting Advisor: Gemeiner, Peter
Issue Date: 2008
Number of Pages: 88
Qualification level: Diploma
Abstract: 
x.

To navigate successfully in an unknown environment, mobile robots have to know their location, and they need a map of the scene.
These two necessities cannot be separated and for navigation purposes they have to be solved simultaneously. The combination of these tasks is known within the robotics community as Simultaneous Localization and Mapping (SLAM).
Different sensors can be used to solve SLAM, but we think that a camera is the most appealing option, this is because it provides dense information content. Using the standard single perspective-projective camera as the only SLAM sensor has two major disadvantages. First, the depth information is immediately lost. To estimate the robot's location and positions' of scene landmarks, the camera has to move and perceive the environment from several different views. Second, the features lying at occlusion boundaries can not be distinctively rejected. However, false features can cause SLAM to collapse.
In this thesis, a recently developed Time-of-Flight (ToF) camera is used as the only sensor input for SLAM. The ToF sensor provides 2D images as the standard perspective-projective camera, but it can also measure the position of the scene features directly. Presented in this work is a new interface for a vision SLAM framework, which incorporates ToF sensor readings in real-time. However, the ToF cameras suffer from several noise effects, e.g. scattering, mixed pixels etc. We present how these various noise effects influence the previously mentioned localization and mapping problem.
Initially the experimental results for the selected vision SLAM framework using the ToF camera performed well, when enough near distant features have been available. In case new features were not detectable, SLAM usually gets instable or lost.
To tackle the problem of false scene landmarks lying at occlusion boundaries a concept is presented. The idea of this concept is to straightforwardly use the measured 3D information to analyze the cornerness of a landmark. Simulated results show that landmarks can be identified using the analysis based on the eigen decomposition, and this can improve the real-time feature initialization.
URI: https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-24923
http://hdl.handle.net/20.500.12708/14441
Library ID: AC05039244
Organisation: E376 - Institut für Automatisierungs- und Regelungstechnik 
Publication Type: Thesis
Hochschulschrift
Appears in Collections:Thesis

Show full item record

Page view(s)

10
checked on Feb 18, 2021

Download(s)

76
checked on Feb 18, 2021

Google ScholarTM

Check


Items in reposiTUm are protected by copyright, with all rights reserved, unless otherwise indicated.