<div class="csl-bib-body">
<div class="csl-entry">Jojic, P. (2008). <i>Implementing a Time-of-Flight Camera Interface for Visual Simultaneous Localization and Mapping</i> [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-24923</div>
</div>
To navigate successfully in an unknown environment, mobile robots have to know their location, and they need a map of the scene.<br />These two necessities cannot be separated and for navigation purposes they have to be solved simultaneously. The combination of these tasks is known within the robotics community as Simultaneous Localization and Mapping (SLAM).<br />Different sensors can be used to solve SLAM, but we think that a camera is the most appealing option, this is because it provides dense information content. Using the standard single perspective-projective camera as the only SLAM sensor has two major disadvantages. First, the depth information is immediately lost. To estimate the robot's location and positions' of scene landmarks, the camera has to move and perceive the environment from several different views. Second, the features lying at occlusion boundaries can not be distinctively rejected. However, false features can cause SLAM to collapse.<br />In this thesis, a recently developed Time-of-Flight (ToF) camera is used as the only sensor input for SLAM. The ToF sensor provides 2D images as the standard perspective-projective camera, but it can also measure the position of the scene features directly. Presented in this work is a new interface for a vision SLAM framework, which incorporates ToF sensor readings in real-time. However, the ToF cameras suffer from several noise effects, e.g. scattering, mixed pixels etc. We present how these various noise effects influence the previously mentioned localization and mapping problem.<br />Initially the experimental results for the selected vision SLAM framework using the ToF camera performed well, when enough near distant features have been available. In case new features were not detectable, SLAM usually gets instable or lost.<br />To tackle the problem of false scene landmarks lying at occlusion boundaries a concept is presented. The idea of this concept is to straightforwardly use the measured 3D information to analyze the cornerness of a landmark. Simulated results show that landmarks can be identified using the analysis based on the eigen decomposition, and this can improve the real-time feature initialization.
en
dc.language
English
-
dc.language.iso
en
-
dc.rights.uri
http://rightsstatements.org/vocab/InC/1.0/
-
dc.subject
Autonome mobile Roboter
de
dc.subject
Simultane Lokalisierung und Kartenerstellung
de
dc.subject
Navigation
de
dc.subject
3D Kamera
de
dc.subject
3D Sensor
de
dc.subject
Autonomous mobile robots
en
dc.subject
Simultaneous Localization and Mapping
en
dc.subject
SLAM
en
dc.subject
3D scene analysis
en
dc.subject
tracking
en
dc.subject
3D camera
en
dc.subject
Time-of-Flight principle
en
dc.subject
range-imaging camera
en
dc.subject
3D sensor
en
dc.title
Implementing a Time-of-Flight Camera Interface for Visual Simultaneous Localization and Mapping
en
dc.type
Thesis
en
dc.type
Hochschulschrift
de
dc.rights.license
In Copyright
en
dc.rights.license
Urheberrechtsschutz
de
dc.contributor.affiliation
TU Wien, Österreich
-
dc.rights.holder
Peter Jojic
-
tuw.version
vor
-
tuw.thesisinformation
Technische Universität Wien
-
dc.contributor.assistant
Gemeiner, Peter
-
tuw.publication.orgunit
E376 - Institut für Automatisierungs- und Regelungstechnik