<div class="csl-bib-body">
<div class="csl-entry">Gerstweiler, G., Vonach, E., & Kaufmann, H. (2016). HyMoTrack: A Mobile AR Navigation System for Complex Indoor Environments. <i>Sensors</i>. https://doi.org/10.3390/s16010017</div>
</div>
Navigating in unknown big indoor environments with static 2D maps is a challenge, especially when time is a critical factor. In order to provide a mobile assistant, capable of supporting people while navigating in indoor locations, an accurate and reliable localization system is required in almost every corner of the building. We present a solution to this problem through a hybrid tracking system specifically designed for complex indoor spaces, which runs on mobile devices like smartphones or tablets. The developed algorithm only uses the available sensors built into standard mobile devices, especially the inertial sensors and the RGB camera. The combination of multiple optical tracking technologies, such as 2D natural features and features of more complex three-dimensional structures guarantees the robustness of the system. All processing is done locally and no network connection is needed. State-of-the-art indoor tracking approaches use mainly radio-frequency signals like Wi-Fi or Bluetooth for localizing a user. In contrast to these approaches, the main advantage of the developed system is the capability of delivering a continuous 3D position and orientation of the mobile device with centimeter accuracy. This makes it usable for localization and 3D augmentation purposes, e.g. navigation tasks or location-based information visualization.
en
dc.description.sponsorship
Austrian Research Promotion Agency (FFG)
-
dc.language
English
-
dc.language.iso
en
-
dc.publisher
MDPI
-
dc.relation.ispartof
Sensors
-
dc.rights.uri
http://creativecommons.org/licenses/by/4.0/
-
dc.subject
indoor tracking
en
dc.subject
navigation
en
dc.subject
localization
en
dc.subject
augmented reality
en
dc.subject
mobile
en
dc.title
HyMoTrack: A Mobile AR Navigation System for Complex Indoor Environments
en
dc.type
Article
en
dc.type
Artikel
de
dc.rights.license
Creative Commons Namensnennung 4.0 International
de
dc.rights.license
Creative Commons Attribution 4.0 International
en
dc.relation.grantno
838548
-
dcterms.dateSubmitted
2015-09-30
-
dc.rights.holder
2015 by the authors
-
dc.type.category
Original Research Article
-
tuw.journal.peerreviewed
true
-
tuw.peerreviewed
true
-
tuw.version
vor
-
dcterms.isPartOf.title
Sensors
-
tuw.publication.orgunit
E188 - Institut für Softwaretechnik und Interaktive Systeme
-
tuw.publisher.doi
10.3390/s16010017
-
dc.date.onlinefirst
2015-12-24
-
dc.identifier.eissn
1424-8220
-
dc.identifier.libraryid
AC11359952
-
dc.identifier.urn
urn:nbn:at:at-ubtuw:3-1400
-
tuw.author.orcid
0000-0002-3869-9799
-
tuw.author.orcid
0000-0002-0322-9869
-
dc.rights.identifier
CC BY 4.0
de
dc.rights.identifier
CC BY 4.0
en
wb.sci
true
-
item.languageiso639-1
en
-
item.cerifentitytype
Publications
-
item.cerifentitytype
Publications
-
item.openairecristype
http://purl.org/coar/resource_type/c_18cf
-
item.openairecristype
http://purl.org/coar/resource_type/c_18cf
-
item.fulltext
with Fulltext
-
item.openaccessfulltext
Open Access
-
item.grantfulltext
open
-
item.openairetype
Article
-
item.openairetype
Artikel
-
crisitem.author.dept
E193-02 - Forschungsbereich Computer Graphics
-
crisitem.author.dept
E193-03 - Forschungsbereich Virtual and Augmented Reality
-
crisitem.author.dept
E193-03 - Forschungsbereich Virtual and Augmented Reality
-
crisitem.author.orcid
0000-0002-0322-9869
-
crisitem.author.parentorg
E193 - Institut für Visual Computing and Human-Centered Technology
-
crisitem.author.parentorg
E193 - Institut für Visual Computing and Human-Centered Technology
-
crisitem.author.parentorg
E193 - Institut für Visual Computing and Human-Centered Technology