<div class="csl-bib-body">
<div class="csl-entry">Kattenbeck, M., Giannopoulos, I., Alinaghi, N., Golab, A., & Montello, D. R. (2025). Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world. <i>Scientific Reports</i>, <i>15</i>(1), Article 7970. https://doi.org/10.1038/s41598-025-92274-4</div>
</div>
-
dc.identifier.issn
2045-2322
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/215224
-
dc.description.abstract
Spatial familiarity has seen a long history of interest in wayfinding research. To date, however, no studies have been done which systematically assess the behavioral correlates of spatial familiarity, including eye and body movements. In this study, we take a step towards filling this gap by reporting on the results of an in-situ, within-subject study with N = 52 pedestrian wayfinders that combines eye-tracking and body movement sensors. In our study, participants were required to walk both a familiar route and an unfamiliar route by following auditory, landmark-based route instructions. We monitored participants' behavior using a mobile eye tracker, a high-precision Global Navigation Satellite System receiver, and a high-precision, head-mounted Inertial Measurement Unit. We conducted machine learning experiments using Gradient-Boosted Trees to perform binary classification, testing out different feature sets, i.e., gaze only, Inertial Measurement Unit data only, and a combination of the two, to classify a person as familiar or unfamiliar with a particular route. We achieve the highest accuracy of 89.9 %. using exclusively Inertial Measurement Unit data, exceeding gaze alone at 67.6 %, and gaze and Inertial Measurement Unit data together at 85.9 %. For the highest accuracy achieved, yaw and acceleration values are most important. This finding indicates that head movements ("looking around to orient oneself") are a particularly valuable indicator to distinguish familiar and unfamiliar environments for pedestrian wayfinders.
en
dc.description.sponsorship
European Commission
-
dc.language.iso
en
-
dc.publisher
NATURE PORTFOLIO
-
dc.relation.ispartof
Scientific Reports
-
dc.rights.uri
http://creativecommons.org/licenses/by/4.0/
-
dc.subject
Humans
en
dc.subject
Male
en
dc.subject
Female
en
dc.subject
Adult
en
dc.subject
Spatial Navigation
en
dc.subject
Walking
en
dc.subject
Young Adult
en
dc.subject
Machine Learning
en
dc.subject
Eye-Tracking Technology
en
dc.subject
Recognition, Psychology
en
dc.subject
Eye Movements
en
dc.subject
Pedestrians
en
dc.subject
Head Movements
en
dc.title
Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world