<div class="csl-bib-body">
<div class="csl-entry">Hu, J., Jiang, H., Liu, D., Xiao, Z., Zhang, Q., Liu, J., & Dustdar, S. (2024). Combining IMU With Acoustics for Head Motion Tracking Leveraging Wireless Earphone. <i>IEEE Transactions on Mobile Computing</i>, <i>23</i>(6), 6835–6847. https://doi.org/10.1109/TMC.2023.3325826</div>
</div>
-
dc.identifier.issn
1536-1233
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/198636
-
dc.description.abstract
Head motion tracking is a promising research field with vast applications in ubiquitous human-computer interaction (HCI) scenarios. Unfortunately, solutions based on vision and wireless sensing have shortcomings in user privacy and tracking range, respectively. To address these issues, we propose IA-Track, a novel head motion tracking system that combines inertial measurement units (IMU) and acoustic sensing. Our wireless earphone-based method balances flexibility, computational complexity, and tracking accuracy, requiring only an earphone with an IMU and a smartphone. However, we still face two challenges. First, wireless earphones have limited hardware resources, making acoustic Doppler effect-based method unsuitable for acoustic tracking. Second, traditional Kalman filter-based trajectory restoration methods may introduce significant cumulative errors. To tackle these challenges, we rely on IMU sensor data to recover the trajectory and use smartphones to emit 'inaudible' acoustic signals that the earphone receives to adjust the IMU drift track. We conducted extensive experiments involving 50 volunteers in various potential IA-Track usage scenarios, demonstrating that our well-designed system achieves satisfactory head motion tracking performance.
en
dc.language.iso
en
-
dc.publisher
IEEE COMPUTER SOC
-
dc.relation.ispartof
IEEE Transactions on Mobile Computing
-
dc.subject
Acoustic signal
en
dc.subject
head motion tracking
en
dc.subject
human-machine interface
en
dc.title
Combining IMU With Acoustics for Head Motion Tracking Leveraging Wireless Earphone