Probabilistic approach to robust wearable gaze tracking
DOI:
https://doi.org/10.16910/jemr.10.4.2Keywords:
Wearable gaze tracking, Human eye modeling, Bayesian modeling, Kalman filteringAbstract
This paper presents a method for computing the gaze point using camera data captured with a wearable gaze tracking device. The method utilizes a physical model of the human eye, advanced Bayesian computer vision algorithms, and Kalman filtering, resulting in high accuracy and low noise. Our C++ implementation can process camera streams with 30 frames per second in realtime. The performance of the system is validated in an exhaustive experimental setup with 19 participants, using a self-made device. Due to the used eye model and binocular cameras, the system is accurate for all distances and invariant to device movement. We also test our system against a best-in-class commercial device which is outperformed for spatial accuracy and precision. The software and hardware instructions as well as the experimental data are published as open source.Downloads
Additional Files
Published
2017-11-08
Issue
Section
Articles
License
Copyright (c) 2017 Miika Toivanen, Kristian Lukander, Kai Puolamäki
This work is licensed under a Creative Commons Attribution 4.0 International License.
How to Cite
Probabilistic approach to robust wearable gaze tracking. (2017). Journal of Eye Movement Research, 10(4). https://doi.org/10.16910/jemr.10.4.2