Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection
Keywords: eye tracking, pupil detection, robustness, dirt simulation, data quality
AbstractEye-tracking technology has to date been primarily employed in research. With recent advances in aordable video-based devices, the implementation of gaze-aware smartphones, and marketable driver monitoring systems, a considerable step towards pervasive eye-tracking has been made. However, several new challenges arise with the usage of eye-tracking in the wild and will need to be tackled to increase the acceptance of this technology. The main challenge is still related to the usage of eye-tracking together with eyeglasses, which in combination with reflections for changing illumination conditions will make a subject "untrackable". If we really want to bring the technology to the consumer, we cannot simply exclude 30% of the population as potential users only because they wear eyeglasses, nor can we make them clean their glasses and the device regularly. Instead, the pupil detection algorithms need to be made robust to potential sources of noise. We hypothesize that the amount of dust and dirt on the eyeglasses and the eye-tracker camera has a significant influence on the performance of currently available pupil detection algorithms. Therefore, in this work, we present a systematic study of the eect of dust and dirt on the pupil detection by simulating various quantities of dirt and dust on eyeglasses. Our results show 1) an overall high robustness to dust in an o-focus layer. 2) the vulnerability of edge-based methods to even small in-focus dust particles. 3) a trade-o between tolerated particle size and particle amount, where a small number of rather large particles showed only a minor performance impact.
How to Cite
Fuhl, W., Hospach, D., Kübler, T., Rosenstiel, W., Bringmann, O., & Kasneci, E. (2017). Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection. Journal of Eye Movement Research, 10(3). https://doi.org/10.16910/jemr.10.3.1
This work is licensed under a Creative Commons Attribution 4.0 International License.