Image preference estimation with a data-driven approach: A comparative study between gaze and image features
Abstract
Understanding how humans subjectively look at and evaluate images is an important task for various applications in the field of multimedia interaction. While it has been repeatedly pointed out that eye movements can be used to infer the internal states of humans, not many successes have been reported concerning image understanding. We investigate the possibility of image preference estimation based on a person’s eye movements in a supervised manner in this paper. A dataset of eye movements is collected while the participants are viewing pairs of natural images, and it is used to train image preference label classifiers. The input feature is defined as a combination of various fixation and saccade event statistics, and the use of the random forest algorithm allows us to quantitatively assess how each of the statistics contributes to the classification task. We show that the gaze-based classifier had a higher level of accuracy than metadata-based baseline methods and a simple rule-based classifier throughout the experiments. We also present a quantitative comparison with image-based preference classifiers and discuss the potential and limitations of the gaze-based preference estimator.
Published
2014-04-18
How to Cite
Sugano, Y., Ozaki, Y., Kasai, H., Ogaki, K., & Sato, Y. (2014). Image preference estimation with a data-driven approach: A comparative study between gaze and image features. Journal of Eye Movement Research, 7(3). https://doi.org/10.16910/jemr.7.3.5
Issue
Section
Articles
License
Copyright (c) 2014 Yusuke Sugano, Yasunori Ozaki, Hiroshi Kasai, Keisuke Ogaki, Yoichi Sato
This work is licensed under a Creative Commons Attribution 4.0 International License.