Comparing scanpaths during scene encoding and recognition: A multi-dimensional approach

  • Tom Foulsham Department of Psychology, University of Essex, UK
  • Richard Dewhurst Humanities Lab, Lund University, Sweden
  • Marcus Nyström Humanities Lab, Lund University, Sweden
  • Halszka Jarodzka Centre for Learning Sciences, Heerlen, The Netherlands
  • Roger Johansson Humanities Lab, Lund University, Sweden
  • Geoffrey Underwood School of Psychology, University of Nottingham, UK
  • Kenneth Holmqvist Humanities Lab, Lund University, Sweden
Keywords: scanpaths, scene perception, memory

Abstract

Complex stimuli and tasks elicit particular eye movement sequences. Previous research has focused on comparing between these scanpaths, particularly in memory and imagery research where it has been proposed that observers reproduce their eye movements when recognizing or imagining a stimulus. However, it is not clear whether scanpath similarity is related to memory performance and which particular aspects of the eye movements recur. We therefore compared eye movements in a picture memory task, using a recently proposed comparison method, MultiMatch, which quantifies scanpath similarity across multiple dimensions including shape and fixation duration. Scanpaths were more similar when the same participant’s eye movements were compared from two viewings of the same image than between different images or different participants viewing the same image. In addition, fixation durations were similar within a participant and this similarity was associated with memory performance.
Published
2012-08-24
How to Cite
Foulsham, T., Dewhurst, R., Nyström, M., Jarodzka, H., Johansson, R., Underwood, G., & Holmqvist, K. (2012). Comparing scanpaths during scene encoding and recognition: A multi-dimensional approach. Journal of Eye Movement Research, 5(4). https://doi.org/10.16910/jemr.5.4.3
Section
Articles

Most read articles by the same author(s)

1 2 > >> 

Similar Articles

<< < 1 2 3 > >> 

You may also start an advanced similarity search for this article.