Dummy eye measurements of microsaccades: Testing the influence of system noise and head movements on microsaccade detection in a popular video-based eye tracker

Frouke Hermens


Whereas early studies of microsaccades have predominantly relied on custom-built eye trackers and manual tagging of microsaccades, more recent work tends to use video-based eye tracking and automated algorithms for microsaccade detection. While data from these newer studies suggest that microsaccades can be reliably detected with video-based systems, this has not been systematically evaluated. I here present a method and data examining microsaccade detection in an often used video-based system (the Eyelink II system) and a commonly used detection algorithm (Engbert & Kliegl, 2003; Engbert & Mergenthaler, 2006). Recordings from human participants and those obtained using a pair of dummy eyes, mounted on a pair of glasses either worn by a human participant (i.e., with head motion) or a dummy head (no head motion) were compared. Three experiments were conducted. The first experiment suggests that when microsaccade measurements make use of the pupil detection mode, microsaccade detections in the absence of eye movements are sparse in the absence of head movements, but frequent with head movements (despite the use of a chin rest). A second experiment demonstrates that by using measurements that rely on a combination of corneal reflection and pupil detection, false microsaccade detections can be largely avoided as long as a binocular criterion is used. A third experiment examines whether past results may have been affected by possible incorrect detections due to small head movements. It shows that despite the many detections due to head movements, the typical modulation of microsaccade rate after stimulus onset is found only when recording from the participants’ eyes.


microsaccades; fixational eye movements; head motion; eye tracking; dummy eyes

Full Text:


DOI: http://dx.doi.org/10.16910/jemr.8.1.1


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.