<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.10.3.3</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Sampling rate influences saccade detection in mobile eye tracking of a reading task</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Leube</surname>
						<given-names>Alexander</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Rifai</surname>
						<given-names>Katharina</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Rifai</surname>
						<given-names>Katharina</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
        <aff id="aff1">
		Institute for Ophthalmic Research<institution>University of Tuebingen</institution>, <country>Germany</country>
        </aff>
		</contrib-group>
     
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>7</day>  
		<month>6</month>
        <year>2017</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2017</year>
	</pub-date>
      <volume>10</volume>
      <issue>3</issue>
	 <elocation-id>10.16910/jemr.10.3.3</elocation-id>
	<permissions> 
	<copyright-year>2017</copyright-year>
	<copyright-holder>Leube et al.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>The purpose of this study was to compare saccade detection characteristics in two mobile eye trackers with different sampling rates in a natural task. Gaze data of 11 participants were recorded in one 60 Hz and one 120 Hz mobile eye tracker and compared directly to the saccades detected by a 1000 HZ stationary tracker while a reading task was performed. Saccades and fixations were detected using a velocity based algorithm and their properties analyzed. Results showed that there was no significant difference in the number of detected fixations but mean fixation durations differed between the 60 Hz mobile and the stationary eye tracker. The 120 Hz mobile eye tracker showed a significant increase in the detection rate of saccades and an improved estimation of the mean saccade duration, compared to the 60 Hz eye tracker. To conclude, for the detection and analysis of fast eye movements, such as saccades, it is better to use a 120 Hz mobile eye tracker.</p>
      </abstract>
      <kwd-group>
        <kwd>eye movement</kwd>
        <kwd>mobile eye tracking</kwd>
        <kwd>saccades</kwd>
        <kwd>reading</kwd>
      </kwd-group>
    </article-meta>
  </front>
  
  <body>

    <sec id="S1">
      <title>Introduction</title>
      
        <p>The investigation of eye movements using eye
tracking technology provides a powerful tool for different
disciplines. Besides its role in scientific and clinical tasks,
eye tracking applications are widely used for examining
visual attention in marketing studies (
        <xref ref-type="bibr" rid="R59 R60 R61">1-3</xref>
        ), adapting
learning behavior in real time situations (
        <xref ref-type="bibr" rid="R62">4</xref>
        ) or to enhance
the control modalities in computergames (
        <xref ref-type="bibr" rid="R63 R64 R65">5-7</xref>
        ).
Especially saccadic eye movements and their statistics are of
interest.</p>
            
        <p>They are for instance used to investigate eye
movements during reading, scene perception and visual search
task, see Rayner (
        <xref ref-type="bibr" rid="R100">42</xref>
		) for review. Eye movement
abnormalities like corrective saccades in a smooth pursuit
task were shown to supplement clinical diagnosis of
schizophrenia (
        <xref ref-type="bibr" rid="R66 R67 R68">8-10</xref>
        ) and can be linked to cognitive
deficits in word processing of schizophrenia patients (
        <xref ref-type="bibr" rid="R69">11</xref>
        ).
Furthermore, saccadic eye movements can be used as
objective indicators in screening mental health (
        <xref ref-type="bibr" rid="R70">12</xref>
        ) for
instance of dyslexia (
        <xref ref-type="bibr" rid="R71 R72 R73">13-15</xref>
        ) or autism (
        <xref ref-type="bibr" rid="R74 R75">16, 17</xref>
        ). This
clearly shows the scientifically and clinically importance
to detect the characteristics of saccades accurately. Such
eye movement tests are often conducted on controlled
conditions with high accuracy eye trackers that require
head stabilization and presentation of stimuli on a fixed
display. This, however, is unlike normal visual
perception, and it is therefore important to move to more
day-today tasks. Eye movements in such tasks can only be
measured with mobile eye trackers, and it is unclear how
well these can measure and detect saccadic eye
movements.</p>

		<p>In the analysis of eye tracking data, the algorithm used to
detect events, such as fixations, blinks and saccades is the
crucial factor. Algorithms can be classified in three main
categories, based on their threshold criteria: dispersion,
velocity or acceleration-based (
        <xref ref-type="bibr" rid="R76 R77 R78">18-20</xref>
        ), or the
combination of these criteria. The velocity-threshold identification
is the fastest algorithm (no back-tracking required as in
dispersion algorithms) that differentiate fixations and
saccades by their point-by-point velocities and requires
only one parameter to specify, the velocity threshold (
        <xref ref-type="bibr" rid="R76">18</xref>
        ).
When using velocity-based algorithms to analyze eye
tracking data, the sampling rate of the eye tracking signal
becomes the limiting factor (
        <xref ref-type="bibr" rid="R79">21</xref>
        ). During saccades, the
eye movements are very fast, and at low sampling rates,
insufficient samples of these fast movements may be
available for correct detection. Because of their velocity
characteristics saccades can be classified as &#x201C;outliers&#x201D; in
the velocity profile (
        <xref ref-type="bibr" rid="R80 R81 R82">22-24</xref>
        ) and serve as a robust criteria
in analyzing eye tracking data.</p>
            
        <p>According to the Nyquist theorem, a higher sampled
eye tracker detects saccades of shorter duration in
comparison to a lower sampled eye tracker which saccade
detection shows a minimum duration threshold of twice
the Nyquist frequency. Thus, specifically an increase of
sampling frequency from 60 Hz to 120 Hz is expected to
increase the detection rate of saccades, whose durations
are in the range between approximately 16 and 33 ms.
The main sequence of saccades shows a linear
relationship between saccade amplitude and duration (
        <xref ref-type="bibr" rid="R82 R83">24, 25</xref>
        ). In
reading, which is a common activity and highly important
in modern day-to-day life, saccade distributions show a
high number of small saccades (
        <xref ref-type="bibr" rid="R91">33</xref>
		) which
would not be detected if they fall into the interval
between 16 and 33 ms. Saccadic behavior in reading tasks
is a well-studied and explained characteristic of human
eye movements . The task-specific saccade distributions
directly impact the detection rate of eye trackers with a
limited sampling rate. Thus, specifically in this task, an
accurate choice of sampling frequency is crucial.
Therefore, we assume that an eye tracker with higher sampling
might detect more saccades in a reading task, as it will
also detect short duration saccades. We furthermore
hypothesize, that the estimation of mean saccade duration is
more reliable when estimated from higher sampled gaze,
because more samples will be available to reliably detect
saccade start and end.Studies examining eye movements
typically rely on high-sampling static eye trackers. But,
novel mobile eye trackers allow recordings in more
natural scenarios, especially paradigms in which the subject is
freely behaving. With increasing use of mobile eye
trackers, it becomes inevitable to evaluate how well mobile
eye trackers can detect and measure saccadic eye
movements. Thus, this study evaluates the impact of an
increase in sampling rate of a head worn eye tracker
designed for field studies from 60 Hz to 120 Hz in a
realworld task with a topic of high research interest: reading
(
        <xref ref-type="bibr" rid="R84">26</xref>
        ).</p>      
    </sec>
	
    <sec id="S2">
      <title>Methods</title>
      <sec id="S2a">
        <title>Participants</title>	  

		<p>11 eye-healthy participants with a mean age of 34.9 &#xB1;
9.9 years were included in the study. The participants had
normal or corrected-to-normal vision. All participants
were na&#xEF;ve to the purpose of the study. All procedures
followed the tenets of the Declaration of Helsinki.
Informed consent was obtained from all participants after
explanation of the nature and possible consequences of
the study.</p>
	</sec>

      <sec id="S2b">
        <title>Equipment and experimental procedure</title>
		
        <p>Participants were wearing one of two mobile eye
trackers (SMI ETG w, 60 Hz sampling; SMI ETG 2w,
120 Hz sampling, SensoMotoric Instruments GmbH,
Teltow, Germany). In order to evaluate saccade detection
in these eye trackers, participants placed their head in a
chin rest and a stationary eye tracker (EyeLink 1000, SR
Research Ltd., Mississauga, Canada) was used as a
reference. This eye tracker was placed below the screen at a
distance of 60cm. For stimulus presentation, a visual
display (VIEWPixx /3D, VPixx Technologies, Canada) at
distance of 70 cm was used. Both mobile eye trackers and
the stationary eye tracker were calibrated and validated
using a 3-point calibration pattern composed of three
black rings on a gray background. The stationary and
mobile eye trackers recorded the eye positions
simultaneously. The stationary eye tracker was set to record at a
sampling rate of 1000 Hz, binocularly, and the two
mobile eye trackers to either 60 Hz or 120 Hz binocular
tracking. To minimize an influence of the IR signal from
the stationary eye tracker on the tracking ability of the
mobile eye tracking glasses, the power of the IR LED
was reduced to a minimum of 50% intensity.</p>

        <p>The mobile eye tracking glasses use infrared (IR)
LED&#xB4;s arranged in a ring pattern within the glasses frame
while the IR array from the stationary eye tracker is a
single dot pattern. Because of the differences in shape
and intensity of the reflection pattern (see Figure 1) and
the intensity of the corneal reflex is much higher in the
stationary eye tracker, both reflections were
distinguishable from each other and a simultaneous measurement was
possible. Moreover, both 60 Hz and 120 Hz eye trackers
are expected to be equally affected by any potential mutal
interference.</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1</label>
					<caption>
						<p>Comparison of the corneal infrared reflections from the stationary (red arrow) and the mobile (blue arrows) eye tracker. (a) represents the image from the stationary and (b) from the mobile eye tracker in simultaneous use.</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-10-03-c-figure-01.png"/>
				</fig>

        <p>Prior to the experiment start, participants were
informed that they will read a text, about which they will
have to answer questions to ensure attentive reading. To
enable an offline temporal synchronization between both
eye trackers for the data analysis, a peripheral fixation
point of 25&#xB0; dislocation from the left side of the text was
displayed for three seconds on the screen. Subsequently
the sample text was presented. The letter size was set to
23 pixels, which corresponded to an angular size of 0.5 &#xB0;
for capital letters using the font type Helvetica. A
relatively large letter size ensured that every normally sighted
participant was able to read the text. Two sample texts
were created: Text 1 contained information about the
human visual systems and covered 237 word, text 2 was
about Tuebingen and the Eberhard Karls University
Tuebingen with 276 words. An example of the
experiment procedure is given in Figure 2. The participants
were instructed to read silently and in their normal
reading speed. Subsequently, the participants confirmed or
rejected five statements regarding the content of the text,
by pressing a button on the keyboard. All stimuli were 
programmed and displayed using the psychophysics
toolbox (Psychtoolbox 3, Kleiner M, et al. 2007) in the
Matlab programming language (Matlab, MathWorks Inc.,
Natick, Massachusetts).</p>

      </sec>
      <sec id="S2c">
        <title>Analysis</title>
        
         <p>From the eye tracking data the average number of
fixations, their average duration and the average number of
saccades and their durations were calculated. Blinks were
excluded from the dataset prior to analysis. Blinks were
identified on the basis of the individual eye tracker
criteria (the pupil size is very small or either zero). Fixations
and saccades were identified using an algorithm based on
the velocity profile, see equation (
          <xref ref-type="bibr" rid="R59">1</xref>
          ), of the gaze data
calculated as the difference in horizontal eye position
between successive positions and divided by the
intersample time interval (
          <xref ref-type="bibr" rid="R76 R85">18, 27</xref>
          ) without application of a
running-average filter prior to the analysis. According to
equation (2), a fixation is classified as gaze points where
the eye-velocity signal <italic>v</italic><sup>&#x2192;</sup><sub><italic>n</italic></sub> remains below a threshold of d
= 60 &#xB0;/sec for a minimum time duration of &#x394;t<sub>Fix</sub> = 100 ms
(average fixation durations are around 200 &#x2013; 300 ms (
          <xref ref-type="bibr" rid="R91">33</xref>
		; Starr &amp; Rayner, 2001)). A single fixation
and its associated duration was defined as the time
interval where equation (2) resulted in a &#x2018;false&#x2019; outcome. In
addition to the fixation duration, the absolute number of
fixations was analyzed.</p>

<fig id="eq01" fig-type="equation" position="anchor">
                    <label>(1)</label>
					<graphic id="equation01" xlink:href="jemr-10-03-c-equation-01.png"/>
				</fig>
				
<fig id="eq02" fig-type="equation" position="anchor">
                    <label>(2)</label>
					<graphic id="equation02" xlink:href="jemr-10-03-c-equation-02.png"/>
				</fig>				
        
         <p>Furthermore, the velocity profile of the gaze data was
used for saccade detection. A saccade event was
identified as the time interval where the condition of equation
(2) was true (i.e., the intervals not assigned to a fixation).
The local maximum in this saccade time interval was
localized using Matlab and marked a saccade event. The
saccade duration was calculated as the time interval
during which the velocity of the eye remained above the
velocity threshold d and equation (2) was true.</p>
                
         <p>To analyze the difference in performance between the
60 Hz and the 120 Hz mobile tracking glasses, the
relative differences in the number and duration of detected
fixations and saccades between the mobile and the
stationary eye tracker were calculated and evaluated. All
calculations consider the gaze data of the right eye.
Normality of data was investigated using the Shapiro-Wilk
test. In case of normal distributed data a t-test (power 1-&#x3B2;
= 0.80) to test for difference in the detection ability was
performed. Consequently, a Wilcoxon rank test in case of
not normally distributed data was performed. The critical
p-value (&#x3B1; error) was set to 0.05 and the statistical
analyses was performed (IBM SPSS Statistics 22, IBM,
Armonk, USA).</p>        
      </sec>
    </sec>
	
    <sec id="S3">
      <title>Results</title>
	  
      <p>Figure 2 compares the eye movement data from the 
static eye tracker (a) and the data from the 120Hz mobile 
eye tracker (which is similar in the case of the 60 Hz mobile 
eye tracker) in (b), when superimposed on the text that was read. 
In order to plot the mobile eye tracking data (Figure 2b), the 
data were manually scaled using an empirically defined scaling factor. 
Figure 2 shows that the reduced sampling rate in the mobile eye tracker 
leads to a sparse representation of saccade midflight eye positions.</p>

<fig id="fig02" fig-type="figure" position="float">
					<label>Figure 2</label>
					<caption>
						<p>Raw data from gaze traces superimposed to the stimulus text. (a) illustrates the recorded eye position with a 1000 Hz eye tracker (EyeLink 1000) and in (b) at a sampling rate of 120 Hz (SMI mobile glasses). Gaze data and text were aligned manually by the author (horizontal and vertical stretching).</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-10-03-c-figure-02.png"/>
				</fig>

      <p>A larger number of saccades were detected for the
120 Hz than for the 60 Hz eye tracker (p=0.011, two-sited
t-test), see Table 1 and Figure 3. The 120 Hz mobile eye
tracker also led to a more reliable estimation of mean
saccade duration (&#x394; = 5.91 ms, p = 0.033, two-sited
ttest), see Figure 3. Despite these differences, the number
of saccades undetected by the stationary eye tracker but
detected by the mobile eye trackers was very low and
ranged below 1% of the total number of correctly
detected saccades. The data therefore show that saccade
detection was generally adequate in mobile eye trackers, the
120Hz eye tracker was better in measuring the duration
of the saccade than the 60Hz eye tracker.</p>

      <p>In contrast to the saccade, no significant difference in
the number of fixations were found between the 60Hz
and 120Hz eye tracker (p = 0.110, Wilcoxon-test).
Statistical analysis showed no significant difference between
the 60 Hz and the 120 Hz devices in fixation durations (p
= 0.088, paired t-test). Nevertheless, there is a trend
towards more accurate fixation detection in the 120 Hz
device when compared to the stationary eye tracker.
Mean and standard deviation of the number and the
duration of saccades and fixations are shown in Figure 3.</p>

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 3</label>
					<caption>
						<p>Mean number of fixations (a) and saccades (b), +/- standard deviation (SD). (c) and (d) present the mean fixation and saccade durations, respectively. Asterisks indicate the significance level: * &#x3B1; &#x3C; 0.05, *** &#x3B1; &#x3C; 0.001</p>
					</caption>
					<graphic id="graph03" xlink:href="jemr-10-03-c-figure-03.png"/>
				</fig>

<table-wrap id="t01" position="float">
					<label>Table 1</label>
					<caption>
						<p>Relative comparison of two mobile eye trackers to a stationary eye tracker. Mean and standard deviation (SD) for the number and the duration of saccades and fixations. Asterisks indicate the significance level: * &#x3B1; &#x3C; 0.05; n = 11</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
						<thead>
							<tr>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">60 Hz mobile eye tracker</td>
            <td rowspan="1" colspan="1">120 Hz mobile eye tracker</td>
            <td rowspan="1" colspan="1">Relative difference between mobile eye trackers</td>
							</tr>
						</thead>
						<tbody>
          <tr>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">Mean &#xB1; SD</td>
            <td rowspan="1" colspan="1"/>
          </tr>
          <tr>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Number of saccades</td>
            <td rowspan="1" colspan="1">56.11 &#xB1; 12.44 %</td>
            <td rowspan="1" colspan="1">68.37 &#xB1; 13.97 %</td>
            <td rowspan="1" colspan="1">12.25 % *</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Duration of saccades (ms)</td>
            <td rowspan="1" colspan="1">-10.81 &#xB1; 7.51 ms</td>
            <td rowspan="1" colspan="1">-4.89 &#xB1; 2.76 ms</td>
            <td rowspan="1" colspan="1">5.91 ms *</td>
          </tr>

          <tr>
            <td rowspan="1" colspan="1">Number of fixations</td>
            <td rowspan="1" colspan="1">76.72 &#xB1; 18.67 %</td>
            <td rowspan="1" colspan="1">86.41 &#xB1; 15.43 %</td>
            <td rowspan="1" colspan="1">9.69 %</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Duration of fixations (ms)</td>
            <td rowspan="1" colspan="1">10.55 &#xB1; 10.13 ms</td>
            <td rowspan="1" colspan="1">4.30 &#xB1; 14.33 ms</td>
            <td rowspan="1" colspan="1">6.25 ms</td>
          </tr>
						</tbody>
					</table>
					</table-wrap>
				
		<p>Figure 4 illustrate the frequency distribution 
of fixation durations in the 60 Hz (Figure 4a) and the 120 Hz 
devices (Figure 4b), respectively, compared to the stationary 
eye tracker. Fixation durations within the silent reading task 
ranged from 100 ms to 600 ms. The distribution of recorded fixations 
of the 60 Hz mobile eye tracker showed a shift of the maximum towards 
smaller fixation duration (p = 0.01, Wilcoxon rank test) while the 
distribution of the 120 Hz mobile eye tracker reveals a trend towards 
a better assessment of fixation durations (p = 0.59, Wilcoxon rank test), 
in comparison to the stationary reference eye tracker.</p>

<fig id="fig04" fig-type="figure" position="float">
					<label>Figure 4</label>
					<caption>
						<p>Mean frequency distribution (n = 11) of fixation durations for the stationary and the (a) 60 Hz and (b) 120 Hz mobile eye tracker in milliseconds (ms).</p>
					</caption>
					<graphic id="graph04" xlink:href="jemr-10-03-c-figure-04.png"/>
				</fig>
	</sec>
	
    <sec id="S4">
      <title>Discussion</title>
      
       <p>Previous studies have revealed that the use of
stationary eye trackers with lower sampling rates results in
significantly impoverished detection and measurement of
saccadic eye movements, especially at the border of the
stimuli screen (
        <xref ref-type="bibr" rid="R86">28</xref>
        ). Generally, high-frequency stationary
eye trackers should be preferred in investigations of
saccades and the use of eye tracker with lower sampling
rates should be restricted to the examination of fixation
behavior and pupil size (
        <xref ref-type="bibr" rid="R87">29</xref>
        ). While the effects of
sampling rate for stationary eye trackers is known, no such
information is available for the impact of mobile eye
trackers, which place the cameras often closer to
participants' eyes, use a different pattern of IR lighting, and a
different calibration method. In the current study we
compared the impact of sampling rate of mobile eye
trackers on extraction rates of saccades and fixations in a
reading task, as a common task of daily life.</p>
	</sec>
      
      <sec id="S4a">
        <title>Mobile eye tracking and reading</title>
        
         <p>The development of mobile eye trackers in the last
years (
          <xref ref-type="bibr" rid="R88 R89 R90">30-32</xref>
          ) has enabled researcher to examine eye
movements during reading in a natural context (
          <xref ref-type="bibr" rid="R91">33</xref>
          ).
Mobile eye tracking of reading may enhance clinical
diagnosis, for example, by differentiating progressive
supranuclear palsy from Parkinson's disease (
          <xref ref-type="bibr" rid="R92">34</xref>
          ) or for mental or
linguistic disorders (
          <xref ref-type="bibr" rid="R69 R70">11, 12</xref>
          ). The measurement of eye
movements in such tasks has led to the further
understanding of learning (
          <xref ref-type="bibr" rid="R62">4</xref>
          ) e.g. in medical and health
professions (
          <xref ref-type="bibr" rid="R93">35</xref>
          ) and can further be extended to e-learning
applications (
          <xref ref-type="bibr" rid="R94">36</xref>
          ). However, research on the reliability of
mobile eye tracker in the detection of saccades and
fixations especially in reading is sparse.</p>
                
         <p>Current analysis of saccadic eye movements
demonstrated the benefit of a higher sampling rate of the 120 Hz
mobile eye tracker in the detection of saccades. During
reading, the amplitude and number of both progressive
and return (regression) saccades depend on various
intrinsic and extrinsic factors (
          <xref ref-type="bibr" rid="R91">33</xref>
          ). To evaluate these
properties of saccades, it is therefore important to measure
parameter of saccadic eye movements accurately.
External aspects like visual information factors, e.g. the spaces
or type of characters between words (
          <xref ref-type="bibr" rid="R95 R96">37, 38</xref>
          ) or the length
and orthographic information of the words (
          <xref ref-type="bibr" rid="R97 R98 R99">39-41</xref>
          )
impact saccadic amplitudes. Secondly, higher level factors,
such as spatial coding (
          <xref ref-type="bibr" rid="R80">22</xref>
          ) or the location of attention
(
          <xref ref-type="bibr" rid="R100 R101">42, 43</xref>
          ) influence saccadic behavior. The main sequence
saccadic eye movements (
          <xref ref-type="bibr" rid="R83 R102">25, 44</xref>
          ) demonstrates a linear
correlation between saccadic amplitude and duration. In
reading, people often make small saccadic eye
movements (e.g. refixations of the same word), and therefore it
is important to accurately detect small saccade
amplitudes, it is crucial to use high-frequency equipment
facilitate recording of small saccade durations. Our results
revealed that saccades are better detected with a 120Hz
sampling rate and that the distribution of saccade
amplitudes is better measured with this higher sampling rate.</p>
      </sec>

      <sec id="S4b">
        <title>Event detection algorithms for eye
movement data</title>
                
         <p>In the analysis of saccadic eye movements we used
the standard approach based on the velocity profile of the
gaze traces (
          <xref ref-type="bibr" rid="R76">18</xref>
          ). Engbert and Kliegl (
          <xref ref-type="bibr" rid="R82">24</xref>		  
		  ) developed a
velocity-based algorithm for the detection of
microsaccades involving a noise dependent detection threshold
and a temporal overlap criterion for the binocular
occurrence of saccades. The advantage of using a noise
dependent algorithm is that it can be adapted easily to the
different eye tracking technologies and inter-individual
differences (
          <xref ref-type="bibr" rid="R82">24</xref>
          ). In future work, such noise dependent
algorithms could therefore improve the detection
performance of low sampled eye tracking data if the internal
noise distribution is different between the eye trackers. A
further approach for future work is to use data from both
eyes in the analysis (the method by Engbert and Kliegl
requires saccades to overlap in both eyes). However, in a
real-world application a saccade detection algorithm
could account for the binocularity and could increase
accuracy of detecting saccades. One further extension is
to use acceleration in addition to velocity to detect
saccades (
          <xref ref-type="bibr" rid="R103">45</xref>
          ) and combined this with noise-dependent
saccade thresholds (
          <xref ref-type="bibr" rid="R104">46</xref>
          ). Future work can also examine the
use of more complex algorithms, including continuous
wavelet and principal component analysis (PCA) or using
that saccades can be identified as local singularities (
          <xref ref-type="bibr" rid="R105">47</xref>
          ).
Although the focus of the present study was not the
comparison of algorithms for event detection in mobile eye
tracking, advanced computations might improve the
performance in event detection.</p>        
      </sec>
	  
      <sec id="S4c">
        <title>Head-worn vs. head fixed eye tracking</title>
        
         <p>The accuracy of eye tracker strongly depends on the
conditions of the planned experiment (
          <xref ref-type="bibr" rid="R106">48</xref>
          ) and
furthermore on restrictions of head movements (
          <xref ref-type="bibr" rid="R107">49</xref>
          ), hence it is
important to consider application-oriented parameters as
well. Generally, head-worn eye tracker are not restricted
by a certain head position but the eye movements show a
more complex pattern compared to a head-fixed situation,
as for instance the vestibulo-ocular reflex (
          <xref ref-type="bibr" rid="R108">50</xref>
          ) or the
optokinetic nystagmus (
          <xref ref-type="bibr" rid="R109">51</xref>
          ) occur. In inter-device
comparisons between mobile eye trackers, further studies will
have to clarify whether the event detection in mobile eye
tracking depends on the type of tracking (e.g. pupil/glint
tracking vs. 3d- eye model) or number of tracked eyes
and the use of advanced calibration methods or
algorithms, as discussed above. The current study reports on
results of laboratory work including a head-fixed
measurement setup. Mobile eye tracker enable a head-free
acquisition of eye movement data and it was shown that
head movements strongly contribute in the processing of
the visual input, and thus to the oculomotor behavior (
          <xref ref-type="bibr" rid="R110 R111">52, 53</xref>
          ). Furthermore, in head-free scenarios position or
orientation dislocation of the eye tracker on the head was
shown to have significant influence on the accuracy of
eye tracker (
          <xref ref-type="bibr" rid="R106">48</xref>
          ). Given that, upcoming studies will have
to investigate the sampling dependence of mobile eye
tracker in head-free scenarios and real-world tasks.</p>        
      </sec>
	  
      <sec id="S4d">
        <title>Fixation statistics in reading</title>
        
         <p>Analysis of the fixation statistics during a common
reading task showed no difference in the number of
detected fixations or the mean fixation duration between the
two mobile eye trackers. The observed mean fixation
duration of 220 ms to 240 ms is comparable to other
studies, which performed silent reading tasks (
          <xref ref-type="bibr" rid="R91 R112 R113">33, 54, 55</xref>
          ).
Yang et al. (
          <xref ref-type="bibr" rid="R96">38</xref>
			) reported shorter fixation duration for a
reading task of 211 ms. The higher sampling rate of the
120 Hz mobile eye tracker led to a closer mapping of the
frequency distribution of fixation durations, which is also
represented in the significant smaller mean fixation
duration, compared to the 60 Hz eye tracker. The frequency
distribution of fixation durations showed in all cases the
typical right tailed function (
          <xref ref-type="bibr" rid="R81 R96">23, 38</xref>
          ) with a maximum
around 200 ms. The choice of a typical and realistic
reading task, which represents a common daily visual duty,
suggests that this estimation is also correct for the
saccade amplitude distribution in reading and possibly in
other tasks.</p>

      <sec id="S4e">
        <title>Future implications in virtual reality applications</title>
       
         <p>Mobile eye tracking is becoming progressively more
important with the introduction of head-mounted-displays
(HMD) and virtual reality (VR) glasses to enable a more
realistic interaction mediated by
human-computerinterfaces (
          <xref ref-type="bibr" rid="R114 R115 R116 R117 R118 R119">56-61</xref>
          ). Thus, there is an increasing need in
eye trackers that combine usability for field studies with
high accuracy and fast eye tracker in a miniaturized
version and their incorporation into HMD or VR systems(
          <xref ref-type="bibr" rid="R120 R121">62, 63</xref>
			). Real time gaze estimation including precise and
fast eye tracking enables accurate and thus comfortable
stereo image presentation in virtual reality simulations or
highly interactive virtual reality scenarios due to a higher
sampling rate of eye tracker. Specifically, the detection of
fast eye movements, like saccades, can enable gaze
pointing to virtual objects. Juhola et al. (
          <xref ref-type="bibr" rid="R79">21</xref>
          ) showed that a
velocity based algorithm for saccade analysis requires a
minimum of 70 Hz sampled data. DiScenna et al. (
          <xref ref-type="bibr" rid="R122">64</xref>
			)
stated that for a reliable measurement of all kinds of eye
movement video cameras with frame rates above 120 Hz
are necessary. The results of the current study suggests
that a 120 Hz mobile eye tracker leads to more reliable
measurements also in a task specific evaluation of
saccade and fixation statistics on reading.</p>       
      </sec>
      </sec>	  
	  
    <sec id="S5">
      <title>Conclusions</title>	  
	  
        <p>The study reports on a relative performance
comparison between two mobile video-based eye trackers during
reading. Low sampled eye tracking (60 Hz) lead to an
under estimation of the detection of saccades while 120
Hz sampling results in a higher accuracy in the detection
of fast eye movements and fixation durations. A certain
detection of small saccade durations, as they occur in
reading, requires higher sampling rates of the used eye
trackers. Reliable and robust detection of saccades by fast
and accurate mobile eye trackers will lead to novel
developments in gaze-contingent protocols, e.g. for virtual
reality simulations. Furthermore, increased sampling
rates in eye tracking technology might enable
advancements in new fields, such as in clinical applications for
eye movement training scenarios in visual impaired
patients or clinical eye movement marker analysis in
diagnosis of diseases.</p>
      </sec>

    <sec id="S6" sec-type="COI-statement">
      <title>Ethics and Conflict of Interest</title>	  
	  	  
        <p>The author(s) declare(s) that the contents of the article
are in agreement with the ethics described in
http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html 
and that there is no conflict of interest regarding the
publication of this paper.</p>
      </sec>
	  
    <sec id="S7">
      <title>Acknowledgements</title>		  
	  
        <p>This work was done in an
industry-on-campuscooperation between the University Tuebingen and Carl
Zeiss Vision InternationalGmbH. The work was
supported by third-party-funding (ZUK 63).The beta version of
the 120 Hz mobile eye tracker was developed and
friendly provided by the SensoMotoric Instruments GmbH,
D14513 Teltow, Germany.</p>
    </sec>	
  </body>
  <back>
   <ref-list>
<ref id="R59"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Oliveira</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Machín</surname>, <given-names>L.</given-names></string-name>, <string-name><surname>Deliza</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Rosenthal</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Walter</surname>, <given-names>E. H.</given-names></string-name>, <string-name><surname>Giménez</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Ares</surname>, <given-names>G.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Consum-ers’ attention to functional food labels: Insights from eye-tracking and change detection in a case study with probiotic milk.</article-title> <source>Lebensmittel-Wissenschaft + Technologie</source>, <volume>68</volume>, <fpage>160</fpage>–<lpage>167</lpage>. <pub-id pub-id-type="doi">10.1016/j.lwt.2015.11.066</pub-id><issn>0023-6438</issn></mixed-citation></ref>
<ref id="R60"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Lahey</surname>, <given-names>J. N.</given-names></string-name>, &amp; <string-name><surname>Oxley</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2016</year>). <article-title>The Power of Eye Tracking in Economics Experiments.</article-title> <source>The American Economic Review</source>, <volume>106</volume>(<issue>5</issue>), <fpage>309</fpage>–<lpage>313</lpage>. <pub-id pub-id-type="doi">10.1257/aer.p20161009</pub-id><issn>0002-8282</issn></mixed-citation></ref>
<ref id="R61"><mixed-citation publication-type="other" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Wedel</surname>, <given-names>M.</given-names></string-name></person-group> (<year>2013</year>). <article-title>Attention research in marketing: A review of eye tracking studies.</article-title> <source>Robert H. Smith School Research Paper No. RHS, 2460289</source>. <pub-id pub-id-type="doi">10.2139/ssrn.2460289</pub-id></mixed-citation></ref>
<ref id="R62"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Rosch</surname>, <given-names>J. L.</given-names></string-name>, &amp; <string-name><surname>Vogel-Walcutt</surname>, <given-names>J. J.</given-names></string-name></person-group> (<year>2013</year>). <article-title>A review of eye-tracking applications as tools for training.</article-title> <source>Cognition Technology and Work</source>, <volume>15</volume>(<issue>3</issue>), <fpage>313</fpage>–<lpage>327</lpage>. <pub-id pub-id-type="doi">10.1007/s10111-012-0234-7</pub-id><issn>1435-5558</issn></mixed-citation></ref>
<ref id="R63"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Vickers</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Istance</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Hyrskykari</surname>, <given-names>A.</given-names></string-name></person-group> (<year>2013</year>). <article-title>Performing locomotion tasks in immersive computer games with an adapted eye-tracking interface.</article-title> <comment>[TACCESS]</comment>. <source>ACM Transactions on Accessible Computing</source>, <volume>5</volume>(<issue>1</issue>), <fpage>2</fpage>. <pub-id pub-id-type="doi">10.1145/2514856</pub-id><issn>1936-7228</issn></mixed-citation></ref>
<ref id="R64"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Isokoski</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Joos</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Spakov</surname>, <given-names>O.</given-names></string-name>, &amp; <string-name><surname>Martin</surname>, <given-names>B.</given-names></string-name></person-group> (<year>2009</year>). <article-title>Gaze controlled games.</article-title> <source>Universal Access in the Information Society</source>, <volume>8</volume>(<issue>4</issue>), <fpage>323</fpage>–<lpage>337</lpage>. <pub-id pub-id-type="doi">10.1007/s10209-009-0146-3</pub-id><issn>1615-5289</issn></mixed-citation></ref>
<ref id="R65"><mixed-citation publication-type="conference" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Isokoski</surname>, <given-names>P.</given-names></string-name>, &amp; <string-name><surname>Martin</surname>, <given-names>B.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Eye tracker input in first person shooter games.</article-title> <source>Paper presented at the Proceedings of the 2nd Conference on Communication by Gaze Interaction: Communication by Gaze Interaction-COGAIN 2006: Gazing into the Future</source>.</mixed-citation></ref>
<ref id="R66"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Benson</surname>, <given-names>P. J.</given-names></string-name>, <string-name><surname>Beedie</surname>, <given-names>S. A.</given-names></string-name>, <string-name><surname>Shephard</surname>, <given-names>E.</given-names></string-name>, <string-name><surname>Giegling</surname>, <given-names>I.</given-names></string-name>, <string-name><surname>Rujescu</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>St Clair</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Simple viewing tests can detect eye movement abnormalities that distinguish schizophrenia cases from controls with exceptional accuracy.</article-title> <source>Biological Psychiatry</source>, <volume>72</volume>(<issue>9</issue>), <fpage>716</fpage>–<lpage>724</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsych.2012.04.019</pub-id><pub-id pub-id-type="pmid">22621999</pub-id><issn>0006-3223</issn></mixed-citation></ref>
<ref id="R67"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Sereno</surname>, <given-names>A. B.</given-names></string-name>, &amp; <string-name><surname>Holzman</surname>, <given-names>P. S.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Antisaccades and smooth pursuit eye movements in schizophrenia.</article-title> <source>Biological Psychiatry</source>, <volume>37</volume>(<issue>6</issue>), <fpage>394</fpage>–<lpage>401</lpage>. <pub-id pub-id-type="doi">10.1016/0006-3223(94)00127-O</pub-id><pub-id pub-id-type="pmid">7772648</pub-id><issn>0006-3223</issn></mixed-citation></ref>
<ref id="R68"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Shulgovskiy</surname>, <given-names>V. V.</given-names></string-name>, <string-name><surname>Slavutskaya</surname>, <given-names>M. V.</given-names></string-name>, <string-name><surname>Lebedeva</surname>, <given-names>I. S.</given-names></string-name>, <string-name><surname>Karelin</surname>, <given-names>S. A.</given-names></string-name>, <string-name><surname>Moiseeva</surname>, <given-names>V. V.</given-names></string-name>, <string-name><surname>Kulaichev</surname>, <given-names>A. P.</given-names></string-name>, &amp; <string-name><surname>Kaleda</surname>, <given-names>V. G.</given-names></string-name></person-group> (<year>2015</year>). <article-title>Saccadic responses to consecutive visual stimuli in healthy people and patients with schizophrenia.</article-title> <source>Human Physiology</source>, <volume>41</volume>(<issue>4</issue>), <fpage>372</fpage>–<lpage>377</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1134/s0362119715040143</pub-id> <pub-id pub-id-type="doi">10.1134/S0362119715040143</pub-id><issn>0362-1197</issn></mixed-citation></ref>
<ref id="R69"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Fernández</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Sapognikoff</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Guinjoan</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Orozco</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>Agamennoni</surname>, <given-names>O.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Word processing during reading sentences in patients with schizophrenia: Evidences from the eyetracking technique.</article-title> <source>Comprehensive Psychiatry</source>, <volume>68</volume>, <fpage>193</fpage>–<lpage>200</lpage>. <pub-id pub-id-type="doi">10.1016/j.comppsych.2016.04.018</pub-id><pub-id pub-id-type="pmid">27234202</pub-id><issn>0010-440X</issn></mixed-citation></ref>
<ref id="R70"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Vidal</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Turner</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Bulling</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Gellersen</surname>, <given-names>H.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Wearable eye tracking for mental health monitoring.</article-title> <source>Computer Communications</source>, <volume>35</volume>(<issue>11</issue>), <fpage>1306</fpage>–<lpage>1311</lpage>. <pub-id pub-id-type="doi">10.1016/j.comcom.2011.11.002</pub-id><issn>0140-3664</issn></mixed-citation></ref>
<ref id="R71"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Nilsson Benfatto</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Öqvist Seimyr</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Ygge</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Pansell</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Rydberg</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Jacobson</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Screening for Dyslexia Using Eye Tracking during Reading.</article-title> <source>PLoS One</source>, <volume>11</volume>(<issue>12</issue>), <fpage>e0165508</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0165508</pub-id><pub-id pub-id-type="pmid">27936148</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="R72"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Biscaldi</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Gezeck</surname>, <given-names>S.</given-names></string-name>, &amp; <string-name><surname>Stuhr</surname>, <given-names>V.</given-names></string-name></person-group> (<year>1998</year>). <article-title>Poor saccadic control correlates with dyslexia.</article-title> <source>Neuropsychologia</source>, <volume>36</volume>(<issue>11</issue>), <fpage>1189</fpage>–<lpage>1202</lpage>. <pub-id pub-id-type="doi">10.1016/S0028-3932(97)00170-X</pub-id><pub-id pub-id-type="pmid">9842764</pub-id><issn>0028-3932</issn></mixed-citation></ref>
<ref id="R73"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Eden</surname>, <given-names>G. F.</given-names></string-name>, <string-name><surname>Stein</surname>, <given-names>J. F.</given-names></string-name>, <string-name><surname>Wood</surname>, <given-names>H. M.</given-names></string-name>, &amp; <string-name><surname>Wood</surname>, <given-names>F. B.</given-names></string-name></person-group> (<year>1994</year>). <article-title>Differences in eye movements and reading problems in dyslexic and normal children.</article-title> <source>Vision Research</source>, <volume>34</volume>(<issue>10</issue>), <fpage>1345</fpage>–<lpage>1358</lpage>. <pub-id pub-id-type="doi">10.1016/0042-6989(94)90209-7</pub-id><pub-id pub-id-type="pmid">8023443</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R74"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Rosenhall</surname>, <given-names>U.</given-names></string-name>, <string-name><surname>Johansson</surname>, <given-names>E.</given-names></string-name>, &amp; <string-name><surname>Gillberg</surname>, <given-names>C.</given-names></string-name></person-group> (<year>1988</year>). <article-title>Oculomotor findings in autistic children.</article-title> <source>The Journal of Laryngology and Otology</source>, <volume>102</volume>(<issue>5</issue>), <fpage>435</fpage>–<lpage>439</lpage>. <pub-id pub-id-type="doi">10.1017/S0022215100105286</pub-id><pub-id pub-id-type="pmid">3397639</pub-id><issn>0022-2151</issn></mixed-citation></ref>
<ref id="R75"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kemner</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Verbaten</surname>, <given-names>M. N.</given-names></string-name>, <string-name><surname>Cuperus</surname>, <given-names>J. M.</given-names></string-name>, <string-name><surname>Camfferman</surname>, <given-names>G.</given-names></string-name>, &amp; <string-name><surname>van Engeland</surname>, <given-names>H.</given-names></string-name></person-group> (<year>1998</year>). <article-title>Abnormal saccadic eye movements in autistic children.</article-title> <source>Journal of Autism and Developmental Disorders</source>, <volume>28</volume>(<issue>1</issue>), <fpage>61</fpage>–<lpage>67</lpage>. <pub-id pub-id-type="doi">10.1023/A:1026015120128</pub-id><pub-id pub-id-type="pmid">9546303</pub-id><issn>0162-3257</issn></mixed-citation></ref>
<ref id="R76"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Salvucci</surname>, <given-names>D. D.</given-names></string-name>, &amp; <string-name><surname>Goldberg</surname>, <given-names>J. H.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Identifying fixations and saccades in eye-tracking protocols.</article-title> <source>Paper presented at the Proceedings of the 2000 symposium on Eye tracking research &amp; applications</source>, <publisher-loc>Palm Beach Gardens, Florida, USA</publisher-loc>. <pub-id pub-id-type="doi">10.1145/355017.355028</pub-id></mixed-citation></ref>
<ref id="R77"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Duchowski</surname>, <given-names>A.</given-names></string-name></person-group> (<year>2007</year>). <source>Eye Tracking Methodology: Theory and Practice</source>: <publisher-name>Springer</publisher-name> <publisher-loc>London</publisher-loc>.</mixed-citation></ref>
<ref id="R78"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Nyström</surname>, <given-names>M.</given-names></string-name>, &amp; <string-name><surname>Holmqvist</surname>, <given-names>K.</given-names></string-name></person-group> (<year>2010</year>). <article-title>An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data.</article-title> <source>Behavior Research Methods</source>, <volume>42</volume>(<issue>1</issue>), <fpage>188</fpage>–<lpage>204</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.3758/brm.42.1.188</pub-id> <pub-id pub-id-type="doi">10.3758/BRM.42.1.188</pub-id><pub-id pub-id-type="pmid">20160299</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="R79"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Juhola</surname>, <given-names>M.</given-names></string-name>, &amp; <string-name><surname>Pyykkö</surname>, <given-names>I.</given-names></string-name></person-group> (<year>1987</year>). <article-title>Effect of sampling frequencies on the velocity of slow and fast phases of nystagmus.</article-title> <source>International Journal of Bio-Medical Computing</source>, <volume>20</volume>(<issue>4</issue>), <fpage>253</fpage>–<lpage>263</lpage>. Retrieved from <ext-link ext-link-type="uri" xlink:href="http://www.ncbi.nlm.nih.gov/pubmed/3308713">http://www.ncbi.nlm.nih.gov/pubmed/3308713</ext-link> <pub-id pub-id-type="doi">10.1016/0020-7101(87)90036-5</pub-id><pub-id pub-id-type="pmid">3308713</pub-id><issn>0020-7101</issn></mixed-citation></ref>
<ref id="R80"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Liversedge</surname>, <given-names>S. P.</given-names></string-name>, &amp; <string-name><surname>Findlay</surname>, <given-names>J. M.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Saccadic eye movements and cognition.</article-title> <source>Trends in Cognitive Sciences</source>, <volume>4</volume>(<issue>1</issue>), <fpage>6</fpage>–<lpage>14</lpage>. <pub-id pub-id-type="doi">10.1016/S1364-6613(99)01418-7</pub-id><pub-id pub-id-type="pmid">10637617</pub-id><issn>1364-6613</issn></mixed-citation></ref>
<ref id="R81"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Inhoff</surname>, <given-names>A. W.</given-names></string-name>, <string-name><surname>Seymour</surname>, <given-names>B. A.</given-names></string-name>, <string-name><surname>Schad</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>Greenberg</surname>, <given-names>S.</given-names></string-name></person-group> (<year>2010</year>). <article-title>The size and direction of saccadic curvatures during reading.</article-title> <source>Vision Research</source>, <volume>50</volume>(<issue>12</issue>), <fpage>1117</fpage>–<lpage>1130</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2010.03.025</pub-id><pub-id pub-id-type="pmid">20381515</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R82"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Engbert</surname>, <given-names>R.</given-names></string-name>, &amp; <string-name><surname>Kliegl</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2003</year>). <article-title>Microsaccades uncover the orientation of covert attention.</article-title> <source>Vision Research</source>, <volume>43</volume>(<issue>9</issue>), <fpage>1035</fpage>–<lpage>1045</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(03)00084-1</pub-id><pub-id pub-id-type="pmid">12676246</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R83"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Bahill</surname>, <given-names>A. T.</given-names></string-name>, <string-name><surname>Clark</surname>, <given-names>M. R.</given-names></string-name>, &amp; <string-name><surname>Stark</surname>, <given-names>L.</given-names></string-name></person-group> (<year>1975</year>). <article-title>The main sequence, a tool for studying human eye movements.</article-title> <source>Mathematical Biosciences</source>, <volume>24</volume>(<issue>3-4</issue>), <fpage>191</fpage>–<lpage>204</lpage>. <pub-id pub-id-type="doi">10.1016/0025-5564(75)90075-9</pub-id><issn>0025-5564</issn></mixed-citation></ref>
<ref id="R84"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Rayner</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Pollatsek</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Ashby</surname>, <given-names>J.</given-names></string-name>, &amp; <string-name><surname>Clifton</surname>, <given-names>C.</given-names>, <suffix>Jr</suffix>.</string-name></person-group> (<year>2012</year>). <source>Psychology of reading</source>. <publisher-name>Psychology Press</publisher-name>.</mixed-citation></ref>
<ref id="R85"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>van der Geest</surname>, <given-names>J. N.</given-names></string-name>, &amp; <string-name><surname>Frens</surname>, <given-names>M. A.</given-names></string-name></person-group> (<year>2002</year>). <article-title>Recording eye movements with video-oculography and scleral search coils: A direct comparison of two methods.</article-title> <source>Journal of Neuroscience Methods</source>, <volume>114</volume>(<issue>2</issue>), <fpage>185</fpage>–<lpage>195</lpage>. <pub-id pub-id-type="doi">10.1016/S0165-0270(01)00527-1</pub-id><pub-id pub-id-type="pmid">11856570</pub-id><issn>0165-0270</issn></mixed-citation></ref>
<ref id="R86"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ooms</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Dupont</surname>, <given-names>L.</given-names></string-name>, <string-name><surname>Lapon</surname>, <given-names>L.</given-names></string-name>, &amp; <string-name><surname>Popelka</surname>, <given-names>S.</given-names></string-name></person-group> (<year>2015</year>). <article-title>Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental setups.</article-title> <source>Journal of Eye Movement Research</source>, <volume>8</volume>(<issue>1</issue>).<issn>1995-8692</issn></mixed-citation></ref>
<ref id="R87"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Dalmaijer</surname>, <given-names>E.</given-names></string-name></person-group> (<year>2014</year>). <article-title>Is the low-cost EyeTribe eye tracker any good for research?</article-title> <source>PeerJ PrePrints, 2:e585v1</source>. <pub-id pub-id-type="doi">10.7287/peerj.preprints.585v1</pub-id></mixed-citation></ref>
<ref id="R88"><mixed-citation publication-type="conference" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Babcock</surname>, <given-names>J. S.</given-names></string-name>, &amp; <string-name><surname>Pelz</surname>, <given-names>J. B.</given-names></string-name></person-group> (<year>2004</year>). <article-title>Building a light-weight eyetracking headgear.</article-title> <source>Proceedings of the 2004 symposium on Eye tracking research \&amp; applications</source>, <fpage>109</fpage>-<lpage>114</lpage>. doi:<pub-id pub-id-type="doi">10.1145/968363.968386</pub-id></mixed-citation></ref>
<ref id="R89"><mixed-citation publication-type="conference" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Li</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Babcock</surname>, <given-names>J.</given-names></string-name>, &amp; <string-name><surname>Parkhurst</surname>, <given-names>D. J.</given-names></string-name></person-group> (<year>2006</year>). <article-title>openEyes: a low-cost head-mounted eye-tracking solution</article-title>. Paper presented at the <source>Proceedings of the 2006 symposium on Eye tracking research &amp;amp; applications</source>, <conf-loc>San Diego, California</conf-loc>.</mixed-citation></ref>
<ref id="R90"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Pfeiffer</surname>, <given-names>T.</given-names></string-name>, &amp; <string-name><surname>Renner</surname>, <given-names>P.</given-names></string-name></person-group> (<year>2014</year>). <article-title>EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technol-ogy.</article-title> Paper presented at the <source>Proceedings of the Symposium on Eye Tracking Research and Applications</source>, <conf-loc>Safety Harbor, Florida</conf-loc>. <pub-id pub-id-type="doi">10.1145/2578153.2578183</pub-id></mixed-citation></ref>
<ref id="R91"><mixed-citation publication-type="web-page" specific-use="unparsed">Rayner. (<year>1998</year>). <article-title>Eye movements in reading and information processing: 20 years of research</article-title>. <source>Psychol Bull</source>, <volume>124</volume>(<issue>3</issue>), <fpage>372</fpage>-<lpage>422</lpage>. Retrieved from <ext-link ext-link-type="uri" xlink:href="http://www.ncbi.nlm.nih.gov/pubmed/9849112">http://www.ncbi.nlm.nih.gov/pubmed/9849112</ext-link></mixed-citation></ref>
<ref id="R92"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Marx</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Respondek</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Stamelou</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Dowiasch</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Stoll</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Bremmer</surname>, <given-names>F.</given-names></string-name>, <etal>. . .</etal> <string-name><surname>Einhäuser</surname>, <given-names>W.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Validation of mobile eye-tracking as novel and efficient means for differentiating progressive supranuclear palsy from Parkinson’s disease.</article-title> <source>Frontiers in Behavioral Neuroscience</source>, <volume>6</volume>(<issue>88</issue>), <fpage>88</fpage>. <pub-id pub-id-type="doi">10.3389/fnbeh.2012.00088</pub-id><pub-id pub-id-type="pmid">23248593</pub-id><issn>1662-5153</issn></mixed-citation></ref>
<ref id="R93"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kok</surname>, <given-names>E. M.</given-names></string-name>, &amp; <string-name><surname>Jarodzka</surname>, <given-names>H.</given-names></string-name></person-group> (<year>2017</year>). <article-title>Before your very eyes: The value and limitations of eye tracking in medical education.</article-title> <source>Medical Education</source>, <volume>51</volume>(<issue>1</issue>), <fpage>114</fpage>–<lpage>122</lpage>. <pub-id pub-id-type="doi">10.1111/medu.13066</pub-id><pub-id pub-id-type="pmid">27580633</pub-id><issn>0308-0110</issn></mixed-citation></ref>
<ref id="R94"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Molina</surname>, <given-names>A. I.</given-names></string-name>, <string-name><surname>Redondo</surname>, <given-names>M. A.</given-names></string-name>, <string-name><surname>Lacave</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name><surname>Ortega</surname>, <given-names>M.</given-names></string-name></person-group> (<year>2014</year>). <article-title>Assessing the effectiveness of new devices for accessing learning materials: An empirical analysis based on eye tracking and learner subjective percep-tion.</article-title> <source>Computers in Human Behavior</source>, <volume>31</volume>, <fpage>475</fpage>–<lpage>490</lpage>. <pub-id pub-id-type="doi">10.1016/j.chb.2013.04.022</pub-id><issn>0747-5632</issn></mixed-citation></ref>
<ref id="R95"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Pollatsek</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Rayner</surname>, <given-names>K.</given-names></string-name></person-group> (<year>1982</year>). <article-title>Eye movement control in reading: The role of word boundaries.</article-title> <source>Journal of Experimental Psychology. Human Perception and Performance</source>, <volume>8</volume>(<issue>6</issue>), <fpage>817</fpage>–<lpage>833</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.8.6.817</pub-id><issn>0096-1523</issn></mixed-citation></ref>
<ref id="R96"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Yang</surname>, <given-names>S. N.</given-names></string-name>, &amp; <string-name><surname>McConkie</surname>, <given-names>G. W.</given-names></string-name></person-group> (<year>2001</year>). <article-title>Eye movements during reading: A theory of saccade initiation times.</article-title> <source>Vision Research</source>, <volume>41</volume>(<issue>25-26</issue>), <fpage>3567</fpage>–<lpage>3585</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(01)00025-6</pub-id><pub-id pub-id-type="pmid">11718796</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R97"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Rayner</surname>, <given-names>K.</given-names></string-name>, &amp; <string-name><surname>McConkie</surname>, <given-names>G. W.</given-names></string-name></person-group> (<year>1976</year>). <article-title>What guides a reader's eye movements?</article-title> <source>Vision Research</source>, <volume>16</volume>(<issue>8</issue>), <fpage>829</fpage>-<lpage>837</lpage>. doi:http://dx.doi.org/<pub-id pub-id-type="doi">10.1016/0042-6989(76)90143-7</pub-id></mixed-citation></ref>
<ref id="R98"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Joseph</surname>, <given-names>H. S.</given-names></string-name>, <string-name><surname>Liversedge</surname>, <given-names>S. P.</given-names></string-name>, <string-name><surname>Blythe</surname>, <given-names>H. I.</given-names></string-name>, <string-name><surname>White</surname>, <given-names>S. J.</given-names></string-name>, &amp; <string-name><surname>Rayner</surname>, <given-names>K.</given-names></string-name></person-group> (<year>2009</year>). <article-title>Word length and landing position effects during reading in children and adults.</article-title> <source>Vision Research</source>, <volume>49</volume>(<issue>16</issue>), <fpage>2078</fpage>–<lpage>2086</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2009.05.015</pub-id><pub-id pub-id-type="pmid">19481566</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R99"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Vitu</surname>, <given-names>F.</given-names></string-name>, <string-name><surname>O’Regan</surname>, <given-names>J. K.</given-names></string-name>, <string-name><surname>Inhoff</surname>, <given-names>A. W.</given-names></string-name>, &amp; <string-name><surname>Topolski</surname>, <given-names>R.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Mindless reading: Eye-movement characteristics are similar in scanning letter strings and reading texts.</article-title> <source>Perception &amp; Psychophysics</source>, <volume>57</volume>(<issue>3</issue>), <fpage>352</fpage>–<lpage>364</lpage>. Retrieved from <ext-link ext-link-type="uri" xlink:href="http://www.ncbi.nlm.nih.gov/pubmed/7770326">http://www.ncbi.nlm.nih.gov/pubmed/7770326</ext-link> <pub-id pub-id-type="doi">10.3758/BF03213060</pub-id><pub-id pub-id-type="pmid">7770326</pub-id><issn>0031-5117</issn></mixed-citation></ref>
<ref id="R100"><mixed-citation publication-type="unknown" specific-use="unparsed">Rayner. (<year>2009</year>). <article-title>Eye movements and attention in reading, scene perception, and visual search</article-title>. <source>The Quarterly Journal of Experimental Psychology</source>, <volume>62</volume>(<issue>8</issue>), <fpage>1457</fpage>-<lpage>1506</lpage>. doi:<pub-id pub-id-type="doi">10.1080/17470210902816461</pub-id></mixed-citation></ref>
<ref id="R101"><mixed-citation publication-type="book-chapter" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Schneider</surname>, <given-names>W. X.</given-names></string-name>, &amp; <string-name><surname>Deubel</surname>, <given-names>H.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Visual Attention and Saccadic Eye Movements: Evidence for Obligatory and Selective Spatial Coupling</article-title>. <source>R. W. John M. Findlay &amp; W. K. Robert (Eds.), Studies in Visual In-formation Processing</source>, <volume>6</volume>, <fpage>317</fpage>-<lpage>324</lpage>: <publisher-loc>North-Holland</publisher-loc>.</mixed-citation></ref>
<ref id="R102"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Harris</surname>, <given-names>C. M.</given-names></string-name>, &amp; <string-name><surname>Wolpert</surname>, <given-names>D. M.</given-names></string-name></person-group> (<year>2006</year>). <article-title>The main sequence of saccades optimizes speed-accuracy trade-off.</article-title> <source>Biological Cybernetics</source>, <volume>95</volume>(<issue>1</issue>), <fpage>21</fpage>–<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1007/s00422-006-0064-x</pub-id><pub-id pub-id-type="pmid">16555070</pub-id><issn>0340-1200</issn></mixed-citation></ref>
<ref id="R103"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Behrens</surname>, <given-names>F.</given-names></string-name>, &amp; <string-name><surname>Weiss</surname>, <given-names>L. R.</given-names></string-name></person-group> (<year>1992</year>). <article-title>An algorithm separating saccadic from nonsaccadic eye movements automatically by use of the acceleration signal.</article-title> <source>Vision Research</source>, <volume>32</volume>(<issue>5</issue>), <fpage>889</fpage>–<lpage>893</lpage>. Retrieved from <ext-link ext-link-type="uri" xlink:href="http://www.ncbi.nlm.nih.gov/pubmed/1604857">http://www.ncbi.nlm.nih.gov/pubmed/1604857</ext-link> <pub-id pub-id-type="doi">10.1016/0042-6989(92)90031-D</pub-id><pub-id pub-id-type="pmid">1604857</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R104"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Behrens</surname>, <given-names>F.</given-names></string-name>, <string-name><surname>Mackeben</surname>, <given-names>M.</given-names></string-name>, &amp; <string-name><surname>Schröder-Preikschat</surname>, <given-names>W.</given-names></string-name></person-group> (<year>2010</year>). <article-title>An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters.</article-title> <source>Behavior Research Methods</source>, <volume>42</volume>(<issue>3</issue>), <fpage>701</fpage>–<lpage>708</lpage>. <pub-id pub-id-type="doi">10.3758/BRM.42.3.701</pub-id><pub-id pub-id-type="pmid">20805592</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="R105"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Bettenbühl</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Paladini</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Mergenthaler</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Kliegl</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Engbert</surname>, <given-names>R.</given-names></string-name>, &amp; <string-name><surname>Holschneider</surname>, <given-names>M.</given-names></string-name></person-group> (<year>2010</year>). <article-title>Microsaccade characterization using the continuous wavelet transform and principal component analysis</article-title>. <source>Journal of eye Movement Research</source>, <volume>3</volume>(<issue>5</issue>) doi:<pub-id pub-id-type="doi">10.16910/jemr.3.5.1</pub-id></mixed-citation></ref>
<ref id="R106"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Niehorster</surname>, <given-names>D. C.</given-names></string-name>, <string-name><surname>Cornelissen</surname>, <given-names>T. H. W.</given-names></string-name>, <string-name><surname>Holmqvist</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Hooge</surname>, <given-names>I. T. C.</given-names></string-name>, &amp; <string-name><surname>Hessels</surname>, <given-names>R. S.</given-names></string-name></person-group> (<year>2017</year>). <article-title>What to expect from your remote eye-tracker when participants are unrestrained.</article-title> <source>Behavior Research Methods</source>, <fpage>1</fpage>–<lpage>15</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-017-0863-0</pub-id><pub-id pub-id-type="pmid">28205131</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="R107"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Hessels</surname>, <given-names>R. S.</given-names></string-name>, <string-name><surname>Cornelissen</surname>, <given-names>T. H. W.</given-names></string-name>, <string-name><surname>Kemner</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name><surname>Hooge</surname>, <given-names>I. T. C.</given-names></string-name></person-group> (<year>2015</year>). <article-title>Qualitative tests of remote eyetracker recovery and performance during head rotation.</article-title> <source>Behavior Research Methods</source>, <volume>47</volume>(<issue>3</issue>), <fpage>848</fpage>–<lpage>859</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-014-0507-6</pub-id><pub-id pub-id-type="pmid">25033759</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="R108"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Fetter</surname>, <given-names>M.</given-names></string-name></person-group> (<year>2007</year>). <article-title>Vestibuloocular reflex.</article-title> <comment>[): Karger Publish-ers.]</comment>. <source>Neuro-Ophthalmology</source> (<publisher-name>Aeolus Press</publisher-name>), <volume>40</volume>, <fpage>35</fpage>–<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1159/000100348</pub-id><pub-id pub-id-type="pmid">17314478</pub-id><issn>0165-8107</issn></mixed-citation></ref>
<ref id="R109"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Crawford</surname>, <given-names>J. D.</given-names></string-name>, <string-name><surname>Martinez-Trujillo</surname>, <given-names>J. C.</given-names></string-name>, &amp; <string-name><surname>Klier</surname>, <given-names>E. M.</given-names></string-name></person-group> (<year>2003</year>). <article-title>Neural control of three-dimensional eye and head movements.</article-title> <source>Current Opinion in Neurobiology</source>, <volume>13</volume>(<issue>6</issue>), <fpage>655</fpage>–<lpage>662</lpage>. <pub-id pub-id-type="doi">10.1016/j.conb.2003.10.009</pub-id><pub-id pub-id-type="pmid">14662365</pub-id><issn>0959-4388</issn></mixed-citation></ref>
<ref id="R110"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Rifai</surname>, <given-names>K.</given-names></string-name>, &amp; <string-name><surname>Wahl</surname>, <given-names>S.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Specific eye-head coordination enhances vision in progressive lens wearers.</article-title> <source>Journal of Vision</source> (<publisher-loc>Charlottesville, Va.</publisher-loc>), <volume>16</volume>(<issue>11</issue>), <fpage>5</fpage>–<lpage>5</lpage>. <pub-id pub-id-type="doi">10.1167/16.11.5</pub-id><pub-id pub-id-type="pmid">27604068</pub-id><issn>1534-7362</issn></mixed-citation></ref>
<ref id="R111"><label>t</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Hart</surname>, <given-names>B. M.</given-names></string-name>, <string-name><surname>Vockeroth</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Schumann</surname>, <given-names>F.</given-names></string-name>, <string-name><surname>Bartl</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Schneider</surname>, <given-names>E.</given-names></string-name>, <string-name><surname>König</surname>, <given-names>P.</given-names></string-name>, &amp; <string-name><surname>Einhäuser</surname>, <given-names>W.</given-names></string-name></person-group> (<year>2009</year>). <article-title>Gaze allocation in natural stimuli: Comparing free exploration to head-fixed viewing conditions.</article-title> <source>Visual Cognition</source>, <volume>17</volume>(<issue>6-7</issue>), <fpage>1132</fpage>–<lpage>1158</lpage>. <pub-id pub-id-type="doi">10.1080/13506280902812304</pub-id><issn>1350-6285</issn></mixed-citation></ref>
<ref id="R112"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kliegl</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Nuthmann</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Engbert</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Tracking the mind during reading: The influence of past, present, and future words on fixation durations.</article-title> <source>Journal of Experimental Psychology. General</source>, <volume>135</volume>(<issue>1</issue>), <fpage>12</fpage>–<lpage>35</lpage>. <pub-id pub-id-type="doi">10.1037/0096-3445.135.1.12</pub-id><pub-id pub-id-type="pmid">16478314</pub-id><issn>0096-3445</issn></mixed-citation></ref>
<ref id="R113"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Vitu</surname>, <given-names>F.</given-names></string-name>, <string-name><surname>McConkie</surname>, <given-names>G. W.</given-names></string-name>, <string-name><surname>Kerr</surname>, <given-names>P.</given-names></string-name>, &amp; <string-name><surname>O’Regan</surname>, <given-names>J. K.</given-names></string-name></person-group> (<year>2001</year>). <article-title>Fixation location effects on fixation durations during reading: An inverted optimal viewing position effect.</article-title> <source>Vision Research</source>, <volume>41</volume>(<issue>25-26</issue>), <fpage>3513</fpage>–<lpage>3533</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(01)00166-3</pub-id><pub-id pub-id-type="pmid">11718792</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R114"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Quinlivan</surname>, <given-names>B.</given-names></string-name>, <string-name><surname>Butler</surname>, <given-names>J. S.</given-names></string-name>, <string-name><surname>Beiser</surname>, <given-names>I.</given-names></string-name>, <string-name><surname>Williams</surname>, <given-names>L.</given-names></string-name>, <string-name><surname>McGovern</surname>, <given-names>E.</given-names></string-name>, <string-name><surname>O’Riordan</surname>, <given-names>S.</given-names></string-name>, <etal>. . .</etal> <string-name><surname>Reilly</surname>, <given-names>R. B.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Application of virtual reality head mounted display for investigation of movement: A novel effect of orientation of attention.</article-title> <source>Journal of Neural Engineering</source>, <volume>13</volume>(<issue>5</issue>), <fpage>056006</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2560/13/5/056006</pub-id><pub-id pub-id-type="pmid">27518212</pub-id><issn>1741-2560</issn></mixed-citation></ref>
<ref id="R115"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Pfeiffer</surname>, <given-names>T.</given-names></string-name>, &amp; <string-name><surname>Memili</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Model-based real-time visualization of realistic three-dimensional heat maps for mobile eye tracking and eye tracking in virtual re-ality.</article-title> Paper presented at the <source>Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research &amp; Applications</source>, <conf-loc>Charleston, South Carolina</conf-loc>. <pub-id pub-id-type="doi">10.1145/2857491.2857541</pub-id></mixed-citation></ref>
<ref id="R116"><mixed-citation publication-type="conference" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Duchowski</surname>, <given-names>A. T.</given-names></string-name><string-name><surname>Shivashankaraiah</surname>, <given-names>V.</given-names></string-name><string-name><surname>Rawls</surname>, <given-names>T.</given-names></string-name><string-name><surname>Gramopadhye</surname>, <given-names>A. K.</given-names></string-name><string-name><surname>Melloy</surname>, <given-names>B. J.</given-names></string-name><string-name><surname>Kanki</surname>, <given-names>B.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Binocular eye tracking in virtual reality for inspection training</article-title>. Paper presented at the <source>Proceedings of the 2000 symposium on Eye tracking research &amp; applications</source>, <conf-loc>Palm Beach Gardens, Florida, USA</conf-loc>.</mixed-citation></ref>
<ref id="R117"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Tanriverdi</surname>, <given-names>V.</given-names></string-name>, &amp; <string-name><surname>Jacob</surname>, <given-names>R. J. K.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Interacting with eye movements in virtual environments.</article-title> Paper presented at the <source>Proceedings of the SIGCHI conference on Human Factors in Computing Systems</source>, <conf-loc>The Hague, The Netherlands</conf-loc>. <pub-id pub-id-type="doi">10.1145/332040.332443</pub-id></mixed-citation></ref>
<ref id="R118"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Boukhalfi</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Joyal</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Bouchard</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Neveu</surname>, <given-names>S. M.</given-names></string-name>, &amp; <string-name><surname>Renaud</surname>, <given-names>P.</given-names></string-name></person-group> (<year>2015</year>). <article-title>Tools and Techniques for Real-time Data Acquisition and Analysis in Brain Computer Interface studies using qEEG and Eye Tracking in Virtual Reality Environment.</article-title> <source>IFAC-PapersOnLine</source>, <volume>48</volume>(<issue>3</issue>), <fpage>46</fpage>–<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1016/j.ifacol.2015.06.056</pub-id></mixed-citation></ref>
<ref id="R119"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Krejtz</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Biele</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Chrzastowski</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Kopacz</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Niedzielska</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Toczyski</surname>, <given-names>P.</given-names></string-name>, &amp; <string-name><surname>Duchowski</surname>, <given-names>A.</given-names></string-name></person-group> (<year>2014</year>). <article-title>Gaze-controlled gaming: immersive and difficult but not cognitively overloading.</article-title> Paper presented at the <source>Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Com-puting: Adjunct Publication</source>. <pub-id pub-id-type="doi">10.1145/2638728.2641690</pub-id></mixed-citation></ref>
<ref id="R120"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Bulling</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Gellersen</surname>, <given-names>H.</given-names></string-name></person-group> (<year>2010</year>). <article-title>Toward mobile eye-based human-computer interaction.</article-title> <source>IEEE Pervasive Computing</source>, <volume>9</volume>(<issue>4</issue>), <fpage>8</fpage>–<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1109/MPRV.2010.86</pub-id><issn>1536-1268</issn></mixed-citation></ref>
<ref id="R121"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Kassner</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Patera</surname>, <given-names>W.</given-names></string-name>, &amp; <string-name><surname>Bulling</surname>, <given-names>A.</given-names></string-name></person-group> (<year>2014</year>). <article-title>Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction.</article-title> Paper presented at the <source>Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Ad-junct publication</source>. <pub-id pub-id-type="doi">10.1145/2638728.2641695</pub-id></mixed-citation></ref>
<ref id="R122"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>DiScenna</surname>, <given-names>A. O.</given-names></string-name>, <string-name><surname>Das</surname>, <given-names>V.</given-names></string-name>, <string-name><surname>Zivotofsky</surname>, <given-names>A. Z.</given-names></string-name>, <string-name><surname>Seidman</surname>, <given-names>S. H.</given-names></string-name>, &amp; <string-name><surname>Leigh</surname>, <given-names>R. J.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Evaluation of a video tracking device for measurement of horizontal and vertical eye rotations during locomotion.</article-title> <source>Journal of Neuroscience Methods</source>, <volume>58</volume>(<issue>1-2</issue>), <fpage>89</fpage>–<lpage>94</lpage>. <pub-id pub-id-type="doi">10.1016/0165-0270(94)00162-A</pub-id><pub-id pub-id-type="pmid">7475237</pub-id><issn>0165-0270</issn></mixed-citation></ref>
</ref-list>
  </back>
</article>
