<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.10.4.1</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Using Smooth Pursuit Calibration for Difficult-to-Calibrate Participants</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Blignaut</surname>
						<given-names>Pieter</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				
        <aff id="aff1">
		<institution>University of the Free State Bloemfontein</institution>, <country>South Africa</country>
        </aff>
		</contrib-group>
     
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>4</day>  
		<month>10</month>
        <year>2017</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2017</year>
	</pub-date>
      <volume>10</volume>
      <issue>4</issue>
	  <elocation-id>10.16910/jemr.10.4.1</elocation-id> 
	<permissions> 
	<copyright-year>2017</copyright-year>
	<copyright-holder>Blignaut</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
	<abstract>
        <p>Although the 45-dots calibration routine of a previous study ( <xref ref-type="bibr" rid="R2">2</xref>) provided very good accuracy, it requires intense mental effort and the routine proved to be unsuccessful for young children who struggle to maintain concentration. The calibration procedures that are normally used for difficult-to-calibrate participants, such as autistic children and infants, do not suffice since they are not accurate enough and the reliability of research results might be jeopardised.
Smooth pursuit has been used before for calibration and is applied in this paper as an alternative routine for participants who are difficult to calibrate with conventional routines.  Gaze data is captured at regular intervals and many calibration targets are generated while the eyes are following a moving target. The procedure could take anything between 30 s and 60 s to complete, but since an interesting target and/or a conscious task may be used, participants are assisted to maintain concentration.
It was proven that the accuracy that can be attained through calibration with a moving target along an even horizontal path is not significantly worse than the accura-cy that can be attained with a standard method of watching dots appearing in random order. The routine was applied successfully for a group of children with ADD, ADHD and learning abilities. 
This result is important as it provides for easier calibration – especially in the case of participants who struggle to keep their gaze focused and stable on a stationary target for long enough.
</p>
      </abstract>
	  
	  
	   <kwd-group>
         <kwd>Calibration</kwd>
        <kwd>Smoot pursuit</kwd>
      </kwd-group>
    </article-meta>
  </front> 	 		 	
  
  
  
  
  
  <body>
    
   
    <sec id="s1">
      <title>Introduction</title>
      <p>
        Video-based eye tracking is based on the principle that
near-infrared light shone onto the eyes is reflected off the
different structures in the eye to create four Purkinje
reflections (
        <xref ref-type="bibr" rid="R1">1</xref>
        ). The standard way of calibrating such eye
trackers is through presentation of a series of dots (or gaze
targets) at known positions on the display and expect the
participant to watch the dots until enough gaze data is
sampled (
        <xref ref-type="bibr" rid="R3">3</xref>
        ). While expensive commercial systems utilise a
model of the eye to compute the gaze direction (
        <xref ref-type="bibr" rid="R4">4</xref>
        ),
selfassembled eye trackers use mostly polynomial expressions
to map the relative position of the pupil to the corneal
reflections (the so-called pupil-glint vector) to gaze
coordinates. A least squares estimation is used to minimise the
distances between the observed points and the actual
points in the calibration grid (
        <xref ref-type="bibr" rid="R5">5</xref>
        ).
      </p>
      <p>
        Normally, five or nine dots are used. The more dots
that are used, the better the accuracy of the system should
be. Good accuracy is important when the stimuli is close
to each other as in reading, where a researcher wants to
determine the number of fixations on individual syllables.
A procedure is described in a previous study (
        <xref ref-type="bibr" rid="R2">2</xref>
        ) where 45
dots are displayed in a 9&#xD7;5 grid. Twenty-three of the dots
are used as calibration targets, while the complete set of
dots is used to select the best possible regression
polynomial. The dots are displayed in random order to prevent
participants to pre-empt the position of the next dot and
take their eyes away from a dot before the gaze was
registered.
      </p>
      <p>
        While the procedure described in Blignaut (
        <xref ref-type="bibr" rid="R2">2</xref>
        ) is
accurate with a reported average offset of 0.32&#xB0;, it requires
intense and prolonged concentration and participants do not
always understand that they have to keep their eyes fixated
on a dot until the next one appears. Unsurprisingly, the
routine proved to be unsuccessful for young children with
Attention Deficit Disorder (ADD), Attention Deficit
Hyperactivity Disorder (ADHD) and learning disabilities. An
occupational therapist using the system complained that
young the children with these conditions did not
understand exactly what was expected of them and some of them
could not maintain concentration for the entire period.
      </p>
      <p>The challenge is, therefore, to capture gaze data at as
many known locations as possible, with the least possible
mental effort while maintaining attention on the target. In
this paper, a smooth pursuit calibration routine is proposed
with a target moving across the display at a constant speed.
The target could also be an animated image of something
of interest to a small child, such as a butterfly or an
airplane. In order to further motivate the child participant to
watch the target closely, it could change colour, shape or
image at varying intervals and the child could be
challenged to count the number of changes.</p>
      <p>The need for calibration and existing calibration
procedures are discussed in the following section. The
difficulties that are experienced with the standard routines to
calibrate certain groups of participants (collectively referred
to as difficult-to-calibrate (DC) participants) are
highlighted and previous attempts to solve the problem are
discussed. Thereafter, the presentation of a moving target
with a related task is offered as a solution to capture the
attention of the DC participants for long enough so that the
procedure can be completed.</p>
      <p>The evaluation of smooth pursuit calibration (SPC) is
done in two phases: First, the accuracy of the approach is
validated based on comparison with a standard calibration
procedure using able and cooperating participants. Second,
the applicability of the approach is validated for a group of
early primary school children with various forms of o ne or
more cognitive disorders.</p>
      <p>The paper concludes with a discussion of the results.</p>
    </sec>
    <sec id="s2">
      <title>The Role of Calibration</title>
      <sec id="s2a">
        <title>The need for calibration</title>
        <p>
          The output from eye-tracking devices varies with
individual differences in the shape or size of the eyes, such as
the corneal bulge and the relationship between the eye
features (pupil and corneal reflections) and the foveal region
on the retina. Ethnicity, viewing angle, head pose, colour,
texture, light conditions, position of the iris within the eye
socket and the state of the eye (open or closed) all
influence the appearance of the eye (
          <xref ref-type="bibr" rid="R4">4</xref>
          ) and, therefore, the
quality of eye-tracking data (
          <xref ref-type="bibr" rid="R6">6</xref>
          ). In particular, the individual
shapes of participant eye balls, and the varying positions
of cameras and illumination require all eye-trackers to be
calibrated.
        </p>
      </sec>
      <sec id="s2b">
        <title>The procedure</title>
        <p>
          Calibration refers to a procedure to gather data so that
the coordinates of the pupil and one or more corneal
reflections in the coordinate system of the eye-video can be
converted to x- and y-coordinates that represent the
participant&#x2019;s point of regard in the stimulus space. The
procedure usually consists of asking the participant to look at a
number of pre-defined points at known angular positions
while storing samples of the measured quantity (
          <xref ref-type="bibr" rid="R7 R8 R9">7-9</xref>
          ).
There is no consensus on exactly when to collect these
samples, but Nystr&#xF6;m, Andersson (
          <xref ref-type="bibr" rid="R3">3</xref>
          ) showed that
participants know better than the operator or the system when
they are looking at a target.
        </p>
      </sec>
      <sec id="s2c">
        <title>Mapping to point of regard</title>
        <p>
          The transformation from eye-position to point of
regard can be either model-based (geometric) or
interpolation-based (
          <xref ref-type="bibr" rid="R4">4</xref>
          ). With model-based gaze estimation, a model
of the eye is built from the observable eye features (pupil,
corneal reflection, etc.) to compute the gaze direction. In
this case, calibration is not used to determine the actual
gaze position but rather to record the e ye features from
different angles. See Hansen and Ji (
          <xref ref-type="bibr" rid="R4">4</xref>
          ) for a comprehensive
overview of possible transformations.
        </p>
        <p>
          Interpolation might involve, for example, a linear
regression between the known data set and the
corresponding raw data, using a least squares estimation to minimize
the distances between the observed points and the actual
points (
          <xref ref-type="bibr" rid="R5">5</xref>
          ). Other examples of 2-dimensional interpolation
schemes can be found in McConkie (
          <xref ref-type="bibr" rid="R10">10</xref>
          ) as well as Kliegl
and Olson (
          <xref ref-type="bibr" rid="R8">8</xref>
          ) while a cascaded polynomial curve fit
method is described in Sheena and Borah (
          <xref ref-type="bibr" rid="R11">11</xref>
          ).
        </p>
        <p>
          Theoretically, the transformation should remove any
systematic error, but the limited number of calibration
points that are normally used limits the accuracy that can
be achieved. Typical calibration schemes require 5 or 9
pre-defined points, and rarely use more than 20 points
(
          <xref ref-type="bibr" rid="R12">12</xref>
          ).
        </p>
      </sec>
      <sec id="s2d">
        <title>Auto-calibration</title>
        <p>
          Huang, Kwok (
          <xref ref-type="bibr" rid="R13">13</xref>
          ) presented an auto-calibrating
system that identifies and collects gaze data unobtrusively
during user interaction events since there is a likely
correlation between gaze and cursor and caret locations. The
procedure presented by Huang, Kwok (
          <xref ref-type="bibr" rid="R13">13</xref>
          ) recalibrates
continuously and becomes more and more accurate with
additional use. They reported an average error of 2.56&#xB0;
which is not good but has the advantage that there is no
need for an explicit calibration phase.
        </p>
        <p>
          Swirski and Dodgson (
          <xref ref-type="bibr" rid="R14">14</xref>
          ) describe a procedure that
fits a pupil motion model to a set of eye images.
Information from multiple frames is combined to build a 3D eye
model that is based on assumptions on how the motion of
the pupil is constrained. No calibration is needed and since
the procedure is based on pupil ellipse geometry alone, it
is not necessary to illuminate the eyes to create a corneal
reflex. At best, a mean error of 1.68&#xB0; was reported.
        </p>
        <p>In summary, while auto-calibration might solve the
problem of calibrating for participants who struggle to
maintain concentration, it is not good enough for studies
where high accuracy is needed.</p>
      </sec>
    </sec>
    <sec id="s3">
      <title>Previous Attempts to Track Difficult-to-Calibrate Participants</title>
      <p>In the quest for a solution to calibrate young children
who find it difficult to concentrate on a target for the
duration of a calibration routine, one can learn from the
experience of others who faced similar challenges, for example
tracking infants, toddlers and children with autism.</p>
      <p>
        Tracking infants and toddlers pose a challenge as it is
hardly ever possible to get them to sit down long enough
to focus on calibration targets. Aslin (
        <xref ref-type="bibr" rid="R15">15</xref>
        ) mentioned that
small flashing (or shrinking) targets work well with
infants, but argued that accuracy is unlikely to ever be better
than 1&#xB0; because infants are unable to precisely and reliably
fixate small stimuli. He further asserted that if 1&#xB0; of
accuracy is insufficient to answer a particular question, then an
eye tracker should not be used for the research and
alternative methods should be implemented.
      </p>
      <p>
        Sasson and Elison (
        <xref ref-type="bibr" rid="R16">16</xref>
        ) indicated that eye tracking of
young children with autism involves unique challenges
that are not present when tracking normal-developing
older children or adults. They used the normal calibration
routines provided by the manufacturer to track the gaze
data of their participants, but used large stimuli, spanning
more than 5&#xB0;. Although participants find such stimuli
pleasing to look at, the researcher cannot be exactly sure
where the participant looked at the time of data capture.
This will almost certainly result in bad accuracy that will
not be feasible for tasks where high accuracy is required,
such as reading.
      </p>
      <p>
        In a study by Pierce, Conant (
        <xref ref-type="bibr" rid="R17">17</xref>
        ), toddlers were seated
on their parent&#x2019;s lap in front of a Tobii T120 eye tracker
and a partition separated the operator from the toddler. To
obtain calibration information, toddlers were shown
images of an animated cat that appeared in 9 locations on the
screen. Using a software facility that superimposes the
point of regard on the test image in real time, the operator
observed the infant&#x2019;s gaze position and head position on a
secondary monitor, making note of obvious deviations
from expected gaze positions. The entire process was
repeated if the infant&#x2019;s eyes were no longer picked up. No
mention was made of the achieved accuracy, but it is
reasonable to expect that the accuracy could not be better than
the size of the calibration stimulus (the animated cat).
      </p>
      <p>
        Franchak, Kretch (
        <xref ref-type="bibr" rid="R18">18</xref>
        ) used a head-mounted eye
tracker but displayed stimuli on a computer screen. A
sounding target appeared at a single location within a 3&#xD7;3
matrix on the monitor to induce eye movements.
Calibration involved as few as 3 and as many as 9 points spread
across visual space. Subjective judgement was used to
determine whether fixations deviated from targets by more
than about 2&#xB0; and the procedure was repeated if necessary.
Although the spatial accuracy is lower than that of typical
desk-mounted systems, it was regarded as adequate for
determining the target of fixations in natural settings. The
entire process of preparing the equipment and calibrating the
infant took about 15 minutes.
      </p>
      <p>
        Corbetta, Guan (
        <xref ref-type="bibr" rid="R19">19</xref>
        ) followed a similar procedure to
calibrate an ETL-500 head-mounted eye tracker through
timely coordination between one experimenter facing the
child and another experimenter running the interactive
calibration software of the eye tracker. The experimenter
facing the child was presenting a small, visually attractive and
sounding toy at one of the five predefined spatial positions.
When the child was staring at the toy in that position, the
experimenter running the calibration software was
prompted to capture the gaze data. The researchers did not
report the accuracy achieved, but one can once again
assume that the accuracy could not be better than the size of
the toy used as calibration stimulus.
      </p>
      <p>In summary, it is clear that in an attempt to calibrate
so-called difficult-to-calibrate participants, various
researchers used sound and animation of larger objects as
calibration targets. Furthermore, the number of calibration
points is mostly limited and the accuracy that can be
achieved is not expected to be better than 2&#xB0;. It is also
difficult to tell the actual accuracy that was obtained during a
specific experimental set-up or participant recording. The
need exists, therefore, for a calibration routine that is easy
to execute and can be used for difficult-to-calibrate
participants, yet accurate enough to provide reliable research
results &#x2013; especially if the experiment involves smaller or
closely spaced targets.</p>
    </sec>
    <sec id="s4">
      <title>Smooth Pursuit Calibration</title>
      <sec id="s4a">
        <title>Smooth pursuit eye movements</title>
        <p>
          Smooth-pursuit eye movements are continuous, slow
rotations of the eyes (
          <xref ref-type="bibr" rid="R20">20</xref>
          ) that are used to stabilise the
image of a moving object of interest on the fovea, thus
maintaining high acuity (
          <xref ref-type="bibr" rid="R21 R22">21, 22</xref>
          ). Conscious attention is needed
to maintain accurate smooth pursuit (
          <xref ref-type="bibr" rid="R23 R24">23, 24</xref>
          ).
        </p>
        <p>
          Smooth pursuit gain is expressed as the ratio of smooth
eye movement velocity to the velocity of a foveal target
(
          <xref ref-type="bibr" rid="R25">25</xref>
          ). If the gain is less than 1, gaze will fall behind the
target to create a retinal slip that will have to be reduced
by one or more "catch-up" saccades (
          <xref ref-type="bibr" rid="R26">26</xref>
          ). According to
Meyer, Lasker (
          <xref ref-type="bibr" rid="R27">27</xref>
          ), normal subjects can follow a target
with a gain of 90% up to a target velocity of 100 deg/s.
        </p>
        <p>
          Smooth pursuit gain increases with age, especially for
the first 3 months of an infant&#x2019;s life (
          <xref ref-type="bibr" rid="R28 R29">28, 29</xref>
          ). Accardo,
Pensiero (
          <xref ref-type="bibr" rid="R30">30</xref>
          ) found that velocity gain of children aged
712 is slightly lower than that of adults.
        </p>
        <p>
          Smooth pursuit can also be affected by attention (
          <xref ref-type="bibr" rid="R31 R32">31,
32</xref>
          ). More specifically, performance with smooth pursuit
tasks can be dramatically improved when subjects are
asked to analyse some or other changing characteristic of
the target, such as reading a changing letter or number on
the target (
          <xref ref-type="bibr" rid="R33">33</xref>
          ) or pressing a button (
          <xref ref-type="bibr" rid="R34">34</xref>
          ).
        </p>
		
		
		
		
		
		
		
		
		
		
		
        <p>
          Smooth pursuit performance is also affected by
stimulus background (
          <xref ref-type="bibr" rid="R35 R36 R37">35-37</xref>
          ), target position (
          <xref ref-type="bibr" rid="R38">38</xref>
          ), target
velocity (
          <xref ref-type="bibr" rid="R27 R39">27, 39</xref>
          ), target visibility (
          <xref ref-type="bibr" rid="R40 R41">40, 41</xref>
          ), target direction (
          <xref ref-type="bibr" rid="R42">42</xref>
          )
and predictability of target direction (
          <xref ref-type="bibr" rid="R43">43</xref>
          ).
        </p>
        <p>
          Smooth pursuit impairment and dysfunction can be
linked to mental illnesses such as schizophrenia (
          <xref ref-type="bibr" rid="R33 R44 R45 R46">33,44,45,46</xref>
          ), autism (
          <xref ref-type="bibr" rid="R47">47</xref>
          ), physical anhedonia and perceptual
aberrations (
          <xref ref-type="bibr" rid="R48 R49 R50">48-50</xref>
          ), Alzheimer&#x2019;s disease (
          <xref ref-type="bibr" rid="R51">51</xref>
          ) and
attentiondeficit hyperactivity disorder (ADHD) (
          <xref ref-type="bibr" rid="R52">52</xref>
          ).
        </p>
      </sec>
      <sec id="s4b">
        <title>Smooth pursuit calibration in general</title>
        <p>
          The concept of calibrating while a participant follows
a moving target has been exploited with success in the past.
Pfeuffer, Vidal (
          <xref ref-type="bibr" rid="R53">53</xref>
          ) explains a procedure where gaze data
for calibration is only sampled when the participant is
attending to the target as indicated by high correlation
between eye and target movement. They showed that pursuit
calibration is tolerant to interruption and can be used to
calibrate without participants being aware of the
procedure.
        </p>
        <p>
          Pfeuffer, Vidal (
          <xref ref-type="bibr" rid="R53">53</xref>
          ) used a Tobii TX300 eye tracker to
test their calibration procedure and collected gaze data at
60 Hz. At a target speed of 5.8&#xB0;/s, it took 20 seconds to
complete the target trajectory and an accuracy of just less
than 0.6&#xB0; were achieved. The results were compared with
the 5-point calibration routine of Tobii which took 19
seconds to complete and delivered an accuracy of &#8776;0.7&#xB0;.
        </p>
        <p>
          Celebi, Kim (
          <xref ref-type="bibr" rid="R54">54</xref>
          ) follows the approach of Pfeuffer,
Vidal (
          <xref ref-type="bibr" rid="R53">53</xref>
          ) but argues that an Archimedean spiral would
provide better spatial coverage of the stimulus plane with
little redundancy. At a linear velocity of 6.4&#xB0;/s, the
calibration procedure took 27 seconds during which 1600 data
points were collected. Upon testing 10 healthy adults on
an Eyelink 1000 eye tracker running at at 500 Hz, their
approach delivered an average accuracy of 0.84&#xB0; compared
to 1.39&#xB0; with a standard 9-point calibration procedure
(which took 23 seconds to complete).
        </p>
        <p>
          Celebi, Kim (
          <xref ref-type="bibr" rid="R54">54</xref>
          ) stated explicitly that the goal of their
smooth pursuit approach towards calibration is to improve
the calibration for toddlers and children with or without
developmental disabilities although they did not test the
approach with such participants.
        </p>
        <p>
          Gredeb&#xE4;ck, Johnson (
          <xref ref-type="bibr" rid="R55">55</xref>
          ) describes a calibration
routine that makes use of a moving object to lure infants&#x2019; eyes
to 2 or 5 calibration targets, but they reported accuracy
according to the manufacturer&#x2019;s specifications of 0.5&#xB0; - a
value which has been computed with a conventional
calibration routine under ideal circumstances and with
cooperating adult participants.
        </p>
      </sec>
      <sec id="s4c">
        <title>Smooth pursuit calibration for DC participants</title>
        <p>
          Participants who struggle to maintain concentration o n
tedious tasks can be cognitively stimulated by indicating
or counting the number of transitions of the target from one
stimulus to another (
          <xref ref-type="bibr" rid="R33">33</xref>
          ). In this study, participants were
requested to follow a grey disk of 1.5&#xB0; diameter on a white
background (Figure 1) that contained a coloured dot (0.2&#xB0;)
in the centre. The dot changed colour in cycles of blue (2
seconds) and red (500 ms) and the participants were then
asked to say the word "Red" aloud every time that the disk
changed to red.
        </p>
        <p>The target is initially displayed statically in the top left
corner and the participant can be prepared as to the
direction and nature of the motion that could be expected. The
experimenter can initiate movement with a button as soon
as the participant is ready. Three alternative trajectories
were tested with the target moving along an even
horizontal path (<xref ref-type="fig" rid="fig01">Figure 1a</xref>) , a wavy horizontal path (<xref ref-type="fig" rid="fig02">Figure 1b</xref>) and vertical  <xref ref-type="fig" rid="fig03">Figure 1c</xref>. </p>

	  
<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1a</label>
					<caption>
						<p>Trajectories of moving target</p>
						</caption>
					<graphic id="graph01" xlink:href="jemr-10-04-a-figure-01.png"/>
				</fig>	


<fig id="fig02" fig-type="figure" position="float">
					<label>Figure 1b</label>
					<caption>
						<p>Trajectories of moving target</p>
						</caption>
					<graphic id="graph02" xlink:href="jemr-10-04-a-figure-02.png"/>
				</fig>	

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 1c</label>
					<caption>
						<p>Trajectories of moving target</p>
						</caption>
					<graphic id="graph03" xlink:href="jemr-10-04-a-figure-03.png"/>
				</fig>	




        <p>
          Depending on the speed of movement, the interval
between windows and the trajectory, a large number of
targets can be extracted from the continuous gaze data. In the
procedure of Pfeuffer, Vidal (
          <xref ref-type="bibr" rid="R53">53</xref>
          ), gaze data is collected
whenever a participant attends to the moving target. In our
approach, gaze samples (or more specifically, pupil-glint
vectors for each eye) are captured for very short windows
(100 ms) at intervals of 500 ms. When data is not available
at a specific interval, the point is ignored. This means that,
at a framerate of 200 Hz, 20 samples were recorded within
a 100 ms window. At a velocity of 6.65&#xB0;/s (gaze distance
700 mm, 300 px/s on a 19.5", 1600&#xD7;900 screen), the
samples would span 0.665&#xB0; in the direction of movement.
        </p>
        <p>The radius of curvature was set so that the horizontal
and vertical trajectories would cover 6 and 9 distinct Y and
X coordinates respectively (cf Figure 1). This resulted in
76 targets (in 38 s) being captured for both the even and
wavy horizontal movements and 66 targets (in 33 s) for the
vertical movement.</p>
        <p>Since the eyes move smoothly to follow the target, it
can be expected that a convex hull around the sample
points would be elongated along the direction of
movement. The samples within every window were sorted
according to the x and y dimensions of the pupil-glint vectors
and only the intersection of the centre 80% of samples
around the median in each dimension are retained.</p>
        <p>In contrast with a standard 5-point or 9-point
calibration procedure where all points are needed for the
regression, the multitude of points that are available with this
procedure allows the removal of points where participants
blinked or where their attention was distracted. All
windows for which the dispersion (Max(maxX-minX,
(maxYminY)) of contained samples are above 5&#xB0;, are also
removed. For each of the remaining windows of gaze data
samples, the average location and the average pupil-glint
vector are calculated.</p>
        <p>
          From here on, the procedure as explained in Blignaut
(
          <xref ref-type="bibr" rid="R2">2</xref>
          ) is followed. The gaze data windows represent
calibration points at known locations and are used to determine a
gaze mapping polynomial set per participant. Regression
coefficients are recalculated in real-time &#x2013; based on a
subset of calibration points in the region of the current gaze.
Real-time localized corrections are done that are based on
calibration targets in the same region. See Blignaut (
          <xref ref-type="bibr" rid="R2">2</xref>
          ) for
a detailed discussion of the procedure.
        </p>
      </sec>
    </sec>
    <sec id="s5">
      <title>Accuracy of Smooth Pursuit Calibration</title>
      <p>In this section, the accuracy of the approach is
validated based on a comparison with a standard calibration
procedure using healthy and cooperating adult
participants. The applicability of the approach for
difficult-tocalibrate participants will be addressed in the next section.</p>
      <sec id="s5a">
        <title>Equipment</title>
        <p>For this study, an eye tracker with two infrared
illuminators, 480 mm apart, and the UI-1550LE camera from
IDS Imaging (https://en.ids-imaging.com) was assembled.
All recordings were made at a framerate of 200 Hz.</p>
        <p>Every frame that is captured by the eye camera was
analysed and the centres of the pupils and the corneal
reflections (glints) were identified. A regression-based approach
was followed to map the pupil-glint vector to a point of
regard in display coordinates. The regression coefficients
are determined through a calibration process.</p>
      </sec>
      <sec id="s5b">
        <title>Method</title>
        <p>
          Seventeen healthy and cooperating adult participants
were recruited through convenience sampling and
presented with four calibration routines, namely a moving
target along an even horizontal path, a moving target along a
wavy horizontal path and a target moving vertically (cf
Figure 1). The 45-dots routine as proposed in a previous
study (
          <xref ref-type="bibr" rid="R2">2</xref>
          ) was also presented for comparison purposes. The
procedure was executed only once for every participant.
        </p>
        <p>After every routine, a 7&#xD7;4 grid of dots was displayed
to determine the accuracy of the procedure. As for the 45
dots, the 28 dots appeared in random order to prevent
participants from pre-empting the position of the next dot and
prematurely look away. The regression coefficients as
determined in the preceding calibration routine was used to
map the gaze data to screen coordinates. The accuracy for
a specific participant was calculated as the average offset
between the known locations and the reported gaze
coordinates across the 28 dots. The performance of a
calibration routine is expressed as the average accuracy over all
participants.</p>
      </sec>
      <sec id="s5c">
        <title>Recording of calibration points</title>
        <p> <xref ref-type="fig" rid="fig04">Figure 2</xref> shows the calibration points that were
recorded for a specific participant while the target was
moving along an even horizontal path. The mapped gaze
coordinates of the sample data are enclosed by convex hulls &#x2013;
green for the left eye and red for the right eye.</p>


<fig id="fig04" fig-type="figure" position="float">
					<label>Figure 2</label>
					<caption>
						<p>Calibration points with accompanying gaze data samples as captured with a horizontal-ly moving target. Missing points marked with circles.</p>
						</caption>
					<graphic id="graph04" xlink:href="jemr-10-04-a-figure-04.png"/>
				</fig>	





        <p>In the example presented in Figure 2, five of the
calibration windows did not contain enough sample data &#x2013;
probably due to blinks. Table 1 shows that for routines that
involve a moving target, on average between 2 and 4
calibration windows are lost in this way. Since there are more
than enough other points to be used in the subsequent
regression and because the lost points are seldom at
successive locations, this does not pose a problem.</p>
<table-wrap id="t1" position="anchor">
					<label>Table 1</label>
					<caption>
						<p>Average number of points (across participants) with enough samples and with mapped gaze coordinates within 1&#x0B0; of the target per calibration routine (SD: Standard deviation)</p>
					</caption>
<table frame="hsides" rules="groups" cellpadding="3">
<tbody>
          <tr>
          <td rowspan="1" colspan="1"></td>
				<td rowspan="1" colspan="1"></td>
			
            <td align="center" rowspan="1" colspan="3">Points with enough samples</td>
            
            <td align="center" rowspan="1" colspan="4">Points within 1&#x0B0; of target</td>
			    
          </tr>
		   
          <tr>
           <td rowspan="1" colspan="1">Routine</td>
            <td rowspan="1" colspan="1">Possible points</td> 
            <td rowspan="1" colspan="1">Avg</td>
            <td rowspan="1" colspan="1">SD  </td>
            <td rowspan="1" colspan="1">%</td>
            <td rowspan="1" colspan="1"> </td>
            <td rowspan="1" colspan="1">Avg</td>
            <td rowspan="1" colspan="1">SD  </td>
            <td rowspan="1" colspan="1">%  </td>
          </tr>
		   </tbody><tbody>
		  
          <tr>
            <td rowspan="1" colspan="1">45 dots</td>
            <td rowspan="1" colspan="1">45</td>
            <td rowspan="1" colspan="1">45</td>
            <td rowspan="1" colspan="1">0</td>
            <td rowspan="1" colspan="1">100</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">39.3</td>
            <td rowspan="1" colspan="1">10.0</td>
            <td rowspan="1" colspan="1">87.3</td>
          </tr> </tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">Even hor</td>
            <td rowspan="1" colspan="1">76</td>
            <td rowspan="1" colspan="1">72.5</td>
            <td rowspan="1" colspan="1">6.1</td>
            <td rowspan="1" colspan="1">95.4</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">69.1</td>
            <td rowspan="1" colspan="1">8.2</td>
            <td rowspan="1" colspan="1">90.9</td>
          </tr> </tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">Wavy hor</td>
            <td rowspan="1" colspan="1">76</td>
            <td rowspan="1" colspan="1">74.1</td>
            <td rowspan="1" colspan="1">2.9</td>
            <td rowspan="1" colspan="1">97.5</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">69.7</td>
            <td rowspan="1" colspan="1">7.1</td>
            <td rowspan="1" colspan="1">91.6</td>
          </tr> </tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">Vertical</td>
            <td rowspan="1" colspan="1">66</td>
            <td rowspan="1" colspan="1">63.9</td>
            <td rowspan="1" colspan="1">3.3</td>
            <td rowspan="1" colspan="1">96.9</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">60.9</td>
            <td rowspan="1" colspan="1">6.0</td>
            <td rowspan="1" colspan="1">92.3</td>
          </tr> 
		  </tbody>
        </table>
		</table-wrap>






        <p>Initially, the set of calibration points was also used as
validation targets and the offsets between the calibration
points and the mapped gaze coordinates were calculated.
All points with offsets larger than 1.0&#xB0; were then removed
from the set of calibration targets and the regression
procedure was repeated. The remaining calibration points are
shown in blue in Figure 2, while black dots indicate points
that were excluded for real-time interpolation. Table 1 also
shows the final number of points that was used to
determine the polynomial coefficients through regression.</p>

 </sec>
 
 
 
 
 
 
      <sec id="s5d">
        <title>Validation results</title>
        <p> <xref ref-type="fig" rid="fig05">Figure 3</xref> shows the validation points for a calibration that was done for a moving target along an even horizon-tal path. The average of all samples within a window is shown for the left and right eyes. A + indicates the average position between the left and right eyes.</p>



<fig id="fig05" fig-type="figure" position="float">
					<label>Figure 3</label>
					<caption>
						<p>Validation points with left (green) and right (red) eye averages of samples per point. The average between eyes is indicated with a blue +. A point where the participant was distracted, is also shown.</p>
						</caption>
					<graphic id="graph05" xlink:href="jemr-10-04-a-figure-05.png"/>
				</fig>	



        <p>The example in Figure 3 was specifically selected to
illustrate the occurrence of outliers. These outliers may
occur if the participant loses concentration or is distracted by
external stimuli. Sometimes (some of) the samples are
captured during a blink, in which case the samples for the left
and right eyes appear to be disconnected. Validation points
were excluded from the calculation of average offset if the
offset was larger than 3&#xB0; or if the mapped gaze coordinates
for the two eyes were more than 3&#xB0; apart. Table 2 shows
the average number of validation points that was included
for each of the calibration routines. These thresholds were
set large enough not to exclude valid data but small enough
to ensure that unwanted gaze behaviour is excluded.</p>
      
	  <p>Table 2 also shows the average error across the 28
validation points and 17 participants per calibration routine.
The important column that needs to be interpreted to
compare the four calibration routines is boldfaced. 
<xref ref-type="fig" rid="fig06">Figure 4</xref>
provides a visualisation of the same results. The vertical
bars denote the 95% confidence intervals of the means.


<table-wrap id="t2" position="anchor">
					<label>Table 2</label>
					<caption>
						<p>Average number of validation points that was included and the
average error (over participants and validation targets) for each
of the calibration routines. (SD: Standard deviation, SEM:
Standard error of the mean)</p>
					</caption>
	
		<table frame="hsides" rules="groups" cellpadding="3">
		<tbody>
	
          <tr>            
			
				<td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="4">Number of points</td>
            <td rowspan="1" colspan="1"> </td>
            <td rowspan="1" colspan="5">Error (degrees)</td>
          </tr></tbody> <tbody>
          <tr>
			<td rowspan="1" colspan="1">Routine</td>
            <td rowspan="1" colspan="1">Min</td>
            <td rowspan="1" colspan="1">Max</td>
            <td rowspan="1" colspan="1">Avg</td>
            <td rowspan="1" colspan="1">SD</td>
            <td rowspan="1" colspan="1"> </td>
            <td rowspan="1" colspan="1">Min</td>
            <td rowspan="1" colspan="1">Max</td>
            <td rowspan="1" colspan="1">Avg</td>
            <td rowspan="1" colspan="1">SD</td>
            <td rowspan="1" colspan="1">SEM</td>
          </tr></tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">45 dots</td>
            <td rowspan="1" colspan="1">14</td>
            <td rowspan="1" colspan="1">28</td>
            <td rowspan="1" colspan="1">25.7</td>
            <td rowspan="1" colspan="1">4.4</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">0.31</td>
            <td rowspan="1" colspan="1">0.65</td>
            <td rowspan="1" colspan="1">0.47</td>
            <td rowspan="1" colspan="1">0.10</td>
            <td rowspan="1" colspan="1">0.024</td>
          </tr></tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">Even hor</td>
            <td rowspan="1" colspan="1">24</td>
            <td rowspan="1" colspan="1">28</td>
            <td rowspan="1" colspan="1">27.1</td>
            <td rowspan="1" colspan="1">1.2</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">0.41</td>
            <td rowspan="1" colspan="1">0.68</td>
            <td rowspan="1" colspan="1">0.53</td>
            <td rowspan="1" colspan="1">0.08</td>
            <td rowspan="1" colspan="1">0.019</td>
          </tr></tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">Wavy hor</td>
            <td rowspan="1" colspan="1">19</td>
            <td rowspan="1" colspan="1">28</td>
            <td rowspan="1" colspan="1">26.6</td>
            <td rowspan="1" colspan="1">2.3</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">0.39</td>
            <td rowspan="1" colspan="1">1.03</td>
            <td rowspan="1" colspan="1">0.61</td>
            <td rowspan="1" colspan="1">0.17</td>
            <td rowspan="1" colspan="1">0.041</td>
          </tr></tbody>
		  
		  <tbody>	
          <tr>
            <td rowspan="1" colspan="1">Vertical</td>
            <td rowspan="1" colspan="1">23</td>
            <td rowspan="1" colspan="1">28</td>
            <td rowspan="1" colspan="1">26.6</td>
            <td rowspan="1" colspan="1">1.6</td>
            <td rowspan="1" colspan="1"></td>
            <td rowspan="1" colspan="1">0.45</td>
            <td rowspan="1" colspan="1">1.18</td>
            <td rowspan="1" colspan="1">0.68</td>
            <td rowspan="1" colspan="1">0.18</td>
            <td rowspan="1" colspan="1">0.043</td>
          </tr> </tbody>
        </table>
      </table-wrap>




</p>


<fig id="fig06" fig-type="figure" position="float">
					<label>Figure 4</label>
					<caption>
						<p> Average error over participants and validation targets for four calibration routines. The vertical bars denote the 95% confidence intervals of the means</p>
						</caption>
					<graphic id="graph06" xlink:href="jemr-10-04-a-figure-06.png"/>
				</fig>	
    
        <p>
          A repeated measures analysis of variance (each
participant calibrated with four different routines) showed that
the calibration routine has a significant (&#x3B1; = .001) effect on
the magnitude of the error (F(3,48) = 9.74, p &lt; .000). Table
3 shows the results of Tukey&#x2019;s test for the honestly
significant difference between pairs of means. The differences
between the means for the 45-dots and vertical movement
as well as the difference between 45-dots and horizontal
movement along a wavy path were significant (&#x3B1; = .01).
Although the accuracy for the moving target along an even
horizontal path was worse than that of the 45-dots routine,
it was not significantly so (&#x3B1; = .05).
        </p>
        <p>It can, therefore, be concluded that a moving target
along an even horizontal path has the potential to be used
as alternative calibration routine. This will be tested in the
next section with a sample of difficult-to-calibrate
participants.

<table-wrap id="t3" position="anchor">
					<label>Table 3</label>
					<caption>
						<p>p-Values for the significance of the difference in error between pairs of means</p>
					</caption>
		<table frame="hsides" rules="groups" cellpadding="3">
		<tbody>
          <tr>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">Even hor</td>
            <td rowspan="1" colspan="1">Wavy hor</td>
            <td rowspan="1" colspan="1">Vertical</td>
          </tr> </tbody><tbody>
          <tr>
            <td rowspan="1" colspan="1">45 dots</td>
            <td rowspan="1" colspan="1">0.465</td>
            <td rowspan="1" colspan="1">0.005</td>
            <td rowspan="1" colspan="1">0.000</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Even hor</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">0.183</td>
            <td rowspan="1" colspan="1">0.005</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Wavy hor </td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">0.457</td>
          </tr></tbody>
        </table>
		    </table-wrap>

		</p>
</sec>
      <sec id="s6">
        <title>Applicability of Smooth Pursuit Calibration
for Difficult-to-Calibrate Participants</title>
      </sec>



      
      <sec id="s6a">
        <title>Equipment</title>
		<p>The same self-assembled eye tracker was used as in the
previous section.</p>
      </sec>
      <sec id="s6b">
        <title>Method</title>
        
        <p>A school for learners with special education needs
were visited and all learners from Grade 1 to Grade 3 (ages
6 &#x2013; 11) for which permission of the parents were obtained,
were tested. The school accommodates learners who are
cerebrally palsied, physically and/or learning disabled.
The school has specially qualified remedial teachers as
well as a multi-disciplinary support structure that includes
psychologists, social workers, occupational therapists,
speech therapists, physiotherapists as well as a
professional nurse. The school follows the normal mainstream
syllabi but there are no more than 10 learners in a class to
enable teachers to provide specialised and individual
attention.</p>
        <p>Several lessons were learned in the process of
capturing data. Initially, learners were requested to follow the
moving target without any further instruction. It soon
became evident that they struggle to maintain focus on the
target for the duration of the trajectory. The target was then
programmed to change colour in cycles of blue (2 seconds)
and red (500 ms) and learners were instructed to call out
the word "Red" whenever the target changes to red. For the
procedures where dots were involved, every dot appeared
in a different colour and learners were instructed to call out
the colour for the dot every time.</p>
        <p>Although the system allows moderate head
movements, many learners had excessive sideways and back
and forth head movements &#x2013; some of which were
involuntary. A chinrest was then used to maintain head position,
but it caused instability of the eyes every time that the
learners vocalised their response on a colour change of the
target. Finally, the learners were instructed to push their
foreheads against a barrier that was set such that a fixed
gaze distance of 700 mm was maintained.</p>
        <p>It was also realised that the sets of 45 dots for
calibration and 28 dots for validation was too exhausting and
therefore these were limited to 23 calibration targets (in
rows of 5, 4, 5, 4, 5 targets each) and 15 validations targets
in a grid of 5&#xD7;3.</p>
        <p>Eventually, 24 participants were tested with the final
configuration of target movement, headrest and calibration
sets. The number of learners per grade and condition is
shown in Table 4. Note that some learners had more than
one condition.

	<table-wrap id="t4" position="anchor">
					<label>Table 4</label>
					<caption>
						<p>Number of learners per grade and condition
(ADD: Attention deficit disorder; ADHD: Attention deficit hyperactivity disorder; ASP: Asperger syndrome; DSL: Dyslexia; EPSY: Epilepsy; LD: Learning disability)</p>
					</caption>
	<table frame="hsides" rules="groups" cellpadding="3">
	<tbody>
          <tr>
            <td rowspan="1" colspan="1">Grade</td>
            <td rowspan="1" colspan="1">n</td>
            <td rowspan="1" colspan="1">Age</td>
            <td rowspan="1" colspan="1">ADD</td>
            <td rowspan="1" colspan="1">ADHD</td>
            <td rowspan="1" colspan="1">ASP</td>
            <td rowspan="1" colspan="1">DSL</td>
            <td rowspan="1" colspan="1">EPSY</td>
            <td rowspan="1" colspan="1">LD</td>
          </tr>
		   </tbody><tbody> 
          <tr>
            <td rowspan="1" colspan="1">1</td>
            <td rowspan="1" colspan="1">4</td>
            <td rowspan="1" colspan="1">6.75</td>
            <td rowspan="1" colspan="1">1</td>
            <td rowspan="1" colspan="1">3</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">1</td>
            <td rowspan="1" colspan="1">1</td>
            <td rowspan="1" colspan="1">1</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">2</td>
            <td rowspan="1" colspan="1">13</td>
            <td rowspan="1" colspan="1">8.62</td>
            <td rowspan="1" colspan="1">5</td>
            <td rowspan="1" colspan="1">2</td>
            <td rowspan="1" colspan="1">2</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">6</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">3</td>
            <td rowspan="1" colspan="1">7</td>
            <td rowspan="1" colspan="1">9.86</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">4</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">4</td>
          </tr></tbody>
        </table>
      </table-wrap>



</p>
<p>
The learners were presented with a moving target along
an even horizontal path (SP) (<italic>cf</italic> Figure 1) and the 23-dots
routine. After every routine, the 5&#xD7;3 grid of dots was
displayed to determine the accuracy of the procedure. As was
the case for the validation of accuracy with healthy adults,
the dots appeared in random order to prevent learners from
pre-empting the position of the next dot and prematurely
look away. The performance of both the 23-dot and SP
calibration routines was expressed as the average accuracy
over all participants as determined through the 15 dots
validation routine.</p>
      </sec>
      <sec id="s6c">
        <title>Validation results</title>
        <p>
          As for the validation of accuracy with healthy adults,
validation points were excluded from the calculation of
average offset if the offset was larger than 3&#xB0; or if the mapped
gaze coordinates for the two eyes were more than 3&#xB0; apart.
Table 5 shows the average number of validation points that
were included for each one of the calibration routines. A
repeated measures analysis of variance for the effect of
calibration routine (23-dots vs SP) on the number of valid
points showed that the SP routine leads to more reliable
data as there are significantly less points that have to be
discarded (F(1,47) = 47.7, p &lt; .001).
        </p>
        <p>
          Table 5 also shows the average error across the 15
validation points and 24 participants per calibration routine.
A repeated measures analysis of variance for the effect of
calibration routine (23-dots vs SP) on the accuracy of
tracking showed that the SP routine is significantly better
than a dots-based routine (F(1,47) = 12.57, p &lt; .000) for
difficult-to-calibrate participants.

<table-wrap id="t5" position="anchor">
					<label>Table 5</label>
					<caption>
						<p>Average number of validation points that was included and the average error (over participants and validation targets) for each of the calibration routines. (SD: Standard deviation, SEM: Standard error of the mean)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
					<tbody>
          <tr>
                <td rowspan="1" colspan="1"> </td> 
            <td rowspan="1" colspan="4">Number of points</td>
            <td rowspan="1" colspan="1"> </td>
            <td rowspan="1" colspan="5">Error (degrees)</td>
          </tr> </tbody><tbody> 
          <tr>
           <td rowspan="1" colspan="1">Routine</td> 	
            <td rowspan="1" colspan="1">Min</td>
            <td rowspan="1" colspan="1">Max</td>
            <td rowspan="1" colspan="1">Avg</td>
            <td rowspan="1" colspan="1">SD</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">Min</td>
            <td rowspan="1" colspan="1">Max</td>
            <td rowspan="1" colspan="1">Avg</td>
            <td rowspan="1" colspan="1">SD</td>
            <td rowspan="1" colspan="1">SEM</td>
          </tr> </tbody><tbody> 
          <tr>
            <td rowspan="1" colspan="1">28 dots</td>
            <td rowspan="1" colspan="1">2</td>
            <td rowspan="1" colspan="1">15</td>
            <td rowspan="1" colspan="1">10.6</td>
            <td rowspan="1" colspan="1">3.34</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">0.60</td>
            <td rowspan="1" colspan="1">2.49</td>
            <td rowspan="1" colspan="1">1.15</td>
            <td rowspan="1" colspan="1">0.39</td>
            <td rowspan="1" colspan="1">0.080</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Even hor</td>
            <td rowspan="1" colspan="1">9</td>
            <td rowspan="1" colspan="1">15</td>
            <td rowspan="1" colspan="1">13.7</td>
            <td rowspan="1" colspan="1">1.67</td>
            <td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1">0.53</td>
            <td rowspan="1" colspan="1">1.77</td>
            <td rowspan="1" colspan="1">0.94</td>
            <td rowspan="1" colspan="1">1.15</td>
            <td rowspan="1" colspan="1">0.235</td>
          </tr></tbody> 
        </table>
      </table-wrap>
        </p>
      </sec>
    </sec>
    <sec id="s7">
      <title>Summary</title>
      <p>
        The conventional way of calibrating remote
videobased eye trackers is through presentation of a series of
gaze targets at known positions while participants are
expected to watch the targets. For regression-based mapping
of eye features to gaze coordinates, more gaze targets
normally mean better (more accurate) calibration.
Unfortunately, more gaze targets also require more mental effort
from participants. Through informal observations, it was
realised that, although the 45 dots-routine of a previous
study (
        <xref ref-type="bibr" rid="R2">2</xref>
        ) provided very good accuracy, it expected too
much mental effort for participants who struggle to
maintain concentration.
      </p>
      <p>Depending on the type of experiment, better accuracy
might be expected than can be achieved with calibration
free or auto-calibrating systems. The calibration
procedures that are normally used for infants, toddlers and
autistic children do also not suffice since they are not
accurate enough and the reliability of research results might be
jeopardised.</p>
      <p>
        In this paper, the use of smooth pursuit with a target
moving across the display at a constant speed, is proposed.
This approach is motivated by the fact that attention to a
moving target can be maintained more easily &#x2013; especially
if accompanied by a concurrent and related task such as
analysis of some or other changing characteristic of the
target (
        <xref ref-type="bibr" rid="R33 R34">33, 34</xref>
        ).
      </p>
      <p>While the participant is following the target, gaze data
is captured at regular intervals and many calibration targets
are saved that can be used in subsequent regression and
interpolation. Because of the abundance of points, the
procedure allows the exclusion of points of dubious quality.
Depending on the speed of movement and the trajectory,
the procedure could take anything between 30 s and 60 s
to complete.</p>
      <p>Validation of the proposed routine was done in two
phases: The accuracy of the routine was validated by
comparing its performance with that of a standard calibration
procedure for healthy and cooperating adults. Thereafter,
the applicability of the approach for participants who are
normally difficult to calibrate, is validated by applying it
for a group of early primary school children with various
forms of one or more cognitive disorders.</p>
      <p>
        It was proven through a repeated measures,
within-participants, analysis of variance that the accuracy that can be
attained through calibration with a moving target along an
even horizontal path is not significantly worse than the
accuracy that can be attained with a standard method of
watching dots. Accuracy of around the 0.5&#xB0; mark were
obtained for both routines for a group of seventeen adults
which is comparable with the 0.6	 attained by Pfeuffer,
Vidal (
        <xref ref-type="bibr" rid="R53">53</xref>
        ) and better than the 0.84&#xB0; attained by Celebi,
Kim (
        <xref ref-type="bibr" rid="R54">54</xref>
        ).
      </p>
      <p>
        For a group of young children with various forms of
cognitive disorders such as ADD, ADHD and learning
disabilities, smooth pursuit calibration proved to be superior
to the standard routine. For this group, an average accuracy
of below 1&#xB0; could be achieved with SP while it was not the
case with a standard routine of 28 dots. This is a significant
improvement on the 1.5&#xB0;-2.5&#xB0; errors that can be attained
by calibration-free or auto-calibrating routines such as
those of Huang, Kwok (
        <xref ref-type="bibr" rid="R13">13</xref>
        ) and Swirski and Dodgson (
        <xref ref-type="bibr" rid="R14">14</xref>
        ).
      </p>
    </sec>
    <sec id="s8">
      <title>Future Research</title>
      <p>
        Since smooth pursuit ability develops until the age of
adolescence (
        <xref ref-type="bibr" rid="R30">30</xref>
        ), one can expect that older children will
benefit even more from the smooth pursuit approach. This
needs to be investigated.
      </p>
      <p>
        Furthermore, the smooth pursuit approach was tested
above for children with ADD, ADHD and learning
disabilities. No children with autism were tested and it remains
to be seen of the approach will work for such conditions
since it is known that smooth pursuit is impaired in autism
and similar conditions (
        <xref ref-type="bibr" rid="R17 R47">17, 47</xref>
        ).
      </p>
    </sec>
  </body>
  
  
  
  
  <back>
   <ref-list>
<ref id="R7"><label>7</label><mixed-citation publication-type="book-chapter" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Abe</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Ohi</surname>, <given-names>S.</given-names></string-name>, &amp; <string-name><surname>Ohyama</surname>, <given-names>M.</given-names></string-name></person-group> (<year>2007</year>). <chapter-title>An eye-gaze input system using information on eye movement history</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>C.</given-names> <surname>Stephanidis</surname></string-name> (<role>Ed.</role>),</person-group> <source>Universal Access in HCI, Part II. HCII2007, LNCS 4555, 721-729</source>. <publisher-loc>Berlin, Heidelberg</publisher-loc>: <publisher-name>Springer Berlin Heidelberg</publisher-name>. <pub-id pub-id-type="doi">10.1007/978-3-540-73281-5_79</pub-id></mixed-citation></ref>
<ref id="R30"><label>30</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Accardo</surname>, <given-names>A. P.</given-names></string-name>, <string-name><surname>Pensiero</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Da Pozzo</surname>, <given-names>S.</given-names></string-name>, &amp; <string-name><surname>Perissutti</surname>, <given-names>P.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Characteristics of horizontal smooth pursuit eye movements to sinusoidal stimulation in children of primary school age.</article-title> <source>Vision Research</source>, <volume>35</volume>(<issue>4</issue>), <fpage>539</fpage>–<lpage>548</lpage>. <pub-id pub-id-type="doi">10.1016/0042-6989(94)00145-C</pub-id><pub-id pub-id-type="pmid">7900294</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R15"><label>15</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Aslin</surname>, <given-names>R. N.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Infant eyes: A window on cognitive development.</article-title> <source>Infancy</source>, <volume>17</volume>(<issue>1</issue>), <fpage>126</fpage>–<lpage>140</lpage>. <pub-id pub-id-type="doi">10.1111/j.15327078.2011.00097.x</pub-id><pub-id pub-id-type="pmid">22267956</pub-id><issn>1525-0008</issn></mixed-citation></ref>
<ref id="R40"><label>40</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Becker</surname>, <given-names>W.</given-names></string-name>, &amp; <string-name><surname>Fuchs</surname>, <given-names>A. F.</given-names></string-name></person-group> (<year>1985</year>). <article-title>Prediction in the oculomotor system: Smooth pursuit during transient disappearance of a visual target.</article-title> <source>Experimental Brain Research</source>, <volume>57</volume>(<issue>3</issue>), <fpage>562</fpage>–<lpage>575</lpage>. <pub-id pub-id-type="doi">10.1007/BF00237843</pub-id><pub-id pub-id-type="pmid">3979498</pub-id><issn>0014-4819</issn></mixed-citation></ref>
<ref id="R2"><label>2</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Blignaut</surname>, <given-names>P. J.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Idiosyncratic feature-based gaze mapping.</article-title> <source>Journal of Eye Movement Research</source>, <volume>9</volume>(<issue>3</issue>), <fpage>1</fpage>–<lpage>17</lpage>. <pub-id pub-id-type="doi">10.16910/jemr.9.3.2</pub-id><issn>1995-8692</issn></mixed-citation></ref>
<ref id="R12"><label>12</label><mixed-citation publication-type="book" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Borah</surname>, <given-names>J.</given-names></string-name></person-group> (<year>1998</year>). <source>Technology and application of head based control.</source> RTO Lecture series on Alternative Control Technologies, 7-8 October 1998, Brétigny, France and 14-15 October 1998, Ohio, USA.</mixed-citation></ref>
<ref id="R35"><label>35</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Brenner</surname>, <given-names>E.</given-names></string-name>, <string-name><surname>Smeets</surname>, <given-names>J. B.</given-names></string-name>, &amp; <string-name><surname>van den Berg</surname>, <given-names>A. V.</given-names></string-name></person-group> (<year>2001</year>). <article-title>Smooth eye movements and spatial localisation.</article-title> <source>Vision Research</source>, <volume>41</volume>(<issue>17</issue>), <fpage>2253</fpage>–<lpage>2259</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(01)00018-9</pub-id><pub-id pub-id-type="pmid">11448717</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R54"><label>54</label><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Celebi</surname>, <given-names>F. M.</given-names></string-name>, <string-name><surname>Kim</surname>, <given-names>E. S.</given-names></string-name>, <string-name><surname>Wang</surname>, <given-names>Q.</given-names></string-name>, <string-name><surname>Wall</surname>, <given-names>C. A.</given-names></string-name>, &amp; <string-name><surname>Shic</surname>, <given-names>F.</given-names></string-name></person-group> (<year>2014</year>). <source>A smooth pursuit calibration technique. Eye Tracking Research and Applications (ETRA), 26- 28 March 2014</source>. <publisher-loc>Florida</publisher-loc>: <publisher-name>Safety Harbor</publisher-name>.</mixed-citation></ref>
<ref id="R19"><label>19</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Corbetta</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Guan</surname>, <given-names>Y.</given-names></string-name>, &amp; <string-name><surname>Williams</surname>, <given-names>J. L.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Infant eye-tracking in the context of goal-directed actions.</article-title> <source>Infancy</source>, <volume>17</volume>(<issue>1</issue>), <fpage>102</fpage>–<lpage>125</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1111/j.15327078.2011.0093.x</pub-id> <pub-id pub-id-type="doi">10.1111/j.1532-7078.2011.00093.x</pub-id><pub-id pub-id-type="pmid">22563297</pub-id><issn>1525-0008</issn></mixed-citation></ref>
<ref id="R1"><label>1</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Crane</surname>, <given-names>H. D.</given-names></string-name>, &amp; <string-name><surname>Steele</surname>, <given-names>C. M.</given-names></string-name></person-group> (<year>1985</year>). <article-title>Generation-V dual-Purkinje-image eyetracker.</article-title> <source>Applied Optics</source>, <volume>24</volume>(<issue>4</issue>), <fpage>527</fpage>–<lpage>537</lpage>. <pub-id pub-id-type="doi">10.1364/AO.24.000527</pub-id><pub-id pub-id-type="pmid">18216982</pub-id><issn>0003-6935</issn></mixed-citation></ref>
<ref id="R42"><label>42</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Engel</surname>, <given-names>K. C.</given-names></string-name>, <string-name><surname>Anderson</surname>, <given-names>J. H.</given-names></string-name>, &amp; <string-name><surname>Soechting</surname>, <given-names>J. F.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Similarity in the response of smooth pursuit and manual tracking to a change in the direction of target motion.</article-title> <source>Journal of Neurophysiology</source>, <volume>84</volume>(<issue>3</issue>), <fpage>1149</fpage>–<lpage>1156</lpage>. <pub-id pub-id-type="doi">10.1152/jn.2000.84.3.1149</pub-id><pub-id pub-id-type="pmid">10979990</pub-id><issn>0022-3077</issn></mixed-citation></ref>
<ref id="R51"><label>51</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Fletcher</surname>, <given-names>W. A.</given-names></string-name>, &amp; <string-name><surname>Sharpe</surname>, <given-names>J. A.</given-names></string-name></person-group> (<year>1988</year>). <article-title>Smooth pursuit dysfunction in Alzheimer’s disease.</article-title> <source>Neurology</source>, <volume>38</volume>(<issue>2</issue>), <fpage>272</fpage>–<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1212/WNL.38.2.272</pub-id><pub-id pub-id-type="pmid">3340292</pub-id><issn>0028-3878</issn></mixed-citation></ref>
<ref id="R18"><label>18</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Franchak</surname>, <given-names>J. M.</given-names></string-name>, <string-name><surname>Kretch</surname>, <given-names>K. S.</given-names></string-name>, <string-name><surname>Soska</surname>, <given-names>K. C.</given-names></string-name>, &amp; <string-name><surname>Adolph</surname>, <given-names>K. E.</given-names></string-name></person-group> (<year>2011</year>). <article-title>Head-mounted eye tracking: A new method to describe infant looking.</article-title> <source>Child Development</source>, <volume>82</volume>(<issue>6</issue>), <fpage>1738</fpage>–<lpage>1750</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1111/j.14678624.2011.01670.x</pub-id> <pub-id pub-id-type="doi">10.1111/j.14678624.2011.01670.x</pub-id><pub-id pub-id-type="pmid">22023310</pub-id><issn>0009-3920</issn></mixed-citation></ref>
<ref id="R52"><label>52</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Fried</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Tsitsiashvili</surname>, <given-names>E.</given-names></string-name>, <string-name><surname>Bonneh</surname>, <given-names>Y. S.</given-names></string-name>, <string-name><surname>Sterkin</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Wygnanski-Jaffe</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Epstein</surname>, <given-names>T.</given-names></string-name>, &amp; <string-name><surname>Polat</surname>, <given-names>U.</given-names></string-name></person-group> (<year>2014</year>). <article-title>ADHD subjects fail to suppress eye blinks and microsaccades while anticipating visual stimuli but recover with medication.</article-title> <source>Vision Research</source>, <volume>101</volume>, <fpage>62</fpage>–<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2014.05.004</pub-id><pub-id pub-id-type="pmid">24863585</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R48"><label>48</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Gooding</surname>, <given-names>D. C.</given-names></string-name>, <string-name><surname>Miller</surname>, <given-names>M. D.</given-names></string-name>, &amp; <string-name><surname>Kwapil</surname>, <given-names>T. R.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Smooth pursuit eye tracking and visual fixation in psychosis-prone individuals.</article-title> <source>Psychiatry Research</source>, <volume>93</volume>(<issue>1</issue>), <fpage>41</fpage>–<lpage>54</lpage>. <pub-id pub-id-type="doi">10.1016/S0165-1781(00)00113-X</pub-id><pub-id pub-id-type="pmid">10699227</pub-id><issn>0165-1781</issn></mixed-citation></ref>
<ref id="R55"><label>55</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Gredebäck</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Johnson</surname>, <given-names>S.</given-names></string-name>, &amp; <string-name><surname>von Hofsten</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2010</year>). <article-title>Eye tracking in infancy research.</article-title> <source>Developmental Neuropsychology</source>, <volume>35</volume>(<issue>1</issue>), <fpage>1</fpage>–<lpage>19</lpage>. <pub-id pub-id-type="doi">10.1080/87565640903325758</pub-id><pub-id pub-id-type="pmid">20390589</pub-id><issn>8756-5641</issn></mixed-citation></ref>
<ref id="R4"><label>4</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Hansen</surname>, <given-names>D. W.</given-names></string-name>, &amp; <string-name><surname>Ji</surname>, <given-names>Q.</given-names></string-name></person-group> (<year>2010</year>). <article-title>In the eye of the beholder: A survey of models for eyes and gaze.</article-title> <source>IEEE Transactions on Pattern Analysis and Machine Intelligence</source>, <volume>32</volume>(<issue>3</issue>), <fpage>478</fpage>–<lpage>500</lpage>. <pub-id pub-id-type="doi">10.1109/TPAMI.2009.30</pub-id><pub-id pub-id-type="pmid">20075473</pub-id><issn>0162-8828</issn></mixed-citation></ref>
<ref id="R6"><label>6</label><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Holmqvist</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Nyström</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Andersson</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Dewhurst</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Jarodzka</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Van de Weijer</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2011</year>). <source>Eye tracking: A comprehensive guide to methods and measures</source>. <publisher-loc>London</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>.</mixed-citation></ref>
<ref id="R33"><label>33</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Holzman</surname>, <given-names>P. S.</given-names></string-name>, <string-name><surname>Levy</surname>, <given-names>D. L.</given-names></string-name>, &amp; <string-name><surname>Proctor</surname>, <given-names>L. R.</given-names></string-name></person-group> (<year>1976</year>). <article-title>Smooth pursuit eye movements, attention, and schizophrenia.</article-title> <source>Archives of General Psychiatry</source>, <volume>33</volume>(<issue>12</issue>), <fpage>1415</fpage>–<lpage>1420</lpage>. <pub-id pub-id-type="doi">10.1001/archpsyc.1976.01770120019001</pub-id><pub-id pub-id-type="pmid">999447</pub-id><issn>0003-990X</issn></mixed-citation></ref>
<ref id="R44"><label>44</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Holzman</surname>, <given-names>P. S.</given-names></string-name>, &amp; <string-name><surname>Levy</surname>, <given-names>D. L.</given-names></string-name></person-group> (<year>1977</year>). <article-title>Smooth pursuit eye movements and functional psychoses; a review.</article-title> <source>Schizophrenia Bulletin</source>, <volume>3</volume>(<issue>1</issue>), <fpage>15</fpage>–<lpage>27</lpage>. <pub-id pub-id-type="doi">10.1093/schbul/3.1.15</pub-id><pub-id pub-id-type="pmid">325640</pub-id><issn>0586-7614</issn></mixed-citation></ref>
<ref id="R5"><label>5</label><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Hoormann</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Jainta</surname>, <given-names>S.</given-names></string-name>, &amp; <string-name><surname>Jaschinski</surname>, <given-names>W.</given-names></string-name></person-group> (<year>2008</year>). <article-title>The effect of calibration errors on the accuracy of the eye movement recordings.</article-title> Journal of Eye Movement Research, 1(2):3, 1-7. doi:<pub-id pub-id-type="doi">10.16910/jemr.1.2.3</pub-id>.</mixed-citation></ref>
<ref id="R13"><label>13</label><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Huang</surname>, <given-names>M.X.</given-names></string-name>, <string-name><surname>Kwok</surname>, <given-names>T.C.K.</given-names></string-name>, <string-name><surname>Ngai</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Chan</surname>, <given-names>S.C.F.</given-names></string-name>, &amp; <string-name><surname>Leong</surname>, <given-names>H.V.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Building a personalized, autocalibrating eye tracker from user interactions.</article-title> CHI 2016, San Jose, California, 7-12 May 2016. http://dx.doi.org/<pub-id pub-id-type="doi">10.1145/2858036.2858404</pub-id>.</mixed-citation></ref>
<ref id="R23"><label>23</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Hutton</surname>, <given-names>S. B.</given-names></string-name>, &amp; <string-name><surname>Tegally</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2005</year>). <article-title>The effects of dividing attention on smooth pursuit eye tracking.</article-title> <source>Experimental Brain Research</source>, <volume>163</volume>(<issue>3</issue>), <fpage>306</fpage>–<lpage>313</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-004-2171-z</pub-id><pub-id pub-id-type="pmid">15654587</pub-id><issn>0014-4819</issn></mixed-citation></ref>
<ref id="R34"><label>34</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Iacono</surname>, <given-names>W. G.</given-names></string-name>, &amp; <string-name><surname>Lykken</surname>, <given-names>D. T.</given-names></string-name></person-group> (<year>1979</year>). <article-title>Electro-oculographic recording and scoring of smooth pursuit and saccadic eye tracking: A parametric study using monozygotic twins.</article-title> <source>Psychophysiology</source>, <volume>16</volume>(<issue>2</issue>), <fpage>94</fpage>–<lpage>107</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-8986.1979.tb01451.x</pub-id><pub-id pub-id-type="pmid">570714</pub-id><issn>0048-5772</issn></mixed-citation></ref>
<ref id="R36"><label>36</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kerzel</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Souto</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>Ziegler</surname>, <given-names>N. E.</given-names></string-name></person-group> (<year>2008</year>). <article-title>Effects of attention shifts to stationary objects during steady-state smooth pursuit eye movements.</article-title> <source>Vision Research</source>, <volume>48</volume>(<issue>7</issue>), <fpage>958</fpage>–<lpage>969</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2008.01.015</pub-id><pub-id pub-id-type="pmid">18295816</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R8"><label>8</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kliegl</surname>, <given-names>R.</given-names></string-name>, &amp; <string-name><surname>Olson</surname>, <given-names>R. K.</given-names></string-name></person-group> (<year>1981</year>). <article-title>Reduction and calibration of eye monitor data.</article-title> <source>Behavior Research Methods and Instrumentation</source>, <volume>13</volume>(<issue>2</issue>), <fpage>107</fpage>–<lpage>111</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.3758/bf03207917</pub-id> <pub-id pub-id-type="doi">10.3758/BF03207917</pub-id><issn>0005-7878</issn></mixed-citation></ref>
<ref id="R39"><label>39</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kowler</surname>, <given-names>E.</given-names></string-name>, &amp; <string-name><surname>McKee</surname>, <given-names>S. P.</given-names></string-name></person-group> (<year>1987</year>). <article-title>Sensitivity of smooth eye movement to small differences in target velocity.</article-title> <source>Vision Research</source>, <volume>27</volume>(<issue>6</issue>), <fpage>993</fpage>–<lpage>1015</lpage>. <pub-id pub-id-type="doi">10.1016/0042-6989(87)90014-9</pub-id><pub-id pub-id-type="pmid">3660658</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R46"><label>46</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Levin</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Luebke</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Zee</surname>, <given-names>D. S.</given-names></string-name>, <string-name><surname>Hain</surname>, <given-names>T. C.</given-names></string-name>, <string-name><surname>Robinson</surname>, <given-names>D. A.</given-names></string-name>, &amp; <string-name><surname>Holzman</surname>, <given-names>P. S.</given-names></string-name></person-group> (<year>1988</year>). <article-title>Smooth pursuit eye movements in schizophrenics: Quantitative measurements with the search-coil technique.</article-title> <source>Journal of Psychiatric Research</source>, <volume>22</volume>(<issue>3</issue>), <fpage>195</fpage>–<lpage>206</lpage>. <pub-id pub-id-type="doi">10.1016/0022-3956(88)90005-2</pub-id><pub-id pub-id-type="pmid">3225789</pub-id><issn>0022-3956</issn></mixed-citation></ref>
<ref id="R37"><label>37</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Lindner</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Schwarz</surname>, <given-names>U.</given-names></string-name>, &amp; <string-name><surname>Ilg</surname>, <given-names>U. J.</given-names></string-name></person-group> (<year>2001</year>). <article-title>Cancellation of self-induced retinal image motion during smooth pursuit eye movements.</article-title> <source>Vision Research</source>, <volume>41</volume>(<issue>13</issue>), <fpage>1685</fpage>–<lpage>1694</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(01)00050-5</pub-id><pub-id pub-id-type="pmid">11348650</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R24"><label>24</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Madelain</surname>, <given-names>L.</given-names></string-name>, <string-name><surname>Krauzlis</surname>, <given-names>R. J.</given-names></string-name>, &amp; <string-name><surname>Wallman</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2005</year>). <article-title>Spatial deployment of attention influences both saccadic and pursuit tracking.</article-title> <source>Vision Research</source>, <volume>45</volume>(<issue>20</issue>), <fpage>2685</fpage>–<lpage>2703</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2005.05.009</pub-id><pub-id pub-id-type="pmid">16005932</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R10"><label>10</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>McConkie</surname>, <given-names>G. W.</given-names></string-name></person-group> (<year>1981</year>). <article-title>Evaluating and reporting data quality in eye movement research.</article-title> <source>Behavior Research Methods and Instrumentation</source>, <volume>13</volume>(<issue>2</issue>), <fpage>97</fpage>–<lpage>106</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.3758/bf03207916</pub-id> <pub-id pub-id-type="doi">10.3758/BF03207916</pub-id><issn>0005-7878</issn></mixed-citation></ref>
<ref id="R27"><label>27</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Meyer</surname>, <given-names>C. H.</given-names></string-name>, <string-name><surname>Lasker</surname>, <given-names>A. G.</given-names></string-name>, &amp; <string-name><surname>Robinson</surname>, <given-names>D. A.</given-names></string-name></person-group> (<year>1985</year>). <article-title>The upper limit of human smooth pursuit velocity.</article-title> <source>Vision Research</source>, <volume>25</volume>(<issue>4</issue>), <fpage>561</fpage>–<lpage>563</lpage>. <pub-id pub-id-type="doi">10.1016/0042-6989(85)90160-9</pub-id><pub-id pub-id-type="pmid">4060608</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R21"><label>21</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Nagel</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Sprenger</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Steinlechner</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Binkofski</surname>, <given-names>F.</given-names></string-name>, &amp; <string-name><surname>Lencer</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Altered velocity processing in schizophrenia during pursuit eye tracking.</article-title> <source>PLoS One</source>, <volume>7</volume>(<issue>6</issue>), <fpage>e38494</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0038494</pub-id><pub-id pub-id-type="pmid">22693639</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="R3"><label>3</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Nyström</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Andersson</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Holmqvist</surname>, <given-names>K.</given-names></string-name>, &amp; <string-name><surname>van de Weijer</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2013</year>). <article-title>The influence of calibration method and eye physiology on eyetracking data quality.</article-title> <source>Behavior Research Methods</source>, <volume>45</volume>(<issue>1</issue>), <fpage>272</fpage>–<lpage>288</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-012-0247-4</pub-id><pub-id pub-id-type="pmid">22956394</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="R45"><label>45</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>O’Driscoll</surname>, <given-names>G. A.</given-names></string-name>, &amp; <string-name><surname>Callahan</surname>, <given-names>B. L.</given-names></string-name></person-group> (<year>2008</year>). <article-title>Smooth pursuit in schizophrenia: A meta-analytic review of research since 1993.</article-title> <source>Brain and Cognition</source>, <volume>68</volume>(<issue>3</issue>), <fpage>359</fpage>–<lpage>370</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2008.08.023</pub-id><pub-id pub-id-type="pmid">18845372</pub-id><issn>0278-2626</issn></mixed-citation></ref>
<ref id="R49"><label>49</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>O’Driscoll</surname>, <given-names>G. A.</given-names></string-name>, <string-name><surname>Lenzenweger</surname>, <given-names>M. F.</given-names></string-name>, &amp; <string-name><surname>Holzman</surname>, <given-names>P. S.</given-names></string-name></person-group> (<year>1998</year>). <article-title>Antisaccades and smooth pursuit eye tracking and schizotypy.</article-title> <source>Archives of General Psychiatry</source>, <volume>55</volume>(<issue>9</issue>), <fpage>837</fpage>–<lpage>843</lpage>. <pub-id pub-id-type="doi">10.1001/archpsyc.55.9.837</pub-id><pub-id pub-id-type="pmid">9736011</pub-id><issn>0003-990X</issn></mixed-citation></ref>
<ref id="R53"><label>53</label><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Pfeuffer</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Vidal</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Turner</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Bulling</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Gellersen</surname>, <given-names>H.</given-names></string-name></person-group> (<year>2013</year>). <article-title>Pursuit calibration: Making gaze calibration less tedious and more flexible.</article-title> <source>Proceedings of the 26th annual ACM symposium on user interface software and technology</source>, <fpage>261</fpage>-<lpage>270</lpage>. http://dx.doi.org/<pub-id pub-id-type="doi">10.1145/2501988.2501998</pub-id></mixed-citation></ref>
<ref id="R17"><label>17</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Pierce</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Conant</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Hazin</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Stoner</surname>, <given-names>R.</given-names></string-name>, &amp; <string-name><surname>Desmond</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2011</year>). <article-title>Preference for geometric patterns early in life as a risk factor for autism.</article-title> <source>Archives of General Psychiatry</source>, <volume>68</volume>(<issue>1</issue>), <fpage>101</fpage>–<lpage>109</lpage>. <pub-id pub-id-type="doi">10.1001/archgenpsychiatry.2010.113</pub-id><pub-id pub-id-type="pmid">20819977</pub-id><issn>0003-990X</issn></mixed-citation></ref>
<ref id="R41"><label>40</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Pola</surname>, <given-names>J.</given-names></string-name>, &amp; <string-name><surname>Wyatt</surname>, <given-names>H. J.</given-names></string-name></person-group> (<year>1997</year>). <article-title>Offset dynamics of human smooth pursuit eye movements: Effects of target presence and subject attention.</article-title> <source>Vision Research</source>, <volume>37</volume>(<issue>18</issue>), <fpage>2579</fpage>–<lpage>2595</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(97)00058-8</pub-id><pub-id pub-id-type="pmid">9373690</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R38"><label>38</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Pola</surname>, <given-names>J.</given-names></string-name>, &amp; <string-name><surname>Wyatt</surname>, <given-names>H. J.</given-names></string-name></person-group> (<year>2001</year>). <article-title>The role of target position in smooth pursuit deceleration and termination.</article-title> <source>Vision Research</source>, <volume>41</volume>(<issue>5</issue>), <fpage>655</fpage>–<lpage>669</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(00)00280-7</pub-id><pub-id pub-id-type="pmid">11226509</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R29"><label>29</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Richards</surname>, <given-names>J. E.</given-names></string-name>, &amp; <string-name><surname>Holley</surname>, <given-names>F. B.</given-names></string-name></person-group> (<year>1999</year>). <article-title>Infant attention and the development of smooth pursuit tracking.</article-title> <source>Developmental Psychology</source>, <volume>35</volume>(<issue>3</issue>), <fpage>856</fpage>–<lpage>867</lpage>. <pub-id pub-id-type="doi">10.1037/0012-1649.35.3.856</pub-id><pub-id pub-id-type="pmid">10380875</pub-id><issn>0012-1649</issn></mixed-citation></ref>
<ref id="R16"><label>16</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Sasson</surname>, <given-names>N. J.</given-names></string-name>, &amp; <string-name><surname>Elison</surname>, <given-names>J. T.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Eye tracking young children with autism.</article-title> <source>Journal of Visualized Experiments</source>, <volume>e3675</volume>(<issue>61</issue>), <fpage>•••</fpage>. <pub-id pub-id-type="doi">10.3791/3675</pub-id><pub-id pub-id-type="pmid">22491039</pub-id><issn>1940-087X</issn></mixed-citation></ref>
<ref id="R25"><label>25</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Sharpe</surname>, <given-names>J. A.</given-names></string-name></person-group> (<year>2008</year>). <article-title>Neurophysiology and neuroanatomy of smooth pursuit: Lesion studies.</article-title> <source>Brain and Cognition</source>, <volume>68</volume>(<issue>3</issue>), <fpage>241</fpage>–<lpage>254</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2008.08.015</pub-id><pub-id pub-id-type="pmid">19004537</pub-id><issn>0278-2626</issn></mixed-citation></ref>
<ref id="R11"><label>11</label><mixed-citation publication-type="book-chapter" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Sheena</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>Borah</surname>, <given-names>J.</given-names></string-name></person-group> (<year>1981</year>). <chapter-title>Compensation for some second order effects to improve eye position measurements</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>D. F.</given-names> <surname>Fisher</surname></string-name>, <string-name><given-names>R. A.</given-names> <surname>Monty</surname></string-name>, &amp; <string-name><given-names>J. W.</given-names> <surname>Senders</surname></string-name> (<role>Eds.</role>),</person-group> <source>Eye Movements: Cognition and Visual Percep-tion</source>. <publisher-loc>Hillsdale</publisher-loc>: <publisher-name>Lawrence Erlbaum Associates</publisher-name>.</mixed-citation></ref>
<ref id="R50"><label>50</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Simons</surname>, <given-names>R. F.</given-names></string-name>, &amp; <string-name><surname>Katkin</surname>, <given-names>W.</given-names></string-name></person-group> (<year>1985</year>). <article-title>Smooth pursuit eye movements in subjects reporting physical anhedonia and perceptual aberrations.</article-title> <source>Psychiatry Research</source>, <volume>14</volume>(<issue>4</issue>), <fpage>275</fpage>–<lpage>289</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/01651781(85)90096-4</pub-id> <pub-id pub-id-type="doi">10.1016/0165-1781(85)90096-4</pub-id><pub-id pub-id-type="pmid">3860882</pub-id><issn>0165-1781</issn></mixed-citation></ref>
<ref id="R43"><label>43</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Soechting</surname>, <given-names>J. F.</given-names></string-name>, <string-name><surname>Rao</surname>, <given-names>H. M.</given-names></string-name>, &amp; <string-name><surname>Juveli</surname>, <given-names>J. Z.</given-names></string-name></person-group> (<year>2010</year>). <article-title>Incorporating prediction in models for two-dimensional smooth pursuit.</article-title> <source>PLoS One</source>, <volume>5</volume>(<issue>9</issue>), <fpage>e12574</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0012574</pub-id><pub-id pub-id-type="pmid">20838450</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="R31"><label>31</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Souto</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>Kerzel</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2011</year>). <article-title>Attentional constraints on target selection for smooth pursuit eye movements.</article-title> <source>Vision Research</source>, <volume>51</volume>(<issue>1</issue>), <fpage>13</fpage>–<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2010.09.017</pub-id><pub-id pub-id-type="pmid">20869380</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R20"><label>20</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Spering</surname>, <given-names>M.</given-names></string-name>, &amp; <string-name><surname>Gegenfurtner</surname>, <given-names>K. R.</given-names></string-name></person-group> (<year>2008</year>). <article-title>Contextual effects on motion perception and smooth pursuit eye movements.</article-title> <source>Brain Research</source>, <volume>1225</volume>, <fpage>76</fpage>–<lpage>85</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2008.04.061</pub-id><pub-id pub-id-type="pmid">18538748</pub-id><issn>0006-8993</issn></mixed-citation></ref>
<ref id="R14"><label>14</label><mixed-citation publication-type="conference" specific-use="parsed"><person-group person-group-type="author"><string-name><surname>Swirski</surname>, <given-names>L.</given-names></string-name>, &amp; <string-name><surname>Dodgson</surname>, <given-names>N.</given-names></string-name></person-group> (<year>2013</year>). <article-title>A fully automatic, temporal approach to single camera, glint-free 3D eye model fitting.</article-title> <source>European Conference of Eye Movements (ECEM)</source>, <conf-loc>Lund, Sweden</conf-loc>.</mixed-citation></ref>
<ref id="R47"><label>47</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Takarae</surname>, <given-names>Y.</given-names></string-name>, <string-name><surname>Minshew</surname>, <given-names>N. J.</given-names></string-name>, <string-name><surname>Luna</surname>, <given-names>B.</given-names></string-name>, <string-name><surname>Krisky</surname>, <given-names>C. M.</given-names></string-name>, &amp; <string-name><surname>Sweeney</surname>, <given-names>J. A.</given-names></string-name></person-group> (<year>2004</year>). <article-title>Pursuit eye movement deficits in autism.</article-title> <source>Brain</source>, <volume>127</volume>(<issue>Pt 12</issue>), <fpage>2584</fpage>–<lpage>2594</lpage>. <pub-id pub-id-type="doi">10.1093/brain/awh307</pub-id><pub-id pub-id-type="pmid">15509622</pub-id><issn>0006-8950</issn></mixed-citation></ref>
<ref id="R22"><label>22</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Thier</surname>, <given-names>P.</given-names></string-name>, &amp; <string-name><surname>Ilg</surname>, <given-names>U. J.</given-names></string-name></person-group> (<year>2005</year>). <article-title>The neural basis of smooth-pursuit eye movements.</article-title> <source>Current Opinion in Neurobiology</source>, <volume>15</volume>(<issue>6</issue>), <fpage>645</fpage>–<lpage>652</lpage>. <pub-id pub-id-type="doi">10.1016/j.conb.2005.10.013</pub-id><pub-id pub-id-type="pmid">16271460</pub-id><issn>0959-4388</issn></mixed-citation></ref>
<ref id="R9"><label>9</label><mixed-citation publication-type="book" specific-use="unparsed"><person-group person-group-type="author"><collab>Tobii Technology</collab></person-group>. (<year>2010</year>). Product description, Tobii T/X series eye trackers, rev. 2.1. June 2010. Tobii Technology AB.</mixed-citation></ref>
<ref id="R32"><label>32</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Van Donkelaar</surname>, <given-names>P.</given-names></string-name>, &amp; <string-name><surname>Drew</surname>, <given-names>A. S.</given-names></string-name></person-group> (<year>2002</year>). <article-title>The allocation of attention during smooth pursuit eye movements.</article-title> <source>Progress in Brain Research</source>, <volume>140</volume>, <fpage>267</fpage>–<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1016/S0079-6123(02)40056-8</pub-id><pub-id pub-id-type="pmid">12508596</pub-id><issn>0079-6123</issn></mixed-citation></ref>
<ref id="R26"><label>26</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Van Gelder</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Lebedev</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Liu</surname>, <given-names>P. M.</given-names></string-name>, &amp; <string-name><surname>Tsui</surname>, <given-names>W. H.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Anticipatory saccades in smooth pursuit: Task effects and pursuit vector after saccades.</article-title> <source>Vision Research</source>, <volume>35</volume>(<issue>5</issue>), <fpage>667</fpage>–<lpage>678</lpage>. <pub-id pub-id-type="doi">10.1016/0042-6989(94)00161-E</pub-id><pub-id pub-id-type="pmid">7900305</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="R28"><label>28</label><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>von Hofsten</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name><surname>Rosander</surname>, <given-names>K.</given-names></string-name></person-group> (<year>1997</year>). <article-title>Development of smooth pursuit tracking in young infants.</article-title> <source>Vision Research</source>, <volume>37</volume>(<issue>13</issue>), <fpage>1799</fpage>–<lpage>1810</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(96)00332-X</pub-id><pub-id pub-id-type="pmid">9274766</pub-id><issn>0042-6989</issn></mixed-citation></ref>
</ref-list> 
 </back>
</article>
