<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.10.1.2</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Cyclopean, Dominant, and Non-dominant Gaze Tracking for Smooth Pursuit Gaze Interaction</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Elbaum</surname>
						<given-names>Tomer</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Wagner</surname>
						<given-names>Michael</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Botzer</surname>
						<given-names>Assaf</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
        <aff id="aff1">
		<institution>Ariel University</institution>, <country>Israel</country>
        </aff>
		</contrib-group>
     
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>25</day>  
		<month>1</month>
        <year>2017</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2017</year>
	</pub-date>
      <volume>10</volume>
      <issue>1</issue>
	  <elocation-id>10.16910/jemr.10.1.2</elocation-id>
	<permissions> 
	<copyright-year>2017</copyright-year>
	<copyright-holder>Elbaum et al.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
	<abstract>
          <p>User-centered design questions in gaze interfaces have been explored in multitude empirical investigations. Interestingly, the question of what eye should be the input device has never been studied. We compared tracking accuracy between the &#x201C;cyclopean&#x201D; (i.e., midpoint between eyes) dominant and non-dominant eye. In two experiments, participants performed tracking tasks. In Experiment 1, participants did not use a crosshair. Results showed that mean distance from target was smaller with cyclopean than with dominant or non-dominant eyes. In Experiment 2, participants controlled a crosshair with their cyclopean, dominant and non-dominant eye intermittently and had to align the crosshair with the target. Overall tracking accuracy was highest with cyclopean eye, yet similar between cyclopean and dominant eye in the second half of the experiment. From a theoretical viewpoint, our findings correspond with the cyclopean eye theory of egocentric direction and provide indication for eye dominance, in accordance with the hemispheric laterality approach. From a practical viewpoint, we show that what eye to use as input should be a design consideration in gaze interfaces.</p>
      </abstract>
	 <kwd-group>
        <kwd>eye movement</kwd>
        <kwd>gaze interaction</kwd>
        <kwd>interactive eye tracking</kwd>
        <kwd>smooth pursuit</kwd>
        <kwd>usability</kwd>
        <kwd>cyclopean eye</kwd>
        <kwd>dominant eye</kwd>
        <kwd>human-computer interaction</kwd>
      </kwd-group>
    </article-meta>
  </front>

  <body>

    <sec id="s1">
      <title>Introduction</title>
      <p>
        Eye-gaze interaction with computerized systems holds
a number of benefits. For instance, users&#x2019; hands are free to
perform other tasks while interacting with the computer (
        <xref ref-type="bibr" rid="R56">1</xref>
        )
and individuals with severe motor disabilities can
communicate with their environment more easily (
        <xref ref-type="bibr" rid="R57 R58">2, 3</xref>
        ). In
addition, gaze interaction can be highly useful when screens
are larger and when objects are in motion because time to
move one's eyes between objects changes very little with
distance (
        <xref ref-type="bibr" rid="R59">4</xref>
        ) and tracking them when they are moving (as
in video games) draws upon an inherent and especially
adapt eye-brain mechanism (
        <xref ref-type="bibr" rid="R60 R61">5, 6</xref>
        ).
      </p>
   
   <p>
        The interest in using gaze interfaces has led to
empirical investigations of user-centered design questions. For
instance, how should users select on-screen objects (e.g.,
icons) that they would like to interact with (
        <xref ref-type="bibr" rid="R62">7</xref>
        )? Whether or not users should receive feedback on where they are
looking (
          <xref ref-type="bibr" rid="R56">1</xref>
          ) and what kind of feedback? (
          <xref ref-type="bibr" rid="R58 R63">3, 8</xref>
          ). Findings
have shown that when users selected objects for interaction
by dwelling on them for a certain duration, selection times
were faster than with the "traditional" mouse (
          <xref ref-type="bibr" rid="R58 R59">3, 4</xref>
          ). Yet,
other studies have demonstrated that when targets were
smaller than 4&#xB0;  of visual angle, users had to confirm
choices by key press or by moving their facial muscles to
compete with the computer mouse in speed and in
accuracy (
          <xref ref-type="bibr" rid="R64 R65">9, 10</xref>
          ). Finally, Alonso, Causse (
          <xref ref-type="bibr" rid="R56">1</xref>
          ) found that for
targets smaller than 2.14&#xB0; , cursor feedback on where users
were looking improved their accuracy in selecting these
targets.
        </p>
		
        <p>
          Interestingly, although pointing accuracy on smaller
objects has been identified as key factor in the
effectiveness of gaze interaction, the question of what eye points
more accurately on targets has not been studied. This
question may hold even greater importance in gaze interaction
with moving targets that currently suffer from low success
in target acquisition (
          <xref ref-type="bibr" rid="R65 R66">10, 11</xref>
          ). In the current study, we
compared tracking accuracy between the &#x201C;cyclopean&#x201D;,
dominant and non-dominant eye.
        </p>
		
		
        <sec id="s1a">
          <title>Missing of targets and the higher accuracy of the &#x201C;cyclopean eye&#x201D;</title>
          <p>
            Cui and Hondzinski (
            <xref ref-type="bibr" rid="R67">12</xref>
            ) conducted an experiment
where they tested the gaze accuracy of participants.
Participants viewed targets (i.e., weighted fishing anchors)
suspended from the ceiling at three different heights while
their binocular points of gaze were recorded at 60Hz.
Errors were quantified as the absolute and angular distances
between targets and points of gaze of the right and of the
left eye. Then, a third type of error was defined as the
absolute and angular distances between targets and the
average of the positions of the right and left eye. Findings
showed that mean error of averaged positions were either
smaller or not significantly different from the mean error
of the right or of the left eye alone. Based on these
findings, the conclusion from this study was that for a range of
viewing conditions, averaged gaze positions would
produce the most accurate results for viewing tasks.
          </p>
		  
          <p>
            From a broader theoretical perspective, Cui and
Hondzinski (
            <xref ref-type="bibr" rid="R67">12</xref>
            ) suggested that their findings resonate
with the "cyclopean eye" theory that accounts for how
people set their relative direction to objects in space.
According to this theory, people set their egocentric visual
direction according to a line connecting the target and a point
on an imaginary line between their eyes. In other words,
when one assesses her relative positions to targets, it is a
point between her eyes that designates her position. This
point was metaphorically termed the "cyclopean eye" (
            <xref ref-type="bibr" rid="R68">13</xref>
            ) and numerous studies have indeed demonstrated that
individuals set &#x201C;cyclopean&#x201D; direction to objects in their field
of view (e.g., 
            <xref ref-type="bibr" rid="R69 R70 R71 R72">14, 15, 16, 17</xref>
			). Cyclopean eye position, in turn,
may be approximated by averaging left and right eye
positions as in Cui and Hondzinski (
            <xref ref-type="bibr" rid="R67">12</xref>
            ) study.
          </p>
		  		  
          <p>
            Although Cui and Hondzinski (
            <xref ref-type="bibr" rid="R67">12</xref>
            ) did not account for
why the right and left eye would miss targets in the first
place, their findings do correspond with a
well-documented phenomenon in optometry and the human vision
and perception domains, termed &#x201C;fixation disparity&#x201D;. In
fixation disparity, vergence eye movements fail to
intersect both lines of sight on the intended targets and
consequently, eyes do not land on the same spot, but rather fixate
on slightly different locations from each other and from the
intended targets (
            <xref ref-type="bibr" rid="R73 R74">18, 19</xref>
            ). Hence, while right and left eyes
may sometimes miss targets, the "cyclopean eye", who sets
the direction to targets, may be the one that is placed on
them more accurately. Cyclopean eye theory, therefore,
resonates with that averaged gaze positions, or cyclopean
positions, may "land" closer to targets than single gaze
positions. Still, another theory, that of eye dominance
suggests that at least in some cases gaze positions of the
dominant eye may land closer to targets.
          </p>
		  
		  
        </sec>
		
        <sec id="s1b">
          <title>Eye dominance</title>
          <p>
            The concept of "eye dominance" can be traced back to
Kepler (
            <xref ref-type="bibr" rid="R75">20</xref>
            ) determination that visual direction is set by an
optical line from the viewed object to the retina. This
determination was considered undisputed, as the eyes are the
ultimate source of vision (
            <xref ref-type="bibr" rid="R76">21</xref>
            ). Later theorists argued that
direction is not only determined by an optical line to the
retinas, but is determined by an optical line to the retina of
the dominant eye (
            <xref ref-type="bibr" rid="R77 R78">22, 23</xref>
            ). Their view was supported by
repeated empirical observations that individuals align
targets with one eye and not the other, for instance, in
Dolman's peephole test (e.g., 
            <xref ref-type="bibr" rid="R79 R80 R81">24, 25, 26</xref>
			). This eye is
considered to be the dominant one.
          </p>
		  
          <p>
            Subsequent studies supported the concept of eye
dominance, demonstrating a preference for one eye over the
other. For instance, one of the eyes usually suppresses
sensory input from the other in case of rivalry inputs. Next,
visual acuity is sometimes better in one of the eyes and not
the other and finally, there is better sensory motor
coordination with one eye than with the other (See reviews by
            <xref ref-type="bibr" rid="R82 R83">27, 28</xref>
			). However, the concept of eye dominance has also
suffered considerable criticism when repeated empirical
investigations demonstrated that the interrelationships
between different measures of dominance are very low (See
reviews by 
            <xref ref-type="bibr" rid="R70 R83">15, 28</xref>
			). Further, it was also demonstrated that
dominance might even change with the same measure
when task characteristics are different (
            <xref ref-type="bibr" rid="R84">29</xref>
            ). Finally, a
series of sophisticated experiments demonstrated that even
though sighting or alignment of targets is usually done to
a sighting eye, egocentric visual direction is closely
associated with the "cyclopean eye" (e.g., 
            <xref ref-type="bibr" rid="R69 R83 R85 R86">14, 28, 30, 31</xref>
			).
          </p>
		  
          <p>
            It appears, then, that the possible role of the dominant
eye in vision had not been strongly established yet. Still,
researchers strongly point to the hemispheric laterality that
characterizes other established phenomena as handedness
or footedness, as a possible source for "eyedness" or eye
dominance. For instance, in a large meta-analysis
Bourassa, Mcmanus (
            <xref ref-type="bibr" rid="R82">27</xref>
            ) convincingly showed strong
relationships between measures of eye dominance and
measures of hand and feet dominance. These relationships
may suggest that dominant eyes may be superior to
nondominant eyes in certain tasks, just as dominant hands or
feet are (
            <xref ref-type="bibr" rid="R82">27</xref>
            ). This view, in turn, has gained some support
from empirical findings.
          </p>
		  
          <p>
            For instance, Han, Seideman (
            <xref ref-type="bibr" rid="R87">32</xref>
            ) showed that
dominant eyes (i.e., the "sighting" eyes in tests like Dolman's)
make more accurate vergence movements in response to
different viewing conditions. In Van Leeuwen, Westen
(
            <xref ref-type="bibr" rid="R88">33</xref>
            ), individuals sometimes preferred to make short
saccades to nearby objects with only their dominant eyes.
Next, in Moiseeva, Slavutskaya (
            <xref ref-type="bibr" rid="R89">34</xref>
            ), pre-saccadic
processes appeared earlier in the dominant than in the
nondominant eye, possibly suggesting faster sensory
processing and attention disengagement for the dominant eye.
Finally, Kawata and Ohtsuka (
            <xref ref-type="bibr" rid="R90">35</xref>
            ) showed that when
individuals tracked an X shaped target moving on a rail at
different speeds, vergence movements were first initiated
with the dominant eye and were faster with the dominant
eye than with the non-dominant eye.
          </p>
		  		  
          <p>It seems, then, that dominant eyes may have certain
qualities in some tasks and thus, although the collective
evidence in support of eye dominance is currently not very
strong, it is possible that dominant eyes will still be more
accurate in motor tasks such as the tracking of targets.</p>
          
        </sec>
        <sec id="s1c">
          <title>The question of what eye should be the input device in gaze interfaces</title>
          <p>
            The question of whether it is the cyclopean or the
dominant eye that fixates more accurately on targets has
theoretical significance, but also practical implications for the
design of gaze interfaces. Efficient human-computer
interaction requires rapid and seamless capturing of on-screen
targets to avoid missed commands and long selection times
(
            <xref ref-type="bibr" rid="R56 R65 R66">1, 10, 11</xref>
            ). Vidal, Bulling (
            <xref ref-type="bibr" rid="R91">36</xref>
            ), developed a promising
technique in this respect-&#x2018;Pursuits&#x2019; that is based on the
similarity of trajectories between the eye and the object it
pursues. When the correlation coefficient between a
sample of the eye and object coordinates is greater than a
predefined threshold, &#x2018;Pursuits&#x2019; detects that the object is being
pursued. Usability tests of Pursuits-based interaction,
when users interacted with circular and linear-trajectory
objects, showed high percentage of successful detections.
          </p>
		  
          <p>The most widely used technique of gaze interaction, to
date, with both, stationary and moving objects, is
gazebased interaction. That is, users can select and interact with
objects at times when they point at them with their eyes.
Therefore, testing what eye-input method is most accurate
may assist in facilitating more successful gaze-based user
interaction. In the current study, we compared tracking
performance between the dominant, non-dominant and
cyclopean eye.</p>
        </sec>
		
      </sec>
   
    <sec id="s2">
      <title>Experiment 1: Exploratory study</title>
      <p>The purpose of the first experiment was to obtain first
impression on what eye tracks a moving target more
accurately before we test this question with gaze-interface
tracking.</p>

      <sec id="s2a">
        <title>Method</title>
		<sec id="s2aa">
        <title>Participants</title>
        <p>
27 undergraduate psychology and engineering students
participated in the experiment in partial fulfillment of the
requirements of a course in human factors engineering.
Age ranged from 21 to 31 years (Mean=26, SD=2.7). 48%
of the participants were males. We tested participants for
normal binocular vision using Snellen test and for
binocular stability using the &#x201C;Parallel infinity balance test&#x201D;
(PTIB) (
          <xref ref-type="bibr" rid="R92">37</xref>
          ).
        </p>
		
        <p>
          Participants' ocular dominance was tested using the
Dolman's Hole in the card/Peephole test (e.g., 
            <xref ref-type="bibr" rid="R79 R80 R81">24, 25, 26</xref>
			).
19 of the 27 participants (70 %) were right-eyed. 24 of the
27 participants (89%) were right-handed. 6 of the 27
participants (22%) had an opposite eye-hand lateral
dominance (i.e. right dominant eye with left dominant hand and
vice versa). All mentioned proportions comply with the
proportions reported in Bourassa, Mcmanus (
          <xref ref-type="bibr" rid="R82">27</xref>
          )
metaanalysis.
		</p>
		</sec>
		<sec id="s2ab">
        <title>Task and procedure</title>
        
        <p>Participants arrived at the lab for individual sessions
that lasted approximately 20 minutes. Upon arrival, they
were briefed about the procedure by the experimenter that
encouraged participants to ask questions throughout and
after the briefing. Participants signed the informed consent
form only after the experimenter confirmed that they
understood the procedure. Then, the experimenter tested
participants for normal binocular vision and eye-dominance.
The experiment was conducted in a sound-attenuated and
darkened room. Participants sat in front of the display
screen and the binocular eye tracker`s desktop camera
(&#x201C;Eyelink 1000&#x201D; see apparatus).</p>

        <p>
          Participants performed a free gaze-tracking task (see
Figure 1). They were instructed to &#x201C;track the moving target
with their eyes&#x201D;. The moving target was a red circle, 80
pixels in diameter and 1.87&#xB0; from a viewing distance of
65cm. Mean percent time on target of a similar size in a
previous study we conducted with joystick tracking was
approximately 55% (
          <xref ref-type="bibr" rid="R93">38</xref>
          ) and we therefore anticipated that
participants in the current study would be able to track the
target with their eyes. We created six tracking conditions:
3 target velocities X 2 maneuvering types. Target
velocities were: 1.7&#xB0;/sec, 3.1&#xB0;/sec and 4.5&#xB0;/sec. Maneuvering
types were straight lines and curved lines. Lowest and
medium velocities were also adapted from Wagner, Sahar
(
          <xref ref-type="bibr" rid="R93">38</xref>
          ) and maneuvering types were chosen to create lower
(straight lines) and higher (curved lines) degrees of
difficulty (
          <xref ref-type="bibr" rid="R94">39</xref>
          ).
        </p>
		
		<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1</label>
					<caption>
						<p>The experimental task.</p>
						</caption>
					<graphic id="graph01" xlink:href="jemr-10-01-b-figure-01.png"/>
				</fig>	  
	  
 
		
        <p>In the straight lines maneuvering type, the target
moved in a straight path, changing angles every 2-5
seconds. The experimental program randomly selected both,
angle size and timing of turns. In the curved lines
maneuvering type, the target moved along a curve, yet every 2-5
seconds it made a turn and started moving along a new
curve. In terms of the experimental program, curves were
arcs of circles with radii of 200-600 pixels and it randomly
selected the radius of circles and the timing of turns. In
both the straight and curved lines movement, whenever the
target hit the edges of the monitor it turned to the opposite
direction in a similar angle as the impact angle, relative to
the perpendicular. Figure 2 and 3 show an example of curved
and straight lines movements. The different
maneuveringvelocity combinations allowed us to test our hypothesis
across six different movement profiles as summarized in
Table 1. Each profile was equivalent to a single
experimental trial of 45 seconds. The experiment was composed
of 2 blocks. Each block contained 6 trials of 45 seconds
according to the 6 movement profiles in Table 1. The order
of trials in each block was randomized. Overall,
participants performed 12 trials, experiencing each tracking
condition (i.e., movement profile) twice, once in each block.</p>

<fig id="fig02" fig-type="figure" position="float">
					<label>Figure 2</label>
					<caption>
						<p>Straight lines maneuvering types.</p>
						</caption>
					<graphic id="graph02" xlink:href="jemr-10-01-b-figure-02.png"/>
				</fig>	  
<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 3</label>
					<caption>
						<p>Curved lines maneuvering types.</p>
						</caption>
					<graphic id="graph03" xlink:href="jemr-10-01-b-figure-03.png"/>
				</fig>	  




<table-wrap id="t01" position="float">
					<label>Table 1</label>
					<caption>
						<p>The 6 tracking conditions within an experimental block according to 3 velocities X 2 maneuvering types</p>
					</caption>
<table frame="hsides" rules="groups" cellpadding="3">
<tbody>

          <tr>
            <td rowspan="1" colspan="1"/>
			<td rowspan="1" colspan="1"/>
            <td rowspan="1" colspan="1"> 
			
              <bold>Velocity</bold>
            </td>
            <td rowspan="1" colspan="1"/>
          </tr>
		  </tbody>
         
<tbody>
		 <tr>
            <td rowspan="1" colspan="1">
              <bold>Maneuver</bold>
            </td>
            <td rowspan="1" colspan="1">Slow (1.7&#xB0;/sec.)</td>
            <td rowspan="1" colspan="1">Medium (3.1&#xB0;/sec.)</td>
            <td rowspan="1" colspan="1">Fast (4.5&#xB0;/sec.)</td>
          </tr>
		  </tbody>
		  <tbody>
          <tr>
            <td rowspan="1" colspan="1">Straight Lines</td>
            <td rowspan="1" colspan="1">Slow &amp; Straight</td>
            <td rowspan="1" colspan="1">Medium &amp; Straight</td>
            <td rowspan="1" colspan="1">Fast &amp; Straight</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Curved Lines</td>
            <td rowspan="1" colspan="1">Slow &amp; Curved</td>
            <td rowspan="1" colspan="1">Medium &amp; Curved</td>
            <td rowspan="1" colspan="1">Fast &amp; Curved</td>
          </tr>
		  </tbody>
        </table>
		</table-wrap>

	</sec>
		<sec id="s2ac">
        <title>Apparatus</title>

        <p>Data Collection and Stimulus Presentation: Binocular
eye-movements were tracked with the EyeLink 1000
system (SR Research Ltd., Mississauga, Canada) with a
sampling rate of 250Hz. To avoid head movements and to
ensure a constant viewing distance of 65 cm, participants
rested their chins on a rest with a forehead support band.
We performed a calibration procedure based on a
ninepoint grid at the beginning of each block using the
manufacturer&#x2019;s software. It was a binocular calibration, yet the
mathematical models of gaze positions were fitted to each
eye independent of the other as described in Stampe (
          <xref ref-type="bibr" rid="R95">40</xref>
          )
and in accordance with previous studies with binocular
measurements (
          <xref ref-type="bibr" rid="R96 R97">41, 42</xref>
          ). Practical calibration error was
23.24 (SD=6.37) and 22.75 (SD=7.31), in minutes of arc,
for the left and right eye, respectively. Following each
trial, we performed a &#x201C;drift correction&#x201D; procedure, where
participants fixated on a calibration point for a few seconds
while the system corrected any drifts it had from initial
calibration.
        </p>
		
        <p>Two interfaced computers managed the data collection
and stimulus presentation in the experiment: the
Eye-Link1000 host computer and the task computer. The task
computer controlled stimulus presentation and managed task
intervals via self-developed software (C#). Stimulus
(moving target) was presented on an Alienware OptX AW2310,
23'' monitor with 1920 x1080 resolution and a 120 Hz
refresh rate. The Eye-Link 1000 host computer was set as the
main experimental computer, coordinating and recording
all aspects of the experiment.</p>


		</sec>
	
		<sec id="s2b">
        <title>Design</title>
       
          <p>
            Tracking performance was the dependent variable. It
was quantified as the "mean absolute distance" between
eye and target measured in minutes of arc (usually termed
arc min). "Mean absolute distance", often referred to as
"mean absolute error" is a common measure of tracking
performance (
            <xref ref-type="bibr" rid="R98">43</xref>
            ). It is calculated by aggregating eye to
target distances across all samples in a trial, and dividing
this aggregated sum by the number of samples in that trial,
as shown in Formula 1. The higher the mean absolute
distance between eye and target positions, the lower tracking
performance is.</p>


<fig id="eq01" fig-type="figure" position="anchor">
					<graphic id="equation01" xlink:href="jemr-10-01-b-equation-01.png"/>
				</fig>		





 <p>Where: m = Mean absolute distance in minutes of arc </p>
 <p>
<italic>n</italic> = Number of samples for each trial </p>
 <p>
<italic>i</italic> = Sample index </p>
 <p>
e = Eye position
 </p>
 <p>t = Target position</p>
          <p>We computed tracking performance separately for the
dominant, non-dominant and cyclopean eye.</p>

          <p>The four independent variables in the experiment were:
<italic>Eye classification</italic>: Dominant, non-dominant and
cyclopean eye. <italic>Target velocity</italic>: 1.7&#xB0;/sec, 3.1&#xB0;/sec and 4.5&#xB0;/sec.
Maneuvering type: straight or curved lines. Experimental
block: first block or second block.</p>

          <p>This yielded a 3 X 3 X 2 X 2 within subjects design.
Cyclopean eye was defined as the averaged x-y
coordinates of the dominant and non-dominant eye.</p>

        </sec>
		
      
	  
      <sec id="s2c">
        <title>Results</title>
        <p>
          As a preliminary step to our analyses, certain data had
to be excluded for being irrelevant for our study.
Participants in our study were essentially engaged in a smooth
pursuit task. However, the purpose of our study was not to
investigate the underlying mechanisms of smooth pursuit.
Rather, we aimed to compare tracking accuracy between
dominant, non-dominant and cyclopean eyes to learn about
expected performance in gaze control interfaces.
Therefore, saccades, that for all participants, constituted
attempts by the oculomotor system to recapture targets that
moved outside their foveae (
          <xref ref-type="bibr" rid="R99">44</xref>
          ), had to be regarded as
noise and be filtered out. Essentially, eye-to-target
distance during a saccade is irrelevant for studying gaze
control, because visual information processing is largely
suppressed during saccades (
          <xref ref-type="bibr" rid="R100 R101">45, 46</xref>
          ) and thus, very little
control (if any) is possible. In this respect, our research
resembles the study of eye movements in real-life reading
conditions, where in many instances saccades are regarded as
noise (
          <xref ref-type="bibr" rid="R102">47</xref>
          ).
        </p>
		
        <p>
          To identify saccades, we used the online SR research
event detection algorithm, which is the most widely used
event detection algorithm for academic research (
          <xref ref-type="bibr" rid="R102">47</xref>
          ). The
algorithm was set according to the following parameters:
saccadic velocity threshold of 30&#xB0;/sec, saccadic
acceleration threshold of 8000&#xB0;/sec, saccadic motion threshold of
0.2&#xB0;. This setting is considered a conservative one and is
widely used in eye-movement research (
            <xref ref-type="bibr" rid="R103">48</xref>
			, pp. 89-94).
Data exclusion procedure resulted in filtering out ~ 8.00%
of the original data.
        </p>
		
        <p>
          Finally, we used Linear Mixed Models (LMM) in all
statistical analyses. LMM is recommended for eye
tracking data that are often unbalanced due to instances where
trackers fail to capture participants' eyes (
          <xref ref-type="bibr" rid="R93 R102">38, 47</xref>
          ).
        </p>
		  <sec id="s2ca">
       <title>Tracking Accuracy</title>     
        <p>To compare tracking accuracy between the dominant,
non-dominant and cyclopean eye, we conducted a Linear
Mixed Model (LMM) analysis with a random intercept on
the mean distance from target. The random effect was the
participants themselves and the fixed effects were eye
classification (dominant, non-dominant and cyclopean), target
velocity (1.7&#xB0;/sec, 3.1&#xB0;/sec and 4.5&#xB0;/sec), maneuvering
profile (straight or curved lines) and experimental block
(first block or second block). We included all second-,
third-, and fourth-order interactions between the fixed
effects in the model.</p>
       
	   <p>
          Our analysis of tracking accuracy showed that mean
absolute distance from target was smallest with the
cyclopean eye (Mean=47.27 arc min, SE=1.03 arc min). We
also found that mean distance from target with dominant
and non-dominant eyes was almost similar (Mean=53.56
arc min, SE=1.03 arc min; Mean=53.59 arc min, SE=1.03
arc min, respectively). Figure 4 summarizes these means
and SEs. The main effect for "eye classification" was
significant, F (2, 810) = 12.174, p&lt;.001. Subsequent pairwise
comparisons using Sidak correction revealed significant
differences between the cyclopean and both the dominant
and non-dominant eyes (p&lt;.05). Thus, findings show that,
on average, the cyclopean eye was closest to target.
Finally, no significant differences in mean distance from
target were found between dominant and non-dominant eyes.
        </p>
		<fig id="fig04" fig-type="figure" position="float">
					<label>Figure 4</label>
					<caption>
						<p>Mean absolute distance from target with cyclopean, dominant and non-dominant eyes. Error bars represent standard errors.</p>
						</caption>
					<graphic id="graph04" xlink:href="jemr-10-01-b-figure-04.png"/>
				</fig>	  
	  
        <p>
          Velocity also affected tracking accuracy. Mean
distance from target was highest when velocity was greatest
(Mean=75.64 arc min, SE=1.03 arc min), smaller for
medium velocity (Mean=51.25 arc min, SE=1.03 arc min)
and smallest for lowest velocity (Mean=46.66 arc min,
SE=1.03 arc min). Main effect for "velocity" was
significant, F (2, 810) = 22.307, p&lt;.001. All multiple
comparisons between levels of velocity, using sidak correction
were also found significant (p&lt;.05). Thus, the faster the
target moved the more difficult it became tracking it. No
other significant effects were found.
        </p>
      </sec>
	  </sec>
      <sec id="s2d">
        <title>Discussion</title>
        <p>
          Findings in Experiment 1 showed that cyclopean gaze
positions were closest to target. We also found that
velocity, but not the maneuvering profile affected tracking
accuracy. Although the timing of turns and radii in the curved
lines maneuvering type were random, the target still
travelled within a constant radius along the curve and its path,
therefore, could have been relatively predictable.
Predictability, in turn, may lead to similar performance for
different maneuvers (e.g., straight vs. curved lines), while shifts
from one constant velocity to the next still generate
changes in performance (
          <xref ref-type="bibr" rid="R104">49</xref>
          ). Such pattern, where velocity
affects performance when changes in path do not,
corresponds with our findings.
        </p>
		
        <p>
          Our main finding regarding the higher accuracy of
cyclopean gaze positions replicates Cui and Hondzinski (
          <xref ref-type="bibr" rid="R67">12</xref>
          )
findings and extend them to moving targets. They also
correspond with the fixation disparity phenomenon where
&#x201C;real&#x201D; eyes sometimes miss targets (
          <xref ref-type="bibr" rid="R73 R74">18, 19</xref>
          ). From a
theoretical perspective, our findings lend further support to the
cyclopean eye theory of egocentric direction. Essentially,
it appears more likely that visual direction is determined
according to a locus that is more often aligned with the
target, than according to another locus (i.e., the dominant
eye) that is less often aligned with the target. Our findings,
however, did not support the hemispheric laterality
approach that dominant eyes may be superior to
non-dominant eyes in certain tasks (
          <xref ref-type="bibr" rid="R82">27</xref>
          ). From a practical
perspective, the higher accuracy with the cyclopean eye may
suggest that performance with gaze interfaces should be better
when cyclopean eye is the input device. At the same time,
however, Experiment 1 was a preliminary investigation
with free tracking and therefore, the implications of our
findings for actual gaze control should be further
investigated.
        </p>
		
        <p>One important question pertains to the difference in
percent time on target between cyclopean and single-eye
control. If we were to set a perimeter that designates when
users can interact with the target (e.g., a crosshair that
designates that they are on target), how often would this
perimeter overlap with the target with cyclopean compared
to single-eye tracking? Findings from Experiment 1
showed that the average difference in accuracy between
the cyclopean and real eyes was ~6 arc minutes and
therefore, smaller than the calibration error we reported in the
Method (23.24 and 22.75 arc min, for the left and right eye,
respectively). Thus, although the mean difference in
accuracy between the eyes that we computed on an extremely
large sample (sampling rate was 250Hz) is robust,
calibration error suggests that single measurements may
sometimes be biased in favor of one eye or the other. Such bias
may even increase near the edges of the monitor. One
should therefore test how often the tracker indeed detects
the cyclopean eye closer to the target than the other eyes,
so that it can interact with the target while the other eyes
cannot. The frequency of such instances can be tested if
one tries to place a crosshair or cursor on target.</p>
        <p>
          Second, one may indeed have a cursor or a crosshair
when using gaze interface, as one usually has when she or
he are operating a computer mouse or a joystick. Alonso,
Causse (
          <xref ref-type="bibr" rid="R56">1</xref>
          ) tested gaze control in ATC (air traffic control)
and found that target selection accuracy has greatly
improved when users received feedback on their gaze
positions. At the same time, however, Jacob (
          <xref ref-type="bibr" rid="R105">50</xref>
          ) noted that
when cursor and target do not completely overlap, as a
result of system errors, users may turn their attention to the
cursor instead of gazing at the target. It is thus unclear how
cyclopean control would compare to single-eye control if
eyes sometimes pursue the cursor instead of the target. In
Experiment 2 we compared cyclopean to single-eye
control in gaze interface tracking.
        </p>
      </sec>
    </sec>
	 </sec>
    <sec id="s3">
      <title>Experiment 2</title>
      <p>Based on Experiment 1 results, we designed a follow
up study where users tracked a target with a crosshair.</p>
      <sec id="s3a">
        <title>Method</title>
		
		 <sec id="s3aa">
        <title>Participants</title>
    
        <p>All participants from Experiment 1 (see Participants
sub-section of Experiment 1) also participated in
Experiment 2 after one to five days interval.</p>
 </sec>

<sec id="s3ab">
        <title>Task and Procedure</title>
		
        <p>Similar to in Experiment 1, participants arrived at the
lab for individual sessions. Upon arrival, they were briefed
about the procedure by the experimenter and were
encouraged to ask questions throughout and after the briefing.
Participants signed the informed consent form only after
the experimenter confirmed that they understood the
procedure. The task was identical to Experiment 1 in that
participants had to track a moving target. However, different
from Experiment 1, where we examined tracking in free
gaze conditions, in Experiment 2 participants performed
the tracking task with a gaze-interface. This meant that
participants tracked the target with a crosshair (see Figure
5) and were instructed to &#x201C;track the moving target with the
crosshair&#x201D;. The experimenter explained to them that they
controlled the crosshair with their eyes.
 </p>
		<fig id="fig05" fig-type="figure" position="float">
					<label>Figure 5</label>
					<caption>
						<p>The experimental task.</p>
						</caption>
					<graphic id="graph05" xlink:href="jemr-10-01-b-figure-05.png"/>
				</fig>	  
	  
        <p>Experiment 2 was composed of six blocks as shown in
Figure 6. Each block contained the six tracking conditions
as in Experiment 1 (3 velocities X 2 maneuvers). In each
block, the crosshair was controlled by either one of the
eyes, according to the three eye classification categories:
dominant, non-dominant, and cyclopean eye. Hence,
participants experienced each of the three eye-crosshair
coupling conditions twice. We randomized the order of
eyecrosshair coupling across blocks. However, complete
randomization could have resulted in sequences where the
same eye controls the crosshair in the last two or the first
two blocks. Such instances could have led to training
effects, and thus, to a possible confounding in our results. In
other words, such instances could have caused enhanced
training prior to some eye classification conditions, while
generating no training prior to other eye classification
conditions. Therefore, we chose to perform
semi-randomization.</p>
<fig id="fig06" fig-type="figure" position="float">
					<label>Figure 6</label>
					<caption>
						<p>Experimental structure.</p>
						</caption>
					<graphic id="graph06" xlink:href="jemr-10-01-b-figure-06.png"/>
				</fig>	 
        <p>Essentially, we did not randomize all 6 blocks as a
group, but rather, decided to define the first three blocks
and the second three blocks as two halves, as depicted in
Figure 6, each of them with all three control options
(dominant, non-dominant, and cyclopean). Then, we
randomized the first three blocks and the second three blocks
separately. This way, there were no sequences where the same
eye controlled the crosshair in the last two or first two
blocks. Following the first half of the experiment,
participants received a five-minute break.</p>
        <p>
          Although participants knew they controlled the
crosshair with their eyes, they were not informed which of the
eyes controlled the crosshair in each block. This was
because we were concerned that such information may
disrupt participants' natural interaction with the interface.
Essentially, users in real-life settings are not expected to
think about how they move their eyes to interact with gaze
interfaces (
          <xref ref-type="bibr" rid="R105">50</xref>
          ). After completing six blocks, the
experiment ended. The experimenter briefed participants about
the main research questions and thanked them for their
participation. The entire procedure lasted approximately
45 minutes.
        </p>
		 </sec>
		 <sec id="s3ac">
        <title>Apparatus</title>
		
        <p>The apparatus in Experiment 2 was identical to in
Experiment 1 except for activating an additional software
function. In each block, the experimental software coupled
the crosshair to one of the three eyes (dominant,
non-dominant, or cyclopean). This function enabled us to compare
gaze interface tracking performance between the three
eyes. Calibration and drift correction procedures were also
identical to in Experiment 1. Practical calibration error was
24.11 (SD=7.03) and 23.77 (SD=7.56) arc min, for the left
and right eye, respectively.</p>
      </sec>
		 <sec id="s3ad">
        <title>Design</title>
		
            <p> The dependent variable was the percent of time
crosshair and target overlapped (termed &#x201C;percent on target&#x201D;).
Percent on target is often used as a measure for tracking
accuracy when using a crosshair (
          <xref ref-type="bibr" rid="R98 R106 R107">43, 51, 52</xref>
          ). To estimate
percent time on target, crosshair was tagged &#x201C;on&#x201D; for every
data sample crosshair and target overlapped (partly or
fully) and &#x201C;off&#x201D; when crosshair and target did not overlap,
where &#x201C;on&#x201D;=1 and &#x201C;off&#x201D;=0 (
          <xref ref-type="bibr" rid="R107">52</xref>
          ). Then, sample values were
aggregated and divided by the number of samples, as
demonstrated in formula 2. This measure allowed us to
estimate the percent of time during each trial that participants
succeeded in &#x201C;capturing the target&#x201D;.

        </p>
		<fig id="eq02" fig-type="figure" position="anchor">
					<graphic id="equation02" xlink:href="jemr-10-01-b-equation-02.png"/>
				</fig>	

<p>Where: p = Percent Time on Target</p>
<p><italic>n</italic> = Number of samples in trial</p>
<p><italic>i</italic> = Index of sample</p>
<p>o = "On Target" (binary variable)</p>
<p>o<sub>i</sub>=1 if target and crosshair overlap
(partly or fully) </p>
<p>o<sub>i</sub>=0 if target and crosshair do not
overlap </p>
        
        <p>The four independent variables in the experiment were:
Eye classification: Dominant, cyclopean or non-dominant
eye. Target velocity: 1.7&#xB0;/sec, 3.1&#xB0;/sec or 4.5&#xB0;/sec.
Maneuvering type: straight or curved lines. Experiment half: first
half or second half. This yielded a 3 X 3 X 2 X 2 within
subjects design.</p>
      </sec>
	   </sec>
      <sec id="s3b">
        <title>Results</title>
        <p>Data exclusion was similar to Experiment 1 and
resulted in similar proportion of excluded data (~8%).</p>
       
 <sec id="s3ba">
        <title>Tracking Accuracy</title>
	 
        <p>We conducted a Linear Mixed Model (LMM) analysis
with a random intercept on "percent time on target". The
random effect was the participants themselves and the
fixed effects were eye classification (dominant,
non-dominant and cyclopean), target velocity (1.7&#xB0;/sec, 3.1&#xB0;/sec or
4.5&#xB0;/sec), target maneuver (straight or curved lines) and
half of the experiment (first half or second half). We
included all second-, third-, and fourth-order interactions
between the fixed effects in the model.</p>
        <p>
          Greatest percent on target was achieved when crosshair
was controlled by the cyclopean eye (Mean=62.13,
SE=1.42) compared to when crosshair was controlled by
either the dominant (Mean=55.95, SE=1.38), or the
nondominant eye (Mean=54.04, SE=1.36). The main effect for
"eye-classification" was significant, F (2,797) = 9.10,
p&lt;.001. Subsequent pairwise comparisons using Sidak
correction revealed significant differences between the
cyclopean and both the dominant and non-dominant eye
(p&lt;.01). We found no significant differences in percent
time on target in the pairwise comparisons between the
dominant and non-dominant eye.
        </p>
        <p>
          We found a significant interaction Experiment half X
Eye classification, F (2,797) = 3.12, P&lt;.05. Figure 7
demonstrates that while differences in mean percent time
on target between cyclopean and the two other eyes were
quite large in the first half of the experiment, mean percent
time on target became more similar between cyclopean
and dominant eye in the second half of the experiment
(Mean=61.08, SE=2.06 and Mean=58.82, SE=2.11,
respectively). Pairwise comparisons using Sidak correction
revealed no significant difference between cyclopean and
dominant eye in percent time on target in the second half
of the experiment. It was only the difference between
cyclopean and non-dominant eye that was statistically
significant (Mean=61.08, SE=2.06 and Mean=52.41, SE=1.92,
respectively), (p&lt;.01).
        </p>
		<fig id="fig07" fig-type="figure" position="float">
					<label>Figure 7</label>
					<caption>
						<p>Tracking-performance with cyclopean, dominant, and non-dominant eye-control in the first and second halves of the experiment. Error bars represent standard errors.</p>
						</caption>
					<graphic id="graph07" xlink:href="jemr-10-01-b-figure-07.png"/>
				</fig>	
        <p>
          To test whether dominant eye tracking had indeed
significantly improved between the first half (Mean=53.09,
SE=1.8) and second half of the experiment (Mean=58.82,
SE=2.11), we ran a Linear Mixed Model (LMM) analysis,
quite similar to the first one, yet, this time, on the dominant
eye alone. In other words, eye classification was not an
independent variable in this model because it had only one
level (i.e., only the dominant eye). We found that
improvement in dominant-eye tracking was indeed significant, F
(1, 269) = 4.96, p&lt; .05.
        </p>
        <p>
          Last, using again the full model we described at the
beginning of the Results section, we also found that percent
on target was highest when target traveled at 1.7&#xB0;/sec
(Mean=60.36, SE=1.39), less when target traveled at
3.1&#xB0;/sec (Mean=58.23, SE=1.38), and smallest when
target traveled at 4.5&#xB0;/sec (Mean=53.52, SE=1.38). The main
effect for "Velocity" was significant, F (2,797) = 6.33,
P&lt;.01. Yet, subsequent pairwise comparisons using Sidak
correction revealed a significant difference only between
the greatest and smallest velocities (4.5&#xB0;/sec vs. 1.7&#xB0;/sec),
(p&lt;.01). No other significant main effects or interactions
were found.
        </p>
      </sec>
	   </sec>
      <sec id="s3d">
        <title>Discussion</title>
        <p>The main finding in Experiment 2 replicated the main
finding in Experiment 1, namely, that tracking accuracy
was best with the cyclopean eye. Thus, we expect
cyclopean tracking to be more accurate than single-eye tracking
also in cases where eyes control a crosshair. In contrast to
Experiment 1, however, findings in Experiment 2 did
indicate that dominant eyes might have unique qualities in
motor tasks. In the General Discussion, we present a wider
theoretical view of our findings and discuss the possible
limitations of this study.</p>
      </sec>
    </sec>
    <sec id="s4">
      <title>General Discussion</title>
      <p>We tested which of the eyes would lead to greatest
accuracy when tracking a moving target: the dominant eye,
the non-dominant eye, or the metaphorical "cyclopean
eye" that we embodied its estimated projection by
averaging the x-y coordinates of the two real eyes. Findings from
both Experiment 1 and Experiment 2 showed that
cyclopean-eye tracking would be the most accurate as the mean
cyclopean distance from target was the smallest in
Experiment 1 and mean percent time on target was highest with
the cyclopean eye in Experiment 2. At the same time,
however, a significant interaction between eye classification
and the half of the experiment in Experiment 2 suggested
that supremacy of the cyclopean eye was limited to the first
half of the experiment. These findings have both
theoretical and practical implications.</p>
      
      <p>
        From a theoretical view, our findings replicate Cui and
Hondzinski (
        <xref ref-type="bibr" rid="R67">12</xref>
        ) findings that the average gaze positions
of the two eyes is closer to targets than the single gaze
positions of either eye alone. These findings resonate with
the cyclopean eye theory of egocentric direction (e.g., 
            <xref ref-type="bibr" rid="R69 R70 R71 R72">14, 15, 16, 17</xref>
			). They also show that average gaze positions (or
&#x201C;cyclopean positions&#x201D;) are not only closer to stationary
targets as in Cui and Hondzinski (
        <xref ref-type="bibr" rid="R67">12</xref>
        ), but also to moving
targets.
      </p>
      <p>
        In addition, our findings provide indication for eye
dominance, in accordance with the hemispheric laterality
approach that dominant eyes may be superior to
non-dominant eyes in certain tasks, just as dominant hands or feet
are (
        <xref ref-type="bibr" rid="R82 R108">27, 53</xref>
        ). As we mentioned in the Introduction, the idea
of hemispheric laterality with respect to ocular dominance
has suffered great criticism (e.g., 
            <xref ref-type="bibr" rid="R70 R84">15, 29</xref>
15, 29), yet a number of
empirical reports did show indications for it (
        <xref ref-type="bibr" rid="R87 R88 R89 R90">32-35</xref>
        ). This
also seems to be the case in the current empirical report.
      </p>
      <p>Tracking accuracy with the dominant eye in
Experiment 2 improved with time and became more similar to
tracking accuracy with the cyclopean eye. No such
improvement was found in Experiment 1 that included only
two blocks and no such improvement was found with the
non-dominant eye in neither experiments. Thus, training
improved performance, yet only with the dominant eye. It
appears, therefore, that evidence of asymmetric motor
performance between the dominant and non-dominant eye is
accumulating and we believe that further empirical
investigations of this phenomenon are highly necessary.</p>

      <p>The practical implications of our study relate to the
design of gaze interfaces. We showed in two experiments
that cyclopean tracking is more accurate than single-eye
tracking and therefore, designers of gaze-interfaces may
want to consider cyclopean control. Tracking accuracy
will of course depend on task characteristics, as for
example, the size of targets. The mean difference in percent time
on target between cyclopean and dominant-eye tracking in
the task we designed for Experiment 2 was ~6% and would
probably increase with smaller targets and decrease with
larger targets. Our main interest in this study was in the
question of whether the effectiveness of gaze interaction
may depend on what eye one uses as the input device. We
therefore used a relatively small target and did not explore
the relative effects of different target sizes and other task
characteristics that may possibly affect tracking accuracy.
Designers of gaze-interfaces should decide what eye to use
as the input device according to the characteristics of the
task and the rewards and punishments for different
outcomes. For instance, would 6% difference (or less/more)
in percent time on target be enough to justify cyclopean
control for reducing missed commands and selection times
in a video game? What about reducing missed commands
and selection times in ATC or in combat piloting? Our
study does not provide answers to these questions. It shows
that what eye to use as input should be a design
consideration in gaze interfaces.</p>
      <p>
        We focused in this study on tracking, where gaze
control holds great promise in replacing the less natural
tracking with joystick or with a mouse, while at the same time
it has been reported to call for methods to improve
accuracy (e.g., 
            <xref ref-type="bibr" rid="R65 R66">10, 11</xref>
			). In addition, our task did not require
participants to select targets, for instance by pressing a bar (
        <xref ref-type="bibr" rid="R62">7</xref>
        ),
or by waiting a predefined dwell time before selection (
        <xref ref-type="bibr" rid="R58 R109">3, 54</xref>
        ). We demonstrated that crosshair and target overlapped
for a greater percentage of time with cyclopean compared
to single-eye control and therefore, that selection of targets
should reasonably be faster in such conditions. Cyclopean
fixations are also expected to be more accurate when
focusing on stationary targets (
        <xref ref-type="bibr" rid="R67">12</xref>
        ) and not only when targets
are moving. In future studies we intend to compare
cyclopean and single-eye control when users select targets and
when targets are stationary (e.g., on-screen icons). Future
studies should also test the relative accuracy of the eyes in
free interaction, when users move their heads. Tracking
error in such cases can sometimes exceed 1.5<sup>0</sup> (
        <xref ref-type="bibr" rid="R110">55</xref>
        ) and one
should therefore test whether the distribution of errors does
not bias the position for one eye more strongly than for the
other.
      </p>
      <p>In addition, we invited participants to single
experimental sessions and we therefore could not assess whether
they retained any skill in eye tracking they may have
acquired during the experiment. Being able to retain such
skill may imply that expert users of gaze control interfaces
will be equally accurate when capturing targets with their
dominant as with their cyclopean eye. Future empirical
investigations should look at the longer-term effects of
training on target-capturing accuracy with gaze control
interfaces.</p>
      <p>Last, our estimated projection of the cyclopean eye was
based on an unweighted average of the x-y coordinates of
the dominant and non-dominant eye. However, a weighted
average with greater weight for the dominant eye would
have inevitably driven the crosshair closer to target in the
second part of the experiment where dominant control
improved. Gaze-interface interaction may therefore benefit
from the development of more sophisticated eye-crosshair
coupling algorithms with an alternating weighting system
according to real time data about tracking accuracy.</p>
    </sec>
    <sec id="s5">
      <title>Conclusions</title>
      <p>In two experiments, we demonstrated that tracking
accuracy was better with the cyclopean eye than with the
dominant and non-dominant eye. We also showed similar
tracking accuracy with the cyclopean and dominant eye in
the second half of Experiment 2. Our findings correspond
with the cyclopean eye theory of egocentric direction and
provide indication for eye dominance, in accordance with
the hemispheric laterality approach. From a practical
viewpoint, we showed that what eye to use as input should be a
design consideration in gaze interfaces.</p>
    </sec>
    <sec id="s6" sec-type="COI-statement">
      <title>Acknowledgements</title>
      <p>We wish to thank Lior Lahav for developing and
implementing all programming aspects of this project.</p>
      <p>The authors declare that there is no conflict of interest
regarding the publication of this paper.</p>


   
    </sec>
 </body>
  <back>
  <ref-list>
   <ref id="R56"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Alonso</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Causse</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Vachon</surname>, <given-names>F.</given-names></string-name>, <string-name><surname>Parise</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Dehais</surname>, <given-names>F.</given-names></string-name>, &amp; <string-name><surname>Terrier</surname>, <given-names>P.</given-names></string-name></person-group> (<year>2013</year>). <article-title>Evaluation of head-free eye tracking as an input device for air traffic control.</article-title> <source>Ergonomics</source>, <volume>56</volume>(<issue>2</issue>), <fpage>246</fpage>–<lpage>255</lpage>. <pub-id pub-id-type="doi">10.1080/00140139.2012.744473</pub-id><pub-id pub-id-type="pmid">23231634</pub-id><issn>0014-0139</issn></mixed-citation></ref>
   <ref id="R57"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Bates</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Donegan</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Istance</surname>, <given-names>H. O.</given-names></string-name>, <string-name><surname>Hansen</surname>, <given-names>J. P.</given-names></string-name>, &amp; <string-name><surname>Räihä</surname>, <given-names>K.-J.</given-names></string-name></person-group> (<year>2007</year>). <article-title>Introducing COGAIN: Communication by gaze interaction.</article-title> <source>Universal Access in the Information Society</source>, <volume>6</volume>(<issue>2</issue>), <fpage>159</fpage>–<lpage>166</lpage>. <pub-id pub-id-type="doi">10.1007/s10209-007-0077-9</pub-id><issn>1615-5289</issn></mixed-citation></ref>
   <ref id="R58"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Majaranta</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>MacKenzie</surname>, <given-names>I. S.</given-names></string-name>, <string-name><surname>Aula</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Räihä</surname>, <given-names>K.-J.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Effects of feedback and dwell time on eye typing speed and accuracy.</article-title> <source>Universal Access in the Information Society</source>, <volume>5</volume>(<issue>2</issue>), <fpage>199</fpage>–<lpage>208</lpage>. <pub-id pub-id-type="doi">10.1007/s10209-006-0034-z</pub-id><issn>1615-5289</issn></mixed-citation></ref>
   <ref id="R59"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Sibert</surname>, <given-names>L. E.</given-names></string-name>, &amp; <string-name><surname>Jacob</surname>, <given-names>R. J.</given-names></string-name></person-group> (<year>2000</year>). <article-title>Evaluation of eye gaze interaction.</article-title> <source>Paper presented at the Human Factors in Computing Systems: CHI 2000 Conference Proceedings</source>, <conf-loc>Den Haag: the Netherlands</conf-loc>. <pub-id pub-id-type="doi">10.1145/332040.332445</pub-id></mixed-citation></ref>
   <ref id="R60"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Barnes</surname>, <given-names>G. R.</given-names></string-name></person-group> (<year>2008</year>). <article-title>Cognitive processes involved in smooth pursuit eye movements.</article-title> <source>Brain and Cognition</source>, <volume>68</volume>(<issue>3</issue>), <fpage>309</fpage>–<lpage>326</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2008.08.020</pub-id><pub-id pub-id-type="pmid">18848744</pub-id><issn>0278-2626</issn></mixed-citation></ref>
   <ref id="R61"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Krauzlis</surname>, <given-names>R. J.</given-names></string-name></person-group> (<year>2004</year>). <article-title>Recasting the smooth pursuit eye movement system.</article-title> <source>Journal of Neurophysiology</source>, <volume>91</volume>(<issue>2</issue>), <fpage>591</fpage>–<lpage>603</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00801.2003</pub-id><pub-id pub-id-type="pmid">14762145</pub-id><issn>0022-3077</issn></mixed-citation></ref>
   <ref id="R62"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Jochems</surname>, <given-names>N.</given-names></string-name>, <string-name><surname>Vetter</surname>, <given-names>S.</given-names></string-name>, &amp; <string-name><surname>Schlick</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2013</year>). <article-title>A comparative study of information input devices for aging computer users.</article-title> <source>Behaviour &amp; Information Technology</source>, <volume>32</volume>(<issue>9</issue>), <fpage>902</fpage>–<lpage>919</lpage>. <pub-id pub-id-type="doi">10.1080/0144929X.2012.692100</pub-id><issn>0144-929X</issn></mixed-citation></ref>
   <ref id="R63"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Majaranta</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Isokoski</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Rantala</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Špakov</surname>, <given-names>O.</given-names></string-name>, <string-name><surname>Akkil</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Kangas</surname>, <given-names>J.</given-names></string-name>, &amp; <string-name><surname>Raisamo</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2016</year>). <article-title>Haptic feedback in eye typing.</article-title> <source>Journal of Eye Movement Research</source>, <volume>9</volume>(<issue>1</issue>). <pub-id pub-id-type="doi">10.16910/jemr.9.1.3</pub-id><issn>1995-8692</issn></mixed-citation></ref>
   <ref id="R64"><mixed-citation publication-type="book-chapter" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>MacKenzie</surname>, <given-names>I. S.</given-names></string-name></person-group> (<year>2011</year>). <chapter-title>Evaluating eye tracking systems for computer input</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>P.</given-names> <surname>Majaranta</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Aoki</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Donegan</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Hansen</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Hansen</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Hyrskykari</surname></string-name>, &amp; <string-name><given-names>K.</given-names> <surname>Räihä</surname></string-name> (<role>Eds.</role>),</person-group> <source>Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies: Advances in Assistive Technologies</source> (pp. <fpage>205</fpage>–<lpage>225</lpage>). <publisher-loc>Hershey</publisher-loc>: <publisher-name>IGI Global</publisher-name>.</mixed-citation></ref>
   <ref id="R65"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>San Agustin</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Mateo</surname>, <given-names>J. C.</given-names></string-name>, <string-name><surname>Hansen</surname>, <given-names>J. P.</given-names></string-name>, &amp; <string-name><surname>Villanueva</surname>, <given-names>A.</given-names></string-name></person-group> (<year>2009</year>). <article-title>Evaluation of the potential of gaze input for game interaction.</article-title> <source>PsychNology Journal</source>, <volume>7</volume>(<issue>2</issue>), <fpage>213</fpage>–<lpage>236</lpage>.<issn>1720-7525</issn></mixed-citation></ref>
   <ref id="R66"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Smith</surname>, <given-names>J. D.</given-names></string-name>, &amp; <string-name><surname>Graham</surname>, <given-names>T.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Use of eye movements for video game control.</article-title> <source>Paper presented at the Proceedings of the SIGCHI international conference on Advances in computer entertainment technology</source>, <conf-loc>New York, UAS</conf-loc>. <pub-id pub-id-type="doi">10.1145/1178823.1178847</pub-id></mixed-citation></ref>
   <ref id="R67"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Cui</surname>, <given-names>Y.</given-names></string-name>, &amp; <string-name><surname>Hondzinski</surname>, <given-names>J. M.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Gaze tracking accuracy in humans: Two eyes are better than one.</article-title> <source>Neuroscience Letters</source>, <volume>396</volume>(<issue>3</issue>), <fpage>257</fpage>–<lpage>262</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2005.11.071</pub-id><pub-id pub-id-type="pmid">16423465</pub-id><issn>0304-3940</issn></mixed-citation></ref>
    <ref id="R68"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Hering</surname>, <given-names>E.</given-names></string-name></person-group> (<year>1942</year>). <source>Spatial sense and movements of the eye</source>. <publisher-loc>Oxford, UK</publisher-loc>: <publisher-name>American Academy of Optometry</publisher-name>.</mixed-citation></ref>
   <ref id="R69"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Khokhotva</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Mapp</surname>, <given-names>A. P.</given-names></string-name></person-group> (<year>2005</year>). <article-title>The cyclopean eye is relevant for predicting visual direction.</article-title> <source>Vision Research</source>, <volume>45</volume>(<issue>18</issue>), <fpage>2339</fpage>–<lpage>2345</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2005.04.007</pub-id><pub-id pub-id-type="pmid">15921718</pub-id><issn>0042-6989</issn></mixed-citation></ref>
   <ref id="R70"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Mapp</surname>, <given-names>A. P.</given-names></string-name>, <string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Barbeito</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2003</year>). <article-title>What does the dominant eye dominate? A brief and somewhat contentious review.</article-title> <source>Perception &amp; Psychophysics</source>, <volume>65</volume>(<issue>2</issue>), <fpage>310</fpage>–<lpage>317</lpage>. <pub-id pub-id-type="doi">10.3758/BF03194802</pub-id><pub-id pub-id-type="pmid">12713246</pub-id><issn>0031-5117</issn></mixed-citation></ref>
   <ref id="R71"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name>, <string-name><surname>Mapp</surname>, <given-names>A. P.</given-names></string-name>, &amp; <string-name><surname>Howard</surname>, <given-names>I. P.</given-names></string-name></person-group> (<year>2002</year>). <article-title>The cyclopean eye in vision: The new and old data continue to hit you right between the eyes.</article-title> <source>Vision Research</source>, <volume>42</volume>(<issue>10</issue>), <fpage>1307</fpage>–<lpage>1324</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(01)00281-4</pub-id><pub-id pub-id-type="pmid">12044760</pub-id><issn>0042-6989</issn></mixed-citation></ref>
   <ref id="R72"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Wade</surname>, <given-names>N. J.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Two historical strands in studying visual direction.</article-title> <source>The Japanese Psychological Research</source>, <volume>54</volume>(<issue>1</issue>), <fpage>71</fpage>–<lpage>88</lpage>. <pub-id pub-id-type="doi">10.1111/j.1468-5884.2011.00506.x</pub-id><issn>0021-5368</issn></mixed-citation></ref>
   <ref id="R73"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Howard</surname>, <given-names>I.</given-names></string-name>, &amp; <string-name><surname>Rogers</surname>, <given-names>B.</given-names></string-name></person-group> (<year>2012</year>). <series>Perceiving in depth</series>: <volume>02</volume>. <source>Stereoscopic vision</source>. <publisher-loc>Oxford, UK</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>.</mixed-citation></ref>
   <ref id="R74"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Stidwill</surname>, <given-names>D.</given-names></string-name>, &amp; <string-name><surname>Fletcher</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2011</year>). <source>Normal binocular vision: Theory, investigation and practical aspects</source>. <publisher-loc>West Sussex, UK</publisher-loc>: <publisher-name>John Wiley &amp; Sons</publisher-name>.</mixed-citation></ref>
   <ref id="R75"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kepler</surname>, <given-names>J.</given-names></string-name></person-group> (<year>1611</year>). <source>Dioptrice</source>. <publisher-loc>Augsburg</publisher-loc>: <publisher-name>D. Francus</publisher-name>.</mixed-citation></ref>
   <ref id="R76"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Wade</surname>, <given-names>N. J.</given-names></string-name>, <string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Mapp</surname>, <given-names>A. P.</given-names></string-name></person-group> (<year>2006</year>). <article-title>The lost direction in binocular vision: The neglected signs posted by Wells, Towne, and LeConte.</article-title> <source>Journal of the History of the Behavioral Sciences</source>, <volume>42</volume>(<issue>1</issue>), <fpage>61</fpage>–<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1002/jhbs.20135</pub-id><pub-id pub-id-type="pmid">16345004</pub-id><issn>0022-5061</issn></mixed-citation></ref>
   <ref id="R77"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Rubin</surname>, <given-names>M. L.</given-names></string-name>, &amp; <string-name><surname>Walls</surname>, <given-names>G. L.</given-names></string-name></person-group> (<year>1969</year>). <source>Fundamentals of visual science</source>. <publisher-loc>Springfield</publisher-loc>: <publisher-name>Thomas</publisher-name>.</mixed-citation></ref>
   <ref id="R78"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Walls</surname>, <given-names>G. L.</given-names></string-name></person-group> (<year>1951</year>). <article-title>A theory of ocular dominance.</article-title> <source>Archives of Ophthalmology</source>, <volume>45</volume>(<issue>4</issue>), <fpage>387</fpage>–<lpage>412</lpage>. <pub-id pub-id-type="doi">10.1001/archopht.1951.01700010395005</pub-id><pub-id pub-id-type="pmid">14818494</pub-id><issn>0003-9950</issn></mixed-citation></ref>
   <ref id="R79"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Dolman</surname>, <given-names>P.</given-names></string-name></person-group> (<year>1920</year>). <article-title>The relation of the sighting eye to the measurement of heterophoria. A preliminary report.</article-title> <source>American Journal of Ophthalmology</source>, <volume>3</volume>(<issue>4</issue>), <fpage>258</fpage>–<lpage>261</lpage>. <pub-id pub-id-type="doi">10.1016/S0002-9394(20)90168-X</pub-id><issn>0002-9394</issn></mixed-citation></ref>
   <ref id="R80"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ehrenstein</surname>, <given-names>W. H.</given-names></string-name>, <string-name><surname>Arnold-Schulz-Gahmen</surname>, <given-names>B. E.</given-names></string-name>, &amp; <string-name><surname>Jaschinski</surname>, <given-names>W.</given-names></string-name></person-group> (<year>2005</year>). <article-title>Eye preference within the context of binocular functions.</article-title> <source>Graefes Archive for Clinical and Experimental Ophthalmology</source>, <volume>243</volume>(<issue>9</issue>), <fpage>926</fpage>–<lpage>932</lpage>. <pub-id pub-id-type="doi">10.1007/s00417-005-1128-7</pub-id><pub-id pub-id-type="pmid">15838666</pub-id><issn>0721-832X</issn></mixed-citation></ref>
   <ref id="R81"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Li</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Lam</surname>, <given-names>C. S.</given-names></string-name>, <string-name><surname>Yu</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Hess</surname>, <given-names>R. F.</given-names></string-name>, <string-name><surname>Chan</surname>, <given-names>L. Y.</given-names></string-name>, <string-name><surname>Maehara</surname>, <given-names>G.</given-names></string-name>, <etal>. . .</etal> <string-name><surname>Thompson</surname>, <given-names>B.</given-names></string-name></person-group> (<year>2010</year>). <article-title>Quantifying sensory eye dominance in the normal visual system: A new technique and insights into variation across traditional tests.</article-title> <source>Investigative Ophthalmology &amp; Visual Science</source>, <volume>51</volume>(<issue>12</issue>), <fpage>6875</fpage>–<lpage>6881</lpage>. <pub-id pub-id-type="doi">10.1167/iovs.10-5549</pub-id><pub-id pub-id-type="pmid">20610837</pub-id><issn>0146-0404</issn></mixed-citation></ref>
   <ref id="R82"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Bourassa</surname>, <given-names>D. C.</given-names></string-name>, <string-name><surname>McManus</surname>, <given-names>I. C.</given-names></string-name>, &amp; <string-name><surname>Bryden</surname>, <given-names>M. P.</given-names></string-name></person-group> (<year>1996</year>). <article-title>Handedness and eye-dominance: A meta-analysis of their relationship.</article-title> <source>Brain and Cognition</source>, <volume>1</volume>(<issue>1</issue>), <fpage>5</fpage>–<lpage>34</lpage>. <pub-id pub-id-type="doi">10.1080/713754206</pub-id><pub-id pub-id-type="pmid">15513026</pub-id><issn>0278-2626</issn></mixed-citation></ref>
   <ref id="R83"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Porac</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name><surname>Coren</surname>, <given-names>S.</given-names></string-name></person-group> (<year>1986</year>). <article-title>Sighting dominance and egocentric localization.</article-title> <source>Vision Research</source>, <volume>26</volume>(<issue>10</issue>), <fpage>1709</fpage>–<lpage>1713</lpage>. <pub-id pub-id-type="doi">10.1016/0042-6989(86)90057-X</pub-id><pub-id pub-id-type="pmid">3617511</pub-id><issn>0042-6989</issn></mixed-citation></ref>
   <ref id="R84"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Khan</surname>, <given-names>A. Z.</given-names></string-name>, &amp; <string-name><surname>Crawford</surname>, <given-names>J. D.</given-names></string-name></person-group> (<year>2001</year>). <article-title>Ocular dominance reverses as a function of horizontal gaze angle.</article-title> <source>Vision Research</source>, <volume>41</volume>(<issue>14</issue>), <fpage>1743</fpage>–<lpage>1748</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(01)00079-7</pub-id><pub-id pub-id-type="pmid">11369037</pub-id><issn>0042-6989</issn></mixed-citation></ref>
   <ref id="R85"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Mapp</surname>, <given-names>A. P.</given-names></string-name>, &amp; <string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name></person-group> (<year>1999</year>). <article-title>Wondering about the wandering cyclopean eye.</article-title> <source>Vision Research</source>, <volume>39</volume>(<issue>14</issue>), <fpage>2381</fpage>–<lpage>2386</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(98)00278-8</pub-id><pub-id pub-id-type="pmid">10367058</pub-id><issn>0042-6989</issn></mixed-citation></ref>
   <ref id="R86"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ono</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Barbeito</surname>, <given-names>R.</given-names></string-name></person-group> (<year>1982</year>). <article-title>The cyclopean eye vs. the sighting-dominant eye as the center of visual direction.</article-title> <source>Perception &amp; Psychophysics</source>, <volume>32</volume>(<issue>3</issue>), <fpage>201</fpage>–<lpage>210</lpage>. <pub-id pub-id-type="doi">10.3758/BF03206224</pub-id><pub-id pub-id-type="pmid">7177758</pub-id><issn>0031-5117</issn></mixed-citation></ref>
   <ref id="R87"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Han</surname>, <given-names>Y.</given-names></string-name>, <string-name><surname>Seideman</surname>, <given-names>M.</given-names></string-name>, &amp; <string-name><surname>Lennerstrand</surname>, <given-names>G.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Dynamics of accommodative vergence movements controlled by the dominant and non-dominant eye.</article-title> <source>Acta Ophthalmologica Scandinavica</source>, <volume>73</volume>(<issue>4</issue>), <fpage>319</fpage>–<lpage>324</lpage>. <pub-id pub-id-type="doi">10.1111/j.1600-0420.1995.tb00034.x</pub-id><pub-id pub-id-type="pmid">8646576</pub-id><issn>1395-3907</issn></mixed-citation></ref>
   <ref id="R88"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>van Leeuwen</surname>, <given-names>A. F.</given-names></string-name>, <string-name><surname>Westen</surname>, <given-names>M. J.</given-names></string-name>, <string-name><surname>van der Steen</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>de Faber</surname>, <given-names>J.-T. H.</given-names></string-name>, &amp; <string-name><surname>Collewijn</surname>, <given-names>H.</given-names></string-name></person-group> (<year>1999</year>). <article-title>Gaze-shift dynamics in subjects with and without symptoms of convergence insufficiency: Influence of monocular preference and the effect of training.</article-title> <source>Vision Research</source>, <volume>39</volume>(<issue>18</issue>), <fpage>3095</fpage>–<lpage>3107</lpage>. <pub-id pub-id-type="doi">10.1016/S0042-6989(99)00066-8</pub-id><pub-id pub-id-type="pmid">10664807</pub-id><issn>0042-6989</issn></mixed-citation></ref>
   <ref id="R89"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Moiseeva</surname>, <given-names>V. V.</given-names></string-name>, <string-name><surname>Slavutskaya</surname>, <given-names>M. V.</given-names></string-name>, &amp; <string-name><surname>Shul’govskii</surname>, <given-names>V. V.</given-names></string-name></person-group> (<year>2000</year>). <article-title>The effects of visual stimulation of the dominant and non-dominant eyes on the latent period of saccades and the latency of the peak of rapid pre-saccade potentials.</article-title> <source>Neuroscience and Behavioral Physiology</source>, <volume>30</volume>(<issue>4</issue>), <fpage>379</fpage>–<lpage>382</lpage>. <pub-id pub-id-type="doi">10.1007/BF02463089</pub-id><pub-id pub-id-type="pmid">10981938</pub-id><issn>0097-0549</issn></mixed-citation></ref>
   <ref id="R90"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Kawata</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Ohtsuka</surname>, <given-names>K.</given-names></string-name></person-group> (<year>2001</year>). <article-title>Dynamic asymmetries in convergence eye movements under natural viewing conditions.</article-title> <source>Japanese Journal of Ophthalmology</source>, <volume>45</volume>(<issue>5</issue>), <fpage>437</fpage>–<lpage>444</lpage>. <pub-id pub-id-type="doi">10.1016/S0021-5155(01)00405-1</pub-id><pub-id pub-id-type="pmid">11583663</pub-id><issn>0021-5155</issn></mixed-citation></ref>
   <ref id="R91"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Vidal</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Bulling</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Gellersen</surname>, <given-names>H.</given-names></string-name></person-group> (<year>2013</year>). <article-title>Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets.</article-title> Paper presented at the <source>Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing</source>, <conf-loc>New York: USA</conf-loc>. <pub-id pub-id-type="doi">10.1145/2493432.2493477</pub-id></mixed-citation></ref>
   <ref id="R92"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Shapiro</surname>, <given-names>I. J.</given-names></string-name></person-group> (<year>1995</year>). <article-title>Paralleltesting infinity balance. Instrument and technique for the parallel testing of binocular vision.</article-title> <source>Optometry and Vision Science</source>, <volume>72</volume>(<issue>12</issue>), <fpage>916</fpage>–<lpage>923</lpage>. <pub-id pub-id-type="doi">10.1097/00006324-199512000-00012</pub-id><pub-id pub-id-type="pmid">8749341</pub-id><issn>1040-5488</issn></mixed-citation></ref>
   <ref id="R93"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Wagner</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Sahar</surname>, <given-names>Y.</given-names></string-name>, <string-name><surname>Elbaum</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Botzer</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Berliner</surname>, <given-names>E.</given-names></string-name></person-group> (<year>2015</year>). <article-title>Grip Force as a Measure of Stress in Aviation.</article-title> <source>The International Journal of Aviation Psychology</source>, <volume>25</volume>(<issue>3-4</issue>), <fpage>157</fpage>–<lpage>170</lpage>. <pub-id pub-id-type="doi">10.1080/10508414.2015.1162632</pub-id><issn>1050-8414</issn></mixed-citation></ref>
   <ref id="R94"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Wickens</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name><surname>Hollands</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2000</year>). <source>Engineering psychology and human performance. 2000</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>HarperCollins</publisher-name>.</mixed-citation></ref>
	<ref id="R95"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Stampe</surname>, <given-names>D. M.</given-names></string-name></person-group>(<year>1993</year>). <article-title>Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems.</article-title> <source>Behavior Research Methods, Instruments, &amp; Computers</source>,<volume>25</volume>(<issue>2</issue>), <fpage>137</fpage>–<lpage>142</lpage>. <pub-id pub-id-type="doi">10.3758/BF03204486</pub-id><issn>0743-3808</issn></mixed-citation></ref>
   <ref id="R96"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Nuthmann</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Kliegl</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2009</year>). <article-title>An examination of binocular reading fixations based on sentence corpus data.</article-title> <source>Journal of Vision (Charlottesville, Va.)</source>, <volume>9</volume>(<issue>5</issue>), <fpage>1</fpage>–<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1167/9.5.31</pub-id><pub-id pub-id-type="pmid">19757909</pub-id><issn>1534-7362</issn></mixed-citation></ref>
   <ref id="R97"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Paterson</surname>, <given-names>K. B.</given-names></string-name>, <string-name><surname>Jordan</surname>, <given-names>T. R.</given-names></string-name>, &amp; <string-name><surname>Kurtev</surname>, <given-names>S.</given-names></string-name></person-group> (<year>2009</year>). <article-title>Binocular fixation disparity in single word displays.</article-title> <source>Journal of Experimental Psychology. Human Perception and Performance</source>, <volume>35</volume>(<issue>6</issue>), <fpage>1961</fpage>–<lpage>1968</lpage>. <pub-id pub-id-type="doi">10.1037/a0016889</pub-id><pub-id pub-id-type="pmid">19968446</pub-id><issn>0096-1523</issn></mixed-citation></ref>
   <ref id="R98"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Jagacinski</surname>, <given-names>R. J.</given-names></string-name>, &amp; <string-name><surname>Flach</surname>, <given-names>J. M.</given-names></string-name></person-group> (<year>2003</year>). <source>Control theory for humans: Quantitative approaches to modeling performance</source>. <publisher-loc>Mahwah, NJ</publisher-loc>: <publisher-name>Lawrence Erlbaum Associates</publisher-name>.</mixed-citation></ref>
   <ref id="R99"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Leigh</surname>, <given-names>R. J.</given-names></string-name>, &amp; <string-name><surname>Zee</surname>, <given-names>D. S.</given-names></string-name></person-group> (<year>2015</year>). <source>The neurology of eye movements</source>. <publisher-loc>Oxford, UK</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>. <pub-id pub-id-type="doi">10.1093/med/9780199969289.001.0001</pub-id></mixed-citation></ref>
   <ref id="R100"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Bridgeman</surname>, <given-names>B.</given-names></string-name>, <string-name><surname>Van der Heijden</surname>, <given-names>A.</given-names></string-name>, &amp; <string-name><surname>Velichkovsky</surname>, <given-names>B. M.</given-names></string-name></person-group> (<year>1994</year>). <article-title>A theory of visual stability across saccadic eye movements.</article-title> <source>Behavioral and Brain Sciences</source>, <volume>17</volume>(<issue>2</issue>), <fpage>247</fpage>–<lpage>257</lpage>. <pub-id pub-id-type="doi">10.1017/S0140525X00034361</pub-id><issn>0140-525X</issn></mixed-citation></ref>
	<ref id="R101"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Vallines</surname>, <given-names>I.</given-names></string-name>, &amp; <string-name><surname>Greenlee</surname>, <given-names>M. W.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Saccadic suppression of retinotopically localized blood oxygen level-dependent responses in human primary visual area V1.</article-title> <source>The Journal of Neuroscience : The Official Journal of the Society for Neuroscience</source>, <volume>26</volume>(<issue>22</issue>), <fpage>5965</fpage>–<lpage>5969</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.0817-06.2006</pub-id><pub-id pub-id-type="pmid">16738238</pub-id><issn>0270-6474</issn></mixed-citation></ref>
	<ref id="R102"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Holmqvist</surname>, <given-names>K.</given-names></string-name>, <string-name><surname>Nyström</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Andersson</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Dewhurst</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Jarodzka</surname>, <given-names>H.</given-names></string-name>, &amp; <string-name><surname>Van de Weijer</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2011</year>). <source>Eye tracking: A comprehensive guide to methods and measures</source>. <publisher-loc>Oxford, UK</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>.</mixed-citation></ref>
	<ref id="R103"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Research</surname>, <given-names>S. R.</given-names></string-name></person-group> (<year>2010</year>). <source>EyeLink 1000 User’s Manual Version 1.5.2</source>. <publisher-loc>Mississauga, Ontario, Canada</publisher-loc>: <publisher-name>SR Research Ltd.</publisher-name></mixed-citation></ref>
	<ref id="R104"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Goldreich</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Krauzlis</surname>, <given-names>R. J.</given-names></string-name>, &amp; <string-name><surname>Lisberger</surname>, <given-names>S. G.</given-names></string-name></person-group> (<year>1992</year>). <article-title>Effect of changing feedback delay on spontaneous oscillations in smooth pursuit eye movements of monkeys.</article-title> <source>Journal of Neurophysiology</source>, <volume>67</volume>(<issue>3</issue>), <fpage>625</fpage>–<lpage>638</lpage>. <pub-id pub-id-type="doi">10.1152/jn.1992.67.3.625</pub-id><pub-id pub-id-type="pmid">1578248</pub-id><issn>0022-3077</issn></mixed-citation></ref>
	<ref id="R105"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Jacob</surname>, <given-names>R. J.</given-names></string-name></person-group> (<year>1993</year>). <article-title>Eye movement-based human-computer interaction techniques: Toward non-command interfaces.</article-title> <source>Advances in Human-Computer Interaction</source>, <volume>4</volume>, <fpage>151</fpage>–<lpage>190</lpage>.<issn>1687-5893</issn></mixed-citation></ref>
	<ref id="R106"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ellson</surname>, <given-names>D.</given-names></string-name></person-group> (<year>1947</year>). <source>The independence of tracking in two and three dimensions with the B-29 pedestal sight (TSEAA-694-2G)</source>. <publisher-loc>Dayton, Ohio</publisher-loc>: <publisher-name>Aero Medical Laboratory</publisher-name>.</mixed-citation></ref>
	<ref id="R107"><mixed-citation publication-type="conference" specific-use="parsed"><person-group person-group-type="author"><string-name><surname>Klochek</surname>, <given-names>C.</given-names></string-name>, &amp; <string-name><surname>MacKenzie</surname>, <given-names>I. S.</given-names></string-name></person-group> (<year>2006</year>). <article-title>Performance measures of game controllers in a three-dimensional environment.</article-title> Paper presented at the <source>Proceedings of Graphics Interface 2006</source>, <conf-loc>Canada</conf-loc>.</mixed-citation></ref>
	<ref id="R108"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Gundogan</surname>, <given-names>N. U.</given-names></string-name>, <string-name><surname>Yazici</surname>, <given-names>A. C.</given-names></string-name>, &amp; <string-name><surname>Simsek</surname>, <given-names>A.</given-names></string-name></person-group> (<year>2009</year>). <article-title>Study on dominant eye measurement.</article-title> <source>International Journal of Ophthalmology</source>, <volume>2</volume>(<issue>3</issue>), <fpage>271</fpage>–<lpage>277</lpage>.<issn>2222-3959</issn></mixed-citation></ref>
	<ref id="R109"><mixed-citation publication-type="conference" specific-use="parsed"><person-group person-group-type="author"><string-name><surname>Räihä</surname>, <given-names>K.-J.</given-names></string-name>, &amp; <string-name><surname>Ovaska</surname>, <given-names>S.</given-names></string-name></person-group> (<year>2012</year>). <article-title>An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload.</article-title> Paper presented at the <source>Proceedings of the SIGCHI 14 Conference on Human Factors in Computing Systems</source>, <conf-loc>New York</conf-loc>.</mixed-citation></ref>
	<ref id="R110"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Zhu</surname>, <given-names>Z.</given-names></string-name>, &amp; <string-name><surname>Ji</surname>, <given-names>Q.</given-names></string-name></person-group> (<year>2007</year>). <article-title>Novel eye gaze tracking techniques under natural head movement.</article-title> <source>IEEE Transactions on Biomedical Engineering</source>, <volume>54</volume>(<issue>12</issue>), <fpage>2246</fpage>–<lpage>2260</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2007.895750</pub-id><pub-id pub-id-type="pmid">18075041</pub-id><issn>0018-9294</issn></mixed-citation></ref>

</ref-list>
  </back>
</article>
