<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.11.6.3</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Eye Movement Parameters for Performance Evaluation in Projection-based Stereoscopic Display</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Lin</surname>
						<given-names>Chiuhsiang Joe</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Prasetyo</surname>
						<given-names>Yogi Tri</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>	
				<contrib contrib-type="author">
					<name>
						<surname>Widyaningrum</surname>
						<given-names>Retno</given-names>
					</name>
					<xref ref-type="aff" rid="aff2">2</xref>
				</contrib>        			
        <aff id="aff1">
		<institution>Department of Industrial Management, National Taiwan University of Science and Technology</institution>,   <country>Taiwan</country>
        </aff>
        <aff id="aff2">
		<institution>Department of Industrial Engineering Sepuluh Nopember Institute of Technology, Kampus ITS Sukolilo Surabaya 60111</institution>,   <country>Indonesia</country>
        </aff>        
		</contrib-group>   

		
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>20</day>  
		<month>11</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2018</year>
	</pub-date>
      <volume>11</volume>
      <issue>6</issue>
	 <elocation-id>10.16910/jemr.11.6.3</elocation-id> 
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Lin, C.J., Prasetyo, Y.T. &#x26; Widyaningrum, R.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>The current study applied Structural Equation Modeling (SEM) to analyze the rela-tionship among index of difficulty (ID) and parallax on eye gaze movement time (EMT), fixation duration (FD), time to first fixation (TFF), number of fixation (NF), and eye gaze accuracy (AC) simultaneously. EMT, FD, TFF, NF, and AC were measured in the projec-tion-based stereoscopic display by utilizing Tobii eye tracker system. Ten participants were recruited to perform multi-directional tapping task using within-subject design with three different levels of parallax and six different levels of ID. SEM proved that ID had significant direct effects on EMT, NF, and FD also a significant indirect effect on NF. However, ID was found not a strong predictor for AC. SEM also proved that parallax had significant direct effects on EMT, NF, FD, TFF, and AC. Apart from the direct effect, parallax also had significant indirect effects on NF and AC. Regarding the interrelation-ship among dependent variables, there were significant indirect effects of FD and TFF on AC. Our results concluded that higher AC was achieved by lowering parallax (at the screen), longer EMT, higher NF, longer FD, and longer TFF.</p>
        <p><bold>Practitioner Summary</bold>: The SEM could provide valuable theoretical foundations of the interrelationship among eye movement parameters for VR researchers and human-virtual-reality interface developers especially for predicting eye gaze accuracy.</p>        
      </abstract>
      <kwd-group>
        <kwd>Structural equation modeling</kwd>
        <kwd>mediator effect</kwd>
        <kwd>eye movement parameters</kwd>
        <kwd>stereoscopic</kwd>
        <kwd>parallax</kwd>
        <kwd>virtual reality</kwd>	
        <kwd>eye movement</kwd>
        <kwd>eye tracking</kwd>	        	
      </kwd-group>
    </article-meta>
  </front>	
  <body>

    <sec id="S1">
      <title>Introduction</title>

<p>Virtual reality (VR) has developed significantly in the world over the past two
decades. It is designed to make possible a human sensorimotor and
cognitive activity in a digitally created artificial world, which can be
imaginary, symbolic, or a simulation of certain aspects of the real
world (<xref ref-type="bibr" rid="b1">1</xref>). Manufacturers and researchers from different disciplines are
paying more and more attention to VR, seeking to maximize the image
quality while also considering the diverse applications. Recent research
has explored the promising diverse applications of VR, particularly in
the 3D geovisualization (<xref ref-type="bibr" rid="b2">2</xref>), 3D animated media (<xref ref-type="bibr" rid="b3">3</xref>), and even 3D
laparoscopic surgery (<xref ref-type="bibr" rid="b4">4</xref>). One of the most common techniques to create VR
is projection-based stereoscopic display.</p>

<p>Projection-based stereoscopic display has been commercialized in
order to implement it in VR (<xref ref-type="bibr" rid="b5">5</xref>). It generates 3D images by creating
depth perception via a cue called binocular disparity which refers to a
lateral shift or difference between the spatial positions of
corresponding left and right eye images (<xref ref-type="bibr" rid="b6">6</xref>). This binocular disparity of
two images between left and right eye is commonly mentioned as parallax
(<xref ref-type="bibr" rid="b7">7</xref>). Parallax creates binocular disparity in the human visual system
that gives a stereoscopic effect of depth with each eye receiving an
image similar, but not identical, to that of a real spatial vision (<xref ref-type="bibr" rid="b1">1</xref>).
One common device to evaluate the effectiveness of projection-based
stereoscopic display is eye tracker (<xref ref-type="bibr" rid="b8 b9">8, 9</xref>)</p>

<p>Eye tracker is becoming widely popular to evaluate projection-based
stereoscopic 3D display, especially for collecting and analyzing
information about the users. It is a tool that allows user experience
researchers to observe the position of the eye to understand area of
interest an individual is looking (<xref ref-type="bibr" rid="b10">10</xref>). Eye tracker measures some
variables which commonly named as eye movement measures or eye movement
parameters. Research in different fields might focus on different eye
movement parameters (<xref ref-type="bibr" rid="b11">11</xref>).</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Fig. 1.</label>
					<caption>
						<p>Structural Equation Modeling of Chinese character complicacy
using eye movement parameters (<xref ref-type="bibr" rid="b12">12</xref>).</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-11-06-c-figure-01.png"/>
				</fig>

<p>Despite the availability of many eye tracker publications over the
past two decades, there is still little information available on the
interrelationship among eye movement parameters. Previously, Ma &#x26;
Chuang (<xref ref-type="bibr" rid="b38">38</xref>) investigated the correlations between the Chinese characters
stroke complicacy and eye movement parameters by utilizing structural
equation modeling (SEM). However, the path coefficient was found low (β:
0.17) and the loading factor of saccade amplitude was less than 0.70
(p-value &#x3C; 0.10) indicating that saccade amplitude was a not a strong
predictor for eye movement parameters in the model (Figure 1). Moreover,
the interrelationship between two eye movement parameters (NF and
saccade amplitude) and Chinese characters information was not analyzed
further. Unema et al (<xref ref-type="bibr" rid="b13">13</xref>) mentioned that there was a strong but
nonlinear relationship between saccade amplitude and fixation duration.
Similarly, Pannash et al (<xref ref-type="bibr" rid="b14">14</xref>) demonstrate a systematic change in the
saccade amplitude and fixation duration over time. However, these
studies were limited only to two eye movement parameters. A further
investigation which incorporates more eye movement parameters could be
very valuable for VR researchers on different fields and human-virtual
reality interface developers. Goldberg conducted a study to investigate
the impact of several page design factors on perceived ratings of page
clarity, completion time, emotional valence from video, and several eye
movement parameters (<xref ref-type="bibr" rid="b15">15</xref>). In addition, Goldberg also explored the
relationship among selected eye movement parameters using Pearson
correlation (Table 1) (<xref ref-type="bibr" rid="b15">15</xref>). This study could be improved by utilizing
SEM approach since this method can analyze beyond a simple correlation
analysis.</p>

<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>Correlation matrix among selected eye movement parameters (<xref ref-type="bibr" rid="b15">15</xref>).</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th></th>
        <th>SO</th>
        <th>JF</th>
        <th>SR</th>
        <th>CT</th>
        <th>EV</th>
        <th>TFF</th>
        <th>FD</th>
        <th>NF</th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>JF</td>
        <td>.17***</td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>SR</td>
        <td>.21***</td>
        <td>ns</td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>CT</td>
        <td>.16***</td>
        <td>.10*</td>
        <td>.45***</td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>EV</td>
        <td>ns</td>
        <td>ns</td>
        <td>ns</td>
        <td>ns</td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>TFF</td>
        <td>.15**</td>
        <td>.24***</td>
        <td>.23***</td>
        <td>.43***</td>
        <td>ns</td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>FD</td>
        <td>.14**</td>
        <td>ns</td>
        <td>ns</td>
        <td>ns</td>
        <td>ns</td>
        <td>ns</td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>NF</td>
        <td>.25***</td>
        <td>ns</td>
        <td>.41***</td>
        <td>.82***</td>
        <td>ns</td>
        <td>.40***</td>
        <td>ns</td>
        <td></td>
      </tr>
      <tr>
        <td>SA</td>
        <td>ns</td>
        <td>ns</td>
        <td>.21***</td>
        <td>.30***</td>
        <td>ns</td>
        <td>.41***</td>
        <td>ns</td>
        <td>.35***</td>
      </tr>
    </tbody>
  </table>
					<table-wrap-foot>
						<fn id="FN1">
						<p><italic>Note:</italic> JF=JPEG file size; SR=subjective ratings;
CT=task completion time; EV=emotional valence; TFF=time to first
fixation; FD=fixation duration; NF=number of fixations; SA=search area.
*p&#x3C;0.05. **p&#x3C;0.01. ***p&#x3C;0.001.</p>
						</fn>
					</table-wrap-foot>  
</table-wrap>

<p>SEM is a very useful technique to investigate the interrelationship
among eye movement parameters since the relationships between variables
are assessed simultaneously via covariance analysis (<xref ref-type="bibr" rid="b16">16</xref>). It examines
the structure of interrelationships expressed in a series of equations,
similar to a series of multiple regression (<xref ref-type="bibr" rid="b17">17</xref>). A structural model with
a hypothesized mediating effect can also produce direct and indirect
effects (<xref ref-type="bibr" rid="b17">17</xref>). Direct effects are the relationship linking two parameters
with a single path and indirect effects are those relationships that
involve a sequence of relationships with at least one intervening
parameter involved (<xref ref-type="bibr" rid="b17">17</xref>). Two of our previous studies collected several
eye movement parameters in projection-based stereoscopic display which
consist of eye gaze movement time (EMT), fixation duration (FD), time to
first fixation (TFF), number of fixation (NF), and eye gaze accuracy
(AC) (<xref ref-type="bibr" rid="b8 b9">8, 9</xref>). Our previous one-way repeated ANOVA analysis must decompose
chains of relationships among three or more constructs into tests of
relationships to derive the mediating effects. Moreover, our previous
one-way repeated ANOVA analysis could not investigate further the
interrelationship among dependent eye movement parameters. By utilizing
SEM approach, the mediating effects when the third parameter intervenes
between two other related parameters and the interrelationship among eye
movement parameters can be analyzed simultaneously.</p>

<p>The purpose of the current study is to analyze the interrelationship
among eye movement parameters in projection-based stereoscopic display
by utilizing SEM approach. The interrelationship among variables could
be used to predict AC, which was defined as the distance between the
recorded fixation locations and the actual location of the projection of
the image (<xref ref-type="bibr" rid="b9">9</xref>). In terms of engineering application, AC is one of the
most important eye movement parameters and commonly used as the
performance evaluation of eye tracker since it can be an objective
indicator to distinguish good and bad designs (<xref ref-type="bibr" rid="b9 b18 b19">9, 18, 19</xref>). AC could also
provide valuable theoretical foundations for VR researchers and
human-virtual reality interface developers.</p>

<fig id="fig02" fig-type="figure" position="float">
					<label>Fig. 2.</label>
					<caption>
						<p>The SEM hypothesis constructs for eye movement parameters in
stereoscopic display.</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-11-06-c-figure-02.png"/>
				</fig>

<p>For the hypothesized SEM model, the current study proposed 20 set of
hypotheses (Figure 2). ID was hypothesized had significant direct
effects on EMT (Hypothesis 1), AC (Hypothesis 2), NF (Hypothesis 3), FD
(Hypothesis 4), and TFF (Hypothesis 5). Hypothesis 4 was supported by
Walshe &#x26; Nuthmann who mentioned that FD was found to be under direct
control of stimulus content (<xref ref-type="bibr" rid="b20">20</xref>). Following our two previous
publications regarding parallax effect on eye movement parameters in
stereoscopic display (<xref ref-type="bibr" rid="b8 b9">8, 9</xref>), parallax was hypothesized had significant
direct effects on EMT (Hypothesis 6), AC (Hypothesis 7), NF (Hypothesis
8), FD (Hypothesis 9), and TFF (Hypothesis 10). EMT was hypothesized had
direct effects on TFF (Hypothesis 11), NF (Hypothesis 13), and FD
(Hypothesis 14). Based speed-accuracy trade-off in Fit’s Law, EMT was
also hypothesized had a significant direct effect on AC (Hypothesis 12).
Rodrigues &#x26; Rosa., (<xref ref-type="bibr" rid="b11">11</xref>) and Castner &#x26; Eastman (<xref ref-type="bibr" rid="b21">21</xref>) mentioned
that FD is highly correlated with NF, therefore FD was hypothesized had
a direct effect on NF (Hypothesis 16). FD was also hypothesized had
significant direct effects on AC (Hypothesis 15) and TFF (Hypothesis
17). TFF was hypothesized had significant direct effects on AC
(Hypothesis 18) and NF (Hypothesis 19) as supported by Goldberg, (2014).
Finally, NF was hypothesized had a significant effect on AC (Hypothesis
20) as supported by Togami (<xref ref-type="bibr" rid="b22">22</xref>).</p>
    </sec>
	
    <sec id="S2">
      <title>Methods</title>

<p>The current study applied Structural Equation Modeling (SEM) to
analyze the interrelationship among index of difficulty (ID), parallax,
and eye movement parameters which include eye gaze movement time (EMT),
fixation duration (FD), time to first fixation (TFF), number of fixation
(NF), and eye gaze accuracy (AC) simultaneously. The main focus of the
study is to analyze the causal relationship among all of the parameters
for predicting AC.</p>

    <sec id="S2a">
      <title>Participants</title>

<p>A total of ten participants (7 male and 3 female) from National
Taiwan University of Science and Technology voluntary took part in this
experiment. Therefore, they were not paid or compensated with academic
credits. All participants were graduate students (mean: 25 years; sd: 4
years) and had normal or corrected to normal visual acuity (1.0 in
decimal units). Prior to the experiment, participants needed to fill out
a consent form and screened for capability to see the object clearly in
the stereoscopic display.</p>
    </sec>
	
    <sec id="S2b">
      <title>Apparatus</title>

<p>Eye movements were recorded using the Tobii X2-60 remote eye tracking
system at a sampling rate of 60 Hz. The fixation filter Tobii Studio
version 3.3.2 was used for calibration, testing, and data analysis. Raw
eye fixation data was filtered using an I-VT fixation filter with 30
degree per second velocity threshold.</p>

<p>During the experiment, participants were asked to wear a pair of View
Sonic 3D glasses PDF-250 to perceive the stereoscopic 3D environment.
The 3D glasses were integrated with a 3D vision IR Emitter NVIDIA and 3D
View Sonic (PJD 6251) projector. 3D vision IR Emitter NVIDIA was located
under the table on a certain distance from Tobii X2 to eliminate the
shuttering effect so the signal of 3D Emitter NVIDIA and infrared from
Tobii X2 did not affect the eye movement registration. The length and
width of the projection screen were 143 x 108 cm respectively. In
addition, a Logitech C-920 webcam integrated with Tobii studio was used
to record the eye movement data from the screen display.</p>

<p>Figure 3 represents the experimental layout of this study. The
distance between the participant and the screen was 181 cm. The View
Sonic 3D projector was placed 89 in front of the screen and the Tobii
eye tracker was placed 64 cm in front of the participant. To maintain
the consistency of the relative distance to the participant, all devices
were kept fixed and marked using adhesive tape. The participant
performed the entire task in a dark room (3.6x3.2x2.5m) covered by black
curtains to prevent the light and create a good quality of the
stereoscopic environment.</p>

<fig id="fig03" fig-type="figure" position="float">
					<label>Fig. 3.</label>
					<caption>
						<p>An illustration of experimental layout of the current study (<xref ref-type="bibr" rid="b9">9</xref>).</p>
					</caption>
					<graphic id="graph03" xlink:href="jemr-11-06-c-figure-03.png"/>
				</fig>

    </sec>
	
    <sec id="S2c">
      <title>Independent variables</title>

<p>There were two independent variables in the current study: ID and
parallax. ID represents task difficulty and precision level determined
by movement distance and object width during tapping task (<xref ref-type="bibr" rid="b23">23</xref>). ISO
9241-9 classified the task precision into three levels to measure the
accuracy for tapping task: low, medium, and high (<xref ref-type="bibr" rid="b24">24</xref>). Table 2 shows the
details of ID and task precision level which was similar to our previous
study (<xref ref-type="bibr" rid="b9">9</xref>). Parallax represents the horizontal display disparity of two
images between right and left eyes to create 3D images (<xref ref-type="bibr" rid="b7">7</xref>). When the
object observed is located virtually in front of the screen, the
parallax is negative (<xref ref-type="bibr" rid="b1">1</xref>). In the current study, we developed zero
parallax (at the screen), negative parallax 20 cm, and negative parallax
50 cm in front of the screen (<xref ref-type="bibr" rid="b9">9</xref>). This range was selected to minimize
the effect of visual fatigue (<xref ref-type="bibr" rid="b8">8</xref>). Since there were six levels of ID and
three levels of parallax, therefore there were 18 different combinations
need to be completed by the participants.</p>

<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>ID and task precision level</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th>Distance (unity unit)</th>
        <th>Width (unity unit)</th>
        <th>ID (bits)</th>
        <th>Task Precision Level</th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>20</td>
        <td>3.3</td>
        <td>2.8</td>
        <td>Low</td>
      </tr>
      <tr>
        <td>20</td>
        <td>2.3</td>
        <td>3.3</td>
        <td>Low</td>
      </tr>
      <tr>
        <td>20</td>
        <td>0.6</td>
        <td>5.1</td>
        <td>Medium</td>
      </tr>
      <tr>
        <td>40</td>
        <td>3.3</td>
        <td>3.7</td>
        <td>Low</td>
      </tr>
      <tr>
        <td>40</td>
        <td>2.3</td>
        <td>4.2</td>
        <td>Medium</td>
      </tr>
      <tr>
        <td>40</td>
        <td>0.6</td>
        <td>6.1</td>
        <td>High</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

<table-wrap id="t03" position="float">
					<label>Table 3.</label>
					<caption>
						<p>Descriptive statistics and correlation coefficients among the observed variables.</p>
					</caption>
					<graphic id="graph07" xlink:href="jemr-11-06-c-table-03.png"/>
					</table-wrap>

    </sec>
	
    <sec id="S2d">
      <title>Dependent variables</title>

<p>There were five independent variables in the current study: eye gaze
movement time (EMT), fixation duration (FD), time to first fixation
(TFF), number of fixation (NF), and eye gaze accuracy (AC). EMT was the
elapsed time from the eye fixation point on the origin to the fixation
point of the next target (<xref ref-type="bibr" rid="b8">8</xref>). FD or average fixation duration (<xref ref-type="bibr" rid="b25">25</xref>) was
defined as an average duration of fixations made by the participant to
click the virtual target from the origin to the next target. TFF was
elapsed time from the slide presentation until the first fixation on the
virtual target (<xref ref-type="bibr" rid="b15">15</xref>). NF is a total number of fixations counted starting
from the origin virtual to destination virtual ball. As mentioned in the
introduction, AC was defined as the distance between the recorded
fixation locations and the actual location of the projection of the
image (<xref ref-type="bibr" rid="b9">9</xref>). Following our previous publication (<xref ref-type="bibr" rid="b9">9</xref>), AC was calculated
using the following formula:</p>

<fig id="eq01" fig-type="equation" position="anchor">
					<graphic id="equation01" xlink:href="jemr-11-06-c-equation-01.png"/>
				</fig>

<p>Where,</p>
<p>EFp = Eye fixation position</p>
<p>IPp = Image projection position</p>

<p>Eye fixation positions (EFp) were recorded by utilizing Tobii studio
in pixels, and the coordinate positions were converted into mm (<xref ref-type="bibr" rid="b8">8</xref>). The
image projection position (IPp) was measured from the location of the
projection image to the screen in mm. Both EFp and IPp were measured in
X-axis and Y-axis (2D). The y-axis was measured from the bottom to the
top and the x-axis was measured from left to right. We did not measure
the AC in the z-axis. As an independent variable (parallax), we
manipulated z-axis into three different levels: 0cm, 20cm, and 50cm in
front of the screen. The detailed calculation of EMT had been published
in Lin &#x26; Widyaningrum (<xref ref-type="bibr" rid="b8">8</xref>) and the detailed calculation of FD, TFF,
and NF had been published in Lin &#x26; Widyaningrum (<xref ref-type="bibr" rid="b8">8</xref>). Table 3
represents the descriptive statistics and Pearson correlation
coefficients among the observed variables as recommended by (<xref ref-type="bibr" rid="b15">15</xref>).</p>
    </sec>
	
    <sec id="S2e">
      <title>Experiment Procedures</title>

<p>The experiment was conducted according to the ethical guidelines of
the National Taiwan University Research Ethics Committee. Prior to the
experiments, participants needed to fill the consent form that described
the purpose of the study, the descriptions of experimental tasks, and
the confidential data of the participant. Then, participant sat on the
chair stably and wore the 3D glasses.</p>

<p>A calibration was conducted to ensure Tobii eye tracker can detect
participant’s eye movement. They were asked to look at the red
calibration dots as precise as possible until the red dots disappeared.
Regular calibration setting from Tobii eye tracker with five red dots
was used as the default to capture participant’s eye gaze binocularly.
The experiment can be continued when the quality of the calibration was
excellent.</p>

<fig id="fig04" fig-type="figure" position="float">
					<label>Fig. 4.</label>
					<caption>
						<p>The pointing sequence of the virtual red balls (shown as ball 1).</p>
					</caption>
					<graphic id="graph04" xlink:href="jemr-11-06-c-figure-04.png"/>
				</fig>

<p>A tapping task using 3D virtual ball was conducted following ISO
9241-9 (<xref ref-type="bibr" rid="b24">24</xref>). This task is also widely known as multi-directional tapping
task (<xref ref-type="bibr" rid="b26">26</xref>). The virtual balls were arranged in concentric circles and the
sequence is presented in Figure 4 (<xref ref-type="bibr" rid="b9">9</xref>). The virtual ball was created
using the Unity 3D platform version 4.3.4. The participants were
instructed to click the virtual red ball as fast and accurately as
possible (<xref ref-type="bibr" rid="b24">24</xref>). Each trial had twelve virtual red balls and the
participants were instructed to click all the virtual red balls. The
experiment took about 60 minutes. Participants start the task by
fixating their eyes on a virtual cube and click it using a virtual 3D
mouse which was also developed in the Unity 3D platform version 4.3.4.
Tobii eye tracker simultaneously recorded the participant’s eye gaze
movement and eye fixation point in each trial.</p>
    </sec>
	
    <sec id="S2f">
      <title>Structural Equation Modeling</title>

<p>Figure 2 shows that the eye parameters model had seven variables,
including two exogenous variables (index of difficulty and parallax) and
five endogenous variables (eye gaze movement time, fixation duration,
time to first fixation, number of fixation, and accuracy).</p>

<p>The structural equation model was derived using AMOS 22 with Maximum
Likelihood estimation approach. The difference between the hypothesized
model and the observed data were examined by four sets of tests: a full
model test, incremental fit indices, goodness of fit index, and badness
of fit index (<xref ref-type="bibr" rid="b17">17</xref>). For the full model test, normed Chi-Square
(<italic>χ</italic>²/df) of less than 2.0 (p-value &#x3E;0.05) indicated
no significant difference between the observed sample and SEM estimated
covariance matrices (<xref ref-type="bibr" rid="b17">17</xref>). Incremental fit index was measured by Normed
Fit Index (NFI), Tucker Lewis Index (TLI), and Comparative Fit Index
(CFI). Goodness of fit was measured by goodness of fit index (GFI) and
Adjusted Goodness of Fit Index (AGFI) which similar to R<sup>2</sup>
values used in the regression analysis. Finally, badness of fit index
was measured by Root Mean Square Error of Approximation (RMSEA) and
Standardized Root Mean Residual (SRMR). Values greater than 0.95 for
NFI, TLI, CFI, GFI, AGFI (<xref ref-type="bibr" rid="b27 b28 b29">27, 28, 29</xref>), smaller than 0.07 for RMSEA and
smaller than 0.08 for SRMR indicated a good fit (<xref ref-type="bibr" rid="b17 b30">17, 30</xref>).</p>

<p>Since there were 18 combinations tested on 10 participants, a total
of 180 data was analyzed. It has been advocated to conduct bootstrapping
technique when sample sizes are under 250 (<xref ref-type="bibr" rid="b31">31</xref>). Bootstrapping is a
technique which generates an empirical representation of sampling
distribution of the data (<xref ref-type="bibr" rid="b32">32</xref>). It is repeatedly resampled during
analysis as a means of duplicating the original sampling process (<xref ref-type="bibr" rid="b32">32</xref>).
This study applied bootstrapping technique with the bias-corrected 95%
confidence interval bootstrap percentiles.</p>
    </sec>
    </sec>

    <sec id="S3">
      <title>Results &#x26; Discussion</title>

<fig id="fig05" fig-type="figure" position="float">
					<label>Fig. 5.</label>
					<caption>
						<p>Initial SEM model for eye movement parameters in
projection-based stereoscopic display.</p>
					</caption>
					<graphic id="graph05" xlink:href="jemr-11-06-c-figure-05.png"/>
				</fig>      

<p>The initial SEM model for eye movement parameters is presented in
Figure 5. Based on this figure, seven hypotheses were found not
significant. Therefore, a revised model was derived by removing these
seven paths: ID-AC (Hypothesis 2), ID-TFF (Hypothesis 5), EMT-TFF
(Hypothesis 11), EMT-FD (Hypothesis 14), FD-AC (Hypothesis 15), FD-TFF
(Hypothesis 17) (<xref ref-type="bibr" rid="b15">15</xref>), and TFF-AC (Hypothesis 18).</p>

<fig id="fig06" fig-type="figure" position="float">
					<label>Fig. 6.</label>
					<caption>
						<p>Final SEM model for eye movement parameters in
    projection-based stereoscopic display.</p>
					</caption>
					<graphic id="graph06" xlink:href="jemr-11-06-c-figure-06.png"/>
				</fig>

<p>The final SEM analysis of eye movement parameter is presented in
Figure 6 above. As presented in Table 4, the full model fit test index
norm <italic>χ</italic>² was smaller than 2 (norm χ² = 1.976, p=0.922)
and all incremental fit indices were greater than 0.97, which indicates
that the hypothesized model was a very good representation of the
observed data. The GFI and AGFI values were 0.989 and 0.956 respectively
which also greater than recommended value of 0.95. Regarding badness of
fit index, RMSEA and RMR were 0.009 and 0.010 respectively which also
smaller than recommended values.</p>

<table-wrap id="t04" position="float">
					<label>Table 4.</label>
					<caption>
						<p>Parameter Estimates, and Goodness of Fit</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th><bold>Goodness of fit measures of the SEM</bold></th>
        <th><bold>Parameter Estimates</bold></th>
        <th><bold>Suggested cut-off</bold></th>
        <th><bold>Recommended by</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>p-value for Chi-square (<italic>χ</italic>²)</td>
        <td>0.418</td>
        <td>&#x3E; 0.05</td>
        <td>(<xref ref-type="bibr" rid="b17">17</xref>)</td>
      </tr>
      <tr>
        <td>Chi-square statistic (<italic>χ</italic>²)</td>
        <td>7.102</td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Degree of freedom (df)</td>
        <td>7</td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Normed chi-square (<italic>χ</italic>²/df)</td>
        <td>1.015</td>
        <td>&#x3C; 2</td>
        <td>(<xref ref-type="bibr" rid="b12">12</xref>)</td>
      </tr>
      <tr>
        <td><bold>Incremental Fit Indices</bold></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Normed Fit Index (NFI)</td>
        <td>0.978</td>
        <td>&#x3E; 0.95</td>
        <td>(<xref ref-type="bibr" rid="b27">27</xref>)</td>
      </tr>
      <tr>
        <td>Tucker Lewis Index (TLI)</td>
        <td>0.999</td>
        <td>&#x3E; 0.95</td>
        <td>(<xref ref-type="bibr" rid="b27">27</xref>)</td>
      </tr>
      <tr>
        <td>Comparative Fit Index (CFI)</td>
        <td>1.000</td>
        <td>&#x3E; 0.96</td>
        <td>(<xref ref-type="bibr" rid="b28">28</xref>)</td>
      </tr>
      <tr>
        <td><bold>Goodness-of-fit index</bold></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Goodness of Fit Index (GFI)</td>
        <td>0.989</td>
        <td>&#x3E; 0.95</td>
        <td>(<xref ref-type="bibr" rid="b29">29</xref>)</td>
      </tr>
      <tr>
        <td>Adjusted Goodness of Fit Index (AGFI)</td>
        <td>0.956</td>
        <td>&#x3E; 0.95</td>
        <td>(<xref ref-type="bibr" rid="b29">29</xref>)</td>
      </tr>
      <tr>
        <td><bold>Badness-of-fit index</bold></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Root Mean Square Error of Approximation (RMSEA)</td>
        <td>0.009</td>
        <td>&#x3C; 0.07</td>
        <td>(<xref ref-type="bibr" rid="b30">30</xref>)</td>
      </tr>
      <tr>
        <td>Root Mean Square Residual (RMR)</td>
        <td>0.010</td>
        <td>&#x3C; 0.08</td>
        <td>(<xref ref-type="bibr" rid="b17">17</xref>)</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

<p>Based on Table 5, SEM indicates that ID had a significant direct
effects on EMT (β: 0.371, p=0.003), FD (β: 0.680, p=0.003), and NF (β:
0.371, p=0.003). Interestingly, ID was found not have a significant
direct and indirect effects on AC. Therefore, while designing a task
under the stereoscopic display, it is advocated to set ID between 2.8
and 6.1 bits since it would not significantly affect AC. Another very
interesting correlation was found between ID and NF. ID was found had a
positive significant direct effect on NF (β: 0.380, p=0.003), however,
ID was also found had a negative significant indirect effect on NF (β:
-0.271, p=0.004). The total effect of ID to NF become less significant
due to an indirect effect through EMT (β: 0.109, p=0.080).</p>

<p>Identical to our previous studies about the effect of parallax using
one-way repeated ANOVA (<xref ref-type="bibr" rid="b8 b9">8, 9</xref>), parallax had significant direct effects
on EMT (β: 0.156, p=0.039), FD (β: 0.281, p=0.003), NF (β: -0.298,
p=0.002), and AC (β: -0.222, p=0.001). Apart from the significant direct
effects, interestingly, parallax was also found to had significant
indirect effects on NF (β: -0.169, p=0.002) and AC (β: -0.133, p=0.003).
Despite the application of different statistical techniques, the direct
effect of parallax in the current SEM analysis matches with the previous
one-way repeated ANOVA analysis. In addition, SEM also can reveal the
significant indirect effect which could not be obtained by utilizing
one-way repeated ANOVA analysis.</p>

<p>The total effect of one parameter on another is the sum of the direct
and the indirect relationships between them (<xref ref-type="bibr" rid="b17">17</xref>). Based on Table 4,
parallax was found had the highest total effect on AC comparing to other
parameters (β: -0.355, p=0.002), indicating that parallax is a key while
designing stereoscopic display. The highest accuracy was achieved when
the virtual ball was projected at the screen (<xref ref-type="bibr" rid="b9">9</xref>). Therefore, it is also
advocated to apply projection at the screen comparing to projection at
20 or 50 cm in front of the screen. This finding is also supported by
Fuchs (<xref ref-type="bibr" rid="b1">1</xref>) who mentioned that parallax should be small so as not to
create difficulties for stereoscopic display.</p>

<table-wrap id="t05" position="float">
					<label>Table 5.</label>
					<caption>
						<p>Total Effects of ID and Parallax on Eye Movement
Parameters</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th><bold>No</bold></th>
        <th><bold>Variables</bold></th>
        <th><bold>Direct effect</bold></th>
        <th><bold>P value</bold></th>
        <th><bold>Indirect effect</bold></th>
        <th><bold>P value</bold></th>
        <th><bold>Total effect</bold></th>
        <th><bold>P value</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>1</td>
        <td>ID -&#x3E; EMT</td>
        <td>0.371</td>
        <td>0.003</td>
        <td>No path</td>
        <td>------</td>
        <td>0.371</td>
        <td>0.003</td>
      </tr>
      <tr>
        <td>2</td>
        <td>ID -&#x3E; AC</td>
        <td>No path</td>
        <td>------</td>
        <td>-0.080</td>
        <td>0.104</td>
        <td>-0.080</td>
        <td>0.104</td>
      </tr>
      <tr>
        <td>3</td>
        <td>ID -&#x3E; NF</td>
        <td>0.380</td>
        <td>0.003</td>
        <td>-0.271</td>
        <td>0.004</td>
        <td>0.109</td>
        <td>0.080</td>
      </tr>
      <tr>
        <td>4</td>
        <td>ID -&#x3E; FD</td>
        <td>0.680</td>
        <td>0.003</td>
        <td>No path</td>
        <td>------</td>
        <td>0.680</td>
        <td>0.003</td>
      </tr>
      <tr>
        <td>5</td>
        <td>PAR -&#x3E; EMT</td>
        <td>0.156</td>
        <td>0.039</td>
        <td>No path</td>
        <td>------</td>
        <td>0.156</td>
        <td>0.039</td>
      </tr>
      <tr>
        <td>6</td>
        <td>PAR -&#x3E; AC</td>
        <td>-0.222</td>
        <td>0.001</td>
        <td>-0.133</td>
        <td>0.003</td>
        <td>-0.355</td>
        <td>0.002</td>
      </tr>
      <tr>
        <td>7</td>
        <td>PAR-&#x3E; NF</td>
        <td>-0.298</td>
        <td>0.002</td>
        <td>-0.169</td>
        <td>0.002</td>
        <td>-0.467</td>
        <td>0.002</td>
      </tr>
      <tr>
        <td>8</td>
        <td>PAR -&#x3E; FD</td>
        <td>0.281</td>
        <td>0.003</td>
        <td>No path</td>
        <td>------</td>
        <td>0.281</td>
        <td>0.003</td>
      </tr>
      <tr>
        <td>9</td>
        <td>PAR -&#x3E; TFF</td>
        <td>0.249</td>
        <td>0.002</td>
        <td>No path</td>
        <td>------</td>
        <td>0.249</td>
        <td>0.002</td>
      </tr>
      <tr>
        <td>10</td>
        <td>EMT-&#x3E; AC</td>
        <td>-0.274</td>
        <td>0.011</td>
        <td>-0.044</td>
        <td>0.007</td>
        <td>-0.318</td>
        <td>0.002</td>
      </tr>
      <tr>
        <td>11</td>
        <td>EMT -&#x3E; NF</td>
        <td>-0.224</td>
        <td>0.002</td>
        <td>No path</td>
        <td>-</td>
        <td>-0.224</td>
        <td>0.002</td>
      </tr>
      <tr>
        <td>12</td>
        <td>FD -&#x3E; AC</td>
        <td>No path</td>
        <td>------</td>
        <td>-0.054</td>
        <td>0.007</td>
        <td>-0.054</td>
        <td>0.007</td>
      </tr>
      <tr>
        <td>13</td>
        <td>FD -&#x3E; NF</td>
        <td>-0.276</td>
        <td>0.004</td>
        <td>No path</td>
        <td>------</td>
        <td>-0.276</td>
        <td>0.004</td>
      </tr>
      <tr>
        <td>14</td>
        <td>TFF -&#x3E; AC</td>
        <td>No path</td>
        <td>------</td>
        <td>-0.044</td>
        <td>0.007</td>
        <td>-0.044</td>
        <td>0.007</td>
      </tr>
      <tr>
        <td>15</td>
        <td>TFF -&#x3E; NF</td>
        <td>-0.228</td>
        <td>0.003</td>
        <td>No path</td>
        <td>------</td>
        <td>-0.228</td>
        <td>0.003</td>
      </tr>
      <tr>
        <td>16</td>
        <td>NF -&#x3E; AC</td>
        <td>0.194</td>
        <td>0.009</td>
        <td>No path</td>
        <td>------</td>
        <td>0.194</td>
        <td>0.009</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

<p>SEM can analyze the mediating effect between parameters construct
simultaneously (<xref ref-type="bibr" rid="b17">17</xref>). There are two types of mediator: full mediator and
partial mediator. Our results indicate that EMT was a partial mediator
between parallax-AC, a partial mediator between ID-NF, and a full
mediator between ID-AC. In addition, NF was found to be a full mediator
between FD-AC and TFF-AC.</p>

<p>Another advantage of utilizing SEM approach is the direct effect of
two exogenous variables on one endogenous variable can be analyzed
simultaneously (Hair et al., 2006). While comparing the direct effect of
ID and parallax on EMT, it was found that ID had a higher effect on EMT
(β: 0.371, p=0.003) than parallax (β: 0.156, p=0.039) on EMT. Regarding
the effect on FD, ID was found to affect FD (β: 0.680, p=0.003) more
than parallax (β: 0.281, p=0.003). Longer FD indicated that the
participants faced greater cognitive processing difficulty under
stereoscopic display and they required more effort to process the
information of virtual red ball’s position to perceived it clearly (<xref ref-type="bibr" rid="b33 b34">33, 34</xref>). 
Another interesting correlation was found while comparing the
effect on NF. Based on the direct effect, ID was found to affect NF (β:
0.371, p=0.003) more than parallax (β: -0.298, p=0.002). However, while
comparing the total effect on NF, it was found that parallax actually
affect NF (β: -0.467, p=0.002) more than ID (β: 109, p=0.080) since the
effect of ID on NF became smaller due to an indirect effect through EMT
(β: -0.271, p=0.004).</p>

<p>There were significant indirect effects of FD (β: -0.054, p=0.007)
and TFF (β: -0.044, p=0.007) on AC. The indirect effect of FD was
slightly higher than TFF on AC. However, these total indirect effects
were very small comparing to the effect of parallax. Our results also
indicate that NF is highly more correlated to AC than FD. This result is
contradictory to Togami who mentioned that AC is more related to FD than
NF (<xref ref-type="bibr" rid="b22">22</xref>). This could probably be explained by the difference in the
environment of the task. Togami measured the eye movement parameters
under 2D screen while the current study measured the eye movement
parameters under stereoscopic 3D display (<xref ref-type="bibr" rid="b22">22</xref>). Our findings indicate
that in stereoscopic display, higher NF is strongly correlated to higher
AC.</p>

<p>Similar finding with Goldberg (<xref ref-type="bibr" rid="b15">15</xref>), there was a significant direct
effect of TFF on NF (β: -0.228, p=0.003). However, our study indicated
that higher TFF was highly associated with lower NF while Goldberg found
that higher TFF was also highly associated with higher NF (<xref ref-type="bibr" rid="b15">15</xref>). This
could probably also be explained by the difference in the environment of
the task.</p>

<p>Our results concluded that higher AC was achieved by lowering
parallax (at the screen), longer EMT, higher NF, longer FD, longer TFF.
This finding is linear to Schoonahd et al who mentioned that longer FD
and higher NF would lead to higher AC (<xref ref-type="bibr" rid="b35">35</xref>).</p>

<p>The current study is the first attempt to analyze interrelationship
among eye movement parameters in the projection-based stereoscopic
display by utilizing SEM approach. This approach could discover further
causal relationships among selected eye movement parameters which could
not be discovered by using simple correlation analysis such as study
conducted by Goldberg (<xref ref-type="bibr" rid="b15">15</xref>). The derived SEM could provide valuable
theoretical foundations of the interrelationship among eye movement
parameters for VR researchers and human-virtual reality interface
developers.</p>

<p>As powerful as it seems, there are several limitations when
generalizing about the research findings derived from the current SEM
model. First of all, the current study chose to measure eye movement
parameters under projection-based stereoscopic display with negative
parallax. The derived SEM model could be different depending on the type
of environment used to measure the eye movement parameters, for
instance, head-mounted display (<xref ref-type="bibr" rid="b5 b36 b37">5, 36, 37</xref>) could probably produce a
different SEM model compared to our projection-based stereoscopic model.
In addition, the difference in task parameters and stimulus materials
could affect the eye movement parameters (<xref ref-type="bibr" rid="b13">13</xref>). Therefore, the derived
SEM model was also limited to negative parallax. Second, the current
study only measured EMT, NF, FD, TFF, and AC which describes a portion
of the potential universe eye movement parameters. Other parameters such
as pupil size (<xref ref-type="bibr" rid="b39 b40">39, 40</xref>) and eye correction phase time might reveal more
information regarding the interrelationship among eye movement
parameters.</p>
    </sec>

    <sec id="S4">
      <title>Conclusions</title>

<p>Virtual reality (VR) has developed significantly in the world over
the past two decades. The current study is the first attempt to analyze
the interrelationship among eye movement parameters in the
projection-based stereoscopic display by utilizing SEM approach. SEM
analyzed the interrelationship among index of difficulty (ID) and
parallax on eye gaze movement time (EMT), fixation duration (FD), time
to first fixation (TFF), number of fixation (NF), and eye gaze accuracy
(AC) simultaneously in projection-based stereoscopic display by
utilizing Tobii eye tracker system. Ten participants were recruited to
perform multi-directional tapping task using within-subject design with
three different levels of parallax and six different levels of ID. SEM
proved that ID had significant direct effects on EMT, NF, and FD also a
significant indirect effect on NF. However, ID was found not a strong
predictor for AC. SEM also proved that parallax had significant direct
effects on EMT, NF, FD, TFF, and AC. Apart from the direct effect,
parallax also had significant indirect effects on NF and AC. Regarding
the interrelationship among dependent variables, there were significant
indirect effects of FD and TFF on AC. The results of SEM can be used to
evaluate all of the above affecting factors for predicting eye gaze
accuracy. Our results concluded that higher AC was achieved by lowering
parallax (at the screen), longer EMT, higher NF, longer FD, longer TFF.
The current study is the first attempt to analyze interrelationship
among eye movement parameters in the projection-based stereoscopic
display by utilizing SEM approach. These findings could provide valuable
theoretical foundations of the interrelationship among eye movement
parameters for VR researchers and human-virtual reality interface
developers.</p>
    </sec>

    <sec id="S5">
      <title>Acknowledgements</title>

<p>This work was supported by the Ministry of Science and Technology of
Taiwan (MOST 103-2221-E-011-100-MY3).</p>

    </sec>
</body>
<back>
<ref-list>
<ref id="b10"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Bergstrom</surname>, <given-names>J. R.</given-names></name>, &#x26; <name><surname>Schall</surname>, <given-names>A. J.</given-names></name></person-group> (<year>2014</year>). <source>Eye Tracking in User Experience Design</source>. <publisher-name>Elsevier</publisher-name>.</mixed-citation></ref>
<ref id="b21"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Castner</surname>, <given-names>H. W.</given-names></name>, &#x26; <name><surname>Eastman</surname>, <given-names>R. J.</given-names></name></person-group> (<year>1984</year>). <article-title>Eye-Movement Parameters and Perceived Map Complexity&#8212;I.</article-title> <source>American Cartographer</source>, <volume>11</volume>(<issue>2</issue>), <fpage>107</fpage>&#8211;<lpage>117</lpage>. <pub-id pub-id-type="doi">10.1559/152304084783914768</pub-id><issn>0094-1689</issn></mixed-citation></ref>
<ref id="b16"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Chang</surname>, <given-names>Y.-H.</given-names></name>, &#x26; <name><surname>Yeh</surname>, <given-names>C.-H.</given-names></name></person-group> (<year>2010</year>). <article-title>Human performance interfaces in air traffic control.</article-title> <source>Applied Ergonomics</source>, <volume>41</volume>(<issue>1</issue>), <fpage>123</fpage>&#8211;<lpage>129</lpage>. <pub-id pub-id-type="doi">10.1016/j.apergo.2009.06.002</pub-id><pub-id pub-id-type="pmid">19580957</pub-id><issn>0003-6870</issn></mixed-citation></ref>
<ref id="b39"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Choe</surname>, <given-names>K. W.</given-names></name>, <name><surname>Blake</surname>, <given-names>R.</given-names></name>, &#x26; <name><surname>Lee</surname>, <given-names>S.-H.</given-names></name></person-group> (<year>2016</year>). <article-title>Pupil size dynamics during fixation impact the accuracy and precision of video-based gaze estimation.</article-title> <source>Vision Research</source>, <volume>118</volume>, <fpage>48</fpage>&#8211;<lpage>59</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2014.12.018</pub-id><pub-id pub-id-type="pmid">25578924</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b1"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Fuchs</surname>, <given-names>P.</given-names></name>, &#x26; <name><surname>Guez</surname>, <given-names>J.</given-names></name></person-group> (<year>2017</year>). <source>Virtual reality headsets: a theoretical and pragmatic approach</source>. <publisher-name>CRC Press/Balkema</publisher-name>. <pub-id pub-id-type="doi">10.1201/9781315208244</pub-id></mixed-citation></ref>
<ref id="b33"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Goldberg</surname>, <given-names>J. H.</given-names></name>, &#x26; <name><surname>Kotval</surname>, <given-names>X. P.</given-names></name></person-group> (<year>1999</year>). <article-title>Computer interface evaluation using eye movements: Methods and constructs.</article-title> <source>International Journal of Industrial Ergonomics</source>, <volume>24</volume>(<issue>6</issue>), <fpage>631</fpage>&#8211;<lpage>645</lpage>. <pub-id pub-id-type="doi">10.1016/S0169-8141(98)00068-7</pub-id><issn>0169-8141</issn></mixed-citation></ref>
<ref id="b15"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Goldberg</surname>, <given-names>J. H.</given-names></name></person-group> (<year>2014</year>). <article-title>Measuring Software Screen Complexity: Relating Eye Tracking, Emotional Valence, and Subjective Ratings.</article-title> <source>International Journal of Human-Computer Interaction</source>, <volume>30</volume>(<issue>7</issue>), <fpage>518</fpage>&#8211;<lpage>532</lpage>. <pub-id pub-id-type="doi">10.1080/10447318.2014.906156</pub-id><issn>1044-7318</issn></mixed-citation></ref>
<ref id="b17"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hair</surname>, <given-names>J.</given-names></name>, <name><surname>Anderson</surname>, <given-names>R.</given-names></name>, <name><surname>Tatham</surname>, <given-names>R.</given-names></name>, &#x26; <name><surname>Black</surname>, <given-names>W.</given-names></name></person-group> (<year>2006</year>). <source>Multivariate data analysis</source> (<edition>6th ed.</edition>). <publisher-name>Prentice Hall</publisher-name>.</mixed-citation></ref>
<ref id="b32"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hayes</surname>, <given-names>A. F.</given-names></name></person-group> (<year>2009</year>). <article-title>Beyond Baron and Kenny: Statistical Mediation Analysis in the New Millennium.</article-title> <source>Communication Monographs</source>, <volume>76</volume>(<issue>4</issue>), <fpage>408</fpage>&#8211;<lpage>420</lpage>. <pub-id pub-id-type="doi">10.1080/03637750903310360</pub-id><issn>0363-7751</issn></mixed-citation></ref>
<ref id="b2"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Herman</surname>, <given-names>L.</given-names></name>, <name><surname>Popelka</surname>, <given-names>S.</given-names></name>, &#x26; <name><surname>Hejlova</surname>, <given-names>V.</given-names></name></person-group> (<year>2017</year>). <article-title>Eye-tracking analysis of interactive 3D geovisualitzation.</article-title> <source>Journal of Eye Movement Research</source>, <volume>10</volume>(<issue>3</issue>), <fpage>2</fpage>.<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b29"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hoelter</surname>, <given-names>J. W.</given-names></name></person-group> (<year>1983</year>). <article-title>The Analysis of Covariance Structures.</article-title> <source>Sociological Methods &#x26; Research</source>, <volume>11</volume>(<issue>3</issue>), <fpage>325</fpage>&#8211;<lpage>344</lpage>. <pub-id pub-id-type="doi">10.1177/0049124183011003003</pub-id><issn>0049-1241</issn></mixed-citation></ref>
<ref id="b34"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Holmqvist</surname>, <given-names>K.</given-names></name>, <name><surname>Nystr</surname>, <given-names>M.</given-names></name>, <name><surname>Andersson</surname>, <given-names>R.</given-names></name>, <name><surname>Dewhurst</surname>, <given-names>R.</given-names></name>, <name><surname>Jarodzka</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Weijer</surname>, <given-names>J. V. D.</given-names></name></person-group> (<year>2011</year>). <source>Eye tracking: A comprehensive guide to methods and measures</source>. <publisher-name>Oxford University Press</publisher-name>.</mixed-citation></ref>
<ref id="b27"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hooper</surname>, <given-names>D.</given-names></name>, <name><surname>Coughlan</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Mullen</surname>, <given-names>M. R.</given-names></name></person-group> (<year>2008</year>). <article-title>Structural equation modelling: Guidelines for determining model fit.</article-title> <source>Electronic Journal of Business Research Methods</source>, <volume>6</volume>(<issue>1</issue>), <fpage>53</fpage>&#8211;<lpage>60</lpage>.<issn>1477-7029</issn></mixed-citation></ref>
<ref id="b28"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hu</surname>, <given-names>L. T.</given-names></name>, &#x26; <name><surname>Bentler</surname>, <given-names>P. M.</given-names></name></person-group> (<year>1999</year>). <article-title>Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives.</article-title> <source>Structural Equation Modeling</source>, <volume>6</volume>(<issue>1</issue>), <fpage>1</fpage>&#8211;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1080/10705519909540118</pub-id><issn>1070-5511</issn></mixed-citation></ref>
<ref id="b24"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><collab>ISO</collab></person-group>. DIS 9241-11: Ergonomic requirements for office work with visual display terminals (VDTs). The International organization for standardization, 45; <year>2000</year>.</mixed-citation></ref>
<ref id="b5"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Jeong</surname>, <given-names>S.</given-names></name>, <name><surname>Jung</surname>, <given-names>E. S.</given-names></name>, &#x26; <name><surname>Im</surname>, <given-names>Y.</given-names></name></person-group> (<year>2016</year>). <article-title>Ergonomic evaluation of interaction techniques and 3D menus for the practical design of 3D stereoscopic displays.</article-title> <source>International Journal of Industrial Ergonomics</source>, <volume>53</volume>, <fpage>205</fpage>&#8211;<lpage>218</lpage>. <pub-id pub-id-type="doi">10.1016/j.ergon.2016.01.001</pub-id><issn>0169-8141</issn></mixed-citation></ref>
<ref id="b36"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Kim</surname>, <given-names>J-H.</given-names></name>, <name><surname>Son</surname>, <given-names>H-J.</given-names></name>, <name><surname>Lee</surname>, <given-names>S-J.</given-names></name>, <name><surname>Yun</surname>, <given-names>D-Y.</given-names></name>, <name><surname>Kwon</surname>, <given-names>S-C.</given-names></name>, <name><surname>Lee</surname>, <given-names>S-H.</given-names></name></person-group> <article-title>Effectiveness of virtual reality head-mounted display system-based developmental eye movement test.</article-title> Journal of Eye Movement Research. <year>2016</year>;9(6):4, 1-14.&#160;</mixed-citation></ref>
<ref id="b25"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lai</surname>, <given-names>M.-L.</given-names></name>, <name><surname>Tsai</surname>, <given-names>M.-J.</given-names></name>, <name><surname>Yang</surname>, <given-names>F.-Y.</given-names></name>, <name><surname>Hsu</surname>, <given-names>C.-Y.</given-names></name>, <name><surname>Liu</surname>, <given-names>T.-C.</given-names></name>, <name><surname>Lee</surname>, <given-names>S. W.-Y.</given-names></name>, <name><surname>Lee</surname>, <given-names>M.-H.</given-names></name>, <name><surname>Chiou</surname>, <given-names>G.-L.</given-names></name>, <name><surname>Liang</surname>, <given-names>J.-C.</given-names></name>, &#x26; <name><surname>Tsai</surname>, <given-names>C.-C.</given-names></name></person-group> (<year>2013</year>). <article-title>A review of using eye-tracking technology in exploring learning from 2000 to 2012.</article-title> <source>Educational Research Review</source>, <volume>10</volume>, <fpage>90</fpage>&#8211;<lpage>115</lpage>. <pub-id pub-id-type="doi">10.1016/j.edurev.2013.10.001</pub-id><issn>1747-938X</issn></mixed-citation></ref>
<ref id="b19"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lin</surname>, <given-names>C. J.</given-names></name>, <name><surname>Chang</surname>, <given-names>C.-C.</given-names></name>, &#x26; <name><surname>Lee</surname>, <given-names>Y.-H.</given-names></name></person-group> (<year>2014</year>). <article-title>Evaluating camouflage design using eye movement data.</article-title> <source>Applied Ergonomics</source>, <volume>45</volume>(<issue>3</issue>), <fpage>714</fpage>&#8211;<lpage>723</lpage>. <pub-id pub-id-type="doi">10.1016/j.apergo.2013.09.012</pub-id><pub-id pub-id-type="pmid">24139724</pub-id><issn>0003-6870</issn></mixed-citation></ref>
<ref id="b26"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lin</surname>, <given-names>C. J.</given-names></name>, <name><surname>Ho</surname>, <given-names>S.-H.</given-names></name>, &#x26; <name><surname>Chen</surname>, <given-names>Y.-J.</given-names></name></person-group> (<year>2015</year>). <article-title>An investigation of pointing postures in a 3D stereoscopic environment.</article-title> <source>Applied Ergonomics</source>, <volume>48</volume>, <fpage>154</fpage>&#8211;<lpage>163</lpage>. <pub-id pub-id-type="doi">10.1016/j.apergo.2014.12.001</pub-id><pub-id pub-id-type="pmid">25683543</pub-id><issn>0003-6870</issn></mixed-citation></ref>
<ref id="b8"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lin</surname>, <given-names>C. J.</given-names></name>, &#x26; <name><surname>Widyaningrum</surname>, <given-names>R.</given-names></name></person-group> (<year>2016</year>). <article-title>Eye pointing in stereoscopic displays.</article-title> <source>Journal of Eye Movement Research</source>, <volume>9</volume>(<issue>5</issue>), <fpage>4</fpage>.<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b4"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lin</surname>, <given-names>C. J.</given-names></name>, <name><surname>Cheng</surname>, <given-names>C.-F.</given-names></name>, <name><surname>Chen</surname>, <given-names>H.-J.</given-names></name>, &#x26; <name><surname>Wu</surname>, <given-names>K.-Y.</given-names></name></person-group> (<year>2017</year>, <month>April</month>). <article-title>Training Performance of Laparoscopic Surgery in Two- and Three-Dimensional Displays.</article-title> <source>Surgical Innovation</source>, <volume>24</volume>(<issue>2</issue>), <fpage>162</fpage>&#8211;<lpage>170</lpage>. <pub-id pub-id-type="doi">10.1177/1553350617692638</pub-id><pub-id pub-id-type="pmid">28190372</pub-id><issn>1553-3506</issn></mixed-citation></ref>
<ref id="b7"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lin</surname>, <given-names>C. J.</given-names></name>, &#x26; <name><surname>Woldegiorgis</surname>, <given-names>B. H.</given-names></name></person-group> (<year>2018</year>). <article-title>Kinematic analysis of direct pointing in projection-based stereoscopic environments.</article-title> <source>Human Movement Science</source>, <volume>57</volume>, <fpage>21</fpage>&#8211;<lpage>31</lpage>. <pub-id pub-id-type="doi">10.1016/j.humov.2017.11.002</pub-id><pub-id pub-id-type="pmid">29132076</pub-id><issn>0167-9457</issn></mixed-citation></ref>
<ref id="b9"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lin</surname>, <given-names>C. J.</given-names></name>, &#x26; <name><surname>Widyaningrum</surname>, <given-names>R.</given-names></name></person-group> (<year>2018</year>). <article-title>The effect of parallax on eye fixation parameter in projection-based stereoscopic displays.</article-title> <source>Applied Ergonomics</source>, <volume>69</volume>, <fpage>10</fpage>&#8211;<lpage>16</lpage>. <pub-id pub-id-type="doi">10.1016/j.apergo.2017.12.020</pub-id><pub-id pub-id-type="pmid">29477316</pub-id><issn>0003-6870</issn></mixed-citation></ref>
<ref id="b38"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ma</surname>, <given-names>M.-Y.</given-names></name>, &#x26; <name><surname>Chuang</surname>, <given-names>H.-C.</given-names></name></person-group> (<year>2005</year>). <article-title>A legibility study of chinese character complicacy and eye movement data.</article-title> <source>Perceptual and Motor Skills</source>, <volume>120</volume>(<issue>1</issue>), <fpage>232</fpage>–<lpage>246</lpage>. <pub-id pub-id-type="doi">10.1080/13506280444000409</pub-id><issn>1350-6285</issn></mixed-citation></ref>
<ref id="b12"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ma</surname>, <given-names>Q.</given-names></name>, <name><surname>Chan</surname>, <given-names>A. H.</given-names></name>, &#x26; <name><surname>Chen</surname>, <given-names>K.</given-names></name></person-group> (<year>2016</year>). <article-title>Personal and other factors affecting acceptance of smartphone technology by older Chinese adults.</article-title> <source>Applied Ergonomics</source>, <volume>54</volume>, <fpage>62</fpage>&#8211;<lpage>71</lpage>. <pub-id pub-id-type="doi">10.1016/j.apergo.2015.11.015</pub-id><pub-id pub-id-type="pmid">26851465</pub-id><issn>0003-6870</issn></mixed-citation></ref>
<ref id="b3"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Naour</surname>, <given-names>T. L.</given-names></name>, &#x26; <name><surname>Bresciani</surname>, <given-names>J.-P.</given-names></name></person-group> (<year>2017</year>). <article-title>A skeleton-based approach to analyzing oculomotor behavior when viewing animated characters.</article-title> <source>Journal of Eye Movement Research</source>, <volume>10</volume>(<issue>5</issue>), <fpage>7</fpage>.<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b31"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Nevitt</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Hancock</surname>, <given-names>G.</given-names></name></person-group> (<year>2001</year>, <month>January</month>). <article-title>Performance of Bootstrapping Approaches to Model Test Statistics and Parameter Standard Error Estimation in Structural Equation Modeling.</article-title> <source>Structural Equation Modeling</source>, <volume>8</volume>(<issue>3</issue>), <fpage>353</fpage>&#8211;<lpage>377</lpage>. <pub-id pub-id-type="doi">10.1207/S15328007SEM0803_2</pub-id><issn>1070-5511</issn></mixed-citation></ref>
<ref id="b40"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Nystr&#246;m</surname>, <given-names>M.</given-names></name>, <name><surname>Hooge</surname>, <given-names>I.</given-names></name>, &#x26; <name><surname>Andersson</surname>, <given-names>R.</given-names></name></person-group> (<year>2016</year>). <article-title>Pupil size influences the eye-tracker signal during saccades.</article-title> <source>Vision Research</source>, <volume>121</volume>, <fpage>95</fpage>&#8211;<lpage>103</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2016.01.009</pub-id><pub-id pub-id-type="pmid">26940030</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b18"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ooms</surname>, <given-names>K.</given-names></name>, <name><surname>Dupont</surname>, <given-names>L.</given-names></name>, <name><surname>Lapon</surname>, <given-names>L.</given-names></name>, &#x26; <name><surname>Popelka</surname>, <given-names>S.</given-names></name></person-group> (<year>2015</year>). <article-title>Accuracy and precision of fixation locations recorded with the low-cost eye tribe tracker in different experimental set-ups.</article-title> <source>Journal of Eye Movement Research</source>, <volume>8</volume>(<issue>1</issue>), <fpage>1</fpage>&#8211;<lpage>24</lpage>.<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b14"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Pannasch</surname>, <given-names>S.</given-names></name>, <name><surname>Helmert</surname>, <given-names>J. R.</given-names></name>, <name><surname>Roth</surname>, <given-names>K.</given-names></name>, <name><surname>Herbold</surname>, <given-names>A.-K.</given-names></name>, &#x26; <name><surname>Walter</surname>, <given-names>H.</given-names></name></person-group> (<year>2008</year>). <article-title>Visual fixation durations and saccade amplitudes: Shifting relationship in a veriety of conditions.</article-title> <source>Journal of Eye Movement Research</source>, <volume>2</volume>(<issue>2</issue>), <fpage>4</fpage>.<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b23"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Park</surname>, <given-names>K. S.</given-names></name>, <name><surname>Hong</surname>, <given-names>G. B.</given-names></name>, &#x26; <name><surname>Lee</surname>, <given-names>S.</given-names></name></person-group> (<year>2012</year>). <article-title>Fatigue problems in remote pointing and the use of an upper-arm support.</article-title> <source>International Journal of Industrial Ergonomics</source>, <volume>42</volume>(<issue>3</issue>), <fpage>293</fpage>&#8211;<lpage>303</lpage>. <pub-id pub-id-type="doi">10.1016/j.ergon.2012.02.005</pub-id><issn>0169-8141</issn></mixed-citation></ref>
<ref id="b6"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Patterson</surname>, <given-names>R. E.</given-names></name></person-group> (<year>2016</year>). <source>Human factors of stereoscopic 3D displays</source>. <publisher-name>Springer</publisher-name>.</mixed-citation></ref>
<ref id="b11"><mixed-citation publication-type="book-chapter" specific-use="linked"><person-group person-group-type="author"><name><surname>Rodrigues</surname>, <given-names>P.</given-names></name>, &#x26; <name><surname>Rosa</surname>, <given-names>P. J.</given-names></name></person-group> (<year>2017</year>). <chapter-title>Eye-Tracking as a Research Methodology in Educational Context: A Spanning Framework.</chapter-title>&#160;Christopher W., Frank S., Bradley M., (eds), Eye Tracking Technology Applications in Educational Research (pp.1-26). Pennsylvania: IGI Global; 2017. <pub-id pub-id-type="doi">10.4018/978-1-5225-1005-5.ch001</pub-id></mixed-citation></ref>
<ref id="b35"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Schoonahd</surname>, <given-names>J. W.</given-names></name>, <name><surname>Gould</surname>, <given-names>J. D.</given-names></name>, &#x26; <name><surname>Miller</surname>, <given-names>L. A.</given-names></name></person-group> (<year>1973</year>). <article-title>Studies of Visual Inspection.</article-title> <source>Ergonomics</source>, <volume>16</volume>(<issue>4</issue>), <fpage>365</fpage>&#8211;<lpage>379</lpage>. <pub-id pub-id-type="doi">10.1080/00140137308924528</pub-id><pub-id pub-id-type="pmid">28086275</pub-id><issn>0014-0139</issn></mixed-citation></ref>
<ref id="b37"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Sharples</surname>, <given-names>S.</given-names></name>, <name><surname>Cobb</surname>, <given-names>S.</given-names></name>, <name><surname>Moody</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Wilson</surname>, <given-names>J. R.</given-names></name></person-group> (<year>2008</year>). <article-title>Virtual reality induced symptoms and effects (VRISE): Comparison of head mounted display (HMD), desktop and projection display systems.</article-title> <source>Displays</source>, <volume>29</volume>(<issue>2</issue>), <fpage>58</fpage>&#8211;<lpage>69</lpage>. <pub-id pub-id-type="doi">10.1016/j.displa.2007.09.005</pub-id><issn>0141-9382</issn></mixed-citation></ref>
<ref id="b30"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Steiger</surname>, <given-names>J. H.</given-names></name></person-group> (<year>2007</year>). <article-title>Understanding the limitations of global fit assessment in structural equation modeling.</article-title> <source>Personality and Individual Differences</source>, <volume>42</volume>(<issue>5</issue>), <fpage>893</fpage>&#8211;<lpage>898</lpage>. <pub-id pub-id-type="doi">10.1016/j.paid.2006.09.017</pub-id><issn>0191-8869</issn></mixed-citation></ref>
<ref id="b22"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Togami</surname>, <given-names>H.</given-names></name></person-group> (<year>1984</year>). <article-title>Affects on visual search performance of individual differences in fixation time and number of fixations.</article-title> <source>Ergonomics</source>, <volume>27</volume>(<issue>7</issue>), <fpage>789</fpage>&#8211;<lpage>799</lpage>. <pub-id pub-id-type="doi">10.1080/00140138408963552</pub-id><issn>0014-0139</issn></mixed-citation></ref>
<ref id="b13"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Unema</surname>, <given-names>P. J. A.</given-names></name>, <name><surname>Pannasch</surname>, <given-names>S.</given-names></name>, <name><surname>Joos</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Velichkovsky</surname>, <given-names>B. M.</given-names></name></person-group> (<year>2005</year>). <article-title>Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration.</article-title> <source>Visual Cognition</source>, <volume>12</volume>(<issue>3</issue>), <fpage>473</fpage>&#8211;<lpage>494</lpage>. <pub-id pub-id-type="doi">10.1080/13506280444000409</pub-id><issn>1350-6285</issn></mixed-citation></ref>
<ref id="b20"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Walshe</surname>, <given-names>R. C.</given-names></name>, &#x26; <name><surname>Nuthmann</surname>, <given-names>A.</given-names></name></person-group> (<year>2014</year>). <article-title>Asymmetrical control of fixation durations in scene viewing.</article-title> <source>Vision Research</source>, <volume>100</volume>, <fpage>38</fpage>&#8211;<lpage>46</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2014.03.012</pub-id><pub-id pub-id-type="pmid">24726565</pub-id><issn>0042-6989</issn></mixed-citation></ref>
</ref-list>
</back>
</article>
