<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.11.4.3</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Using Eye-Movement Events to Determine the Mental Workload of Surgical Residents</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Menekse Dalveren</surname>
						<given-names>Gonca Gokce</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Cagiltay</surname>
						<given-names>Nergiz Ercil</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>				
        <aff id="aff1">
		<institution>Atilim University, Faculty of Engineering, Department of Software Engineering, Ankara</institution>,   <country>Turkey</country>
        </aff>
		</contrib-group>   

		
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>24</day>  
		<month>8</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2018</year>
	</pub-date>
      <volume>11</volume>
      <issue>4</issue>
	 <elocation-id>10.16910/jemr.11.4.3</elocation-id> 
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Menekse Dalveren, G. G., Cagiltay, N. E.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>These days, eye-tracking is one of the promising technologies
    used in different fields such as aviation, arts, sports, psychology
    and driving for several purposes. Even though it is being used for
    health purposes, studies involving eye-tracking are rare in the
    field of endo-neurosurgery. This study aims to use this technology
    to promote our understanding of the effect related to computer-based
    instructional materials on mental workload of endo-neurosurgery
    residents. Four computer-based simulation scenarios are developed
    based on skill development requirements of endo-neurosurgery
    residents. Two of them were designed as general models and the other
    two as simulated surgical models. During these surgery procedures,
    in real settings, surgical residents need to use their both hands
    simultaneously to control the endoscope and the operational tool in
    a coordinated fashion. Therefore, to shed light on the participants’
    behaviors, these scenarios are performed with dominant-hand,
    non-dominant hand and, finally with both-hands using haptic
    interfaces. Twenty-three residents volunteered in this study. Their
    eye-movements were recorded while performing the scenarios.
    According to the results of this study, when performing the
    simulated surgical models, an increase in the participants’ mental
    workload was recorded when compared to the other scenarios.
    Accordingly, it can be concluded that the eye-movements of surgical
    residents can provide insights about the anticipated level of
    difficulty about the skill-based tasks. This information might be
    very critical to properly design and organize instructional
    materials for endo-neurosurgery, and also to better guide and
    evaluate the progress of trainees in computer simulation-based skill
    training environments.</p>
      </abstract>
      <kwd-group>
        <kwd>eye-tracking</kwd>
        <kwd>eye-movement events</kwd>
        <kwd>mental workload</kwd>
        <kwd>task difficulty</kwd>
        <kwd>surgical virtual environment</kwd>
        <kwd>endo-neurosurgery</kwd>		
      </kwd-group>
    </article-meta>
  </front>	
  <body>

    <sec id="S1">
      <title>Introduction</title>

  <p>Technology-enhanced educational environment, provide several
  benefits to improve surgical education programs. For instance,
  simulation is one of the technologies that allows trainees to perform
  clinical activities interactively by recreating such operations in a
  computer-based system without exposing patients to the associated
  risks (<xref ref-type="bibr" rid="b27 b34">27, 34</xref>). 
  However, still there is a need for research to develop
  strategies for improving the curriculum integration of these systems
  and for creating standardized approaches. In this respect, the mental
  workload theory and the eyetracking technology are two important
  concepts that can be implemented in surgical education programs.</p>
  
  <p>The mental workload concept has long been accepted as an essential
  aspect of individual performance within complex systems (<xref ref-type="bibr" rid="b51">51</xref>). It is reported that mental workload can change the
  performance of individuals (<xref ref-type="bibr" rid="b52">52</xref>) and further affect the competence of the whole system
  (<xref ref-type="bibr" rid="b51">51</xref>). Accordingly, system developers need
  certain models to assess the mental workload imposed on individuals at
  an early stages so that alternative system designs can be appraised
  (<xref ref-type="bibr" rid="b51">51</xref>). At the same time, mental workload can
  negatively affect performance and increase the probability of errors
  (<xref ref-type="bibr" rid="b52">52</xref>), and researchers have spent a great deal of
  effort developing measures and probes of mental workload (<xref ref-type="bibr" rid="b1">1</xref>). Supportively, Moray (<xref ref-type="bibr" rid="b33">33</xref>) stated that
  adjusting the allocation of mental workload could reduce human errors,
  improve system safety, and increase productivity. In earlier studies,
  three types of mental workload has been defined: intrinsic load,
  extraneous or ineffective load, and germane or effective load
  (<xref ref-type="bibr" rid="b44">44</xref>). Intrinsic load is an
  interaction between the nature of the material being learned and the
  expertise of the learners (<xref ref-type="bibr" rid="b37 b44">37, 44</xref>). Extraneous load is resulting from mainly
  poorly designed instruction, and germane load is related to processes
  that contribute to the construction and automation of schemas (<xref ref-type="bibr" rid="b37">37</xref>).</p>
  
  <p>Eye-tacking provides a valuable source of information, and events
  such as fixations, blinks, and pupil diameter can be used to assess
  the mental workload (<xref ref-type="bibr" rid="b47">47</xref>). Accordingly, there are several studies conducted on the
  assessment of mental workload by using eye-tracking technology
  (<xref ref-type="bibr" rid="b31">31</xref>). A precise evaluation of
  mental workload will be essential for developing systems that manage
  user attention (<xref ref-type="bibr" rid="b3 b14 b21">3, 14, 21</xref>). Researchers have used eye-movement events found to
  correlate with cognitive demands (<xref ref-type="bibr" rid="b1">1</xref>).
  For instance, Benedetto et al. (<xref ref-type="bibr" rid="b5">5</xref>) examined the changes in blink
  duration and blink rate in a simple driving task and stated that blink
  events reflect the effects of visual workload. Another study evaluates
  the mental workload by developing combined measures based on various
  physiological indices (<xref ref-type="bibr" rid="b41">41</xref>). To determine the mental
  workload, three physiological signals were recorded; these are: alpha
  rhythm, eye blink interval, and heart rate variability (<xref ref-type="bibr" rid="b41">41</xref>). The study of de Greef, Lafeber, van Oostendorp, and
  Lindenberg (<xref ref-type="bibr" rid="b15">15</xref>) describes an approach for objective assessment of
  mental workload by analyzing the differences in pupil diameter and
  several aspects of eye-movement under different levels of mental
  workload. Eye-movement events are also used in medicine for diagnoses,
  treatment and training purposes (<xref ref-type="bibr" rid="b22">22</xref>) and for clinical applications such as Alzheimer’s (<xref ref-type="bibr" rid="b12">12</xref>), HIV-1 infected patients with eye-movement dysfunction
  (<xref ref-type="bibr" rid="b42">42</xref>), and schizophrenia
  (<xref ref-type="bibr" rid="b18">18</xref>). Studies show that
  these events provide crucial information about how users interact with
  complex visual displays (<xref ref-type="bibr" rid="b29">29</xref>). The field of radiology and
  visual search (<xref ref-type="bibr" rid="b36">36</xref>) and laparoscopic surgery
  training (<xref ref-type="bibr" rid="b25 b46">25, 46</xref>) are among the cases in medicine where
  eye-tracking approach has been adopted. To provide an example,
  according to the study by Zheng, Jiang, and Atkins (<xref ref-type="bibr" rid="b53">53</xref>),
  participants perform a simulated laparoscopic procedure, and when the
  task difficulty is increased, the task completion time and pupil size
  also increase as a result.</p>
  
  <p>Previous studies were conducted mostly on pupil size changes, but
  there are other eye-movement events, fixation for example, that can be
  informative for understanding mental workload. Fixation occurs when
  eye-movements are nearly still in order to assemble necessary
  information. Accordingly, in this study fixation number and fixation
  duration events are used to validate the mental workload imposed by
  different scenarios. As changes in eye-movement events, such as
  fixation number and fixation duration, with changes in mental workload
  are likely affected due to the nature of the scenarios (<xref ref-type="bibr" rid="b47">47</xref>), understanding the surgical resident’s mental workload while
  performing surgical operations is crucial for assessing task
  difficulties (<xref ref-type="bibr" rid="b2">2</xref>). It is stated by
  Just and Carpenter (<xref ref-type="bibr" rid="b23">23</xref>) longer fixation duration related with
  difficulty in interpreting the information present or a greater
  involvement in its exploration. Accordingly, it was found that more
  complex problem results in more fixation numbers and longer fixation
  duration (<xref ref-type="bibr" rid="b4 b31 b38">4, 31, 38</xref>). Also, another study stated that the fixation duration might
  be related to the mental workload, when the mental workload increases
  the longer fixation duration for observation occurs (<xref ref-type="bibr" rid="b6 b19 b49 b50">6, 19, 49, 50</xref>). Hence, this
  study attempts to understand the mental workload changes of the
  participants through their eye-movement events, namely fixation number
  and fixation duration, while performing tasks having different
  difficulty levels in four surgical scenarios. Accordingly, the
  scenarios are developed in different fidelity levels (high- and
  low-fidelity) which expected to affect mental workload of the
  participants. Additionally, in each scenario, the hand condition
  effect on mental workload is also investigated. Hence, in this study
  it is hypothesized that because of the changes in the mental workload
  under these situations (different hand conditions, fidelity levels and
  task difficulties of scenarios) eye-tracking data would display
  different behaviors. The authors believe that, this information will
  be very critical to better understand the mental workload of the
  participants in these situations. This information provides insights
  to the instructional system designers to better order and adapt
  related computer-based simulation technologies according to the skill
  levels and progress of the trainees.</p>
</sec>

<sec id="S2">
  <title>Methods </title>

  <p>In this experimental study, 23 surgical residents performed the
  tasks assigned in four different computer-based simulation scenarios
  by their dominant hand, non-dominant hand and both-hands. During this
  process, their eyemovement data is recorded by an eye-tracker. The
  results were analyzed using statistical methods aimed to better
  understand the surgical residents’ behaviors in these different
  simulation scenarios.</p>

  <sec id="S2a">
    <title>Participants </title>

    <p>Twenty three volunteer surgical residents participated in this
    study from the Neurosurgery and Ear-Nose-Throat (ENT) surgery
    departments of a medical school. The majority of the participants
    were male (87.0%) and do not use eye-glasses (73.9%).</p>
  </sec>

  <sec id="S2b">
    <title>Apparatus </title>

    <p>The eye-movement data of the surgical residents were recorded
    with an eye-tracker device while the scenarios were performed under
    different hand conditions with haptic devices. The data was recorded
    by The Eye Tribe (<xref ref-type="bibr" rid="b17">17</xref>) eye-tracker at 60
    Hz with a screen resolution of 1920×1080 pixels. The Eye Tribe is a
    Danish start-up company that produces eye-tracking technology and
    offers the product to software developers to be incorporated into
    different applications and programs. The company focuses on a sleek
    appearance and a portable structure. The Eye Tribe Eye Tracker is an
    affordable device, thereby making it a potentially available tool
    for research. According to Coyne and Sibley (<xref ref-type="bibr" rid="b11">11</xref>), the Eye Tribe
    system results are quite promising for human factors researchers.
    Dalmaijer (<xref ref-type="bibr" rid="b13">13</xref>) stated that researchers on a budget can use the Eye
    Tribe tracker for the evaluation of fixation events and pupil
    size.</p>

    <p>Since haptic devices enable participants to perform movements in
    the simulated environments, for performing the tasks the Geomagic
    Touch mid-range professional haptic device (<xref ref-type="bibr" rid="b45">45</xref>) is used alongside 3D Systems haptic devices
    presenting real 3D navigation and force feedback.</p>
  </sec>

  <sec id="S2c">
    <title>Scenarios </title>    

    <p>Four scenarios were developed for the collection of surgical
    residents’ eye-movement data. These scenarios were implemented using
    Unity Platform and C# Programming language. The scenarios were
    performed with the dominant-hand, then with the non-dominant hand
    and, finally, both-hands in a given fixed period of time. For
    providing more objectivity, 12 of the participants started to
    perform the tasks by their dominant hand, and the remaining
    participants started with their non-dominant hand. Increasing the 3D
    depth perception, using the surgical instruments efficiently,
    fast-following up of objects, and improving the ability to plan and
    strategize were the learning outcomes of these scenarios.
    Accordingly, different tasks were defined in each scenario to reach,
    move and control objects in 3D environments simulating real surgical
    conditions. Current development technologies allow the recreation of
    real-life operations with adequate fidelity, thus profoundly
    improving the training environment (<xref ref-type="bibr" rid="b34">34</xref>).
    Accordingly, in this study two of the scenarios were simulated as
    surgical model and can be considered as higher-fidelity; the other
    two were based on general models which can be considered as
    lower-fidelity. High-fidelity scenarios were simulation of a human
    nose with the view of a real surgical operation and real skin
    texture. Also the tasks performed in high-fidelity scenarios were
    more complex than the low-fidelity scenarios. In addition, it is
    critical for surgical residents to improve their hand skills. In
    real operations they have to use their both hands in simultaneously.
    Accordingly, the simulated surgical tasks in this study performed in
    different hand conditions (dominant, non-dominant and both) to
    represent different complexity levels of the tasks. Hence, as it is
    defined the mental load caused by the internal complexity of the
    learning materials (<xref ref-type="bibr" rid="b43">43</xref>), the intrinsic load is expected
    to be increased in scenarios having higher complexity levels.</p>

    <p>In Scenario-1, it is necessary to catch the red ball (Figure 1:
    A) that appears at random places in a room with a surgical tool.
    After catching the red ball the aim is placing it on the cube, which
    also appears at random places (Figure 1: B). This scenario is a
    general simulation model aimed to gain the ability to use the
    surgical instrument and to develop depth perception and the process
    has to be completed 10 times in a given fixed period of time.</p>

    <fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1.</label>
					<caption>
						<p>Scenario-1</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-11-04-c-figure-01.png"/>
				</fig>

    <p>In Scenario-2, it is necessary to remove the tumor like objects
    in a given fixed period of time using a surgical tool from a model
    which was designed based on the inside of a human nose. These tumor
    objects located in 10 different places (Figure 2: A &#x26; B). This
    scenario is a simulated surgical model, which has made it possible
    for surgical residents to feel as if they are in surgical settings.
    Surgical residents can move the endoscopic device through the nose
    using the haptic device and feel the tissue as the device give force
    feedback upon collision with any surface. By using the surgical tool
    in the most accurate way, it is expected to complete the operation
    by carefully removing the tumors from their locations.</p>

    <fig id="fig02" fig-type="figure" position="float">
					<label>Figure 2.</label>
					<caption>
						<p>Scenario-2</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-11-04-c-figure-02.png"/>
				</fig>

    <p>In Scenario-3, the aim is to approach to the red ball with the
    correct angle and explode it in a given fixed period of time. This
    ball appears 10 times in different cubes randomly (Figure 3: A &#x26;
    B). If the correct angle is achieved, the ball will explode;
    otherwise it will not. In this scenario the aim is to develop depth
    perceptions and improve ability to approach a certain point with the
    correct angle. This scenario is a simulation of a general model.</p>

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 3.</label>
					<caption>
						<p>Scenario-3</p>
					</caption>
					<graphic id="graph03" xlink:href="jemr-11-04-c-figure-03.png"/>
				</fig>

    <p>In Scenario-4, surgical residents are expected to move the ball
    over a certain path in the nose model by approaching it with a
    correct angle in a given fixed period of time (Figure 4: A &#x26; B).
    This scenario is a simulated surgical model and designed like inside
    of human nose with similar texture, simulating the field vision of a
    surgical resident during an actual operation.</p>

<fig id="fig04" fig-type="figure" position="float">
					<label>Figure 4.</label>
					<caption>
						<p>Scenario-4</p>
					</caption>
					<graphic id="graph04" xlink:href="jemr-11-04-c-figure-04.png"/>
				</fig>
  </sec>

  <sec id="S2d">
    <title>Procedure </title>

    <p>First, an instruction describing the procedure was given
    individually and the personal information of the participants were
    recorded. Volunteers were seated and centered in front of the
    monitor at a distance of 70cm and 9 calibration points were
    presented for eye-tracker device calibration. The scenarios were
    performed in the order of 1, 3, 2 and 4 representing the scenario
    numbers. Randomly, twelve of the participants started to perform the
    scenarios first with their dominant-hand and the other group with
    their non-dominant hand. Afterwards, they performed the tasks with
    their both hands, under which conditions they used the operation
    tool with their dominant hand and the camera tool with their
    non-dominant hand for lighting up the operation area. The recorded
    raw eye data was classified into number of fixation and fixation
    duration using an open-source eye-movement classification algorithm
    (Binocular-Individual Threshold-BIT). BIT algorithm (<xref ref-type="bibr" rid="b48">48</xref>) is a velocity-based algorithm to
    classify fixations from the data with individualspecific thresholds
    which was implemented in MATLAB. To verify fixations, the algorithm
    uses the velocity thresholds of both eyes. Also, BIT is a
    parameter-free fixationidentification algorithm that automatically
    identifies task- and individual-specific velocity thresholds by
    optimally exploiting the statistical properties of the eye movement
    data across different eyes and directions of eye movements (<xref ref-type="bibr" rid="b48">48</xref>). The BIT algorithm has advantages over the
    existing algorithms in that it contains binocular viewing and uses
    the information about fixations and co-variations between the
    movements of both eyes to identify saccades; it estimates rather
    than pre-sets the velocity threshold to identify fixations and
    saccades, and it permits the threshold to vary between eye-movement
    directions, tasks and individuals. Also, each record exceeding the
    threshold value contains the stochasticity which is spontaneous in
    the eye-movements so as not to be labeled as saccade (<xref ref-type="bibr" rid="b48">48</xref>). The other important feature is that BIT algorithm is
    independent of eye-tracker and sampling frequency and can be easily
    adapted to the data from varying eye-trackers with different
    sensitivity and sampling frequency (<xref ref-type="bibr" rid="b48">48</xref>). For
    the evaluation of differences based on scenario difficulties, the
    fixation number and fixation duration event values were
    analyzed.</p>
  </sec>

  <sec id="S2e">
    <title>Measures </title>

    <p>Eye-tracking has been widely used to measure the mental workload
    from the eye-movement data so as to analyze the cognitive processes
    underlying visual behavior (<xref ref-type="bibr" rid="b47">47</xref>). Eye-tracking
    provides a valuable source of physiological data for the allocation
    of information processing resources through ocular activity and are
    closely linked to the underlying neural networks in the brain (<xref ref-type="bibr" rid="b7">7</xref>). To understand the mental workload of surgical
    residents in these previously explained scenarios, specific measures
    in eye-tracking were used, namely fixation number and fixation
    duration events (<xref ref-type="bibr" rid="b48">48</xref>). Fixation is a slow
    period event when the eyemovement is almost still with small
    dispersion and velocity. With other words eye movements that occur
    when gaze is dwelling on objects (<xref ref-type="bibr" rid="b24">24</xref>). Eye-movement classification algorithms can be
    able to classify fixation events into number of fixation and
    fixation duration. Sequences of eye fixations are basic components
    of eye movements in these fields to gain understanding in visual
    behavior. Different algorithms have been proposed to identify
    fixations from the recordings of the point of regard (POR) that the
    eye tracking equipment provides (<xref ref-type="bibr" rid="b48">48</xref>).</p>
  </sec>
</sec>

<sec id="S3">
  <title>Results </title>

  <p>In all, 276 (23 surgical residents, 4 scenarios, and 3 hand
  conditions) datasets were recorded, significantly increasing the
  accuracy of the results in this work. To evaluate and compare the
  differences among the difficulty levels of the scenarios and hand
  condition effect, the eyemovement events, fixation number and fixation
  duration were analyzed.</p>
  
  <p>The analysis of the data was carried out with the SPSS 23 program
  and it was worked with 95% confidence level. The Friedman
  non-parametric test technique was used for observing the effect of the
  difficulty levels within scenarios on the eye-movement events of the
  surgical residents. To understand the difficulty levels among four
  scenarios under dominant-, non-dominant and both-hand conditions post
  hoc analysis with Wilcoxon signed-rank tests was conducted with a
  Bonferroni correction applied, resulting in a significance level set
  at p &#x3C; 0.017.</p>

  <sec id="S3a">
    <title>Fixation Number </title>

    <p>A non-parametric Friedman test of differences among the repeated
    measures was conducted for the scenario difficulty level effect on
    the fixation number. The effect of the scenario was significant (all
    ps &#x3C; .05) on the fixation number according to the results. While
    the hand condition is fixed, the results of the analysis of the
    repeated measurements differ according to the scenarios. Based on
    the Friedman test for different measurement groups, there is a
    statistically significant difference between the fixation number
    when using the dominant hand (x<sup>2</sup> (3) = 37.08, p &#x3C;0.05)
    for different scenarios. Scenario-1 has the lowest mean rank for the
    fixation number (1.57), while Scenario2 has the highest (3.78). When
    using the non-dominant hand (x<sup>2</sup> (3) = 50.18, p &#x3C;0.05)
    for different scenarios, Scenario-1 has the lowest mean rank for the
    fixation number (1.26) while Scenario-2 has the highest (3.70)
    fixation number. According to the test results when using both hands
    (x<sup>2</sup> (3) = 52.74, p &#x3C;0.05) for different scenarios,
    Scenario-1 has the lowest mean rank for the fixation number (1.07)
    while Scenario-2 has the highest mean rank (3.80) for the fixation
    number. According to the results of the three hand conditions for
    the fixation number measure, the scenario that makes fixation number
    larger is reported (Figure 5). Generally, in Scenario-2 the fixation
    number becomes larger compared to the other scenarios.</p>

<fig id="fig05" fig-type="figure" position="float">
					<label>Figure 5.</label>
					<caption>
						<p>Fixation Number Differences Among Scenarios</p>
					</caption>
					<graphic id="graph05" xlink:href="jemr-11-04-c-figure-05.png"/>
				</fig>

    <p>Wilcoxon signed-rank tests was conducted for understanding the
    difficulty levels between scenarios under dominant-hand,
    non-dominant hand and both hands condition with a Bonferroni
    correction (p &#x3C; 0.017). The mean and standard deviation values
    for each scenario under dominant hand, non-dominant hand and both
    hands conditions are given at Table1.</p>

<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>Mean and Standard Deviation values for Fixation</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>
          <tr>
            <th>Scenario</th>
            <th colspan="2">Dominant Hand</th>
            <th colspan="2">Non-Dominant Hand</th>
            <th colspan="2">Both Hands</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Mean</td>
            <td>Std. Dev.</td>
            <td>Mean</td>
            <td>Std. Dev.</td>
            <td>Mean</td>
            <td>Std. Dev.</td>
          </tr>
          <tr>
            <td>1</td>
            <td>24.66</td>
            <td>4.84</td>
            <td>26.78</td>
            <td>4.11</td>
            <td>25.33</td>
            <td>5.29</td>
          </tr>
          <tr>
            <td>2</td>
            <td>55.87</td>
            <td>18.18</td>
            <td>61.29</td>
            <td>21.42</td>
            <td>64.40</td>
            <td>20.81</td>
          </tr>
          <tr>
            <td>3</td>
            <td>31.00</td>
            <td>9.93</td>
            <td>30.54</td>
            <td>7.13</td>
            <td>46.46</td>
            <td>13.48</td>
          </tr>
          <tr>
            <td>4</td>
            <td>32.36</td>
            <td>9.73</td>
            <td>46.81</td>
            <td>13.09</td>
            <td>45.60</td>
            <td>25.78</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>

    <p>According to the test results there is a significant difference
    under dominant hand condition between the Scenario-1 and Scenario-2
    (Z = -4.14, p = 0.000), Scenario-1 and Scenario-3 (Z = -2.65, p =
    0.008), Scenario-1 and Scenario-4 (Z = -3.10, p = 0.002). Similarly,
    there is a significant difference between Scenario-2 and Scenario-3
    (Z = -4.05, p = 0.000) and between Scenario-2 and Scenario-4 (Z =
    -4.06, p = 0.000). However, the difference between Scenario-3 and
    Scenario-4 is not statistically significant (Z = -0.68, p = 0.497)
    under dominant hand condition (Table 2).</p>

<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>Wilcoxon signed-rank test results (dominant hand)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>

          <tr>
            <th>Scenario</th>
            <th colspan="2">2</th>
            <th colspan="2">3</th>
            <th colspan="2">4</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
          </tr>
          <tr>
            <td>1</td>
            <td>-4.14</td>
            <td>0.000</td>
            <td>-2.65</td>
            <td>0.008</td>
            <td>-3.10</td>
            <td>0.002</td>
          </tr>
          <tr>
            <td>2</td>
            <td></td>
            <td></td>
            <td>-4.05</td>
            <td>0.000</td>
            <td>-4.06</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>3</td>
            <td></td>
            <td></td>
            <td></td>
            <td></td>
            <td>-0.68</td>
            <td>0.497</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>

    <p>According to the test results there is a significant difference
    under non-dominant hand condition between the Scenario-1 and
    Scenario-2 (Z = -4.20, p = 0.000), Scenario-1 and Scenario-4 (Z =
    -4.13, p = 0.000). Similarly, there is a significant difference
    between Scenario-2 and Scenario-3 (Z = -4.17, p = 0.000), between
    Scenario-2 and Scenario-4 (Z = -2.71, p = 0.007) and Scemario-3 and
    Scenario-4 (Z = -3.96, p = 0.000). However, the difference between
    Scenario-1 and Scenario-3 is not statistically significant (Z =
    -2.28, p = 0.022) under non-dominant hand condition (Table 3).</p>

<table-wrap id="t03" position="float">
					<label>Table 3.</label>
					<caption>
						<p>Wilcoxon signed-rank test results (non-dominant hand)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>

          <tr>
            <th>Scenario</th>
            <th colspan="2">2</th>
            <th colspan="2">3</th>
            <th colspan="2">4</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
          </tr>
          <tr>
            <td>1</td>
            <td>-4.20</td>
            <td>0.000</td>
            <td>-2.28</td>
            <td>0.022</td>
            <td>-4.13</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>2</td>
            <td></td>
            <td></td>
            <td>-4.17</td>
            <td>0.000</td>
            <td>-2.71</td>
            <td>0.007</td>
          </tr>
          <tr>
            <td>3</td>
            <td></td>
            <td></td>
            <td></td>
            <td></td>
            <td>-3.96</td>
            <td>0.000</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>

    <p>According to the test results there is a significant difference
    under both hands condition between the Scenario1 and Scenario-2 (Z =
    -4.20, p = 0.000), Scenario-1 and Scenario-3 (Z = -4.21, p = 0.000),
    Scenario-1 and Scenario-4 (Z = -3.97, p = 0.000). Similarly, there
    is a significant difference between Scenario-2 and Scenario-3 (Z =
    3.97, p = 0.000) and between Scenario-2 and Scenario-4 (Z = -3.45, p
    = 0.001). However, the difference between Scenario-3 and Scenario-4
    is not statistically significant (Z = -0.71, p = 0.48) under both
    hands condition (Table 4).</p>

<table-wrap id="t04" position="float">
					<label>Table 4.</label>
					<caption>
						<p>Wilcoxon signed-rank test results (both hands)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

        <thead>
          <tr>
            <th>Scenario</th>
            <th colspan="2">2</th>
            <th colspan="2">3</th>
            <th colspan="2">4</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
          </tr>
          <tr>
            <td>1</td>
            <td>-4.20</td>
            <td>0.000</td>
            <td>-4.21</td>
            <td>0.000</td>
            <td>-3.97</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>2</td>
            <td></td>
            <td></td>
            <td>-3.97</td>
            <td>0.000</td>
            <td>-3.45</td>
            <td>0.001</td>
          </tr>
          <tr>
            <td>3</td>
            <td></td>
            <td></td>
            <td></td>
            <td></td>
            <td>-0.71</td>
            <td>0.048</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>
  </sec>

  <sec id="S3b">
    <title>Fixation Duration </title>

    <p>A non-parametric Friedman test of differences among the repeated
    measures was conducted for the scenario effect on fixation duration
    (msec.). The effect of scenario was significant (all ps &#x3C; .05) on
    the fixation duration according to the results. While the hand
    condition is fixed, the results of the analysis of the repeated
    measurements differ according to the scenarios. According to
    Friedman test for different measurement groups, there is a
    statistically significant difference between the fixation duration
    when using the dominant hand (x<sup>2</sup> (3) = 52.41, p &#x3C;0.05)
    for different scenarios. Scenario-1 has the lowest mean rank for the
    fixation duration (1.04) while Scenario-2 has the highest mean rank
    for the (3.70) fixation duration. When the non-dominant hand is used
    (x<sup>2</sup> (3) = 54.49, p &#x3C;0.05) for different scenarios,
    Scenario-1 has the lowest mean rank for the fixation duration (1.04)
    while Scenario-4 has the highest mean rank for the (3.52) fixation
    duration. In the both hands condition (x<sup>2</sup> (3) = 65.56, p
    &#x3C;0.05), Scenario-1 has the lowest mean rank for the fixation
    duration (1.00) while Scenario-2 has the highest mean rank for the
    (3.96) fixation duration. According to the results of the three hand
    conditions, the scenario that makes the fixation duration longer is
    reported (Figure 6). In Scenario-2 and Scenario4 the fixation
    duration is becomes larger compared to the other scenarios.</p>

<fig id="fig06" fig-type="figure" position="float">
					<label>Figure 6.</label>
					<caption>
						<p>Fixation Duration Differences Among Scenarios</p>
					</caption>
					<graphic id="graph06" xlink:href="jemr-11-04-c-figure-06.png"/>
				</fig>

    <p>Wilcoxon signed-rank tests was conducted for understanding the
    difficulty levels between scenarios under dominant-hand,
    non-dominant hand and both hands conditions with a Bonferroni
    correction (p &#x3C; 0.017). The mean and standard deviation values
    for each scenario under dominant hand, non-dominant hand and both
    hands conditions are given at Table 5.</p>

<table-wrap id="t05" position="float">
					<label>Table 5.</label>
					<caption>
						<p>Mean and Standard Deviation values for Fixation Duration</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>

          <tr>
            <th></th>
            <th colspan="2">Dominant Hand</th>
            <th colspan="2">Non-Dominant Hand</th>
            <th colspan="2">Both Hands</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td>Scenario</td>
            <td>Mean</td>
            <td>Std. Dev.</td>
            <td>Mean</td>
            <td>Std. Dev.</td>
            <td>Mean</td>
            <td>Std. Dev.</td>
          </tr>
          <tr>
            <td>1</td>
            <td>25695.94</td>
            <td>5001.86</td>
            <td>26697.35</td>
            <td>2907.38</td>
            <td>32369.00</td>
            <td>8352.35</td>
          </tr>
          <tr>
            <td>2</td>
            <td>59112.81</td>
            <td>14626.78</td>
            <td>64860.64</td>
            <td>16724.14</td>
            <td>101785.26</td>
            <td>10333.82</td>
          </tr>
          <tr>
            <td>3</td>
            <td>43239.00</td>
            <td>10899.84</td>
            <td>44321.84</td>
            <td>9344.41</td>
            <td>59903.76</td>
            <td>9284.96</td>
          </tr>
          <tr>
            <td>4</td>
            <td>50363.92</td>
            <td>12175.10</td>
            <td>65791.68</td>
            <td>13479.19</td>
            <td>77301.20</td>
            <td>13700.74</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>

    <p>According to the test results there is a significant difference
    under dominant hand condition between the Scenario-1 and Scenario-2
    (Z = -4.19, p = 0.000), Scenario-1 and Scenario-3 (Z = -3.68, p =
    0.000), Scenario-1 and Scenario-4 (Z = -4.20, p = 0.000). Similarly,
    there is a significant difference between Scenario-2 and Scenario-3
    (Z = 3.86, p = 0.000) and between Scenario-2 and Scenario-4 (Z =
    -2.92, p = 0.003). However, the difference between Scenario-3 and
    Scenario-4 is not statistically significant (Z = -2.16, p = 0.030)
    under dominant hand condition (Table 6).</p>

<table-wrap id="t06" position="float">
					<label>Table 6.</label>
					<caption>
						<p>Wilcoxon signed-rank test results (dominant hand)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>

          <tr>
            <th>Scenario</th>
            <th colspan="2">2</th>
            <th colspan="2">3</th>
            <th colspan="2">4</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
          </tr>
          <tr>
            <td>1</td>
            <td>-4.20</td>
            <td>0.000</td>
            <td>-4.21</td>
            <td>0.000</td>
            <td>-3.97</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>2</td>
            <td></td>
            <td></td>
            <td>-3.97</td>
            <td>0.000</td>
            <td>-3.45</td>
            <td>0.001</td>
          </tr>
          <tr>
            <td>3</td>
            <td></td>
            <td></td>
            <td></td>
            <td></td>
            <td>-0.71</td>
            <td>0.048</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>

    <p>According to the test results there is a significant difference
    under non-dominant hand condition between the Scenario-1 and
    Scenario-2 (Z = -4.20, p = 0.000), Scenario-1 and Scenario-3 (Z =
    -4.08, p = 0.000), Scenario-1 and Scenario-4 (Z = -4.20, p = 0.000).
    Similarly, there is a significant difference between Scenario-2 and
    Scenario-3 (Z = -3.71, p = 0.000) and between Scenario-3 and
    Scenario-4 (Z = -4.17, p = 0.000) (Table 7). However, the difference
    between Scenario-2 and Scenario-4 is not statistically significant
    (Z = -0.06, p = 0.951) under non-dominant hand condition.</p>

<table-wrap id="t07" position="float">
					<label>Table 7.</label>
					<caption>
						<p>Wilcoxon signed-rank test results (non-dominant hand)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>

          <tr>
            <th>Scenario</th>
            <th colspan="2">2</th>
            <th colspan="2">3</th>
            <th colspan="2">4</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
          </tr>
          <tr>
            <td>1</td>
            <td>-4.20</td>
            <td>0.000</td>
            <td>-4.08</td>
            <td>0.000</td>
            <td>-4.20</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>2</td>
            <td></td>
            <td></td>
            <td>-3.71</td>
            <td>0.000</td>
            <td>-0.06</td>
            <td>0.951</td>
          </tr>
          <tr>
            <td>3</td>
            <td></td>
            <td></td>
            <td></td>
            <td></td>
            <td>-4.17</td>
            <td>0.000</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>

    <p>According to the test results under both hands condition there is
    a significant difference between the Scenario1 and Scenario-2 (Z =
    -4.20, p = 0.000), Scenario-1 and Scenario-3 (Z = -4.21, p = 0.000),
    Scenario-1 and Scenario-4 (Z = -4.21, p = 0.000). Similarly, there
    is a significant difference between Scenario-2 and Scenario-3 (Z =
    4.21, p = 0.000), between Scenario-2 and Scenario-4 (Z = -3.69, p =
    0.000) and between Scenario-3 and Scenario-4 (Z = -4.14, p = 0.000)
    under both hands condition (Table 8).</p>

<table-wrap id="t08" position="float">
					<label>Table 8.</label>
					<caption>
						<p>Wilcoxon signed-rank test results (both hand)</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
        <thead>

          <tr>
            <th>Scenario</th>
            <th colspan="2">2</th>
            <th colspan="2">3</th>
            <th colspan="2">4</th>
          </tr>
        </thead>
        <tbody>
          <tr>
            <td></td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
            <td>Z</td>
            <td>p</td>
          </tr>
          <tr>
            <td>1</td>
            <td>-4.20</td>
            <td>0.000</td>
            <td>-4.21</td>
            <td>0.000</td>
            <td>-4.21</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>2</td>
            <td></td>
            <td></td>
            <td>-3.21</td>
            <td>0.000</td>
            <td>-3.69</td>
            <td>0.000</td>
          </tr>
          <tr>
            <td>3</td>
            <td></td>
            <td></td>
            <td></td>
            <td></td>
            <td>-4.14</td>
            <td>0.000</td>
          </tr>
        </tbody>
      </table>
    </table-wrap>
  </sec>
</sec>

<sec id="S4">
  <title>Discussion </title>

  <p>This research describes an approach for an objective assessment of
  mental workload by analyzing the differences in the fixation number
  and fixation duration under different levels of mental workload while
  surgical residents perform simulated scenarios. The eye-movement data
  was collected with an eye-tracking device and classified into fixation
  number and fixation duration events with an eye-movement
  classification algorithm (BIT). These eye-movement events are selected
  because they seem to be most suited to provide insight about changes
  in mental workload (<xref ref-type="bibr" rid="b16">16</xref>).
  There are many other eye-movement classification algorithms, but in
  this study an open-source eye-movement classification algorithm, BIT,
  was used. The reason behind this choice was that BIT algorithm is
  eye-tracker independent and easy to implement and use. The aim of this
  study is to examine whether the fixation number and fixation duration
  events can, indeed, be indicators for mental workload and whether
  there are any among the imposed mental workloads within different
  scenarios. According to the results, the fixation number and fixation
  duration both show a significant increase if the mental workload
  increases. For understanding the differences between the scenarios,
  four of them were developed in this study; two were simulated surgical
  models and two were general models. The results can be summarized as
  highlighted below:</p>

  <list list-type="bullet">
    <list-item>
      <p>In the dominant hand condition, Scenario-1 has the lowest mean
      rank for the fixation number (1.47) and fixation duration (1.04)
      while Scenario-2 has the highest mean rank for the fixation number
      (3.78) and fixation duration (3.70).</p>
    </list-item>
    <list-item>
      <p>When using the non-dominant hand, Scenario-1 has the lowest
      mean rank for the fixation number (1.26) and fixation duration
      (1.04), while Scenario-2 has the highest mean rank for fixation
      number (3.70) and Scenario-4 has the highest mean rank for
      fixation duration (3.52).</p>
    </list-item>
    <list-item>
      <p>When using both hands, Scenario-1 has the lowest mean rank for
      the fixation number (1.07) and fixation duration (1.00), whereas
      Scenario-2 has the highest mean rank for fixation number (3.80)
      and fixation duration (3.96).</p>
    </list-item>
  </list>

  <p>In general, it can be concluded that in the scenarios that are
  designed by using the models that simulate the operational area
  (Scenario 2 &#x26; 4), the fixation duration and fixation number values
  become higher compared to the other group of scenarios (Scenario 1
  &#x26; 3).</p>
 
  <p>In previous studies, it has been stated that fixation time both
  show a general significant increase if the mental workload increases
  (<xref ref-type="bibr" rid="b15">15</xref>). Another study stated that the pupil size
  increased in response to task difficulty (<xref ref-type="bibr" rid="b35">35</xref>). Iqbal et al. (<xref ref-type="bibr" rid="b21">21</xref>) also stated that more difficult
  tasks demand longer processing times, induce higher subjective ratings
  of mental workload, and reliably evoke greater pupillary response at
  corresponding subtasks than a less difficult task. Additionally, Zheng
  et al. (<xref ref-type="bibr" rid="b53">53</xref>) stated that the pupil size of surgical residents is
  influenced depending on the task difficulties increasing as the
  difficulty level elevates. It is also reported that the fidelity level
  is a crucial factor affecting the mental workload (<xref ref-type="bibr" rid="b34">34</xref>). According to the previous studies fixation number and fixation
  duration are widely used eye-movement events and are generally
  believed to increase with increasing mental workload (<xref ref-type="bibr" rid="b20 b26 b28 b30 b32 b39 b40">20, 26, 28, 30, 32, 39, 40</xref>). In
  support to these studies, our results show that the scenarios based on
  simulated tasks using surgical models (higher level of fidelity) increase surgical residents’ mental workloads.
  Hence, it can be concluded that eye-movement events, such as fixation
  number and fixation duration, can be used to increase our knowledge of
  the mental workload of surgical trainees. Since the four scenarios
  were not performed in randomized and balanced order amongst the
  surgical residents there might be a training effect. Even this
  training affect, the results show that lately performed scenarios (2
  and 4) are the ones having higher fixation events. Accordingly, this
  order affect can be considered as acceptable for this study.</p>
  
  <p>Additionally, as there are very limited studies analyzing the
  eye-movement behaviors of endo-neurosurgery residents, there is no
  standards in classifying the simulation content according to the level
  of surgical skills (<xref ref-type="bibr" rid="b8 b9">8, 9</xref>). Similarly, the metrics that can be used
  to evaluate the skill levels of these residents are also very limited
  and there are no standards on these metrics, either (<xref ref-type="bibr" rid="b9">9</xref>). Hence, the results of this study encourage researchers to
  develop other standardized approaches for using objective metrics in
  surgical skill performance. Additionally, the results may guide
  instructional designers to better organize the content of
  computerbased simulation scenarios through the eye-movement behaviors
  of the trainees. As reported in the earlier studies, individual
  characteristics, situational characteristics and training motivation
  explain incremental variance in training outcomes beyond the effects
  of cognitive ability (<xref ref-type="bibr" rid="b10">10</xref>). These
  individual differences are more effective in the case of skill-based
  training environments such as endo-neurosurgery which requires
  development of both cognitive and psychomotor abilities. By using
  information collected from the trainees’ behaviors such as
  eye-movement data, instructional designers can adapt the sequence and
  difficulty levels of the tasks on each trainee to provide a training
  opportunity according to the skill and progress levels of each
  trainee. Hence, in the future the computer-based instructional
  software developed for skill-based training purposes will be more
  adaptive by using the data collected from the behaviors (such as
  eye-movements) and performance of the trainees.</p>
  </sec>

<sec id="S5" sec-type="COI-statement">
  <title>Ethics and Conflict of Interest </title>

  <p>The author(s) declare(s) that the contents of the article are in
  agreement with the ethics described in
  <ext-link ext-link-type="uri" xlink:href="http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html" xlink:show="new">http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html</ext-link>
  and that there is no conflict of interest regarding the publication of
  this paper.</p>
  </sec>

<sec id="S6">
  <title>Acknowledgements </title>  
 
  <p>This study is conducted for improving the scenario designs of the
  educational materials which are developed for endo-neurosurgery
  education project (ECE: Tübitak 1001, Project No: 112K287) purposes.
  The authors would like to thank the support of TÜBİTAK 1001 program
  for realizing this research. The researchers would also like to thank
  the ECE project team and the Hacettepe University Medical School for
  their valuable support throughout the research.</p>
</sec>
  </body>
<back>
<ref-list>
<ref id="b1"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ahlstrom</surname>, <given-names>U.</given-names></name>, &#x26; <name><surname>Friedman-Berg</surname>, <given-names>F. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Using eye movement activity as a correlate of cognitive workload.</article-title> <source>International Journal of Industrial Ergonomics</source>, <volume>36</volume>(<issue>7</issue>), <fpage>623</fpage>–<lpage>636</lpage>. <pub-id pub-id-type="doi">10.1016/j.ergon.2006.04.002</pub-id><issn>0169-8141</issn></mixed-citation></ref>
<ref id="b2"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Andrzejewska</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Stolińska</surname>, <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>Comparing the difficulty of tasks using eye tracking combined with subjective and behavioural criteria.</article-title> <source>Journal of Eye Movement Research</source>, <volume>9</volume>(<issue>3</issue>).<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b3"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Atkins</surname>, <given-names>M. S.</given-names></name>, <name><surname>Tien</surname>, <given-names>G.</given-names></name>, <name><surname>Khan</surname>, <given-names>R. S.</given-names></name>, <name><surname>Meneghetti</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Zheng</surname>, <given-names>B.</given-names></name></person-group> (<year>2013</year>). <article-title>What do surgeons see: Capturing and synchronizing eye gaze for surgery applications.</article-title> <source>Surgical Innovation</source>, <volume>20</volume>(<issue>3</issue>), <fpage>241</fpage>–<lpage>248</lpage>. <pub-id pub-id-type="doi">10.1177/1553350612449075</pub-id><pub-id pub-id-type="pmid">22696024</pub-id><issn>1553-3506</issn></mixed-citation></ref>
<ref id="b4"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Bałaj</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Szubielska</surname>, <given-names>M.</given-names></name></person-group> (<year>2014</year>). <article-title>Wpływ słuchania opisu katalogowego dzieła malarskiego na skaning wzrokowy obrazu15.</article-title> Studi@ Naukowe 20, 77.</mixed-citation></ref>
<ref id="b5"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Benedetto</surname>, <given-names>S.</given-names></name>, <name><surname>Pedrotti</surname>, <given-names>M.</given-names></name>, <name><surname>Minin</surname>, <given-names>L.</given-names></name>, <name><surname>Baccino</surname>, <given-names>T.</given-names></name>, <name><surname>Re</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Montanari</surname>, <given-names>R.</given-names></name></person-group> (<year>2011</year>). <article-title>Driver workload and eye blink duration.</article-title> <source>Transportation Research Part F: Traffic Psychology and Behaviour</source>, <volume>14</volume>(<issue>3</issue>), <fpage>199</fpage>–<lpage>208</lpage>. <pub-id pub-id-type="doi">10.1016/j.trf.2010.12.001</pub-id><issn>1369-8478</issn></mixed-citation></ref>
<ref id="b6"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Brookings</surname>, <given-names>J. B.</given-names></name>, <name><surname>Wilson</surname>, <given-names>G. F.</given-names></name>, &#x26; <name><surname>Swain</surname>, <given-names>C. R.</given-names></name></person-group> (<year>1996</year>). <article-title>Psychophysiological responses to changes in workload during simulated air traffic control.</article-title> <source>Biological Psychology</source>, <volume>42</volume>(<issue>3</issue>), <fpage>361</fpage>–<lpage>377</lpage>. <pub-id pub-id-type="doi">10.1016/0301-0511(95)05167-8</pub-id><pub-id pub-id-type="pmid">8652753</pub-id><issn>0301-0511</issn></mixed-citation></ref>
<ref id="b7"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Bröhl</surname>, <given-names>C.</given-names></name>, <name><surname>Theis</surname>, <given-names>S.</given-names></name>, <name><surname>Rasche</surname>, <given-names>P.</given-names></name>, <name><surname>Wille</surname>, <given-names>M.</given-names></name>, <name><surname>Mertens</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Schlick</surname>, <given-names>C. M.</given-names></name></person-group> (<year>2017</year>). <article-title>Neuroergonomic analysis of perihand space: Effects of hand proximity on eye-tracking measures and performance in a visual search task.</article-title> <source>Behaviour &#x26; Information Technology</source>, <volume>36</volume>(<issue>7</issue>), <fpage>1</fpage>–<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1080/0144929X.2016.1278561</pub-id><issn>0144-929X</issn></mixed-citation></ref>
<ref id="b8"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Cagiltay</surname>, <given-names>N. E.</given-names></name>, &#x26; <name><surname>Berker</surname>, <given-names>M.</given-names></name></person-group> (<year>2018</year>). <article-title>Technology-enhanced surgical education: attitudes and perceptions of the endoscopic surgery community in Turkey.</article-title> BMJ Simulation and Technology Enhanced Learning, bmjstel-2017-000238.</mixed-citation></ref>
<ref id="b9"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Cagiltay</surname>, <given-names>N. E.</given-names></name>, <name><surname>Ozcelik</surname>, <given-names>E.</given-names></name>, <name><surname>Sengul</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Berker</surname>, <given-names>M.</given-names></name></person-group> (<year>2017</year>). <article-title>Construct and face validity of the educational computer-based environment (ECE) assessment scenarios for basic endoneurosurgery skills.</article-title> <source>Surgical Endoscopy</source>, <volume>31</volume>(<issue>11</issue>), <fpage>4485</fpage>–<lpage>4495</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-017-5502-4</pub-id><pub-id pub-id-type="pmid">28389794</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b10"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Colquitt</surname>, <given-names>J. A.</given-names></name>, <name><surname>LePine</surname>, <given-names>J. A.</given-names></name>, &#x26; <name><surname>Noe</surname>, <given-names>R. A.</given-names></name></person-group> (<year>2000</year>). <article-title>Toward an integrative theory of training motivation: A meta-analytic path analysis of 20 years of research.</article-title> <source>The Journal of Applied Psychology</source>, <volume>85</volume>(<issue>5</issue>), <fpage>678</fpage>–<lpage>707</lpage>. <pub-id pub-id-type="doi">10.1037/0021-9010.85.5.678</pub-id><pub-id pub-id-type="pmid">11055143</pub-id><issn>0021-9010</issn></mixed-citation></ref>
<ref id="b11"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>Coyne</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Sibley</surname>, <given-names>C.</given-names></name></person-group> (<year>2016</year>). <article-title>Investigating the use of two low cost eye tracking systems for detecting pupillary response to changes in mental workload.</article-title> Paper presented at the <source>Proceedings of the Human Factors and Ergonomics Society Annual Meeting</source>. <pub-id pub-id-type="doi">10.1177/1541931213601009</pub-id></mixed-citation></ref>
<ref id="b12"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Crawford</surname>, <given-names>T. J.</given-names></name>, <name><surname>Higham</surname>, <given-names>S.</given-names></name>, <name><surname>Renvoize</surname>, <given-names>T.</given-names></name>, <name><surname>Patel</surname>, <given-names>J.</given-names></name>, <name><surname>Dale</surname>, <given-names>M.</given-names></name>, <name><surname>Suriya</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Tetley</surname>, <given-names>S.</given-names></name></person-group> (<year>2005</year>). <article-title>Inhibitory control of saccadic eye movements and cognitive impairment in Alzheimer’s disease.</article-title> <source>Biological Psychiatry</source>, <volume>57</volume>(<issue>9</issue>), <fpage>1052</fpage>–<lpage>1060</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsych.2005.01.017</pub-id><pub-id pub-id-type="pmid">15860346</pub-id><issn>0006-3223</issn></mixed-citation></ref>
<ref id="b13"><mixed-citation publication-type="preprint" specific-use="parsed"><person-group person-group-type="author"><name><surname>Dalmaijer</surname>, <given-names>E.</given-names></name></person-group> (<year>2014</year>). <article-title>Is the low-cost EyeTribe eye tracker any good for research? PeerJ</article-title> <source>PrePrints</source>.</mixed-citation></ref>
<ref id="b14"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>Dalveren</surname>, <given-names>G. G. M.</given-names></name>, <name><surname>Çağıltay</surname>, <given-names>N. E.</given-names></name>, <name><surname>Özçelik</surname>, <given-names>E.</given-names></name>, &#x26; <name><surname>Maraş</surname>, <given-names>H.</given-names></name></person-group> (<year>2017</year>). <article-title>Simulation-based environments for surgical practice.</article-title> Paper presented at the Control, Decision and Information Technologies (CoDIT), 2017 4th International Conference on. <pub-id pub-id-type="doi">10.1109/CoDIT.2017.8102755</pub-id></mixed-citation></ref>
<ref id="b15"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>de Greef</surname>, <given-names>T.</given-names></name>, <name><surname>Lafeber</surname>, <given-names>H.</given-names></name>, <name><surname>van Oostendorp</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Lindenberg</surname>, <given-names>J.</given-names></name></person-group> (<year>2009</year>). <article-title>Eye movement as indicators of mental workload to trigger adaptive automation.</article-title> Foundations of augmented cognition. Neuroergonomics and operational neuroscience, 219-228. <pub-id pub-id-type="doi">10.1007/978-3-642-02812-0_26</pub-id></mixed-citation></ref>
<ref id="b16"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>De Rivecourt</surname>, <given-names>M.</given-names></name>, <name><surname>Kuperus</surname>, <given-names>M. N.</given-names></name>, <name><surname>Post</surname>, <given-names>W. J.</given-names></name>, &#x26; <name><surname>Mulder</surname>, <given-names>L. J.</given-names></name></person-group> (<year>2008</year>). <article-title>Cardiovascular and eye activity measures as indices for momentary changes in mental effort during simulated flight.</article-title> <source>Ergonomics</source>, <volume>51</volume>(<issue>9</issue>), <fpage>1295</fpage>–<lpage>1319</lpage>. <pub-id pub-id-type="doi">10.1080/00140130802120267</pub-id><pub-id pub-id-type="pmid">18802817</pub-id><issn>0014-0139</issn></mixed-citation></ref>
<ref id="b17"><mixed-citation publication-type="web-page" specific-use="parsed"><person-group person-group-type="author"><collab>The Eye Tribe</collab></person-group>. (<year>2016</year>,). Retrieved from <ext-link ext-link-type="uri" xlink:href="http://theeyetribe.com/theeyetribe.com/about/index.html">http://theeyetribe.com/theeyetribe.com/about/index.html</ext-link></mixed-citation></ref>
<ref id="b18"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Flechtner</surname>, <given-names>K.-M.</given-names></name>, <name><surname>Steinacher</surname>, <given-names>B.</given-names></name>, <name><surname>Sauer</surname>, <given-names>R.</given-names></name>, &#x26; <name><surname>Mackert</surname>, <given-names>A.</given-names></name></person-group> (<year>1997</year>). <article-title>Smooth pursuit eye movements in schizophrenia and affective disorder.</article-title> <source>Psychological Medicine</source>, <volume>27</volume>(<issue>6</issue>), <fpage>1411</fpage>–<lpage>1419</lpage>. <pub-id pub-id-type="doi">10.1017/S0033291797005709</pub-id><pub-id pub-id-type="pmid">9403912</pub-id><issn>0033-2917</issn></mixed-citation></ref>
<ref id="b19"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hankins</surname>, <given-names>T. C.</given-names></name>, &#x26; <name><surname>Wilson</surname>, <given-names>G. F.</given-names></name></person-group> (<year>1998</year>). <article-title>A comparison of heart rate, eye activity, EEG and subjective measures of pilot mental workload during flight.</article-title> <source>Aviation, Space, and Environmental Medicine</source>, <volume>69</volume>(<issue>4</issue>), <fpage>360</fpage>–<lpage>367</lpage>.<pub-id pub-id-type="pmid">9561283</pub-id><issn>0095-6562</issn></mixed-citation></ref>
<ref id="b20"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>He</surname>, <given-names>X.</given-names></name>, <name><surname>Wang</surname>, <given-names>L.</given-names></name>, <name><surname>Gao</surname>, <given-names>X.</given-names></name>, &#x26; <name><surname>Chen</surname>, <given-names>Y.</given-names></name></person-group> (<year>2012</year>). <article-title>The eye activity measurement of mental workload based on basic flight task.</article-title> Paper presented at the Industrial Informatics (INDIN), 2012 10th IEEE International Conference on. <pub-id pub-id-type="doi">10.1109/INDIN.2012.6301203</pub-id></mixed-citation></ref>
<ref id="b21"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>Iqbal</surname>, <given-names>S. T.</given-names></name>, <name><surname>Zheng</surname>, <given-names>X. S.</given-names></name>, &#x26; <name><surname>Bailey</surname>, <given-names>B. P.</given-names></name></person-group> (<year>2004</year>). <article-title>Task-evoked pupillary response to mental workload in human-computer interaction.</article-title> Paper presented at the <source>CHI’04 extended abstracts on Human factors in computing systems</source>. <pub-id pub-id-type="doi">10.1145/985921.986094</pub-id></mixed-citation></ref>
<ref id="b22"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Jarodzka</surname>, <given-names>H.</given-names></name>, <name><surname>Holmqvist</surname>, <given-names>K.</given-names></name>, &#x26; <name><surname>Gruber</surname>, <given-names>H.</given-names></name></person-group> (<year>2017</year>). <article-title>Eye tracking in Educational Science: Theoretical frameworks and research agendas.</article-title> <source>Journal of Eye Movement Research</source>, <volume>10</volume>(<issue>1</issue>).<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b23"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Just</surname>, <given-names>M. A.</given-names></name>, &#x26; <name><surname>Carpenter</surname>, <given-names>P. A.</given-names></name></person-group> (<year>1976</year>). <article-title>Eye fixations and cognitive processes.</article-title> <source>Cognitive Psychology</source>, <volume>8</volume>(<issue>4</issue>), <fpage>441</fpage>–<lpage>480</lpage>. <pub-id pub-id-type="doi">10.1016/0010-0285(76)90015-3</pub-id><issn>0010-0285</issn></mixed-citation></ref>
<ref id="b24"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>Koh</surname>, <given-names>D. H.</given-names></name>, <name><surname>Munikrishne Gowda</surname>, <given-names>S. A.</given-names></name>, &#x26; <name><surname>Komogortsev</surname>, <given-names>O. V.</given-names></name></person-group> (<year>2009</year>). <article-title>Input evaluation of an eye-gaze-guided interface: kalman filter vs. velocity threshold eye movement identification.</article-title> Paper presented at the <source>Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems</source>. <pub-id pub-id-type="doi">10.1145/1570433.1570470</pub-id></mixed-citation></ref>
<ref id="b25"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>Law</surname>, <given-names>B.</given-names></name>, <name><surname>Atkins</surname>, <given-names>M. S.</given-names></name>, <name><surname>Kirkpatrick</surname>, <given-names>A. E.</given-names></name>, &#x26; <name><surname>Lomax</surname>, <given-names>A. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment.</article-title> Paper presented at the Proceedings of the 2004 symposium on Eye tracking research &#x26; applications. <pub-id pub-id-type="doi">10.1145/968363.968370</pub-id></mixed-citation></ref>
<ref id="b26"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Maltz</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Shinar</surname>, <given-names>D.</given-names></name></person-group> (<year>1999</year>). <article-title>Eye movements of younger and older drivers.</article-title> <source>Human Factors</source>, <volume>41</volume>(<issue>1</issue>), <fpage>15</fpage>–<lpage>25</lpage>. <pub-id pub-id-type="doi">10.1518/001872099779577282</pub-id><pub-id pub-id-type="pmid">10354803</pub-id><issn>0018-7208</issn></mixed-citation></ref>
<ref id="b27"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Maran</surname>, <given-names>N. J.</given-names></name>, &#x26; <name><surname>Glavin</surname>, <given-names>R. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Low- to high-fidelity simulation - a continuum of medical education?</article-title> <source>Medical Education</source>, <volume>37</volume>(<supplement>Suppl 1</supplement>), <fpage>22</fpage>–<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1046/j.1365-2923.37.s1.9.x</pub-id><pub-id pub-id-type="pmid">14641635</pub-id><issn>0308-0110</issn></mixed-citation></ref>
<ref id="b28"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Marquart</surname>, <given-names>G.</given-names></name>, <name><surname>Cabrall</surname>, <given-names>C.</given-names></name>, &#x26; <name><surname>de Winter</surname>, <given-names>J.</given-names></name></person-group> (<year>2015</year>). <article-title>Review of eye-related measures of drivers’ mental workload.</article-title> <source>Procedia Manufacturing</source>, <volume>3</volume>, <fpage>2854</fpage>–<lpage>2861</lpage>. <pub-id pub-id-type="doi">10.1016/j.promfg.2015.07.783</pub-id><issn>2351-9789</issn></mixed-citation></ref>
<ref id="b29"><mixed-citation publication-type="conference" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Marshall</surname>, <given-names>S. P.</given-names></name></person-group> (<year>2002</year>). The index of cognitive activity: Measuring cognitive workload. Paper presented at the Human factors and power plants, 2002. proceedings of the 2002 IEEE 7th conference on.</mixed-citation></ref>
<ref id="b30"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>May</surname>, <given-names>J. G.</given-names></name>, <name><surname>Kennedy</surname>, <given-names>R. S.</given-names></name>, <name><surname>Williams</surname>, <given-names>M. C.</given-names></name>, <name><surname>Dunlap</surname>, <given-names>W. P.</given-names></name>, &#x26; <name><surname>Brannan</surname>, <given-names>J. R.</given-names></name></person-group> (<year>1990</year>). <article-title>Eye movement indices of mental workload.</article-title> <source>Acta Psychologica</source>, <volume>75</volume>(<issue>1</issue>), <fpage>75</fpage>–<lpage>89</lpage>. <pub-id pub-id-type="doi">10.1016/0001-6918(90)90067-P</pub-id><pub-id pub-id-type="pmid">2260494</pub-id><issn>0001-6918</issn></mixed-citation></ref>
<ref id="b31"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Menekse Dalveren</surname>, <given-names>G. G.</given-names></name>, &#x26; <name><surname>Cagiltay</surname>, <given-names>N. E.</given-names></name></person-group> (<year>2018</year>). <article-title>Insights from surgeons’ eye-movement data in a virtual simulation surgical training environment: Effect of experience level and hand conditions.</article-title> <source>Behaviour &#x26; Information Technology</source>, <volume>37</volume>(<issue>5</issue>), <fpage>1</fpage>–<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1080/0144929X.2018.1460399</pub-id><issn>0144-929X</issn></mixed-citation></ref>
<ref id="b32"><mixed-citation publication-type="unknown" specific-use="parsed"><person-group person-group-type="author"><name><surname>Miura</surname>, <given-names>T.</given-names></name></person-group> (<year>1990</year>). <article-title>Active function of eye movement and useful field of view in a realistic setting.</article-title></mixed-citation></ref>
<ref id="b33"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Moray</surname>, <given-names>N.</given-names></name></person-group> (<year>1988</year>). <article-title>Mental workload since 1979.</article-title> <source>International Reviews of Ergonomics</source>, <volume>2</volume>, <fpage>123</fpage>–<lpage>150</lpage>.</mixed-citation></ref>
<ref id="b34"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Munshi</surname>, <given-names>F.</given-names></name>, <name><surname>Lababidi</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Alyousef</surname>, <given-names>S.</given-names></name></person-group> (<year>2015</year>). <article-title>Low-versus high-fidelity simulations in teaching and assessing clinical skills.</article-title> <source>Journal of Taibah University Medical Sciences</source>, <volume>10</volume>(<issue>1</issue>), <fpage>12</fpage>–<lpage>15</lpage>. <pub-id pub-id-type="doi">10.1016/j.jtumed.2015.01.008</pub-id><issn>1658-3612</issn></mixed-citation></ref>
<ref id="b35"><mixed-citation publication-type="conference" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Nakayama</surname>, <given-names>M.</given-names></name>, <name><surname>Takahashi</surname>, <given-names>K.</given-names></name>, &#x26; <name><surname>Shimizu</surname>, <given-names>Y.</given-names></name></person-group> (<year>2002</year>). <article-title>The act of task difficulty and eye-movement frequency for the’Oculo-motor indices</article-title>'. Paper presented at the Proceedings of the 2002 symposium on Eye tracking research &#x26; applications.</mixed-citation></ref>
<ref id="b36"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Nodine</surname>, <given-names>C. F.</given-names></name>, &#x26; <name><surname>Kundel</surname>, <given-names>H. L.</given-names></name></person-group> (<year>1987</year>). <article-title>Using eye movements to study visual search and to improve tumor detection.</article-title> <source>Radiographics</source>, <volume>7</volume>(<issue>6</issue>), <fpage>1241</fpage>–<lpage>1250</lpage>. <pub-id pub-id-type="doi">10.1148/radiographics.7.6.3423330</pub-id><pub-id pub-id-type="pmid">3423330</pub-id><issn>0271-5333</issn></mixed-citation></ref>
<ref id="b37"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Paas</surname>, <given-names>F.</given-names></name>, <name><surname>Tuovinen</surname>, <given-names>J. E.</given-names></name>, <name><surname>Tabbers</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Van Gerven</surname>, <given-names>P. W.</given-names></name></person-group> (<year>2003</year>). <article-title>Cognitive load measurement as a means to advance cognitive load theory.</article-title> <source>Educational Psychologist</source>, <volume>38</volume>(<issue>1</issue>), <fpage>63</fpage>–<lpage>71</lpage>. <pub-id pub-id-type="doi">10.1207/S15326985EP3801_8</pub-id><issn>0046-1520</issn></mixed-citation></ref>
<ref id="b38"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Rayner</surname>, <given-names>K.</given-names></name></person-group> (<year>1998</year>). <article-title>Eye movements in reading and information processing: 20 years of research.</article-title> <source>Psychological Bulletin</source>, <volume>124</volume>(<issue>3</issue>), <fpage>372</fpage>–<lpage>422</lpage>. <pub-id pub-id-type="doi">10.1037/0033-2909.124.3.372</pub-id><pub-id pub-id-type="pmid">9849112</pub-id><issn>0033-2909</issn></mixed-citation></ref>
<ref id="b39"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Rayner</surname>, <given-names>K.</given-names></name>, &#x26; <name><surname>Morris</surname>, <given-names>R. K.</given-names></name></person-group> (<year>1990</year>). <article-title>Do eye movements reflect higher order processes in reading? Recarte, M. A., &#x26; Nunes, L. M. (2000). Effects of verbal and spatial-imagery tasks on eye fixations while driving.</article-title> <source>Journal of Experimental Psychology. Applied</source>, <volume>6</volume>(<issue>1</issue>), <fpage>31</fpage>.<issn>1076-898X</issn></mixed-citation></ref>
<ref id="b40"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Recarte</surname>, <given-names>M. A.</given-names></name>, &#x26; <name><surname>Nunes</surname>, <given-names>L. M.</given-names></name></person-group> (<year>2000</year>). <article-title>Effects of verbal and spatial-imagery tasks on eye fixations while driving.</article-title> <source>Journal of Experimental Psychology. Applied</source>, <volume>6</volume>(<issue>1</issue>), <fpage>31</fpage>–<lpage>43</lpage>. <pub-id pub-id-type="doi">10.1037/1076-898X.6.1.31</pub-id><pub-id pub-id-type="pmid">10937310</pub-id><issn>1076-898X</issn></mixed-citation></ref>
<ref id="b41"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ryu</surname>, <given-names>K.</given-names></name>, &#x26; <name><surname>Myung</surname>, <given-names>R.</given-names></name></person-group> (<year>2005</year>). <article-title>Evaluation of mental workload with a combined measure based on physiological indices during a dual task of tracking and mental arithmetic.</article-title> <source>International Journal of Industrial Ergonomics</source>, <volume>35</volume>(<issue>11</issue>), <fpage>991</fpage>–<lpage>1009</lpage>. <pub-id pub-id-type="doi">10.1016/j.ergon.2005.04.005</pub-id><issn>0169-8141</issn></mixed-citation></ref>
<ref id="b42"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Sweeney</surname>, <given-names>J. A.</given-names></name>, <name><surname>Brew</surname>, <given-names>B. J.</given-names></name>, <name><surname>Keilp</surname>, <given-names>J. G.</given-names></name>, <name><surname>Sidtis</surname>, <given-names>J. J.</given-names></name>, &#x26; <name><surname>Price</surname>, <given-names>R. W.</given-names></name></person-group> (<year>1991</year>). <article-title>Pursuit eye movement dysfunction in HIV-1 seropositive individuals.</article-title> <source>Journal of Psychiatry &#x26; Neuroscience</source>, <volume>16</volume>(<issue>5</issue>), <fpage>247</fpage>–<lpage>252</lpage>.<pub-id pub-id-type="pmid">1797099</pub-id><issn>1180-4882</issn></mixed-citation></ref>
<ref id="b43"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Sweller</surname>, <given-names>J.</given-names></name></person-group> (<year>1994</year>). <article-title>Cognitive load theory, learning difficulty, and instructional design.</article-title> <source>Learning and Instruction</source>, <volume>4</volume>(<issue>4</issue>), <fpage>295</fpage>–<lpage>312</lpage>. <pub-id pub-id-type="doi">10.1016/0959-4752(94)90003-5</pub-id><issn>0959-4752</issn></mixed-citation></ref>
<ref id="b44"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Sweller</surname>, <given-names>J.</given-names></name>, <name><surname>Van Merrienboer</surname>, <given-names>J. J.</given-names></name>, &#x26; <name><surname>Paas</surname>, <given-names>F. G.</given-names></name></person-group> (<year>1998</year>). <article-title>Cognitive architecture and instructional design.</article-title> <source>Educational Psychology Review</source>, <volume>10</volume>(<issue>3</issue>), <fpage>251</fpage>–<lpage>296</lpage>. <pub-id pub-id-type="doi">10.1023/A:1022193728205</pub-id><issn>1040-726X</issn></mixed-citation></ref>
<ref id="b45"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Three</surname> <given-names>D</given-names></name></person-group> Systems. (<year>2018</year>). Retrieved from <ext-link ext-link-type="uri" xlink:href="https://www.3dsystems.com/haptics-devices/touch">https://www.3dsystems.com/haptics-devices/touch</ext-link></mixed-citation></ref>
<ref id="b46"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><name><surname>Tien</surname>, <given-names>G.</given-names></name>, <name><surname>Atkins</surname>, <given-names>M. S.</given-names></name>, <name><surname>Zheng</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Swindells</surname>, <given-names>C.</given-names></name></person-group> (<year>2010</year>). <article-title>Measuring situation awareness of surgeons in laparoscopic training.</article-title> Paper presented at the <source>Proceedings of the 2010 Symposium on Eye-Tracking Research &#x26; Applications</source>. <pub-id pub-id-type="doi">10.1145/1743666.1743703</pub-id></mixed-citation></ref>
<ref id="b47"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Tsai</surname>, <given-names>Y.-F.</given-names></name>, <name><surname>Viirre</surname>, <given-names>E.</given-names></name>, <name><surname>Strychacz</surname>, <given-names>C.</given-names></name>, <name><surname>Chase</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Jung</surname>, <given-names>T.-P.</given-names></name></person-group> (<year>2007</year>). <article-title>Task performance and eye activity: Predicting behavior relating to cognitive workload.</article-title> <source>Aviation, Space, and Environmental Medicine</source>, <volume>78</volume>(<issue>5</issue>, <supplement>Suppl</supplement>), <fpage>B176</fpage>–<lpage>B185</lpage>.<pub-id pub-id-type="pmid">17547318</pub-id><issn>0095-6562</issn></mixed-citation></ref>
<ref id="b48"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>van der Lans</surname>, <given-names>R.</given-names></name>, <name><surname>Wedel</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Pieters</surname>, <given-names>R.</given-names></name></person-group> (<year>2011</year>). <article-title>Defining eye-fixation sequences across individuals and tasks: The Binocular-Individual Threshold (BIT) algorithm.</article-title> <source>Behavior Research Methods</source>, <volume>43</volume>(<issue>1</issue>), <fpage>239</fpage>–<lpage>257</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-010-0031-2</pub-id><pub-id pub-id-type="pmid">21287116</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="b49"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Veltman</surname>, <given-names>J. A.</given-names></name>, &#x26; <name><surname>Gaillard</surname>, <given-names>A. W.</given-names></name></person-group> (<year>1998</year>). <article-title>Physiological workload reactions to increasing levels of task difficulty.</article-title> <source>Ergonomics</source>, <volume>41</volume>(<issue>5</issue>), <fpage>656</fpage>–<lpage>669</lpage>. <pub-id pub-id-type="doi">10.1080/001401398186829</pub-id><pub-id pub-id-type="pmid">9613226</pub-id><issn>0014-0139</issn></mixed-citation></ref>
<ref id="b50"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Wierwille</surname>, <given-names>W. W.</given-names></name>, <name><surname>Rahimi</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Casali</surname>, <given-names>J. G.</given-names></name></person-group> (<year>1985</year>). <article-title>Evaluation of 16 measures of mental workload using a simulated flight task emphasizing mediational activity.</article-title> <source>Human Factors</source>, <volume>27</volume>(<issue>5</issue>), <fpage>489</fpage>–<lpage>502</lpage>. <pub-id pub-id-type="doi">10.1177/001872088502700501</pub-id><issn>0018-7208</issn></mixed-citation></ref>
<ref id="b51"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Xie</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Salvendy</surname>, <given-names>G.</given-names></name></person-group> (<year>2000</year>). <article-title>Review and reappraisal of modelling and predicting mental workload in single-and multi-task environments.</article-title> <source>Work and Stress</source>, <volume>14</volume>(<issue>1</issue>), <fpage>74</fpage>–<lpage>99</lpage>. <pub-id pub-id-type="doi">10.1080/026783700417249</pub-id><issn>0267-8373</issn></mixed-citation></ref>
<ref id="b52"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Zheng</surname>, <given-names>B.</given-names></name>, <name><surname>Cassera</surname>, <given-names>M. A.</given-names></name>, <name><surname>Martinec</surname>, <given-names>D. V.</given-names></name>, <name><surname>Spaun</surname>, <given-names>G. O.</given-names></name>, &#x26; <name><surname>Swanström</surname>, <given-names>L. L.</given-names></name></person-group> (<year>2010</year>). <article-title>Measuring mental workload during the performance of advanced laparoscopic tasks.</article-title> <source>Surgical Endoscopy</source>, <volume>24</volume>(<issue>1</issue>), <fpage>45</fpage>–<lpage>50</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-009-0522-3</pub-id><pub-id pub-id-type="pmid">19466485</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b53"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Zheng</surname>, <given-names>B.</given-names></name>, <name><surname>Jiang</surname>, <given-names>X.</given-names></name>, &#x26; <name><surname>Atkins</surname>, <given-names>M. S.</given-names></name></person-group> (<year>2015</year>). <article-title>Detection of changes in surgical difficulty: Evidence from pupil responses.</article-title> <source>Surgical Innovation</source>, <volume>22</volume>(<issue>6</issue>), <fpage>629</fpage>–<lpage>635</lpage>. <pub-id pub-id-type="doi">10.1177/1553350615573582</pub-id><pub-id pub-id-type="pmid">25759398</pub-id><issn>1553-3506</issn></mixed-citation></ref>
</ref-list>
</back>
</article>
